CN117648683A - Service identity authentication method, device, equipment and medium based on gesture signature - Google Patents

Service identity authentication method, device, equipment and medium based on gesture signature Download PDF

Info

Publication number
CN117648683A
CN117648683A CN202311604221.5A CN202311604221A CN117648683A CN 117648683 A CN117648683 A CN 117648683A CN 202311604221 A CN202311604221 A CN 202311604221A CN 117648683 A CN117648683 A CN 117648683A
Authority
CN
China
Prior art keywords
hand
image
target user
key point
signature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311604221.5A
Other languages
Chinese (zh)
Inventor
孙超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Life Insurance Company of China Ltd
Original Assignee
Ping An Life Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Life Insurance Company of China Ltd filed Critical Ping An Life Insurance Company of China Ltd
Priority to CN202311604221.5A priority Critical patent/CN117648683A/en
Publication of CN117648683A publication Critical patent/CN117648683A/en
Pending legal-status Critical Current

Links

Landscapes

  • Collating Specific Patterns (AREA)

Abstract

The invention relates to a business identity authentication method, a device, terminal equipment and a medium based on gesture signature. The method comprises the steps of obtaining a hand image of a target user during signature, inputting the hand image into a trained hand feature extraction network model to obtain n first hand key point features, comparing the n first hand key point features with preset n second hand key point features to obtain a comparison result of the hand image, obtaining a complete signature of the target user after the n hand key point features of the hand image pass, comparing the signature of the target user with the preset signature, and confirming the service identity of the target user according to the comparison result. The method utilizes the hand characteristic information to carry out identity authentication during signature, so that the identity authentication is safer and more reliable, the detail characteristic is used for analyzing the identity of the user, and the possibility of identity theft is reduced.

Description

Service identity authentication method, device, equipment and medium based on gesture signature
Technical Field
The invention is suitable for the field of financial business, and particularly relates to a business identity authentication method, device, equipment and medium based on gesture signature.
Background
With the development of computer technology and network technology, more and more applications need to authenticate the identity of a user to protect the privacy and data security of the user. The common identity authentication method at present mainly comprises static passwords, dynamic passwords, biological characteristics and the like. Static passwords and dynamic passwords are easy to use, but are easy to crack or steal; biological features, while unique and non-replicable, are susceptible to environmental factors or counterfeiting attacks. Therefore, there is a need for an identity authentication method that is safer and more reliable and easier to operate in practical applications.
Disclosure of Invention
In view of the above, the embodiments of the present invention provide a method, apparatus, device, and medium for authenticating service identity based on gesture signature, so as to solve the problem that the conventional authentication method is easy to be cracked and counterfeited.
In a first aspect, an embodiment of the present invention provides a method for authenticating a service identity based on gesture signature, where the method includes:
acquiring at least two hand images signed by a target user during service identity authentication;
inputting each hand image into a trained hand feature extraction network model to obtain n first hand key point features of each hand image, wherein n is more than 2;
comparing the n first key point features of the hands of each hand image with the n second key point features of the preset hands of each hand image to obtain a comparison result of each hand image;
and after the n hand key point characteristic comparison results of each hand image pass, acquiring the complete signature of the target user, comparing the signature of the target user with a preset signature, and confirming the service identity of the target user according to the comparison result.
In a second aspect, an embodiment of the present invention provides a service identity authentication device based on gesture signature, including:
the hand image acquisition module is used for acquiring at least two hand images signed by a target user when carrying out service identity authentication;
the key point acquisition module is used for inputting each hand image into the trained hand feature extraction network model to obtain n first key point features of the hands of each hand image, wherein n is more than 2;
the image feature comparison module is used for comparing the n first hand key point features of each hand image with the n preset second hand key point features of each hand image to obtain a comparison result of each hand image;
and the identity confirmation module is used for acquiring the complete signature of the target user after the characteristic comparison results of the n hand key points of each hand image are passed, comparing the signature of the target user with a preset signature, and confirming the service identity of the target user according to the comparison results.
In a third aspect, an embodiment of the present invention provides a terminal device, the terminal device comprising a processor, a memory and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the identity authentication method as in the first aspect when the computer program is executed.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the identity authentication method as in the first aspect.
In a fifth aspect, an embodiment of the present invention provides a computer program product for causing a terminal device to carry out the steps of the identity authentication method of the first aspect described above when the computer program product is run on the terminal device.
Compared with the prior art, the invention has the beneficial effects that: when a user signs during identity authentication of financial services, hand images of the user signature action process are extracted to obtain a plurality of hand key point characteristics, the identity characteristics of the user are carefully analyzed, a series of hand key point characteristic combinations with time sequence characteristics are adopted to identify the identity of the target user, namely, the comparison result of n first key point characteristics of the extracted hand images and preset n second key point characteristics is utilized to determine the identity of the user, so that the identity authentication is safer and more reliable, the possibility of identity theft is reduced, the key point comparison is systematic, the operation flow is simplified, and the identity authentication is facilitated to be simply and conveniently carried out.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an application environment of an authentication method according to an embodiment of the present invention;
FIG. 2 is a flowchart of an identity authentication method according to an embodiment of the present invention;
FIG. 3 is a flow chart of acquiring hand key point features according to an embodiment of the invention;
FIG. 4 is a flow chart of contrast of hand image features according to an embodiment of the present invention;
FIG. 5 is a flow chart of the present invention before acquiring a hand image;
FIG. 6 is a schematic diagram of a process prior to signature comparison in accordance with an embodiment of the present invention;
FIG. 7 is a schematic diagram of an identity authentication device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the invention. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
It should be understood that the sequence numbers of the steps in the following embodiments do not mean the order of execution, and the execution order of the processes should be determined by the functions and the internal logic, and should not be construed as limiting the implementation process of the embodiments of the present invention.
In order to illustrate the technical scheme of the invention, the following description is made by specific examples.
The service identity authentication method based on gesture signature provided by the embodiment of the invention can be applied to an application environment as shown in fig. 1, wherein a client side and a server side communicate through a network. The client includes, but is not limited to, a handheld computer, a desktop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cloud terminal, a personal digital assistant (personal digital assistant, PDA), and other terminal devices. The server may be implemented by a stand-alone server or a server cluster formed by a plurality of servers.
Referring to fig. 2, a flow chart of a gesture signature-based service identity authentication method according to an embodiment of the present invention is shown, where the identity authentication method may be applied to the client in fig. 1, and a terminal device corresponding to the client is connected to a target database through a preset application program interface (Application Programming Interface, API). When the target data is driven to run to execute the corresponding task, a corresponding task log is generated, and the task log can be acquired through an API. As shown in fig. 2, the identity authentication method may include the steps of:
step S101, at least two hand images signed by a target user during service identity authentication are obtained.
In one embodiment, when the financial related business is performed, identity authentication needs to be performed on the target user, and at least two images of hands are taken when the target user signs to determine whether the user is himself in a subsequent step.
In this step, the number of images may be two, three, or more, and the acquired images are a combination of a plurality of images continuously acquired in a preset period of time according to sequential time sequence.
Step S102, inputting each hand image into a trained hand feature extraction network model to obtain n first hand key point features of each hand image, wherein n is more than 2.
The hand feature extraction network model may be a hand key point detection model openPose based on CMU Perceptual Computing Lab open source, or may be a neural network model capable of identifying hand key points, such as a multimedia machine learning model application framework mediaPipe.
In the above embodiment, at least two hand images taken when the target user signed have been acquired, the hand images are input to the MediaPipe framework, the model is capable of extracting n=21 hand keypoints of the hand images, and n=21 hand first keypoint features are output according to the keypoints.
In this step, the number of the first key point features of the hand may be set to different values according to the requirement, and in the prior art, 21 key points of the hand are generally selected to display the hand features.
Step S103, comparing the n first hand key point features of each hand image with the n preset second hand key point features of each hand image to obtain a comparison result of each hand image.
The determining process of the preset second key point characteristics of the hand is as follows: when a user transacts business for the first time, processing a hand image in the process of collecting a signature, and obtaining hand key point characteristics, wherein the hand key point characteristics comprise:
acquiring at least two hand images signed by a target user when carrying out service identity authentication for the first time;
inputting each hand image into the trained hand feature extraction network model to obtain n second key point features of the hands of each hand image, wherein n is more than 2.
In one embodiment, when the user handles the related business again, the hand images signed at the time are obtained, the number of the hand images at the time is equal to the number of the hand images obtained when the user signs at the first time, each image has corresponding hand key point information through the hand feature extraction network model processing, and the two groups of hand key point features are compared to obtain a comparison result of each hand image.
Specifically, in the above embodiment, the first transaction service and the first transaction service respectively obtain 3 hand images when signed, each image has 21 hand key points through openpost processing of the hand key point detection model, and the two sets of images are subjected to key point feature comparison in a one-to-one correspondence manner, so as to obtain a comparison result.
In this step, the comparison result may or may not pass. In one embodiment, under the condition that the 21 key point features are consistent, the comparison result is judged to be passing; and judging that the comparison result is not passed under the condition that 1 key point feature in the 21 key point features is not consistent. In another embodiment, if more than 50% of the keypoint features match, the comparison is judged to be pass; if more than 50% of the key points do not match, the comparison result is judged as failed.
Step S104, after the n hand key point characteristics of each hand image pass, obtaining the complete signature of the target user, comparing the signature of the target user with the preset signature, and confirming the service identity of the target user according to the comparison result.
In one embodiment, when the comparison result is similar, the complete signature of the target user is scanned, the signature is compared with the first obtained signature, whether the signature is consistent is judged, and whether the target user is the user is finally confirmed.
According to the identity authentication method, identity authentication is performed by using the hand key point features, namely, the user behavior habit is used for authentication, so that the identity authentication is safer and more reliable, the hand feature extraction network model is adopted for recognition, and the identity authentication method is simple, convenient and easy to operate, and integrally enables the comparison result to be more reliable.
In other embodiments, as shown in fig. 3, in step S102, each hand image is input to a trained hand feature extraction network model to obtain n first key point features of the hand of each hand image, where n >2, including the following steps:
in step S201, the hand feature extraction network model includes a first hand feature extraction network model and a second hand feature extraction network model, which form a twin comparison network model.
Optionally, the ResNet50 network model is adopted to compare the first hand feature with the second hand feature, so as to judge the comparison result of the two features.
Step S202, inputting each hand image into the trained first hand feature extraction network model to obtain n hand first key point features of each hand image.
In one embodiment, the first hand image obtained by the present signature is input into the first hand feature extraction network model, that is, the res net50 network model for processing, and finally, the key point feature of each hand image is obtained.
Step S203, inputting each hand image into the trained second hand feature extraction network model to obtain n second key point features of each hand image.
In one embodiment, the second hand image obtained by the first signature is input into a second hand feature extraction network model, namely a ResNet50 network model for processing, and finally, the key point feature of each hand image is obtained.
In this embodiment, the twin comparison network takes two samples as input, and outputs a representation of its embedding in the high-dimensional space to compare the similarity of the two samples. Specifically, the first hand image and the second hand image are respectively input into a ResNet50 network model, after the key point feature extraction processing, the two key point features are compared, the two images are output to be embedded into the characterization of the high-dimensional space, and the similarity degree of the two images is compared.
According to the identity verification method, the key point feature extraction is carried out on the first hand image and the second hand image by utilizing the twin comparison network model, so that the identity verification operation is simplified, and the identity verification is more convenient.
In other embodiments, as shown in fig. 4, in step S103, n hand first key point features of each hand image are compared with n preset hand second key point features of each hand image to obtain a comparison result of each hand image, which includes the following steps:
step S301, calculating the similarity between the first key point features and the second key point features of the n hands of each hand image to obtain a similarity comparison result of each hand image.
In one embodiment, to compare the N signature-time hand images obtained by the transaction with the N signature-time hand images obtained for the first time, each image extracts N hand key point features, and after extracting the hand key point features respectively, the N hand key point features are converted into key point feature vectors, i.e., a group of N first key point features are converted into feature vectors and recorded as target feature vectors, and N second key point features are converted into feature vectors and recorded as example feature vectors, and then similarity between the target feature vectors and the example feature vectors is calculated to obtain the comparison result.
Specifically, the target feature vector and the example feature vector may be substituted into a preset similarity calculation formula to obtain the similarity between the target feature vector and the example feature vector;
the preset similarity calculation formula is as follows:
wherein X is i For the similarity of the ith target feature and the ith example feature, m i For the ith target feature vector, s i Is the i-th example feature vector, where i=1, 2, …, N.
In this embodiment, the cosine similarity between the target feature vector and the example feature vector is calculated, so that the cosine similarity is used to represent the similarity between the target feature and the corresponding example feature, and the greater the cosine similarity is, the greater the similarity between the target feature and the corresponding example feature is.
In this embodiment, the hand images acquired this time and the hand images acquired first time are in one-to-one correspondence to form N groups of hand images, each group of hand images contains two hand images, N hand key points of each image are jointly converted into key point feature vectors, the feature vector groups are compared, and N similarity X is calculated i I=1, 2, …, N, a similarity threshold value X is set 0 When the similarity is larger than a similarity threshold value, judging that the similarity comparison result is passing; and when the similarity is smaller than a similarity threshold value, judging that the similarity comparison result is not passed.
Step S302, determining comparison results of the hand images according to the similarity comparison results of the hand images.
In one embodiment, 3 sets of hand images are obtained, each image has 21 key point features, 3 similarity values are obtained through calculation, the similarity values are 0.95,0.87,0.96 respectively, the average similarity of the hand images in the set with the similarity threshold value of 0.85,3 is larger than the similarity threshold value, and the hand images in the set are judged to be similar.
The identity verification method of the embodiment calculates the similarity of the hand images during signature by using a similarity formula, has accurate and reliable result, and effectively improves the matching accuracy between the features
In other embodiments, a method for verifying a business identity based on gesture signature is provided, as shown in fig. 5, before step S101, at least two hand images signed by a target user during business identity authentication are obtained, and the method further includes:
in step S401, a first image of the target user is acquired, and living body detection is performed based on the first image of the target user.
In one embodiment, the living body detection is performed on the first image of the target user, a single-frame input method may be adopted, after the image of the user is obtained, specular reflection and distortion processing are performed on the image to obtain a series of statistic characteristics, the statistic characteristics after the processing are combined, the combined characteristic sets are subjected to two classification, and the classification result is analyzed to determine whether the user is a living body.
Optionally, the living body detection can also use cameras with different modes to input images, the light field camera is utilized to shoot, a raw light field photo is obtained, and the analysis is carried out on the partial micro-mirror image of the photo, so that whether the user is a living body can be judged.
Step S402, after the living body detection passes, a second image of the target user is acquired, and face features in the second image are extracted.
In one embodiment, after detecting that the target is a living body, the image of the target user is acquired again, the face part is recognized, and the face part image is intercepted to extract the face features in the image.
Step S403, obtaining a pre-stored reference face feature of the target user, and matching the face feature with the reference face feature to obtain a face feature matching result.
In the above embodiment, the obtained face features are subjected to reference matching with the reference face features in the system sample library, so as to obtain a face feature matching result, and the identity of the target user is confirmed.
The identity authentication method of the embodiment performs living body detection, avoids the condition that the user uses photos to make a mixed pass, identifies the face characteristics, confirms the identity of the user according to the face information of the user, and enables the identity authentication to be more comprehensive and careful.
In other embodiments, in the step S103, the hand image used for extracting the hand key point feature is a left hand image and/or a right hand image of the target user in the preset body posture.
The preset body posture is the body posture when the user transacts the business for the first time, and can be a standing posture or a sitting posture.
In an embodiment, the hand image acquired in advance may be a hand image when the user signs with the right hand in the standing position, or may be a hand image when the user signs with the left hand in the standing position when the user is left-handed, right-handed, or disabled.
In one embodiment, to avoid hand injury and other conditions affecting the change of hand image during signature, multiple hand images of the user may be acquired according to the personal intent of the target user, including but not limited to a right hand signature hand image in sitting position, a left hand signature hand image in sitting position, a right hand signature hand image in standing position, and a left hand signature hand image in standing position, where each condition has n hand images, based on the user's selection of one or more hand images for acquisition.
The identity verification method of the embodiment provides hand images under various postures, avoids authentication failure caused by the influence of certain factors on the hand images when a target user handles subsequent business, improves the accuracy and fault tolerance of identity authentication, and improves user experience.
In other embodiments, as shown in fig. 6, before step S104, that is, after the n hand key point features of each hand image pass, the complete signature of the target user is obtained, the signature of the target user is compared with the preset signature, and the service identity of the target user is confirmed according to the result obtained by the comparison, which further includes:
step S601, detecting a plurality of pose information of the electronic pen within a preset time by using the electronic pen which is signed when the target user performs service identity authentication, and obtaining a stroke pose sequence of the signature.
The electronic pen for signing is internally provided with a gyroscope, and the electronic pen can collect pose information when a user signs.
In one embodiment, the target user signs by using an electronic pen, and in the signing process, the electronic pen senses the pose information of the real-time pen holder and records the change sequence of the pose information, namely the pose sequence of the current signature.
Step S602, comparing the stroke pose sequence with a pre-stored reference stroke pose sequence to obtain a stroke comparison result of the signature.
The pre-stored reference stroke pose sequence is a stroke pose sequence acquired when a target user performs service handling for the first time.
In one embodiment, after the target user signs by using the electronic pen, the stroke pose sequence is compared with the stroke pose sequence stored in registration in similarity to obtain a stroke comparison result.
Step S603, after the stroke comparison result of the current signature passes or the n hand key point feature comparison results of each hand image pass, obtaining the complete signature of the target user.
In one embodiment, for some objective reasons, the comparison result of the hand feature key points of the user does not pass, but the stroke pose sequence is consistent with the stroke pose sequence archived during registration, and the comparison result is judged to pass, so that the next step can be performed to verify the complete signature.
According to the identity verification method, stability of pen holding habit and writing habit of a target user during signing is considered, namely a series of pen strokes are basically unchanged when a person signs, pen holder pose information is sensed in real time by using a special signature electronic pen, a pen stroke pose sequence during signing is extracted, similarity of pose information and pose information during initial registration is judged, a defense line is added to identity authentication, and accuracy of a judgment result is improved.
In other embodiments, in step S601, the number of position information in the stroke pose sequence is less than or equal to the number of name strokes of the target user.
The number of the position information in the stroke pose sequence can be the number of the name strokes of the target user, and when the user signature is confirmed, the detection is that the user writes a plurality of position information of the electronic pen corresponding to all strokes of the name, so that the stroke pose sequence of the user name is obtained.
In other embodiments, considering that the pen habit when writing a part of characters may be fixed while the pen habit when writing another part of characters is variable when the user signs, the number of position information in the pen stroke pose sequence may be smaller than the number of name strokes of the target user, and when the user signs are confirmed, it is detected that the user only writes a plurality of position information of the pen stroke corresponding to the last name or the first name of the user, and the pen stroke pose sequence of the last name or the first name of the user is obtained.
In one embodiment, the target user name is wang hong, the pen habit is changed when the user writes the "red" word in the name, but the pen habit of writing the "wang" word is fixed, so that the pose information of the electronic pen when the user writes the "wang" word can be detected only, and the stroke pose information of the user's surname can be obtained.
According to the identity verification method, different conditions during signature are considered, and acquisition of stroke pose sequences is limited under the condition that verification is accurate.
Fig. 7 shows a block diagram of a gesture signature-based service identity authentication device according to an embodiment of the present invention, where the identity authentication device is applied to a terminal device, and the terminal device is connected to a target database through a preset application program interface. When the target database is driven to run to execute the corresponding task, a corresponding task log is generated, and the task log can be acquired through an API. For convenience of explanation, only portions relevant to the embodiments of the present invention are shown.
Referring to fig. 7, the authentication apparatus includes: the hand image acquisition module 81, the key point acquisition module 82, the image feature comparison module 83 and the identity confirmation module 84.
The hand image acquisition module is used for acquiring at least two hand images signed by a target user when carrying out service identity authentication;
the key point acquisition module is used for inputting each hand image into the trained hand feature extraction network model to obtain n first key point features of the hands of each hand image, wherein n is more than 2;
the image feature comparison module is used for comparing the n first hand key point features of each hand image with the n preset second hand key point features of each hand image to obtain a comparison result of each hand image;
and the identity confirmation module is used for acquiring the complete signature of the target user after the characteristic comparison results of the n hand key points of each hand image are passed, comparing the signature of the target user with a preset signature, and confirming the service identity of the target user according to the comparison results.
Optionally, the key point obtaining module includes:
the network model construction unit is used for the hand feature extraction network model to comprise a first hand feature extraction network model and a second hand feature extraction network model, and the first hand feature extraction network model and the second hand feature extraction network model form a twin comparison network model;
the first key point feature acquisition unit is used for inputting each hand image into the trained first hand feature extraction network model to obtain n hand first key point features of each hand image;
the second key point feature acquisition unit is used for inputting each hand image into the trained second hand feature extraction network model to obtain n hand second key point features of each hand image.
Optionally, the image feature comparison module includes:
the similarity comparison unit is used for calculating the similarity between the first key point features and the second key point features of the n hands of each hand image to obtain n similarity comparison results of each hand image;
and the comparison result determining unit is used for determining the comparison result of each hand image according to the n similarity comparison results of each hand image.
Optionally, the hand image acquisition module includes:
a living body detection unit for acquiring a first image of a target user, and performing living body detection according to the first image of the target user;
the face feature extraction unit is used for acquiring a second image of the target user after the living body detection passes and extracting face features in the second image;
the matching result obtaining unit is used for obtaining the pre-stored reference face characteristics of the target user, and matching the face characteristics with the reference face characteristics to obtain a face characteristic matching result.
Optionally, the hand image acquisition module further includes:
the hand image construction unit is used for the hand image to be the hand image of the left hand and/or the hand image of the right hand of the target user under the preset body posture.
Optionally, the identity confirmation module includes:
the pose sequence acquisition unit is used for detecting a plurality of pose information of the electronic pen within preset time by using the electronic pen signed by the target user when carrying out service identity authentication to obtain a stroke pose sequence of the signature;
the comparison result acquisition unit is used for comparing the stroke pose sequence with a pre-stored reference stroke pose sequence to obtain a stroke comparison result of the signature;
the complete signature acquisition unit is used for acquiring the complete signature of the target user after the stroke comparison result of the signature passes or the characteristic comparison results of n hand key points of each hand image pass.
Optionally, the pose sequence obtaining unit includes:
and the pose information acquisition subunit is used for acquiring the pose information of the stroke pose sequence, wherein the number of the pose information is smaller than or equal to the number of name strokes of the target user.
It should be noted that, because the content of information interaction and execution process between the modules and the embodiment of the method of the present invention are based on the same concept, specific functions and technical effects thereof may be referred to in the method embodiment section, and details thereof are not repeated herein.
Fig. 8 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present invention. As shown in fig. 8, the terminal device of this embodiment includes: at least one processor (only one shown in fig. 8), a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps in any of the authentication method embodiments described above when the computer program is executed.
The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that fig. 8 is merely an example of a terminal device and is not limiting of the terminal device, and that the terminal device may comprise more or less components than shown, or may combine some components, or different components, e.g. may further comprise a network interface, a display screen, an input device, etc.
The processor may be a CPU, but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory includes a readable storage medium, an internal memory, etc., where the internal memory may be a memory of the terminal device, and the internal memory provides an environment for the operation of an operating system and computer readable instructions in the readable storage medium. The readable storage medium may be a hard disk of the terminal device, and in other embodiments may be an external storage device of the terminal device, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), etc. that are provided on the terminal device. Further, the memory may also include both an internal storage unit of the terminal device and an external storage device. The memory is used to store an operating system, application programs, boot loader (BootLoader), data, and other programs such as program codes of computer programs, and the like. The memory may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working process of the units and modules in the above device may refer to the corresponding process in the foregoing method embodiment, which is not described herein again. The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above-described embodiment, and may be implemented by a computer program to instruct related hardware, and the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of the method embodiment described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The present invention may also be implemented by a computer program product for implementing all or part of the steps of the method embodiments described above, when the computer program product is run on a terminal device, causing the terminal device to execute the steps of the method embodiments described above.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (10)

1. A business identity authentication method based on gesture signature, the method comprising:
acquiring at least two hand images signed by a target user when carrying out service identity authentication;
inputting each hand image into a trained hand feature extraction network model to obtain n first hand key point features of each hand image, wherein n is more than 2;
comparing the n first hand key point features of each hand image with the n preset second hand key point features of each hand image to obtain comparison results of each hand image;
and after the comparison results of the hand images pass, acquiring a complete signature of the target user, comparing the signature of the target user with a preset signature, and confirming the service identity of the target user according to the comparison results.
2. The identity authentication method according to claim 1, wherein the inputting each of the hand images into the trained hand feature extraction network model to obtain n hand first key point features of each of the hand images, n >2, includes:
the hand feature extraction network model comprises a first hand feature extraction network model and a second hand feature extraction network model, and the first hand feature extraction network model and the second hand feature extraction network model form a twin comparison network model;
inputting each hand image into a trained first hand feature extraction network model to obtain n hand first key point features of each hand image;
and inputting each hand image into the trained second hand feature extraction network model to obtain n second hand key point features of each hand image.
3. The identity authentication method according to claim 1, wherein the comparing the n first hand key point features of each hand image with the n preset second hand key point features of each hand image to obtain a comparison result of each hand image includes:
calculating the similarity between the first key point features and the second key point features of n hands of each hand image to obtain a similarity comparison result of each hand image;
and determining the comparison result of each hand image according to the similarity comparison result of each hand image.
4. The authentication method according to claim 1, further comprising, before the acquiring at least two hand images signed by the target user at the time of service authentication:
acquiring a first image of a target user, and performing living body detection according to the first image of the target user;
after the living body detection passes, a second image of a target user is obtained, and face features in the second image are extracted;
and acquiring a prestored reference face feature of the target user, and matching the face feature with the reference face feature to obtain a face feature matching result.
5. The identity authentication method according to claim 1, wherein the hand image is a left hand image and/or a right hand image of the target user in a preset body posture.
6. The authentication method according to claim 1, further comprising, before comparing the signature of the target user with a preset signature:
detecting a plurality of pose information of the electronic pen in a preset time by using the electronic pen signed by the target user when carrying out service identity authentication to obtain a stroke pose sequence of the signature;
comparing the stroke pose sequence with a pre-stored reference stroke pose sequence to obtain a stroke comparison result of the signature;
and after the stroke comparison result of the signature passes or the characteristic comparison results of n hand key points of each hand image pass, acquiring the complete signature of the target user.
7. The sequence of stroke poses according to claim 6, wherein the number of pose information in the sequence of stroke poses is less than or equal to the number of name strokes of the target user.
8. An identity authentication device, characterized in that the identity authentication device comprises:
the hand image acquisition module is used for acquiring at least two hand images signed by a target user when carrying out service identity authentication;
the key point acquisition module is used for inputting each hand image into the trained hand feature extraction network model to obtain n first key point features of the hand of each hand image, wherein n is more than 2;
the image feature comparison module is used for comparing the n first hand key point features of each hand image with the n preset second hand key point features of each hand image to obtain comparison results of each hand image;
and the identity confirmation module is used for acquiring the complete signature of the target user after the n hand key point characteristic comparison results of the hand images pass, comparing the signature of the target user with a preset signature, and confirming the service identity of the target user according to the comparison result.
9. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored in the memory and executable on the processor, which processor, when executing the computer program, realizes the steps of the identity authentication method according to any one of claims 1 to 7.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the identity authentication method according to any one of claims 1 to 7.
CN202311604221.5A 2023-11-27 2023-11-27 Service identity authentication method, device, equipment and medium based on gesture signature Pending CN117648683A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311604221.5A CN117648683A (en) 2023-11-27 2023-11-27 Service identity authentication method, device, equipment and medium based on gesture signature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311604221.5A CN117648683A (en) 2023-11-27 2023-11-27 Service identity authentication method, device, equipment and medium based on gesture signature

Publications (1)

Publication Number Publication Date
CN117648683A true CN117648683A (en) 2024-03-05

Family

ID=90048899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311604221.5A Pending CN117648683A (en) 2023-11-27 2023-11-27 Service identity authentication method, device, equipment and medium based on gesture signature

Country Status (1)

Country Link
CN (1) CN117648683A (en)

Similar Documents

Publication Publication Date Title
WO2019205369A1 (en) Electronic device, identity recognition method based on human face image and voiceprint information, and storage medium
US9747491B2 (en) Dynamic handwriting verification and handwriting-based user authentication
JP5729302B2 (en) Biometric authentication system, method and program
AU2019200711B2 (en) Biometric verification
JP5710748B2 (en) Biometric authentication system
AU2017200935B2 (en) Method for securing and verifying a document
US10339373B1 (en) Optical character recognition utilizing hashed templates
US20190147218A1 (en) User specific classifiers for biometric liveness detection
CN107908940B (en) Fingerprint identification method and terminal equipment
CN110569635A (en) service system login method based on face recognition and service system
EP3371739A1 (en) High speed reference point independent database filtering for fingerprint identification
CN110956149A (en) Pet identity verification method, device and equipment and computer readable storage medium
US20170177847A1 (en) Apparatus and method for verifying an identity of a user
CN116612538A (en) Online confirmation method of electronic contract content
CN117648683A (en) Service identity authentication method, device, equipment and medium based on gesture signature
CN112818312A (en) MES system login authentication method based on face recognition technology and MES system
JP6199470B1 (en) Signature authentication system
Hasan et al. Reliable identity management system using Raspberry Pi
WO2019026415A1 (en) Signature verification system
CN111832533B (en) Authentication method, authentication device, authentication system, electronic equipment and readable storage medium
GB2514132A (en) Two-directional biometric matching
JP6603749B2 (en) Signature verification system
US11755757B1 (en) Methods and systems for determining the authenticity of an identity document
JP6378416B1 (en) Signature verification system
US20240005688A1 (en) Document authentication using multi-tier machine learning models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination