CN112883355A - Non-contact user identity authentication method based on RFID and convolutional neural network - Google Patents

Non-contact user identity authentication method based on RFID and convolutional neural network Download PDF

Info

Publication number
CN112883355A
CN112883355A CN202110311961.4A CN202110311961A CN112883355A CN 112883355 A CN112883355 A CN 112883355A CN 202110311961 A CN202110311961 A CN 202110311961A CN 112883355 A CN112883355 A CN 112883355A
Authority
CN
China
Prior art keywords
phase
layer
identity authentication
data
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110311961.4A
Other languages
Chinese (zh)
Other versions
CN112883355B (en
Inventor
肖甫
戴纪馨
盛碧云
周剑
刘海猛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202110311961.4A priority Critical patent/CN112883355B/en
Publication of CN112883355A publication Critical patent/CN112883355A/en
Application granted granted Critical
Publication of CN112883355B publication Critical patent/CN112883355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A non-contact user identity authentication method based on RFID and a convolutional neural network is characterized in that a nine-square grid of a mobile phone is simulated to unlock and deploy an RFID tag matrix, phase information of hands during movement is collected, and identity authentication of a user is achieved through the convolutional neural network. The method comprises the following implementation steps: 1. collecting phase information of actions for identity authentication; 2. preprocessing the collected phase information; 3. extracting starting point information of each action sample from the preprocessed phase information; 4. making the acquired phase information into a data set, and then training the data by adopting a deep learning convolutional neural network to obtain a convolutional neural network model for authentication; 5. when the user performs identity authentication, the trained model is used for identifying the action of the user; 6. and obtaining an authentication result and judging whether the authentication is successful or not. The invention adopts the CNN-based convolutional neural network to extract the characteristics in the phase and perform action identification, thereby performing identity authentication and having better accuracy, safety and robustness.

Description

Non-contact user identity authentication method based on RFID and convolutional neural network
Technical Field
The invention relates to the technical field of action recognition, in particular to a non-contact user identity authentication method based on RFID and a convolutional neural network.
Background
In recent years, wireless sensing is rapidly developed in various fields, such as activity sensing, indoor positioning, intelligent home and behavior analysis. More and more families deploy Internet of things equipment, which brings intelligence and convenience to people, and meanwhile, the safety problem is often ignored. Some internet of things devices can control the access and environment of a building, and more likely monitor the audio and video devices of users, so that identity authentication is of great importance.
Since the conventional contact recognition technology has a problem of being carried by a dedicated device, a non-contact recognition concept that does not depend on a device dedicated to be carried by a recognition object has been a focus of research in recent years. The method mainly utilizes the interference and reflection of the user action on the wireless signals between the transceiver, extracts the characteristics of the received signals, and then carries out matching identification according to a designed model. The non-contact action recognition technology has the characteristics of easiness in operation, easiness in deployment, multiple scenes and the like, and can be widely applied to aspects of smart homes, medical care, industrial manufacturing and the like.
Convolutional Neural Networks (CNN) are one of the representative algorithms of deep learning, and are a kind of feedforward neural network including convolution calculation and having a deep structure. Success has been achieved in a number of areas, for example: image recognition, image segmentation, speech recognition, natural language processing, and the like. The principle of the method is that the characteristics can be learned from data automatically, and the result is generalized to unknown data of the same type. Therefore, the method can be applied to the field of motion recognition and has extremely high efficiency and accuracy.
Disclosure of Invention
The invention mainly aims to provide a non-contact user identity authentication method based on RFID and a convolutional neural network.
A non-contact user identity authentication method based on RFID and a convolutional neural network comprises the following steps:
step 1: arranging a passive RFID tag matrix in an indoor environment in a 3 x 3 squared structure, and acquiring phase data for identity authentication;
step 2: preprocessing the phase information acquired in the step 1;
and step 3: acquiring the information of the starting point and the end point of each action sample from the phase information preprocessed in the step 2;
and 4, step 4: making the phase information data into a data set for identity authentication, and training the data by adopting a deep learning Convolutional Neural Network (CNN) to obtain a CNN model for identity authentication;
and 5: when the user performs identity authentication, the action of the user is monitored, the label matrix collects phase information, and then the data is sent to the CNN model obtained in the step 4 after the processing of the step 2 and the step 3 to identify the identity of the user;
step 6: and 5, judging whether the user identity authentication is successful or not according to the result of the step 5.
Further, in the step 1, the arranged passive RFID tag matrix consists of 9 tags which are equidistantly arranged in three rows and three columns on a plane, and the distance between every two tags is 12.5 cm; an antenna is placed 1.2m behind the tag; and acquiring the phase information of a plurality of labels in the label matrix through the mutual communication between the RFID reader and the labels.
Further, in the step 2, the preprocessing of the acquired phase information includes the following steps:
step 2-1, phase unwrapping: the original phase signal is a periodic function in the range of 0-2 pi rad; when the phase value approaches 0 or 2 pi, there is a phase jump; setting the original sample signal of the acquired label mHas a phase vector value of
Figure BDA0002989764390000031
m is 1, 2, …, 9; assume two successive time phase values of
Figure BDA0002989764390000032
And
Figure BDA0002989764390000033
due to the periodic function, its actual signal should be
Figure BDA0002989764390000034
And; the phase unwrapping problem is converted into solving Nm,i1, 2, ·, n; obtained from the range of 0-2 π rad of phase
Figure BDA0002989764390000035
Figure BDA0002989764390000036
Thus Δ Nm,1=Nm,i+1-Nm,iThe calculation is as follows:
Figure BDA0002989764390000037
calculate Nm,i
Figure BDA0002989764390000038
Then according to
Figure BDA0002989764390000039
Deriving a phase vector phi after phase unwrappingm=[φm,1,φm,2,...,φm,n]。
Step 2-2, data normalization processing: acquiring a phase sequence with the length of t for each label in a static state, and solving an average phase value of each label in the static state; when the phase data is processed subsequently, the phase value of each tag is subjected to an operation of subtracting the average value, so that the phase of each tag in a static state is normalized to 0.
Step 2-3, filtering the phase using a butterworth low pass filter: when signal filtering is carried out, low-frequency signals generated by movement normally pass through, and high-frequency signals where noise is located are weakened or shielded; the butterworth low pass filter is formulated as the square of amplitude versus frequency as follows:
Figure BDA0002989764390000041
where n is the order of the filter, ωcTo cut-off frequency, ωpIs the value of the passband edge; when ω is ω ═ ωpTime of flight
Figure BDA0002989764390000042
2The values at the edges of the passband.
Further, step 3 specifically includes the following steps:
step 3-1, determining an approximate range of the start or the end of the action by using a KL divergence adaptive segmentation algorithm: setting a sliding window, calculating the discrete probability distribution function PDF of data in each sliding window, setting P and Q as PDF values of two continuous sliding windows, and calculating KL dispersion of the PDFs:
Figure BDA0002989764390000043
when the discrete value is large, the two windows are represented to be in different motion states to perform action segmentation;
step 3-2, determining the accurate time of the start or the end of the action: expanding the sliding window obtained in the step 3-1 by 5 times to obtain a time period which is 10 times that of the previous sliding window; performing sliding window operation on each label in the time period, and determining the starting time or the ending time of each label; the sliding window uses the amplitude and frequency of the signal as features; the length of the sliding window is denoted by L, and the amplitude value and frequency of the ith window are expressed as:
Figure BDA0002989764390000044
and
Figure BDA0002989764390000051
wherein phii,kA kth data point representing an ith sliding window; the measurement difference function G is calculated as:
G(i)=CA|Ai+1-Ai|+CF|Fi+1-Fi|
wherein C isAAnd CFIs two constants which are weight values of two formulas; therefore, the starting time and the ending time of each label are obtained, and the time when the first label starts to change is the time when the action starts; the action end time is obtained in the same way.
Further, in step 4, based on the action starting point information, the phase samples are cut and extracted, and a data set related to identity authentication is constructed, specifically, a phase matrix is constructed, and an m × n matrix P is generated, where m is the number of tags and n is the number of sampling points, that is:
Figure BDA0002989764390000052
then, utilizing an imresize (A, m) function in matlab, wherein A is an original matrix, and m is the normalized size; the label matrix is normalized to a fixed size data set.
Further, in step 4, the CNN model includes: an input layer, a convolution layer, a maximum pooling layer, a flattening layer and a full-connection layer;
the convolutional layer includes: a first convolution layer and a second convolution layer; the maximum pooling layer comprises a first maximum pooling layer and a second maximum pooling layer; the full connection layer comprises a first full connection layer and a second full connection layer;
the CNN model is constructed in a way that an input layer is arranged at the most front end, the input layer is connected with a first convolution layer, and the first convolution layer is connected with a first pooling layer; the maximum pooling layer I is connected with the convolution layer II; the second convolution layer is connected with the second maximum pooling layer; the second pooling layer is connected with the flattening layer; the flattening layer is connected with the full connecting layer; the full connecting layer I is connected with the full connecting layer II;
the input layer is used for finishing processing input data; the convolution layer uses convolution kernels to perform feature extraction and feature mapping; the maximum pooling layer is used for effectively reducing the number of parameters, so that the network complexity is reduced; the flattening layer is used for compressing data into a one-dimensional array; the fully-connected layer maps the learned distributed feature representation to a sample label space; the classification layer is used for completing classification of identity authentication and identification actions;
and feeding the processed training data set into the constructed CNN model, wherein the CNN model is continuously learned based on a large number of data samples in the training data set, and finally training a CNN model meeting requirements.
Further, in step 5, the user starts to perform user identity authentication, and the tag matrix acquires phase data of user authentication actions; the original data is processed in the steps 2 and 3 to obtain data information related to the authentication action; and (4) making the processed data into a data set through the step 4, and then sending the data set into the trained CNN model to perform classification and authentication on the actions.
Further, the action is evaluated by using the CNN model trained in the step 6, and whether the identified action is matched with the trained action is judged; if the identity authentication is matched, the identity authentication is successful.
Compared with the prior art, the invention has the following beneficial effects: the invention relates to a non-contact user identity authentication technology based on RFID and a convolutional neural network, which solves the problem that the existing Internet of things system is lack of security authentication. Compared with the traditional identity authentication, the method and the system do not need expensive equipment, only use a single reader, a single antenna and 9 commercial passive RFID tags, have low cost and simple arrangement, do not infringe the privacy of users, and improve the system safety. Compared with the traditional action segmentation method, the invention designs and implements a novel signal segmentation method, and can accurately and effectively segment actions. In addition, the invention has a great expansion prospect, and the invention can realize a terminal for operating the family Internet of things and has a prospect of continuous research. In order to improve the accuracy of identification, the invention adopts a convolutional neural network to identify the action. The method is realized in an indoor room, the home user identity authentication scene is simulated, and the accuracy rate of action identification reaches 96.3%.
Drawings
Fig. 1 is a system framework of the contactless user identity authentication technology based on RFID and convolutional neural network in the embodiment of the present invention.
Fig. 2 is a tag matrix layout of the non-contact user identity authentication technology based on RFID and convolutional neural network in the embodiment of the present invention.
Fig. 3 is a real view of a layout of an RFID device based on a non-contact user identity authentication technique of RFID and a convolutional neural network according to an embodiment of the present invention.
Fig. 4 illustrates an identification action of the non-contact user identity authentication technology based on the RFID and the convolutional neural network in the embodiment of the present invention.
Fig. 5 is a CNN model structure of the non-contact user identity authentication technology based on RFID and convolutional neural network in the embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further explained in detail by combining the drawings in the specification.
Referring to fig. 1, the method includes: phase preprocessing flow, action extraction flow and identity authentication flow. The passive RFID tag matrix collects phase data information of user identification actions, and the position layout of tags is shown in FIGS. 2 and 3; wherein the collected phase information data comprises: 5 motion modes for identity authentication. The motion pattern of the collected motion is shown in fig. 4. The antenna is used for sending signals to the passive RFID tags, receiving the signals from the passive RFID tags and sending the signals to the RFID reader; the RFID reader is used for modulating and demodulating signals and decoding data packets; the phase preprocessing unit is used for performing phase expansion, normalization and data smoothing on the original data; an action extracting unit for determining each action start and end time, including the process of determining the action range and determining each label start or end time; the identity authentication unit constructs and trains a suitable CNN model based on a large amount of phase data of the recognition motion, and classifies the motion to be recognized as shown in fig. 5.
The phase preprocessing process comprises the following steps:
phase unwrapping: the raw phase signal is a periodic function in the range of 0-2 π rad. When the phase value approaches 0 or 2 pi, there may be a phase jump. Setting the phase vector value of the acquired original sample signal of the label m as
Figure BDA0002989764390000081
Assume two successive time phase values of
Figure BDA0002989764390000082
And
Figure BDA0002989764390000083
due to the periodic function, their actual signals should be
Figure BDA0002989764390000084
And
Figure BDA0002989764390000085
Figure BDA0002989764390000086
the phase unwrapping problem translates into solving Nm,i1, 2.., n. Since the phase ranges from 0 to 2 π rad, one can obtain
Figure BDA0002989764390000087
Thus Δ Nm,1=Nm,i+1-Nm,iCan be calculated as:
Figure BDA0002989764390000088
then N can be calculatedm,i
Figure BDA0002989764390000091
Then according to
Figure BDA0002989764390000092
The phase vector phi after phase unwrapping can be derivedm=[φm,1,φm,2,...,φm,n]。
Data normalization processing: acquiring t readings of each label in a static state, and solving an average phase value of each label in the static state:
Figure BDA0002989764390000093
when the phase data is processed subsequently, the phase value of each tag is subjected to an operation of subtracting the average value, so that the phase of each tag in a static state is normalized to 0.
The phase is filtered using a butterworth low pass filter. The principle of the method is that a phase signal generated by human motion is at a lower frequency, when a Butterworth filter filters signals, a low-frequency signal generated by the motion can normally pass through, and a high-frequency signal where noise is located can be weakened or shielded. The butterworth low pass filter can be expressed by the following equation of amplitude squared versus frequency:
Figure BDA0002989764390000094
where n is the order of the filter, ωcTo cut-off frequency, ωpIs the value of the passband edge; when ω is ω ═ ωpTime of flight
Figure BDA0002989764390000095
2The values at the edges of the passband.
The above-mentioned action extraction flow includes:
determine approximate extent of action start or end: setting a sliding window by using a KL divergence adaptive segmentation algorithm, calculating a discrete Probability Distribution Function (PDF) of data in each sliding window, setting P and Q as PDF values of two continuous sliding windows, and calculating KL divergences of the PDFs:
Figure BDA0002989764390000096
when the discrete value is large, the two windows are represented with different motion states for action segmentation. Determining the exact time at which the action starts or ends: and expanding the sliding window obtained in the previous step by 5 times to obtain a time period which is 10 times of the previous sliding window. And performing sliding window operation on each label in the time period, and determining the starting or ending time of each label. The sliding window uses the amplitude and frequency of the signal as features.
The length of the sliding window is denoted by L, and the amplitude value and frequency of the ith window can be expressed as:
Figure BDA0002989764390000101
and
Figure BDA0002989764390000102
wherein phii,kRepresenting the kth data point of the ith sliding window. The measurement difference function G can be calculated as:
G(i)=CA|Ai+1-Ai|+CF|Fi+1-Fi|
wherein C isAAnd CFIs twoThe constant is a weight of two formulas. Thus, the start and end times of each tag are obtained, and the time when the first tag starts to change is the time when the action starts. The action end time is obtained in the same way.
The identity authentication process comprises the following steps:
the phase information data is made into an action data set for identity authentication. The construction method comprises the following steps: constructing a phase matrix, and generating an m × n matrix P, where m is the number of tags and n is the number of sampling points, that is:
Figure BDA0002989764390000111
and then, utilizing an imresize (A, m) function in matlab, wherein A is an original matrix, and m is the normalized size. The label matrix is normalized to a fixed size data set.
The CNN model comprises: input layer, convolution layer, maximum pooling layer, flattening layer, and full connection layer.
The convolutional layer includes: a first convolution layer and a second convolution layer; the maximum pooling layer comprises a first maximum pooling layer and a second maximum pooling layer; the full connecting layer comprises a first full connecting layer and a second full connecting layer.
The structure of the CNN model is shown in FIG. 5, the most front end of the model is provided with an input layer, the input layer is connected with a first convolution layer, and the first convolution layer is connected with a first pooling layer; the maximum pooling layer I is connected with the convolution layer II; the second convolution layer is connected with the second maximum pooling layer; the second pooling layer is connected with the flattening layer; the flattening layer is connected with the full connecting layer; the full connecting layer I is connected with the full connecting layer II.
The input layer is used for finishing processing input data; the convolution layer uses convolution kernels to perform feature extraction and feature mapping; the maximum pooling layer is used for effectively reducing the number of parameters, so that the network complexity is reduced; the flattening layer is used for compressing data into a one-dimensional array; the fully connected layer maps the learned "distributed feature representation" to a sample label space; the classification layer is used for finishing classification of the identity authentication and identification actions.
And feeding the processed training data set into the constructed CNN model, wherein the CNN model is continuously learned based on a large number of data samples in the training data set, and finally training a CNN model meeting requirements.
The user starts to carry out user identity authentication, and the label matrix collects phase data of user authentication actions; the original data is subjected to phase preprocessing and action extraction to obtain data information related to the authentication action; and making the processed data into a data set, and sending the data set to the trained CNN model to perform classification and authentication on the actions. And evaluating the action by using the trained CNN model, and judging whether the recognized action is matched with the trained action. If the identity authentication is matched, the identity authentication is successful.
The above description is only a preferred embodiment of the present invention, and the scope of the present invention is not limited to the above embodiment, but equivalent modifications or changes made by those skilled in the art according to the present disclosure should be included in the scope of the present invention as set forth in the appended claims.

Claims (8)

1. A non-contact user identity authentication method based on RFID and a convolutional neural network is characterized in that: the method comprises the following steps:
step 1: arranging a passive RFID tag matrix in an indoor environment in a 3 x 3 squared structure, and acquiring phase data for identity authentication;
step 2: preprocessing the phase information acquired in the step 1;
and step 3: acquiring the information of the starting point and the end point of each action sample from the phase information preprocessed in the step 2;
and 4, step 4: making the phase information data into a data set for identity authentication, and training the data by adopting a deep learning Convolutional Neural Network (CNN) to obtain a CNN model for identity authentication;
and 5: when the user performs identity authentication, the action of the user is monitored, the label matrix collects phase information, and then the data is sent to the CNN model obtained in the step 4 after the processing of the step 2 and the step 3 to identify the identity of the user;
step 6: and 5, judging whether the user identity authentication is successful or not according to the result of the step 5.
2. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: in the step 1, the arranged passive RFID label matrix consists of 9 labels which are equidistantly arranged in three rows and three columns on a plane, and the distance between every two labels is 12.5 cm; an antenna is placed 1.2m behind the tag; and acquiring the phase information of a plurality of labels in the label matrix through the mutual communication between the RFID reader and the labels.
3. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: in the step 2, the collected phase information is preprocessed, and the method specifically comprises the following steps:
step 2-1, phase unwrapping: the original phase signal is a periodic function in the range of 0-2 pi rad; when the phase value approaches 0 or 2 pi, there is a phase jump; setting the phase vector value of the acquired original sample signal of the label m as
Figure FDA0002989764380000021
Figure FDA0002989764380000022
Assume two successive time phase values of
Figure FDA0002989764380000023
And
Figure FDA0002989764380000024
due to the periodic function, its actual signal should be
Figure FDA0002989764380000025
And; phase exhibitionThe open problem is converted into solving Nm,i1, 2, ·, n; obtained from the range of 0-2 π rad of phase
Figure FDA0002989764380000026
Figure FDA0002989764380000027
Thus Δ Nm,1=Nm,i+1-Nm,iThe calculation is as follows:
Figure FDA0002989764380000028
calculate Nm,i
Figure FDA0002989764380000029
Then according to
Figure FDA00029897643800000210
Deriving a phase vector phi after phase unwrappingm=[φm,1,φm,2,...,φm,n]。
Step 2-2, data normalization processing: acquiring a phase sequence with the length of t for each label in a static state, and solving an average phase value of each label in the static state; when the phase data is processed subsequently, the phase value of each tag is subjected to an operation of subtracting the average value, so that the phase of each tag in a static state is normalized to 0.
Step 2-3, filtering the phase using a butterworth low pass filter: when signal filtering is carried out, low-frequency signals generated by movement normally pass through, and high-frequency signals where noise is located are weakened or shielded; the butterworth low pass filter is formulated as the square of amplitude versus frequency as follows:
Figure FDA0002989764380000031
where n is the order of the filter, ωcTo cut-off frequency, ωpIs the value of the passband edge; when ω is ω ═ ωpTime of flight
Figure FDA0002989764380000032
2The values at the edges of the passband.
4. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: the step 3 specifically comprises the following steps:
step 3-1, determining an approximate range of the start or the end of the action by using a KL divergence adaptive segmentation algorithm: setting a sliding window, calculating the discrete probability distribution function PDF of data in each sliding window, setting P and Q as PDF values of two continuous sliding windows, and calculating KL dispersion of the PDFs:
Figure FDA0002989764380000033
when the discrete value is large, the two windows are represented to be in different motion states to perform action segmentation;
step 3-2, determining the accurate time of the start or the end of the action: expanding the sliding window obtained in the step 3-1 by 5 times to obtain a time period which is 10 times that of the previous sliding window; performing sliding window operation on each label in the time period, and determining the starting time or the ending time of each label; the sliding window uses the amplitude and frequency of the signal as features; the length of the sliding window is denoted by L, and the amplitude value and frequency of the ith window are expressed as:
Figure FDA0002989764380000034
and
Figure FDA0002989764380000041
wherein phii,kA kth data point representing an ith sliding window; the measurement difference function G is calculated as:
G(i)=CA|Ai+1-Ai|+CF|Fi+1-Fi|
wherein C isAAnd CFIs two constants which are weight values of two formulas; therefore, the starting time and the ending time of each label are obtained, and the time when the first label starts to change is the time when the action starts; the action end time is obtained in the same way.
5. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: in step 4, based on the action starting point information, the phase sample is cut and extracted, and a data set related to identity authentication is constructed, specifically, a phase matrix is constructed, and an mxn matrix P is generated, where m is the number of tags and n is the number of sampling points, that is:
Figure FDA0002989764380000042
then, utilizing an imresize (A, m) function in matlab, wherein A is an original matrix, and m is the normalized size; the label matrix is normalized to a fixed size data set.
6. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: in step 4, the CNN model includes: an input layer, a convolution layer, a maximum pooling layer, a flattening layer and a full-connection layer;
the convolutional layer includes: a first convolution layer and a second convolution layer; the maximum pooling layer comprises a first maximum pooling layer and a second maximum pooling layer; the full connection layer comprises a first full connection layer and a second full connection layer;
the CNN model is constructed in a way that an input layer is arranged at the most front end, the input layer is connected with a first convolution layer, and the first convolution layer is connected with a first pooling layer; the maximum pooling layer I is connected with the convolution layer II; the second convolution layer is connected with the second maximum pooling layer; the second pooling layer is connected with the flattening layer; the flattening layer is connected with the full connecting layer; the full connecting layer I is connected with the full connecting layer II;
the input layer is used for finishing processing input data; the convolution layer uses convolution kernels to perform feature extraction and feature mapping; the maximum pooling layer is used for effectively reducing the number of parameters, so that the network complexity is reduced; the flattening layer is used for compressing data into a one-dimensional array; the fully-connected layer maps the learned distributed feature representation to a sample label space; the classification layer is used for completing classification of identity authentication and identification actions;
and feeding the processed training data set into the constructed CNN model, wherein the CNN model is continuously learned based on a large number of data samples in the training data set, and finally training a CNN model meeting requirements.
7. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: step 5, the user starts to carry out user identity authentication, and the label matrix collects phase data of user authentication actions; the original data is processed in the steps 2 and 3 to obtain data information related to the authentication action; and (4) making the processed data into a data set through the step 4, and then sending the data set into the trained CNN model to perform classification and authentication on the actions.
8. The non-contact user identity authentication method based on the RFID and the convolutional neural network as claimed in claim 1, wherein: evaluating the action by using the CNN model trained in the step 6, and judging whether the identified action is matched with the trained action; if the identity authentication is matched, the identity authentication is successful.
CN202110311961.4A 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network Active CN112883355B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110311961.4A CN112883355B (en) 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110311961.4A CN112883355B (en) 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network

Publications (2)

Publication Number Publication Date
CN112883355A true CN112883355A (en) 2021-06-01
CN112883355B CN112883355B (en) 2023-05-02

Family

ID=76042053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110311961.4A Active CN112883355B (en) 2021-03-24 2021-03-24 Non-contact user identity authentication method based on RFID and convolutional neural network

Country Status (1)

Country Link
CN (1) CN112883355B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113993134A (en) * 2021-12-27 2022-01-28 广州优刻谷科技有限公司 IoT (Internet of things) equipment secure access method and system based on RFID (radio frequency identification) signals

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664909A (en) * 2018-04-28 2018-10-16 上海爱优威软件开发有限公司 A kind of auth method and terminal
CN108776788A (en) * 2018-06-05 2018-11-09 电子科技大学 A kind of recognition methods based on brain wave
CN109299697A (en) * 2018-09-30 2019-02-01 泰山学院 Deep neural network system and method based on underwater sound communication Modulation Mode Recognition
CN110516740A (en) * 2019-08-28 2019-11-29 电子科技大学 A kind of fault recognizing method based on Unet++ convolutional neural networks
CN110598734A (en) * 2019-08-05 2019-12-20 西北工业大学 Driver identity authentication method based on convolutional neural network and support vector field description

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664909A (en) * 2018-04-28 2018-10-16 上海爱优威软件开发有限公司 A kind of auth method and terminal
CN108776788A (en) * 2018-06-05 2018-11-09 电子科技大学 A kind of recognition methods based on brain wave
CN109299697A (en) * 2018-09-30 2019-02-01 泰山学院 Deep neural network system and method based on underwater sound communication Modulation Mode Recognition
CN110598734A (en) * 2019-08-05 2019-12-20 西北工业大学 Driver identity authentication method based on convolutional neural network and support vector field description
CN110516740A (en) * 2019-08-28 2019-11-29 电子科技大学 A kind of fault recognizing method based on Unet++ convolutional neural networks

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113993134A (en) * 2021-12-27 2022-01-28 广州优刻谷科技有限公司 IoT (Internet of things) equipment secure access method and system based on RFID (radio frequency identification) signals
CN113993134B (en) * 2021-12-27 2022-03-22 广州优刻谷科技有限公司 IoT (Internet of things) equipment secure access method and system based on RFID (radio frequency identification) signals

Also Published As

Publication number Publication date
CN112883355B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
Feng et al. Wi-multi: A three-phase system for multiple human activity recognition with commercial wifi devices
CN106658590B (en) Design and implementation of multi-person indoor environment state monitoring system based on WiFi channel state information
CN107968689B (en) Perception identification method and device based on wireless communication signals
US10061389B2 (en) Gesture recognition system and gesture recognition method
CN110337066A (en) Based on channel state information indoor occupant activity recognition method, man-machine interactive system
CN109325399B (en) Stranger gesture recognition method and system based on channel state information
CN110287863A (en) A kind of gesture identification method based on WiFi signal
Arshad et al. Leveraging transfer learning in multiple human activity recognition using WiFi signal
CN104346503A (en) Human face image based emotional health monitoring method and mobile phone
CN113609976A (en) Direction-sensitive multi-gesture recognition system and method based on WiFi (Wireless Fidelity) equipment
CN112861813B (en) Method for identifying human behavior behind wall based on complex value convolution neural network
CN111082879B (en) Wifi perception method based on deep space-time model
Ding et al. Device-free location-independent human activity recognition using transfer learning based on CNN
CN113935373A (en) Human body action recognition method based on phase information and signal intensity
CN111142668B (en) Interaction method based on Wi-Fi fingerprint positioning and activity gesture joint recognition
CN110929242B (en) Method and system for carrying out attitude-independent continuous user authentication based on wireless signals
CN112883355B (en) Non-contact user identity authentication method based on RFID and convolutional neural network
CN114423034B (en) Indoor personnel action recognition method, system, medium, equipment and terminal
Karayaneva et al. Unsupervised Doppler radar based activity recognition for e-healthcare
CN110621038B (en) Method and device for realizing multi-user identity recognition based on WiFi signal detection gait
Liu et al. Human presence detection via deep learning of passive radio frequency data
WO2024041053A1 (en) Indoor passive human behavior recognition method and apparatus
CN116244673A (en) Behavior and identity recognition method based on wireless signals
Pandey et al. Csi-based joint location and activity monitoring for covid-19 quarantine environments
CN114676739B (en) Method for detecting and identifying time sequence action of wireless signal based on fast-RCNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant