CN111611859A - Gait recognition method based on GRU - Google Patents

Gait recognition method based on GRU Download PDF

Info

Publication number
CN111611859A
CN111611859A CN202010315195.4A CN202010315195A CN111611859A CN 111611859 A CN111611859 A CN 111611859A CN 202010315195 A CN202010315195 A CN 202010315195A CN 111611859 A CN111611859 A CN 111611859A
Authority
CN
China
Prior art keywords
gru
gait
data
training
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010315195.4A
Other languages
Chinese (zh)
Other versions
CN111611859B (en
Inventor
耿艳利
蔡晓东
杨鹏
宣伯凯
陈玲玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Technology
Original Assignee
Hebei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Technology filed Critical Hebei University of Technology
Priority to CN202010315195.4A priority Critical patent/CN111611859B/en
Publication of CN111611859A publication Critical patent/CN111611859A/en
Application granted granted Critical
Publication of CN111611859B publication Critical patent/CN111611859B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a gait recognition method based on GRU, belonging to the technical field of artificial limbs, solving the problems of complex calculation, poor real-time performance and the like of the traditional gait recognition method, saving complicated feature extraction engineering through GRU, only needing to use model parameters for classification, greatly improving the calculation speed, realizing real-time calculation of gait stage and getting rid of the complicated process that the traditional gait recognition needs to classify on line. It includes: collecting sole pressure information during walking by using an FSR film pressure sensor worn on the sole of an artificial limb; marking corresponding labels on the data queues according to the typical walking characteristics and the time stamps of the targets; building a GRU network model; defining a GRU unit, a full connection layer and each activation function; and dividing the obtained data label pair into a training set and a test set, sending the training set into a GRU network model for training, and evaluating the classification effect of the model by using the test set after the training is finished to perform online real-time classification.

Description

Gait recognition method based on GRU
Technical Field
The invention belongs to the technical field of artificial limbs, and particularly relates to a gait recognition method based on GRU, namely a gait recognition system of an artificial lower limb, which can recognize the gait phase of an artificial limb wearer and improve the accuracy and the real-time performance of gait recognition.
Background
A full cycle gait is referred to as the "gait cycle". A gait cycle is divided into two phases, a "support phase" and a "swing phase", respectively, and may be further divided into a plurality of sub-phases. The gait recognition can not only provide important analysis basis for a rehabilitation doctor, but also provide control signals for the intelligent artificial limb to make corresponding control strategies and parameter adjustment, so that the movement process of a patient is more stable, smooth and natural.
Most of the existing gait recognition methods have the defects of complex calculation, poor real-time performance, low reliability and the like. For example, in the application of the invention patent in chinese No. 201811241695.7, gait stage is identified by collecting electromyographic signals and extracting relevant features, however, the detection signal is influenced by factors such as the surface temperature of the receptor and sweat, the stability and accuracy are low, complex feature extraction processes such as wavelet decomposition and calculation of williamson amplitude are required, the process is complicated, the real-time performance is poor, and the electrode needs to be in direct contact with the skin, so that the comfort is poor. 201910976122.7, the IMU module is adopted to collect the rotation angles of the left thigh, the right thigh and the shank of the human body, and the classification algorithm based on the rules is adopted to realize the real-time identification of the walking gait of the human body, however, the gait phase transition period is short, the signal change is complex, different road conditions and different detection objects have great difference, the classification algorithm based on the rules is simple and convenient to calculate, the classification accuracy is questioned, and the reliability is low.
Disclosure of Invention
The invention aims to provide a gait recognition method based on GRU, which has high real-time performance, low calculation complexity and high accuracy.
In order to solve the technical problems, the invention adopts the technical scheme that: a gait recognition method based on GRU classifies five walking stages (swing stage, heel landing stage, full-foot landing stage, forefoot landing stage and tiptoe landing stage) according to control signals required by a lower limb dynamic artificial limb, is based on a deep circulation neural network technology, and directly classifies acquired original signals after preprocessing, and comprises the following steps:
step one, collecting sole pressure information during walking by using an eight-unit high dynamic FSR film pressure sensor insole worn on the sole of an artificial limb, marking a timestamp on the collected data and transmitting the data to a raspberry group or a server through a wireless communication module;
secondly, performing data analysis and processing on line according to typical walking step frequency and time stamp of the target and by combining plantar pressure signals, and marking corresponding labels on each data queue in five gait stages to form a data label pair set;
step three, building a gate control cycle unit (GRU) network model, wherein the GRU network model is of a four-layer structure and comprises two layers of GRUs, a full connection layer and a softmax output layer; wherein, the input data is used as the input of a first layer of GRU unit, the output of the first layer of GRU unit is used as the input of a second layer of GRU unit, the output of the second layer of GRU unit is connected with a full-connection hidden layer, and the full-connection hidden layer is connected to the softmax output layer; defining a GRU unit, a full connection layer and each excitation function;
step four, dividing the data label pair set obtained in the step two into a training set and a testing set, wherein the training set is sent to the GRU network model set up in the step three for training, and the testing set is used for evaluating the classification effect of the model after the training is finished;
and step five, performing online real-time classification on the better model obtained by training and evaluating in the step four.
In the first step of the invention, the FSR film pressure sensor is arranged below the artificial foot and is attached to the sole of the foot.
In the second step of the invention, the tags correspond to five typical gait phases, which are respectively: swing, heel strike, full foot strike, forefoot strike, and toe-off, preferably the tag is in the form of a one-hot code.
Preferably, in the second step of the present invention, the label calibration work is divided into three steps, including (1) preliminary division: the artificial limb wearer walks a plurality of steps normally, the duration time and the step frequency of each gait phase are counted, each step is distinguished according to the starting time, the ending time and the step frequency of a timestamp, and a data queue of each gait phase is preliminarily divided according to the duration time proportion and the timestamp of each gait; (2) typical feature analysis of each stage: making the postures of all gait phases by the artificial limb wearer, maintaining for a plurality of seconds, and analyzing the data characteristics of all gait phases; (3) and (3) final division: comprehensively considering the primarily divided gait phases and the typical data characteristics of each gait phase: five gait phases are divided primarily according to duration and step frequency of each gait phase, 25% of data queues in each gait phase are discarded before queues and 25% after queues, 50% of the data queues are taken, and data which meet the typical characteristics of each gait phase form a training set and a test set.
In the third step of the invention, the input data is eight-dimensional characteristic vectors of eight-channel plantar pressure signals after AD conversion, each dimension data is used as the input of a GRU node, and the length of the whole sequence is 8 nodes.
The invention uses STM32 to complete data acquisition, and the raspberry group and the server are matched to realize an identification algorithm: during offline training, data are collected in the first step, and model training is carried out on a server in the second step to the fourth step; when the online deployment is carried out, data are collected in the step one, and in the step five, a better model obtained by training and evaluating in the step four is loaded on the raspberry derivative for real-time classification.
The node number of the output layer is defined as the number of gait types in the current model, namely five gait stages.
Preferably, the weight optimization method of the GRU network model is an Adam optimization algorithm.
Preferably, the fully-connected hidden layer excitation function is a ReLU function, a Sigmod function, or a tanh function, and the output layer excitation function is a Softmax function.
Compared with the prior art, the invention has the beneficial effects that:
a deep circulation neural network is constructed, and the GRU unit is utilized to automatically search the characteristics of plantar pressure data in time and space in an asynchronous state stage from the whole sole, so that the direct classification of five gait stages is completed. The original data only needs to be subjected to simple AD conversion and sent into a neural network, so that the complicated feature extraction engineering is omitted, meanwhile, the accuracy rate can reach 96.32% in the multi-classification task preferred embodiment, and the classification accuracy and the judgment efficiency are improved.
Aiming at the defects of complex calculation, poor real-time performance and the like of the traditional gait classification method, the method performs model training on the server, and specifically, only a series of forward operations are needed to be performed by assigning and loading model parameters through the raspberry, the occupied storage space is less than 2M, and only 0.0028s is needed from the loading of the model to the giving of the classification result, so that the classification efficiency is greatly improved, the real-time calculation of the gait stage is realized, the complex process of classification under the on-line condition required by the traditional gait recognition is eliminated, and the method has high practical application value.
Drawings
The advantages and realisation of the invention will be more apparent from the following detailed description, given by way of example, with reference to the accompanying drawings, which are given for the purpose of illustration only, and which are not to be construed in any way as limiting the invention, and in which:
FIG. 1 is a flow chart of a GRU-based gait recognition method of the invention;
FIG. 2 is a force-sensitive resistance profile of the FSR film pressure sensor;
fig. 3 is a schematic diagram of the plantar pressure collecting device of the present invention;
FIG. 4 is an example of the original signal (100 strips are cut) in normal gait in the embodiment of the invention;
FIG. 5 is a schematic block diagram of a GRU;
FIG. 6 is a structural diagram of a GRU network model constructed in accordance with the present invention;
FIG. 7 is a schematic diagram of an example identification process of a sample in a network model;
FIG. 8 is a gait recognition effect diagram of the present invention
FIG. 9 is a confusion matrix in an embodiment test set
Detailed Description
The invention will be further described with reference to the following examples and figures:
the invention provides a gait recognition method based on GRU, the flow chart of the method is shown in figure 1, and the method comprises the following steps:
the first step is that the eight-unit high dynamic FSR film pressure sensor insole is placed on the sole of a right foot of a prosthetic limb and is attached to the whole sole of the prosthetic limb, HALLLUX in the figure 2 is enabled to coincide with the thumb of the prosthetic limb, TOES is enabled to coincide with the thumb of the prosthetic limb, a voltage division module, a Bluetooth host machine and an STM32F103RCT6 single chip microcomputer (STM 32 for short) are connected, the pressure division module and the Bluetooth host machine are fixed on the front side of the right lower leg through binding bands, and a raspberry group and a Bluetooth slave machine are bound between the waist.
The FSR film pressure sensor is used for sensing pressure changes of eight positions of the sole of a prosthetic foot and converting the pressure changes into resistance value changes, one end of the FSR film pressure sensor is grounded, the other end of the FSR film pressure sensor is connected with an STM32 onboard 3.3V power supply, and the FSR film pressure sensor is led out to a low eight-bit channel of an ADC1 on an STM32 from a voltage division module. The STM32 software starts AD conversion, and sets the input clock of the ADC to be 14MHz, and the sampling period to be 239.5 clock cycles, that is, the conversion time is 239.5+12.5 and 252 cycles are 18 us. The Bluetooth is an HC05 Bluetooth module, a host computer of the Bluetooth is connected to a serial port 1 of an STM32, a slave computer of the Bluetooth is connected to a raspberry pie, the Baud rate of the serial port is set to be 38400, no check bit exists, and a timestamp is attached to the tail of a sent data packet. The hardware schematic of the acquisition device is shown in fig. 3.
During on-line training, after the Bluetooth slave machine receives data, the data is sent to the server through the USB interface for data analysis and model training evaluation; when the Bluetooth slave is deployed on line, after the Bluetooth slave receives data, the data is sent to a model trained in advance by a raspberry group through a USB interface for classification. In the embodiment, the motion signal of the prosthesis wearer during normal walking is collected, the number of the collected steps is not less than 100, the plantar pressure information of the normal gait is obtained, and an example (100 strips are intercepted) of the original signal under a section of normal gait is shown in fig. 4.
And step two, segmenting and cutting the original data according to the typical walking step frequency of the target, and marking corresponding labels on each data queue according to the gait class. The label calibration work of the implementation is divided into three steps, five classification tasks are listed, and the serial number labels are 0 to 4, wherein the label of the swing period is 0, the label of the heel landing period is 1, the label of the full-foot landing period is 2, the label of the forefoot landing period is 3, and the label of the toe-off period is 4.
1. Preliminary division
In the embodiment, a prosthesis wearer walks 100 steps normally, the duration time and the step frequency of each gait phase are counted, each step is distinguished according to the starting time, the ending time and the step frequency of a time stamp, and the data queue of each gait phase is divided primarily according to the duration time proportion and the time stamp of each gait. In the embodiment, statistics shows that the swing period accounts for 40% of the whole gait cycle, the heel landing period accounts for 1.19%, the full-foot landing period accounts for 30.44%, the forefoot landing period accounts for 3.99%, and the toe-off period accounts for 24.38%.
2. Analysis of typical characteristics of each stage
Then, the artificial limb wearer makes postures of each gait phase and maintains the postures for 10s, and the data characteristics of each gait phase are analyzed:
(1) a swing period: the eight force-sensitive resistors are almost free from pressure, and the acquired voltage values are large; (2) heel strike period: the pressure on the force-sensitive resistors at the HEELs of the HEEL R and the HEEL L is the largest, and the voltage value is obviously smaller; (3) full-foot landing period: the voltage values corresponding to the force sensitive resistors at the heel, the left side and the middle side of the sole of the foot are obviously smaller; (4) forefoot strike period: the voltage value corresponding to the heel is obviously increased, and the voltage corresponding to the force sensitive resistors at the front sole and the toes is obviously reduced. (5) Toe-off period: the corresponding voltage of the two force-sensitive resistors at the middle part of the sole of the foot and the left side of the toe is obviously smaller than that of the other resistors.
TABLE 1 typical Voltage values at various gait phases
Figure BDA0002459295790000051
3. Final partitioning
The final data label comprehensively considers the primarily divided gait phases and the typical data characteristics of each gait phase: five gait phases are divided primarily according to duration and step frequency of each gait phase, 25% of data queue in each gait phase is discarded before the queue and 25% after the queue (removing uncertain data at critical parts of each phase), 50% of the data queue is selected, and data meeting typical characteristics of each gait phase form a training set and a test set.
Step three, building a gating cycle unit network (GRU) model, wherein the GRU model is of a four-layer structure and comprises two layers of GRUs, a full connection layer and a softmax output layer respectively, as shown in FIG. 6; wherein the input data is input as a first layer of GRU units, the output of the first layer of GRU units is input as a second layer of GRU units, the output of the second layer of GRU units is connected with a full-connection hidden layer, and the full-connection hidden layer is connected to the softmax output layer.
Referring to fig. 5, the GRU neural network unit specifically includes:
the GRU neural network is a modification of the RNN, a neural network for processing sequence data that is capable of capturing and recording dependencies between data within a sequence. The gate control circulation unit GRU introduces two gate signals, namely an update gate and a reset gate; since RNNs suffer from problems of gradient disappearance or gradient explosion, long-term dependencies within the sequence cannot be captured. The GRU becomes one of the solutions to this problem by updating and resetting the gate.
Hidden state htIs calculated as follows:
Figure BDA0002459295790000061
Figure BDA0002459295790000062
wherein z istTo refresh the door, rtTo reset the gate:
zt=σ(Wz[ht-1,xt]+bz) (3)
rt=σ(Wr[ht-1,xt]+br) (4)
where W is a weight matrix, xtFor input at time t, b is the bias term, and σ is the excitationThe function sigmoid. The calculation formula of the gates is formed by using different parameter matrixes, and the hidden states of all the gates have the same size.
Referring to fig. 6, in the embodiment, the input of the GRU unit in the first layer is eight channels of data of the FSR plantar pressure sensor, the number of the serial nodes is 8 in total, that is, data of one channel is input at each node. In order to thoroughly mine potential features in the data, the hidden state h is obtained through multiple teststIs 125 dimensions. To prevent over-fitting, Dropout is added at the output layer of the GRU unit for random masking with a probability of 0.5. Similarly, the input to the second layer of GRU units is the output of the first layer of GRU units, with hidden state htIs also 125 dimensions and 0.5 Dropout is added at the output layer. At the third layer full-connection layer, because the output of the GRU unit at the last sequence node integrates the information of the whole sequence, the hidden state (125 dimensions) of the last sequence node of the GRU unit at the second layer is selected as the number of input nodes of the full-connection layer, and the number of output nodes at the third layer is 5, which represents five classification tasks. Finally, the output layer adopts softmax as an excitation function, and the formula is as follows:
Figure BDA0002459295790000071
wherein, wj(j ═ 0,1,2,3,4) is the weight vector from the hidden layer to the output layer. In this embodiment, the loss function of the GRU neural network is selected as a cross-entropy loss function, and the cross-entropy loss function has the specific form:
Figure BDA0002459295790000072
wherein m is the sample size in the current batch, and n is the number of categories.
And step four, converting the labels obtained in the step two into unique hot codes (00000,00010,00100,01000,10000), and taking 70% of the data queues and the labels as a training set after the data queues and the labels are in one-to-one correspondence, and taking the rest 30% of the data queues and the labels as a test set. And (5) sending the training set to a GRU network model in the third step for training and carrying out five-fold cross validation. During training, an Adam optimizer is adopted, in order to prevent gradient explosion, the L2 norm of the gradient is cut when the gradient is calculated, c is a cutting threshold (set to be 5), and g is the gradient:
Figure BDA0002459295790000073
in this embodiment, the length of the data queue of the batch size, that is, the length of the data queue taken in each round of training is 100, the learning rate is 0.002, and the number of training rounds is 20. And evaluating the classification effect of the model by using the test set after the training is finished.
It was verified that the accuracy of the model in this example for five classifications in the test set was 96.32% and the F1 value (the harmonic mean of precision and recall) was 97.45%. Fig. 7 shows a classification process of the GRU network, specifically, the probability of five types of classification after a group of test data passes through each layer of GRU unit, the real class of the example is 3, the network can correctly predict the class after passing through 3 GRU units, and the probability of correct class after passing through all 8 units is very high. In addition, the classification results of a plurality of gait cycles are intercepted, as shown in fig. 8, and the predicted value and the true value are basically overlapped. Finally, 100 groups of data are randomly extracted from the test set to show the classification effect of the model, and the confusion matrix of the test results is shown in fig. 9.
And step five, loading the better model obtained by the training and evaluation in the step four on the raspberry derivative for online real-time classification. The input data of the model is 8-dimensional plantar pressure data after AD conversion in the step one, and only the input data need to be subjected to forward calculation through the model during classification. The raspberry type is RPi4B, and a TensorFlow1.13.1 deep learning framework is carried.
The embodiments of the present invention have been described in detail, but the description is only for the preferred embodiments of the present invention and should not be construed as limiting the scope of the present invention. The implementation of the steps can be changed, and all equivalent changes and modifications made within the scope of the present invention should be covered by the present patent.

Claims (9)

1. A gait recognition method based on GRU is characterized in that: the method comprises the following steps:
step one, collecting sole pressure information during walking by using an eight-unit high dynamic FSR film pressure sensor insole worn on the sole of an artificial limb, marking a timestamp on the collected data and transmitting the data to a raspberry group or a server through a wireless communication module;
secondly, performing data analysis and processing on line according to typical walking step frequency and time stamp of the target and by combining plantar pressure signals, and marking corresponding labels on each data queue in five gait stages to form a data label pair set;
step three, building a gate control cycle unit network model, wherein the network model is of a four-layer structure and comprises two GRUs, a full connection layer and a softmax output layer; wherein, the input data is used as the input of a first layer of GRU unit, the output of the first layer of GRU unit is used as the input of a second layer of GRU unit, the output of the second layer of GRU unit is connected with a full-connection hidden layer, and the full-connection hidden layer is connected to the softmax output layer; defining a GRU unit, a full connection layer and each excitation function;
step four, dividing the data label pair set obtained in the step two into a training set and a testing set, wherein the training set is sent to the GRU network model set up in the step three for training, and the testing set is used for evaluating the classification effect of the model after the training is finished;
and step five, performing on-line real-time classification on the model obtained by training and evaluating in the step four.
2. The GRU-based gait recognition method according to claim 1, characterized in that: in the first step, the FSR film pressure sensor is arranged below the artificial foot and attached to the sole of the foot.
3. The GRU-based gait recognition method according to claim 1, characterized in that: in the second step, the tags correspond to five typical gait phases, which are respectively: swing, heel strike, full foot strike, forefoot strike, and toe-off, the tag being in the form of a unique thermal code.
4. The GRU-based gait recognition method according to claim 3, characterized in that: in the second step, the label calibration work is divided into three steps, including (1) preliminary division: the artificial limb wearer walks a plurality of steps normally, the duration time and the step frequency of each gait phase are counted, each step is distinguished according to the starting time, the ending time and the step frequency of a timestamp, and a data queue of each gait phase is preliminarily divided according to the duration time proportion and the timestamp of each gait; (2) typical feature analysis of each stage: making the postures of all gait phases by the artificial limb wearer, maintaining for a plurality of seconds, and analyzing the data characteristics of all gait phases; (3) and (3) final division: comprehensively considering the primarily divided gait phases and the typical data characteristics of each gait phase: five gait phases are divided primarily according to duration and step frequency of each gait phase, 25% of data queues in each gait phase are discarded before queues and 25% after queues, 50% of the data queues are taken, and data which meet the typical characteristics of each gait phase form a training set and a test set.
5. The GRU-based gait recognition method according to claim 1, characterized in that: in the third step, the input data is eight-dimensional characteristic vectors of eight-channel plantar pressure signals after AD conversion, each dimension data is used as the input of one GRU node, and the length of the whole sequence is 8 nodes.
6. The GRU-based gait recognition method according to claim 1, characterized in that: in the third step, the node number of the output layer is the number of the gait types in the current model.
7. The GRU-based gait recognition method according to claim 1, characterized in that: in the third step, the weight optimization method of the GRU network model is an Adam optimization algorithm, the excitation function of the fully-connected hidden layer is a ReLU function, a Sigmod function or a tanh function, and the excitation function of the output layer is a Softmax function.
8. The GRU-based gait recognition method according to claim 7, characterized in that: step three, the hidden shape of the GRU unitState htIs 125 dimensions.
9. The GRU-based gait recognition method according to claim 1, characterized in that: the method uses STM32 to complete data acquisition, and the raspberry group and the server are matched to realize an identification algorithm: during offline training, data are collected in the first step, and model training is carried out on a server in the second step to the fourth step; when the online deployment is carried out, data are collected in the step one, and in the step five, a better model obtained by training and evaluating in the step four is loaded on the raspberry derivative for real-time classification.
CN202010315195.4A 2020-04-21 2020-04-21 Gait recognition method based on GRU Active CN111611859B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010315195.4A CN111611859B (en) 2020-04-21 2020-04-21 Gait recognition method based on GRU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010315195.4A CN111611859B (en) 2020-04-21 2020-04-21 Gait recognition method based on GRU

Publications (2)

Publication Number Publication Date
CN111611859A true CN111611859A (en) 2020-09-01
CN111611859B CN111611859B (en) 2022-07-22

Family

ID=72204687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010315195.4A Active CN111611859B (en) 2020-04-21 2020-04-21 Gait recognition method based on GRU

Country Status (1)

Country Link
CN (1) CN111611859B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487902A (en) * 2020-11-20 2021-03-12 杭州电子科技大学 Gait phase classification method based on TCN-HMM and oriented to exoskeleton
CN113673788A (en) * 2021-09-23 2021-11-19 国网天津市电力公司 Photovoltaic power generation power prediction method based on decomposition error correction and deep learning
CN113780223A (en) * 2021-09-09 2021-12-10 北京信息科技大学 Gait recognition method and device for artificial limb and storage medium
CN117079479A (en) * 2023-10-17 2023-11-17 之江实验室 Traffic signal control method and device for subsequent reinforcement learning of space-time prediction

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109784412A (en) * 2019-01-23 2019-05-21 复旦大学 The multiple sensor signals fusion method based on deep learning for gait classification
CN110537922A (en) * 2019-09-09 2019-12-06 北京航空航天大学 Human body walking process lower limb movement identification method and system based on deep learning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109784412A (en) * 2019-01-23 2019-05-21 复旦大学 The multiple sensor signals fusion method based on deep learning for gait classification
CN110537922A (en) * 2019-09-09 2019-12-06 北京航空航天大学 Human body walking process lower limb movement identification method and system based on deep learning

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487902A (en) * 2020-11-20 2021-03-12 杭州电子科技大学 Gait phase classification method based on TCN-HMM and oriented to exoskeleton
CN112487902B (en) * 2020-11-20 2024-02-02 杭州电子科技大学 Exoskeleton-oriented gait phase classification method based on TCN-HMM
CN113780223A (en) * 2021-09-09 2021-12-10 北京信息科技大学 Gait recognition method and device for artificial limb and storage medium
CN113673788A (en) * 2021-09-23 2021-11-19 国网天津市电力公司 Photovoltaic power generation power prediction method based on decomposition error correction and deep learning
CN117079479A (en) * 2023-10-17 2023-11-17 之江实验室 Traffic signal control method and device for subsequent reinforcement learning of space-time prediction
CN117079479B (en) * 2023-10-17 2024-01-16 之江实验室 Traffic signal control method and device for subsequent reinforcement learning of space-time prediction

Also Published As

Publication number Publication date
CN111611859B (en) 2022-07-22

Similar Documents

Publication Publication Date Title
CN111611859B (en) Gait recognition method based on GRU
CN110537922B (en) Human body walking process lower limb movement identification method and system based on deep learning
CN110232412B (en) Human gait prediction method based on multi-mode deep learning
CN110363152B (en) Method for identifying road condition of lower limb prosthesis based on surface electromyographic signals
Wang et al. Human Gait Recognition System Based on Support Vector Machine Algorithm and Using Wearable Sensors.
CN112754468A (en) Human body lower limb movement detection and identification method based on multi-source signals
Chinimilli et al. Human activity recognition using inertial measurement units and smart shoes
CN113314209B (en) Human body intention identification method based on weighted KNN
Li et al. EEG signal classification method based on feature priority analysis and CNN
Zhang et al. Pathological gait detection of Parkinson's disease using sparse representation
Kang et al. Subject-independent continuous locomotion mode classification for robotic hip exoskeleton applications
CN111401435B (en) Human body motion mode identification method based on motion bracelet
Sun et al. Continuous estimation of human knee joint angles by fusing kinematic and myoelectric signals
CN113780223A (en) Gait recognition method and device for artificial limb and storage medium
Hu et al. A novel fusion strategy for locomotion activity recognition based on multimodal signals
CN112487902B (en) Exoskeleton-oriented gait phase classification method based on TCN-HMM
Benalcázar et al. A model for real-time hand gesture recognition using electromyography (EMG), covariances and feed-forward artificial neural networks
KR102194313B1 (en) Apparatus and method for identifying individuals by performing neural network analysis for various detection information
KR102302719B1 (en) Apparatus and method for classification of gait type by performing neural network analysis for various detection information
Negi et al. Human locomotion classification for different terrains using machine learning techniques
CN115019393A (en) Exoskeleton robot gait recognition system and method based on convolutional neural network
Nieuwoudt et al. Investigation of real-time control of finger movements utilising surface EMG signals
Cene et al. Upper-limb movement classification through logistic regression sEMG signal processing
KR102350593B1 (en) Apparatus and method for classifying gait pattern based on multi modal sensor using deep learning ensemble
Ling-Ling et al. Electromyographic movement pattern recognition based on random forest algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant