CN111481208A - Auxiliary system, method and storage medium applied to joint rehabilitation - Google Patents

Auxiliary system, method and storage medium applied to joint rehabilitation Download PDF

Info

Publication number
CN111481208A
CN111481208A CN202010248427.9A CN202010248427A CN111481208A CN 111481208 A CN111481208 A CN 111481208A CN 202010248427 A CN202010248427 A CN 202010248427A CN 111481208 A CN111481208 A CN 111481208A
Authority
CN
China
Prior art keywords
joint
predicted
angle
image
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010248427.9A
Other languages
Chinese (zh)
Other versions
CN111481208B (en
Inventor
成亮
熊运生
朱勇
林涨源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangya Hospital of Central South University
Original Assignee
Xiangya Hospital of Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangya Hospital of Central South University filed Critical Xiangya Hospital of Central South University
Priority to CN202010248427.9A priority Critical patent/CN111481208B/en
Publication of CN111481208A publication Critical patent/CN111481208A/en
Application granted granted Critical
Publication of CN111481208B publication Critical patent/CN111481208B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Rehabilitation Tools (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses an auxiliary system, a method and a storage medium applied to joint rehabilitation, wherein the system comprises: the joint image acquisition module is used for acquiring a joint image of a patient and preprocessing the joint image; the joint angle prediction module is used for predicting to obtain a first predicted joint angle and three predicted joint key points through a joint rehabilitation evaluation model based on the preprocessed patient joint image; and the angle synthesizer is used for calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value. The joint rehabilitation evaluation model can be used for calculating and predicting the angle of the joint only by acquiring the joint image of the patient, so that the joint mobility is directly sent to an operator or a rehabilitation teacher, the complex steps of manual measurement and calculation of the joint rehabilitation angle are omitted, and the medical compliance of the patient is improved, and the patient rehabilitation condition can be known by the operator or the rehabilitation teacher in the first time.

Description

Auxiliary system, method and storage medium applied to joint rehabilitation
Technical Field
The invention relates to the technical field of medical equipment, in particular to an auxiliary system, a method and a storage medium applied to joint rehabilitation.
Background
Many elbow fracture patients are currently under conservative treatment for internal fixation or plaster fixation. Because the limbs are braked for a long time, muscle atrophy, intra-articular adhesion or ligament contracture are easily caused. The early-stage reasonable functional exercise can promote the blood circulation of the affected limb, reduce the muscular atrophy, prevent the joint stiffness and promote the fracture healing, and the clinical value of the functional exercise is well known. The Chinese rehabilitation system consists of a third-level hospital rehabilitation department, a second-level hospital rehabilitation department (a special rehabilitation hospital), a community rehabilitation center (an outpatient service) or home rehabilitation. The third-level hospital rehabilitation department mainly undertakes acute-phase rehabilitation therapy, the second-level hospital rehabilitation department (rehabilitation special hospital) undertakes recovery-phase rehabilitation therapy, and the community rehabilitation medical center (outpatient service) or home rehabilitation mainly undertakes maintenance-phase rehabilitation therapy. But at present, the high-quality resources are mainly concentrated in the three hospitals in the large city in both software and hardware facility configuration and personnel configuration. Many patients come from counties or remote rural areas, the vicinity of the residence does not necessarily have good rehabilitation conditions, and the rehabilitation exercise in the third hospital in the large city for a long time is inconvenient and the rehabilitation cost is not very high. For many patients who are not affluent, the operation is not easy to be carried out with full operation cost, and the additional high recovery cost is hard to bear, so that the standard bending and stretching rehabilitation training after the elbow joint operation is difficult for many patients in the third hospital. The second problem is that although a doctor often teaches a patient how to perform flexion and extension exercises of the elbow joint after discharge, many patients and family members do not dare to perform flexion and extension exercises after discharge because of pain and difficulty in grasping the strength of rehabilitation training, fear of re-displacement of fracture or dislocation, and the like. The elbow joint of the patient is already stiff at the follow-up visit of one month after the operation, and the recovery is not ideal.
Disclosure of Invention
The invention provides an auxiliary system, a method and a storage medium applied to joint rehabilitation, and aims to solve the problem that a patient cannot perform joint rehabilitation training normatively after being discharged in the prior art.
In a first aspect, there is provided an assistance system for joint rehabilitation, comprising:
the joint image acquisition module is used for acquiring a joint image of a patient and preprocessing the joint image;
the joint angle prediction module is used for predicting to obtain a first predicted joint angle and three predicted joint key points through a joint rehabilitation assessment model based on the preprocessed joint image of the patient, wherein the three predicted joint key points comprise a predicted joint axis point, a predicted joint telecentric end point and a predicted joint centripetal end point, and the joint rehabilitation assessment model is obtained by training based on an acquired joint image set in advance;
and the angle synthesizer is used for calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value of the patient.
By the aid of the auxiliary system applied to joint rehabilitation, the angle of the joint can be calculated and predicted by using the joint rehabilitation assessment model trained in advance only by acquiring the joint image of the patient, so that the mobility of the joint can be directly sent to an operator or a rehabilitation teacher, tedious steps of manual measurement and joint recovery angle calculation are omitted, accuracy, digitalization, visual sense and convenience are achieved, the medical compliance of the patient is improved, the operator or the rehabilitation teacher can know the rehabilitation condition of the patient at the first time conveniently, and remote guidance is conducted in time; also beneficial to the collection and statistical analysis of follow-up visit data of the operators, thereby summarizing the clinical curative effect, improving and improving the medical technology.
Further, the joint rehabilitation evaluation model is obtained by training through the following method:
collecting a plurality of joint images and preprocessing the joint images;
marking the joint axis point, the joint telecentric end point and the joint proximal end point of each preprocessed joint image, and calculating the joint angle; constructing a training sample set by taking a preprocessed joint image and a corresponding joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle as a sample;
based on a training sample set, taking the preprocessed joint image as input, and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output, and training to obtain a joint rehabilitation assessment model;
the joint rehabilitation evaluation model comprises a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network;
the feature extraction sub-network takes the preprocessed joint image as input and outputs a feature image based on a convolutional neural network;
the key point detection sub-network takes the feature image output by the feature extraction sub-network as input, and outputs a predicted joint axis point, a predicted joint telecentric endpoint and a predicted joint proximal endpoint based on the deconvolution layer neural network;
the angle regression sub-network takes the characteristic image output by the characteristic extraction sub-network as input and outputs and predicts the joint angle based on the full connection layer neural network.
In the joint rehabilitation evaluation model, a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network form a multi-task learning system, namely, a key point detection task and an angle regression task are realized simultaneously. The two learning tasks supplement each other, and the useful information contained in the tasks of the two learning tasks is respectively used for helping the other learning task to obtain more accurate model parameters. In multi-task learning, information between tasks is shared with each other, and knowledge migrates from one task to another. On one hand, the multi-task learning method improves the whole learning effect through information sharing and knowledge migration, on the other hand, when sample data is insufficient, the information of a single task can be supplemented through knowledge learned by other tasks, and the problem of sample data sparseness can be solved. In addition, the angle regression sub-network directly obtains the activity angle of the joint, the key point detection sub-network can indirectly obtain the activity angle of the joint through certain formula calculation, and the activity angle of the joint and the activity angle are averaged to be used as a final angle, so that errors in a single task can be further reduced.
Further, the joint rehabilitation assessment model is obtained by training with the preprocessed joint image as input and the joint axis point, the joint telecentric end point, the joint proximal end point and the joint angle corresponding to the joint image as output, and specifically comprises the following steps:
a1, setting training parameters, namely adopting an Adam optimizer, presetting the Batchsize as a, presetting the training times as n rounds and presetting the learning rate β by taking the mean square error loss as a loss function;
a2, initializing network parameters: downloading pre-trained ResNet50 model parameters in an ImageNet data set from the Internet, initializing parameters of a feature extraction sub-network into the pre-trained ResNet50 model parameters, and initializing parameters of a key point detection sub-network and an angle regression sub-network in a random mode;
a3, sequentially extracting a joint images from a training sample set to be used as a mini-batch, inputting the joint images into a joint rehabilitation evaluation model, obtaining a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point from the output end of a key point detection sub-network, and obtaining a predicted joint angle from the output end of an angle regression sub-network;
a4, outputting after inputting a joint image into a joint rehabilitation evaluation model, and calculating the mean square error loss of key points and the mean square error loss of joint angles;
a5, taking the sum of the key point mean square error loss and the joint angle mean square error loss as a final error value, and then updating the network parameters of the joint rehabilitation evaluation model by using a back propagation algorithm according to a preset learning rate;
a6, repeating the steps A3-A5 until n rounds of training are completed, and obtaining the final joint rehabilitation evaluation model.
Further, the pressure visualization module is also included;
the pressure visualization module includes:
the flexible pressure sensor is used for being installed at the joint of a patient, collecting a pressure value applied when the joint moves passively and transmitting the pressure value to the processing module;
the processing module is used for receiving the pressure value transmitted by the flexible pressure sensor, converting the pressure value into a digital signal and transmitting the digital signal to the display module;
and the display module is used for receiving the digital signal of the processing module and displaying the pressure value applied when the joint is in passive motion.
In a second aspect, there is provided an assistance method applied to joint rehabilitation, comprising:
acquiring a joint image of a patient and preprocessing the joint image;
taking a preprocessed patient joint image as input, and predicting through a joint rehabilitation assessment model to obtain a first predicted joint angle and three predicted joint key points, wherein the three predicted joint key points comprise a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point, and the joint rehabilitation assessment model is obtained by training in advance based on an acquired joint image set;
and calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value of the patient.
Further, the joint rehabilitation evaluation model is obtained by training through the following method:
collecting a plurality of joint images and preprocessing the joint images;
marking the joint axis point, the joint telecentric end point and the joint proximal end point of each preprocessed joint image, and calculating the joint angle; constructing a training sample set by taking a preprocessed joint image and a corresponding joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle as a sample;
based on a training sample set, taking the preprocessed joint image as input, and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output, and training to obtain a joint rehabilitation assessment model;
the joint rehabilitation evaluation model comprises a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network;
the feature extraction sub-network takes the preprocessed joint image as input and outputs a feature image based on a convolutional neural network;
the key point detection sub-network takes the feature image output by the feature extraction sub-network as input, and outputs a predicted joint axis point, a predicted joint telecentric endpoint and a predicted joint proximal endpoint based on the deconvolution layer neural network;
the angle regression sub-network takes the characteristic image output by the characteristic extraction sub-network as input and outputs and predicts the joint angle based on the full connection layer neural network.
Further, the joint rehabilitation assessment model is obtained by training with the preprocessed joint image as input and the joint axis point, the joint telecentric end point, the joint proximal end point and the joint angle corresponding to the joint image as output, and specifically comprises the following steps:
a1, setting training parameters, namely adopting an Adam optimizer, presetting the Batchsize as a, presetting the training times as n rounds and presetting the learning rate β by taking the mean square error loss as a loss function;
a2, initializing network parameters: downloading pre-trained ResNet50 model parameters in an ImageNet data set from the Internet, initializing parameters of a feature extraction sub-network into the pre-trained ResNet50 model parameters, and initializing parameters of a key point detection sub-network and an angle regression sub-network in a random mode;
a3, sequentially extracting a joint images from a training sample set to be used as a mini-batch, inputting the joint images into a joint rehabilitation evaluation model, obtaining a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point from the output end of a key point detection sub-network, and obtaining a predicted joint angle from the output end of an angle regression sub-network;
a4, outputting after inputting a joint image into a joint rehabilitation evaluation model, and calculating the mean square error loss of key points and the mean square error loss of joint angles;
a5, taking the sum of the key point mean square error loss and the joint angle mean square error loss as a final error value, and then updating the network parameters of the joint rehabilitation evaluation model by using a back propagation algorithm according to a preset learning rate;
a6, repeating the steps A3-A5 until n rounds of training are completed, and obtaining the final joint rehabilitation evaluation model.
Further, the mean square error loss of the key point is calculated by the following formula:
Figure BDA0002434618500000041
wherein L1 represents the key point mean square error loss, xAiAnd yAiX-coordinate and Y-coordinate, X, respectively representing predicted joint axis points of the ith joint imageDiAnd yDiX-and Y-coordinates, X, representing the real joint axis points of the ith joint image, respectivelyBiAnd yBiX-coordinate and Y-coordinate of predicted telecentric joint end point of ith joint image, XEiAnd yEiX-coordinate and Y-coordinate respectively representing the real telecentric end point of the i-th joint imageCiAnd yCiX-coordinate and Y-coordinate, X, of the proximal end point of the predicted joint respectively representing the ith joint imageFiAnd yFiRespectively representing the X coordinate and the Y coordinate of the proximal end point of the real joint of the ith joint image;
the joint angle mean square error loss is calculated by the following formula:
Figure BDA0002434618500000051
wherein L2 represents the joint angle mean square error loss, JTiRepresenting the true joint angle value of the ith joint image, J1iRepresents the predicted joint angle value of the ith joint image.
Further, collecting a plurality of joint images and preprocessing the joint images, wherein the preprocessing comprises scaling each joint image to a preset size;
the method comprises the following steps of after the collection of a plurality of joint images and the preprocessing of the joint images:
and (4) sequentially rotating each joint image by a preset angle for b times to generate b new joint images.
In a third aspect, a computer readable storage medium is provided, the storage medium comprising stored program instructions adapted to be loaded by a processor and to perform the evaluation method for joint rehabilitation as described above.
Advantageous effects
The invention provides an auxiliary system, a method and a storage medium applied to joint rehabilitation, which can utilize a pre-trained joint rehabilitation evaluation model to calculate and predict the angle of a joint only by acquiring the joint image of a patient so as to directly send the activity of the joint to an operator or a rehabilitation teacher, save the tedious steps of manually measuring and calculating the joint recovery angle, achieve the purposes of accuracy, digitalization, visual sensation and convenience, and contribute to improving the medical compliance of the patient and facilitating the operator or the rehabilitation teacher to know the recovery condition of the patient at the first time and perform remote guidance in time; also beneficial to the collection and statistical analysis of follow-up visit data of the operators, thereby summarizing the clinical curative effect, improving and improving the medical technology.
Drawings
Fig. 1 is a flowchart of an assisting method applied to joint rehabilitation according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of an auxiliary system for joint rehabilitation according to an embodiment of the present invention;
fig. 3 is a schematic diagram of the positions of key points of three joints according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
Example 1
The embodiment provides an auxiliary method applied to joint rehabilitation, as shown in fig. 1, including:
acquiring a joint image of a patient and preprocessing the joint image;
taking a preprocessed patient joint image as input, and predicting through a joint rehabilitation assessment model to obtain a first predicted joint angle and three predicted joint key points, wherein the three predicted joint key points comprise a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point, and the joint rehabilitation assessment model is obtained by training in advance based on an acquired joint image set;
and calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value of the patient. Wherein the first predicted joint angle is calculated by the following formula:
Figure BDA0002434618500000061
Figure BDA0002434618500000062
Figure BDA0002434618500000063
J2=arccos((b2+c2-a2)/(2×b×c))
wherein x isAX-coordinate, X, representing predicted joint axis pointBX-coordinate, X, representing predicted telecentric end of the jointCX-coordinate, y, representing the predicted proximal joint end pointAY-coordinate, Y, representing predicted joint axis pointBY-coordinate, Y, representing predicted telecentric end of the jointCRepresenting the Y coordinate of the predicted joint proximal end point.
The joint rehabilitation assessment model is obtained by training through the following method:
collecting a plurality of joint images and preprocessing the joint images; wherein the preprocessing includes scaling each joint image to a preset size, in this embodiment, scaling each joint image to 224 × 224 pixels;
sequentially rotating each preprocessed joint image for b times at a preset angle to generate b new joint images, wherein the preset angle is 20 degrees, and b is 18 degrees, and one joint image can generate 18 augmented joint images;
marking the joint axis point, the joint telecentric end point and the joint proximal end point of each preprocessed joint image, and calculating the joint angle, wherein the positions of the joint axis point 4, the joint telecentric end point 5 and the joint proximal end point 6 are shown in figure 3; constructing a training sample set by taking a preprocessed joint image and a corresponding joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle as a sample; wherein the joint angle is calculated by the following formula:
Figure BDA0002434618500000064
Figure BDA0002434618500000065
Figure BDA0002434618500000071
JT=arccos((e2+f2-d2)/(2×e×f))
wherein x isDX-coordinate, X, representing joint axis pointEX-coordinate, X, representing the telecentric end of the jointFX-coordinate, y, representing the proximal end of the jointDY-coordinate, Y, representing joint axis pointEY-coordinate, Y, representing the telecentric end of the jointFA Y coordinate representing a proximal end of the joint;
based on a training sample set, taking the preprocessed joint image as input, and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output, and training to obtain a joint rehabilitation assessment model;
the joint rehabilitation evaluation model comprises a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network;
the feature extraction sub-network takes the preprocessed joint image as input and outputs a feature image based on a convolutional neural network;
the key point detection sub-network takes the feature image output by the feature extraction sub-network as input, and outputs a predicted joint axis point, a predicted joint telecentric endpoint and a predicted joint proximal endpoint based on the deconvolution layer neural network;
the angle regression sub-network takes the characteristic image output by the characteristic extraction sub-network as input and outputs and predicts the joint angle based on the full connection layer neural network.
In the joint rehabilitation evaluation model, a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network form a multi-task learning system, namely, a key point detection task and an angle regression task are realized simultaneously. The two learning tasks supplement each other, and the useful information contained in the tasks of the two learning tasks is respectively used for helping the other learning task to obtain more accurate model parameters. In multi-task learning, information between tasks is shared with each other, and knowledge migrates from one task to another. On one hand, the multi-task learning method improves the whole learning effect through information sharing and knowledge migration, on the other hand, when sample data is insufficient, the information of a single task can be supplemented through knowledge learned by other tasks, and the problem of sample data sparseness can be solved. In addition, the angle regression sub-network directly obtains the activity angle of the joint, the key point detection sub-network can indirectly obtain the activity angle of the joint through certain formula calculation, and the activity angle of the joint and the activity angle are averaged to be used as a final angle, so that errors in a single task can be further reduced.
In detail, the joint rehabilitation assessment model is obtained by training with the preprocessed joint image as input and the joint axis point, the joint telecentric end point, the joint proximal end point and the joint angle corresponding to the joint image as output, and specifically comprises the following steps:
a1, setting training parameters, namely, adopting an Adam optimizer, taking the mean square error loss as a loss function, presetting the Batchsize (the number of samples required by one training) as a, presetting the training times as n rounds, and presetting the learning rate β, wherein in the embodiment, the value of a is 32, the value of n is 140, the initial learning rate β is 0.001, the learning rate β is changed to 0.0001 after 100 rounds of training, and the learning rate β is changed to 0.00001 after 20 rounds of training;
a2, initializing network parameters: downloading pre-trained ResNet50 model parameters in an ImageNet data set from the Internet, initializing parameters of a feature extraction sub-network into the pre-trained ResNet50 model parameters, and initializing parameters of a key point detection sub-network and an angle regression sub-network in a random mode;
a3, sequentially extracting a joint images from a training sample set to be used as a mini-batch, inputting the joint images into a joint rehabilitation evaluation model, obtaining a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point from the output end of a key point detection sub-network, and obtaining a predicted joint angle from the output end of an angle regression sub-network;
a4, outputting after inputting a joint image into a joint rehabilitation evaluation model, and calculating the mean square error loss of key points and the mean square error loss of joint angles; wherein, the mean square error loss of the key point is calculated by the following formula:
Figure BDA0002434618500000081
wherein L1 represents the key point mean square error loss, xAiAnd yAiX-coordinate and Y-coordinate, X, respectively representing predicted joint axis points of the ith joint imageDiAnd yDiX-and Y-coordinates, X, representing the real joint axis points of the ith joint image, respectivelyBiAnd yBiX-coordinate and Y-coordinate of predicted telecentric joint end point of ith joint image, XEiAnd yEiX-coordinate and Y-coordinate respectively representing the real telecentric end point of the i-th joint imageCiAnd yCiX-coordinate and Y-coordinate, X, of the proximal end point of the predicted joint respectively representing the ith joint imageFiAnd yFiRespectively representing the X coordinate and the Y coordinate of the proximal end point of the real joint of the ith joint image;
the joint angle mean square error loss is calculated by the following formula:
Figure BDA0002434618500000082
wherein L2 represents the joint angle mean square error loss, JTiRepresenting the true joint angle value of the ith joint image, J1iRepresents the predicted joint angle value of the ith joint image.
A5, taking the sum of the key point mean square error loss and the joint angle mean square error loss as a final error value, and then updating the network parameters of the joint rehabilitation evaluation model by using a back propagation algorithm according to a preset learning rate;
a6, repeating the steps A3-A5 until n rounds of training are completed, and obtaining the final joint rehabilitation evaluation model.
In this embodiment, the feature extraction sub-network is connected to the key point detection sub-network and the angle regression sub-network, the joint image is used as an input, the features of the original image are extracted by the convolutional neural network to obtain a feature map having abstract features, and the generated feature map is sent to the key point detection sub-network and the angle regression sub-network. The feature extractor adopts a structure of a convolutional neural network ResNet50, and comprises 5 groups of convolutional modules and 5 pooling layers; the connection sequence is as follows: a first convolution group-a first pooling layer-a second convolution group-a second pooling layer-a third convolution group-a third pooling layer-a fourth convolution group-a fourth pooling layer-a fifth convolution group-a fifth pooling layer. The input of the first group of convolution is an original joint image, the size of the joint image is 224x224 pixels, a feature map T1 with the size of 224x224 is generated after the first group of convolution processing is carried out, the feature map serves as the input of a first pooling layer, and the pooling layer reduces the size of the feature map to be half of the feature map T1, so that a feature map C1 is obtained; then C1 is used as the input of the second convolution group, and the convolution and pooling processing of the second stage is carried out again; and the like until 5 stages of convolution and pooling are finished, and finally the feature map C5 with the size of 7x7 is output. And 5 stages of convolution and pooling are carried out, useful features in the original image are gradually extracted, dimension reduction processing is carried out on the image gradually, and subsequent calculated amount is reduced.
The key point detection sub-network is connected with the feature extraction sub-network, a feature map C5 output by the feature extraction sub-network is used as input, the pixel size of the feature map is firstly promoted through a deconvolution layer network, then key points of joints are detected, coordinates of the three key points are obtained, the three key points of the joint image are respectively a joint axis point, a joint telecentric end point and a joint proximal end point, the key point detection sub-network consists of three deconvolution layers and a 1x1 convolution layer, wherein each deconvolution layer adopts 256 convolution kernels with the size of 4x4, the size of the feature map is enlarged by 2 times through the operation of the deconvolution layers, and each deconvolution layer is additionally provided with Batch Normalization operation and Re L U activation function, and the 1x1 convolution layer is connected with the three deconvolution layers and used for generating 3 key points.
The angle regression sub-network is connected with the feature extraction sub-network, the feature graph C5 output by the feature extractor is used as input, high-level semantic information in the feature graph is gradually extracted through three full-connection-layer networks, and the predicted joint angle value is directly regressed. The angle regression subnetwork consists of one flattening layer (B1), three fully connected layers (named F1, F2, F3, respectively). The flattening layer transforms the feature map C5 into a feature vector V0 (vector size 100352), the first fully-connected layer takes V0 as input to generate a feature vector V1 (vector size 4096), the second fully-connected layer takes V1 as input to generate a feature vector V2 (vector size 1000), and the third fully-connected layer takes V2 as input to generate an angle value V3 of a joint in the original image (V3 is a scalar value).
Example 2
The present embodiment provides an assistance system for joint rehabilitation, as shown in fig. 2, including:
the joint image acquisition module 1 is used for acquiring a joint image of a patient and preprocessing the joint image;
the joint angle prediction module 2 is used for predicting a first predicted joint angle and three predicted joint key points through a joint rehabilitation assessment model based on a preprocessed patient joint image, wherein the three predicted joint key points comprise a predicted joint axis point, a predicted joint telecentric end point and a predicted joint centripetal end point, and the joint rehabilitation assessment model is obtained by training based on an acquired joint image set in advance;
and the angle synthesizer 3 is used for calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value of the patient.
By the aid of the auxiliary system applied to joint rehabilitation, the angle of the joint can be calculated and predicted by using the joint rehabilitation assessment model trained in advance only by acquiring the joint image of the patient, so that the mobility of the joint can be directly sent to an operator or a rehabilitation teacher, the tedious step of manually calculating the joint rehabilitation angle is omitted, the auxiliary system is accurate, digital, visual and convenient, the medical compliance of the patient is improved, the patient can know the rehabilitation condition of the patient in the first time conveniently, and remote guidance is conducted in time; also beneficial to the collection and statistical analysis of follow-up visit data of the operators, thereby summarizing the clinical curative effect, improving and improving the medical technology.
Specifically, the joint rehabilitation assessment model is obtained by training through the following method:
collecting a plurality of joint images and preprocessing the joint images;
marking the joint axis point, the joint telecentric end point and the joint proximal end point of each preprocessed joint image, and calculating the joint angle; constructing a training sample set by taking a preprocessed joint image and a corresponding joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle as a sample;
based on a training sample set, taking the preprocessed joint image as input, and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output, and training to obtain a joint rehabilitation assessment model;
the joint rehabilitation evaluation model comprises a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network;
the feature extraction sub-network takes the preprocessed joint image as input and outputs a feature image based on a convolutional neural network;
the key point detection sub-network takes the feature image output by the feature extraction sub-network as input, and outputs a predicted joint axis point, a predicted joint telecentric endpoint and a predicted joint proximal endpoint based on the deconvolution layer neural network;
the angle regression sub-network takes the feature image output by the feature extraction sub-network as input and outputs a first predicted joint angle based on the full-connection layer neural network.
More specifically, the joint rehabilitation assessment model is obtained by training with the preprocessed joint image as input and the joint axis point, the joint telecentric end point, the joint proximal end point and the joint angle corresponding to the joint image as output, and specifically comprises the following steps:
a1, setting training parameters, namely adopting an Adam optimizer, presetting the Batchsize as a, presetting the training times as n rounds and presetting the learning rate β by taking the mean square error loss as a loss function;
a2, initializing network parameters: downloading pre-trained ResNet50 model parameters in an ImageNet data set from the Internet, initializing parameters of a feature extraction sub-network into the pre-trained ResNet50 model parameters, and initializing parameters of a key point detection sub-network and an angle regression sub-network in a random mode;
a3, sequentially extracting a joint images from a training sample set to be used as a mini-batch, inputting the joint images into a joint rehabilitation evaluation model, obtaining a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point from the output end of a key point detection sub-network, and obtaining a predicted joint angle from the output end of an angle regression sub-network;
a4, outputting after inputting a joint image into a joint rehabilitation evaluation model, and calculating the mean square error loss of key points and the mean square error loss of joint angles;
a5, taking the sum of the key point mean square error loss and the joint angle mean square error loss as a final error value, and then updating the network parameters of the joint rehabilitation evaluation model by using a back propagation algorithm according to a preset learning rate;
a6, repeating the steps A3-A5 until n rounds of training are completed, and obtaining the final joint rehabilitation evaluation model.
In this embodiment, the system further comprises a pressure visualization module;
the pressure visualization module includes:
the flexible pressure sensor is used for being installed at the joint of a patient, collecting a pressure value applied when the joint moves passively and transmitting the pressure value to the processing module;
the processing module is used for receiving the pressure value transmitted by the flexible pressure sensor, converting the pressure value into a digital signal and transmitting the digital signal to the display module;
and the display module is used for receiving the digital signal of the processing module and displaying the pressure value applied when the joint is in passive motion.
For details of other specific implementations, reference is made to the auxiliary method applied to joint rehabilitation provided in embodiment 1, and details are not described herein.
Example 3
The present embodiment provides a computer readable storage medium comprising stored program instructions adapted to be loaded by a processor and to perform the evaluation method for joint rehabilitation as described in embodiment 1.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The invention provides an auxiliary system, a method and a storage medium applied to joint rehabilitation, which can utilize a pre-trained joint rehabilitation evaluation model to calculate and predict the angle of a joint only by acquiring the joint image of a patient so as to directly send the activity of the joint to an operator or a rehabilitation teacher, save the tedious steps of manually measuring and calculating the joint recovery angle, achieve the purposes of accuracy, digitalization, visual sensation and convenience, and contribute to improving the medical compliance of the patient and facilitating the operator or the rehabilitation teacher to know the recovery condition of the patient at the first time and perform remote guidance in time; also beneficial to the collection and statistical analysis of follow-up visit data of the operators, thereby summarizing the clinical curative effect, improving and improving the medical technology. The joint rehabilitation assessment model is a deep neural network comprising a feature extraction sub-network, an angle regression sub-network and a key point detection sub-network, information sharing and knowledge migration are achieved between an angle regression task and a key point detection task, and three effects are achieved: 1. the overall detection performance and regression performance are improved; 2. the situation of sparse sample data can be effectively relieved; 3. the accuracy of angle prediction can be further improved through comprehensive calculation between the angle regression value and the angle value indirectly obtained from the key point.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An assistance system for joint rehabilitation, comprising:
the joint image acquisition module is used for acquiring a joint image of a patient and preprocessing the joint image;
the joint angle prediction module is used for predicting to obtain a first predicted joint angle and three predicted joint key points through a joint rehabilitation assessment model based on the preprocessed joint image of the patient, wherein the three predicted joint key points comprise a predicted joint axis point, a predicted joint telecentric end point and a predicted joint centripetal end point, and the joint rehabilitation assessment model is obtained by training based on an acquired joint image set in advance;
and the angle synthesizer is used for calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value of the patient.
2. The assistance system for joint rehabilitation according to claim 1, wherein the joint rehabilitation evaluation model is trained by the following method:
collecting a plurality of joint images and preprocessing the joint images;
marking the joint axis point, the joint telecentric end point and the joint proximal end point of each preprocessed joint image, and calculating the joint angle; constructing a training sample set by taking a preprocessed joint image and a corresponding joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle as a sample;
based on a training sample set, taking the preprocessed joint image as input, and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output, and training to obtain a joint rehabilitation assessment model;
the joint rehabilitation evaluation model comprises a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network;
the feature extraction sub-network takes the preprocessed joint image as input and outputs a feature image based on a convolutional neural network;
the key point detection sub-network takes the feature image output by the feature extraction sub-network as input, and outputs a predicted joint axis point, a predicted joint telecentric endpoint and a predicted joint proximal endpoint based on the deconvolution layer neural network;
the angle regression sub-network takes the characteristic image output by the characteristic extraction sub-network as input and outputs and predicts the joint angle based on the full connection layer neural network.
3. The auxiliary system applied to joint rehabilitation according to claim 2, wherein the joint rehabilitation assessment model is obtained by taking the preprocessed joint image as input and taking the joint axis point, the joint telecentric end point, the joint proximal end point and the joint angle corresponding to the joint image as output through training, and the method specifically comprises the following steps:
a1, setting training parameters, namely adopting an Adam optimizer, presetting the Batchsize as a, presetting the training times as n rounds and presetting the learning rate β by taking the mean square error loss as a loss function;
a2, initializing network parameters: downloading pre-trained ResNet50 model parameters in an ImageNet data set from the Internet, initializing parameters of a feature extraction sub-network into the pre-trained ResNet50 model parameters, and initializing parameters of a key point detection sub-network and an angle regression sub-network in a random mode;
a3, sequentially extracting a joint images from a training sample set to be used as a mini-batch, inputting the joint images into a joint rehabilitation evaluation model, obtaining a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point from the output end of a key point detection sub-network, and obtaining a predicted joint angle from the output end of an angle regression sub-network;
a4, outputting after inputting a joint image into a joint rehabilitation evaluation model, and calculating the mean square error loss of key points and the mean square error loss of joint angles;
a5, taking the sum of the key point mean square error loss and the joint angle mean square error loss as a final error value, and then updating the network parameters of the joint rehabilitation evaluation model by using a back propagation algorithm according to a preset learning rate;
a6, repeating the steps A3-A5 until n rounds of training are completed, and obtaining the final joint rehabilitation evaluation model.
4. The assistance system for joint rehabilitation as claimed in claim 1, further comprising a pressure visualization module;
the pressure visualization module includes:
the flexible pressure sensor is used for being installed at the joint of a patient, collecting a pressure value applied when the joint moves passively and transmitting the pressure value to the processing module;
the processing module is used for receiving the pressure value transmitted by the flexible pressure sensor, converting the pressure value into a digital signal and transmitting the digital signal to the display module;
and the display module is used for receiving the digital signal of the processing module and displaying the pressure value applied when the joint is in passive motion.
5. An assistance method applied to joint rehabilitation, comprising:
acquiring a joint image of a patient and preprocessing the joint image;
taking a preprocessed patient joint image as input, and predicting through a joint rehabilitation assessment model to obtain a first predicted joint angle and three predicted joint key points, wherein the three predicted joint key points comprise a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point, and the joint rehabilitation assessment model is obtained by training in advance based on an acquired joint image set;
and calculating to obtain a second predicted joint angle based on the three predicted joint key points, and averaging the first predicted joint angle and the second predicted joint angle to obtain a final joint angle evaluation value of the patient.
6. The assistance method for joint rehabilitation according to claim 5, wherein the joint rehabilitation evaluation model is trained by the following method:
collecting a plurality of joint images and preprocessing the joint images;
marking the joint axis point, the joint telecentric end point and the joint proximal end point of each preprocessed joint image, and calculating the joint angle; constructing a training sample set by taking a preprocessed joint image and a corresponding joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle as a sample;
based on a training sample set, taking the preprocessed joint image as input, and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output, and training to obtain a joint rehabilitation assessment model;
the joint rehabilitation evaluation model comprises a feature extraction sub-network, a key point detection sub-network and an angle regression sub-network;
the feature extraction sub-network takes the preprocessed joint image as input and outputs a feature image based on a convolutional neural network;
the key point detection sub-network takes the feature image output by the feature extraction sub-network as input, and outputs a predicted joint axis point, a predicted joint telecentric endpoint and a predicted joint proximal endpoint based on the deconvolution layer neural network;
the angle regression sub-network takes the characteristic image output by the characteristic extraction sub-network as input and outputs and predicts the joint angle based on the full connection layer neural network.
7. The auxiliary method applied to joint rehabilitation according to claim 6, wherein the joint rehabilitation assessment model is obtained by taking the preprocessed joint image as input and taking a joint axis point, a joint telecentric end point, a joint proximal end point and a joint angle corresponding to the joint image as output through training, and the method specifically comprises the following steps:
a1, setting training parameters, namely adopting an Adam optimizer, presetting the Batchsize as a, presetting the training times as n rounds and presetting the learning rate β by taking the mean square error loss as a loss function;
a2, initializing network parameters: downloading pre-trained ResNet50 model parameters in an ImageNet data set from the Internet, initializing parameters of a feature extraction sub-network into the pre-trained ResNet50 model parameters, and initializing parameters of a key point detection sub-network and an angle regression sub-network in a random mode;
a3, sequentially extracting a joint images from a training sample set to be used as a mini-batch, inputting the joint images into a joint rehabilitation evaluation model, obtaining a predicted joint axis point, a predicted joint telecentric end point and a predicted joint proximal end point from the output end of a key point detection sub-network, and obtaining a predicted joint angle from the output end of an angle regression sub-network;
a4, outputting after inputting a joint image into a joint rehabilitation evaluation model, and calculating the mean square error loss of key points and the mean square error loss of joint angles;
a5, taking the sum of the key point mean square error loss and the joint angle mean square error loss as a final error value, and then updating the network parameters of the joint rehabilitation evaluation model by using a back propagation algorithm according to a preset learning rate;
a6, repeating the steps A3-A5 until n rounds of training are completed, and obtaining the final joint rehabilitation evaluation model.
8. The assistance method for joint rehabilitation according to claim 7, wherein the key point mean square error loss is calculated by the following formula:
Figure FDA0002434618490000031
wherein L1 represents the key point mean square error loss, xAiAnd yAiX-coordinate and Y-coordinate, X, respectively representing predicted joint axis points of the ith joint imageDiAnd yDiX-and Y-coordinates, X, representing the real joint axis points of the ith joint image, respectivelyBiAnd yBiPredicted joint telecentric endpoints respectively representing ith joint imageX and Y coordinates of (2), XEiAnd yEiX-coordinate and Y-coordinate respectively representing the real telecentric end point of the i-th joint imageCiAnd yCiX-coordinate and Y-coordinate, X, of the proximal end point of the predicted joint respectively representing the ith joint imageFiAnd yFiRespectively representing the X coordinate and the Y coordinate of the proximal end point of the real joint of the ith joint image;
the joint angle mean square error loss is calculated by the following formula:
Figure FDA0002434618490000041
wherein L2 represents the joint angle mean square error loss, JTiRepresenting the true joint angle value of the ith joint image, J1iRepresents the predicted joint angle value of the ith joint image.
9. The assistance method applied to joint rehabilitation according to claim 6, wherein a plurality of joint images are acquired and preprocessed, wherein preprocessing comprises scaling each joint image to a preset size;
the method comprises the following steps of after the collection of a plurality of joint images and the preprocessing of the joint images:
and (4) sequentially rotating each joint image by a preset angle for b times to generate b new joint images.
10. A computer-readable storage medium, characterized in that the storage medium comprises stored program instructions adapted to be loaded by a processor and to carry out the evaluation method for joint rehabilitation according to any one of claims 5 to 9.
CN202010248427.9A 2020-04-01 2020-04-01 Auxiliary system, method and storage medium applied to joint rehabilitation Active CN111481208B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010248427.9A CN111481208B (en) 2020-04-01 2020-04-01 Auxiliary system, method and storage medium applied to joint rehabilitation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010248427.9A CN111481208B (en) 2020-04-01 2020-04-01 Auxiliary system, method and storage medium applied to joint rehabilitation

Publications (2)

Publication Number Publication Date
CN111481208A true CN111481208A (en) 2020-08-04
CN111481208B CN111481208B (en) 2023-05-12

Family

ID=71789566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010248427.9A Active CN111481208B (en) 2020-04-01 2020-04-01 Auxiliary system, method and storage medium applied to joint rehabilitation

Country Status (1)

Country Link
CN (1) CN111481208B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114795192A (en) * 2022-07-01 2022-07-29 佛山科学技术学院 Joint motion degree intelligent detection method and system
CN116863383A (en) * 2023-07-31 2023-10-10 山东大学齐鲁医院(青岛) Walking-aid monitoring method and device, electronic equipment and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030153817A1 (en) * 2001-12-28 2003-08-14 Petter Knagenhjelm Pattern analysis system and method
KR20130044473A (en) * 2011-10-24 2013-05-03 강원대학교산학협력단 Method for measuring movement of arm using depth sensor, and medium recording the same
CN103251419A (en) * 2013-04-25 2013-08-21 西安交通大学苏州研究院 Data gloves for function rehabilitation training and assessment of hands and monitoring method thereof
CN103340632A (en) * 2013-06-28 2013-10-09 北京航空航天大学 Human joint angle measuring method based on feature point space position
KR20150004461A (en) * 2013-07-02 2015-01-13 한양대학교 산학협력단 Method and apparatus for evaluating patients' status using depth image
KR20160025416A (en) * 2014-08-27 2016-03-08 대한민국(국립재활원장) Apparatus for controlling of the upper limb rehabilitation equipment of hemiplegic patients using joint estimation and method thereof
CN105787439A (en) * 2016-02-04 2016-07-20 广州新节奏智能科技有限公司 Depth image human body joint positioning method based on convolution nerve network
CN106127204A (en) * 2016-06-30 2016-11-16 华南理工大学 A kind of multi-direction meter reading Region detection algorithms of full convolutional neural networks
CN106361346A (en) * 2016-10-25 2017-02-01 佛山科学技术学院 Method for computing hand rehabilitation indexes based on sensing technology
CN107115102A (en) * 2017-06-07 2017-09-01 西南科技大学 A kind of osteoarticular function appraisal procedure and device
CN107349570A (en) * 2017-06-02 2017-11-17 南京邮电大学 Rehabilitation training of upper limbs and appraisal procedure based on Kinect
CN108229489A (en) * 2016-12-30 2018-06-29 北京市商汤科技开发有限公司 Crucial point prediction, network training, image processing method, device and electronic equipment
CN108376235A (en) * 2018-01-15 2018-08-07 深圳市易成自动驾驶技术有限公司 Image detecting method, device and computer readable storage medium
CN108498102A (en) * 2018-05-31 2018-09-07 北京上达医疗科技有限公司 Recovery training method and device, storage medium, electronic equipment
CN109063778A (en) * 2018-08-09 2018-12-21 中共中央办公厅电子科技学院 A kind of image aesthetic quality determines method and system
US20190205643A1 (en) * 2017-12-29 2019-07-04 RetailNext, Inc. Simultaneous Object Localization And Attribute Classification Using Multitask Deep Neural Networks
CN110163045A (en) * 2018-06-07 2019-08-23 腾讯科技(深圳)有限公司 A kind of recognition methods of gesture motion, device and equipment

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030153817A1 (en) * 2001-12-28 2003-08-14 Petter Knagenhjelm Pattern analysis system and method
KR20130044473A (en) * 2011-10-24 2013-05-03 강원대학교산학협력단 Method for measuring movement of arm using depth sensor, and medium recording the same
CN103251419A (en) * 2013-04-25 2013-08-21 西安交通大学苏州研究院 Data gloves for function rehabilitation training and assessment of hands and monitoring method thereof
CN103340632A (en) * 2013-06-28 2013-10-09 北京航空航天大学 Human joint angle measuring method based on feature point space position
KR20150004461A (en) * 2013-07-02 2015-01-13 한양대학교 산학협력단 Method and apparatus for evaluating patients' status using depth image
KR20160025416A (en) * 2014-08-27 2016-03-08 대한민국(국립재활원장) Apparatus for controlling of the upper limb rehabilitation equipment of hemiplegic patients using joint estimation and method thereof
CN105787439A (en) * 2016-02-04 2016-07-20 广州新节奏智能科技有限公司 Depth image human body joint positioning method based on convolution nerve network
CN106127204A (en) * 2016-06-30 2016-11-16 华南理工大学 A kind of multi-direction meter reading Region detection algorithms of full convolutional neural networks
CN106361346A (en) * 2016-10-25 2017-02-01 佛山科学技术学院 Method for computing hand rehabilitation indexes based on sensing technology
CN108229489A (en) * 2016-12-30 2018-06-29 北京市商汤科技开发有限公司 Crucial point prediction, network training, image processing method, device and electronic equipment
CN107349570A (en) * 2017-06-02 2017-11-17 南京邮电大学 Rehabilitation training of upper limbs and appraisal procedure based on Kinect
CN107115102A (en) * 2017-06-07 2017-09-01 西南科技大学 A kind of osteoarticular function appraisal procedure and device
US20190205643A1 (en) * 2017-12-29 2019-07-04 RetailNext, Inc. Simultaneous Object Localization And Attribute Classification Using Multitask Deep Neural Networks
CN108376235A (en) * 2018-01-15 2018-08-07 深圳市易成自动驾驶技术有限公司 Image detecting method, device and computer readable storage medium
CN108498102A (en) * 2018-05-31 2018-09-07 北京上达医疗科技有限公司 Recovery training method and device, storage medium, electronic equipment
CN110163045A (en) * 2018-06-07 2019-08-23 腾讯科技(深圳)有限公司 A kind of recognition methods of gesture motion, device and equipment
CN109063778A (en) * 2018-08-09 2018-12-21 中共中央办公厅电子科技学院 A kind of image aesthetic quality determines method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王景中,等: "基于BP 回归神经网络的人体角度拟合研究", 《计算机***应用》 *
王景中,等: "基于BP 回归神经网络的人体角度拟合研究", 《计算机***应用》, vol. 28, no. 8, 8 August 2019 (2019-08-08), pages 2 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114795192A (en) * 2022-07-01 2022-07-29 佛山科学技术学院 Joint motion degree intelligent detection method and system
CN114795192B (en) * 2022-07-01 2022-09-16 佛山科学技术学院 Joint mobility intelligent detection method and system
CN116863383A (en) * 2023-07-31 2023-10-10 山东大学齐鲁医院(青岛) Walking-aid monitoring method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111481208B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
CN107485844B (en) Limb rehabilitation training method and system and embedded equipment
CN111481208A (en) Auxiliary system, method and storage medium applied to joint rehabilitation
CN109994175A (en) Health detecting method and system based on artificial intelligence
Miao et al. Upper limb rehabilitation system for stroke survivors based on multi-modal sensors and machine learning
CN113974612B (en) Automatic evaluation method and system for upper limb movement function of stroke patient
CN107115102A (en) A kind of osteoarticular function appraisal procedure and device
CN114782497B (en) Motion function analysis method and electronic device
CN111599433B (en) Auxiliary prescription method and device for medicinal materials, storage medium and terminal
CN114049683A (en) Post-healing rehabilitation auxiliary detection system, method and medium based on three-dimensional human skeleton model
CN113283373A (en) Method for enhancing detection of limb motion parameters by depth camera
CN112641441B (en) Posture evaluation method, system, device and computer readable storage medium
Saha et al. Rehabilitation using neighbor-cluster based matching inducing artificial bee colony optimization
Huang et al. Healthcare application of in-shoe motion sensor for older adults: Frailty assessment using foot motion during gait
TW201445493A (en) A self-care system for assisting quantitative assessment of rehabilitation movement
Liu et al. Estimation of muscle forces of lower limbs based on CNN–LSTM neural network and wearable sensor system
Varga et al. Serious gaming and AI supporting treatment in rheumatoid arthritis
CN112992312B (en) Qualified monitoring method and system for spinal cord injury rehabilitation training
CN116327199A (en) Multi-mode signal analysis method, device and equipment
Howard et al. Non-contact versus contact-based sensing methodologies for in-home upper arm robotic rehabilitation
CN110197727A (en) Upper limb modeling method and motion function assessment system based on artificial neural network
CN115530814A (en) Child motion rehabilitation training method based on visual posture detection and computer deep learning
CN115130851A (en) Clinical care control method and system
CN115019388A (en) Full-automatic gait analysis method for shooting gait video by using monocular camera
CN112086155A (en) Diagnosis and treatment information structured collection method based on voice input
Li et al. A Real-Time Control Method for Upper Limb Exoskeleton Based on Active Torque Prediction Model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant