CN116869490B - Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence - Google Patents
Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence Download PDFInfo
- Publication number
- CN116869490B CN116869490B CN202311155294.0A CN202311155294A CN116869490B CN 116869490 B CN116869490 B CN 116869490B CN 202311155294 A CN202311155294 A CN 202311155294A CN 116869490 B CN116869490 B CN 116869490B
- Authority
- CN
- China
- Prior art keywords
- training
- module
- patient
- visual
- somatosensory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012549 training Methods 0.000 title claims abstract description 228
- 238000011156 evaluation Methods 0.000 title claims abstract description 90
- 238000013473 artificial intelligence Methods 0.000 title claims abstract description 11
- 230000009471 action Effects 0.000 claims abstract description 50
- 238000001514 detection method Methods 0.000 claims abstract description 9
- 238000005259 measurement Methods 0.000 claims abstract description 6
- 230000000007 visual effect Effects 0.000 claims description 63
- 230000003238 somatosensory effect Effects 0.000 claims description 60
- 239000013598 vector Substances 0.000 claims description 16
- 210000005252 bulbus oculi Anatomy 0.000 claims description 10
- 230000001720 vestibular Effects 0.000 claims description 10
- 201000010099 disease Diseases 0.000 claims description 8
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 8
- 238000009472 formulation Methods 0.000 claims description 7
- 239000000203 mixture Substances 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000007726 management method Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 3
- 230000001133 acceleration Effects 0.000 description 8
- 238000000034 method Methods 0.000 description 8
- 238000002645 vision therapy Methods 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000000638 stimulation Effects 0.000 description 2
- 208000012639 Balance disease Diseases 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 210000000133 brain stem Anatomy 0.000 description 1
- 210000001638 cerebellum Anatomy 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000001508 eye Anatomy 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 231100000915 pathological change Toxicity 0.000 description 1
- 230000036285 pathological change Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 210000000278 spinal cord Anatomy 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 210000003273 vestibular nerve Anatomy 0.000 description 1
- 210000004440 vestibular nuclei Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4848—Monitoring or testing the effects of treatment, e.g. of medication
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/028—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4005—Detecting, measuring or recording for evaluating the nervous system for evaluating the sensory system
- A61B5/4023—Evaluating sense of balance
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H5/00—Exercisers for the eyes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Ophthalmology & Optometry (AREA)
- Physiology (AREA)
- Physical Education & Sports Medicine (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Epidemiology (AREA)
- General Engineering & Computer Science (AREA)
- Neurology (AREA)
- Rehabilitation Therapy (AREA)
- Acoustics & Sound (AREA)
- Psychology (AREA)
- Anesthesiology (AREA)
- Hematology (AREA)
- Pain & Pain Management (AREA)
- General Physics & Mathematics (AREA)
- Neurosurgery (AREA)
- Primary Health Care (AREA)
- Rehabilitation Tools (AREA)
Abstract
The invention discloses a vestibule rehabilitation training dynamic evaluation system based on artificial intelligence, which relates to the field of rehabilitation training and comprises the following components: the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system; the training action input module is used for inputting training actions and standard action videos; the illness state evaluation module is used for evaluating the illness state of the patient according to the standard action video stored in the training action input module; this vestibule rehabilitation training dynamic evaluation system based on artificial intelligence gathers patient plantar pressure information through plantar pressure sensor, can be when the patient makes standard action, the analysis patient plantar atress condition, can the side aassessment training effect through the detection to patient plantar atress condition, measures patient limb movement angle through inertial measurement device to and the patient feel the angle of feeling of body when feeling the training, can detect like this whether the patient is doing the action standard when carrying out rehabilitation training.
Description
Technical Field
The invention relates to a rehabilitation training technology, in particular to a vestibular rehabilitation training dynamic evaluation system based on artificial intelligence.
Background
The vestibular system is an important component of the human body balance system, and is provided with special receptors, can receive proper stimulation, transmits the stimulation information to vestibular nuclei and cerebellum in the corresponding brain stem through vestibular nerves, transmits the information to a higher-level center in the brain through a plurality of nerve paths after being integrated, processed and the like with other sensory information (such as visual information and other body sense information) to perform high-level processing, even form subjective consciousness, or transmits the information to motor nuclei (such as eye motor nuclei, spinal cord anterior angle motor nuclei and the like) through a certain nerve path, thereby making specific and nonspecific functional response. The main symptoms of the vestibular system of the patient after the pathological changes comprise dizziness, balance disorder and the like.
When the existing rehabilitation training system is used, a patient with dysfunction carries out rehabilitation training by virtue of therapists and rehabilitation instruments, the environment is a closed and flat indoor environment of a rehabilitation treatment room, different from the natural environment, the training mode is single, the patient is easy to feel boring, the track of the movement of the patient cannot be recorded in the rehabilitation treatment process of the patient, the direction of the movement of the patient in the deflection movement process cannot be judged, and the targeted treatment cannot be carried out.
Disclosure of Invention
The invention aims to provide a vestibular rehabilitation training dynamic evaluation system based on artificial intelligence, which aims to solve the defects in the prior art.
In order to achieve the above object, the present invention provides the following technical solutions: vestibular rehabilitation training dynamic evaluation system based on artificial intelligence includes:
the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system;
the training action input module is used for inputting training actions and standard action videos;
the system comprises a training action input module, a disease condition evaluation module, a disease detection module and a disease detection module, wherein the training action input module is used for inputting a standard action video to a patient;
the rehabilitation training plan making module is used for making a corresponding rehabilitation training plan according to the condition evaluation of the patient by the condition evaluation module, wherein the training plan comprises a visual training plan and a somatosensory training plan;
the rehabilitation training module is used for performing rehabilitation training by the patient according to a rehabilitation training program formulated by the rehabilitation training program formulation module, wherein the rehabilitation training comprises visual training and somatosensory training;
the rehabilitation training evaluation module is used for evaluating training results of rehabilitation training of patients and sending the evaluation results to the rehabilitation training plan making module for updating the rehabilitation training plan.
Further, the user login module is used for registering patients and inputting illness states, and is further used for managing each patient using the system, wherein management operation comprises list viewing, adding and deleting of the patients, and viewing and modifying of attributes of the users.
Further, the rehabilitation training planning module comprises:
the visual evaluation result identification module is used for carrying out identification scoring on each item of visual evaluation result of the patient by the illness state evaluation module;
the body sensing evaluation result identification module is used for carrying out identification scoring on each item of the body sensing evaluation result of the patient for the illness state evaluation module;
the training parameter input module is used for inputting training duty ratio parameters of visual training and somatosensory training by medical staff, wherein the sum of the visual training parameters and the somatosensory training parameters is 1;
the visual training plan making module is used for multiplying the identification scores of each item of the visual evaluation result identification module with the visual training duty ratio parameters and arranging the visual evaluation result identification scores from small to large;
and the somatosensory training plan making module is used for multiplying the identification scores of each item of the somatosensory evaluation result identification module with the somatosensory training duty ratio parameter and arranging the items from small to large.
Further, the specific working mode of the rehabilitation training planning module is as follows:
a1, a visual evaluation result identification module carries out identification scoring on each item of visual evaluation results of patients on the condition evaluation module;
a2, the body sensing evaluation result identification module carries out identification scoring on each item of the body sensing evaluation result of the patient;
a3, medical staff inputs training duty ratio parameters of visual training and somatosensory training through a training parameter input module;
a4, multiplying each item identification score of the visual evaluation result identification module by the visual training duty ratio parameter by the visual training plan making module, and arranging from small to large;
and A5, multiplying the individual item identification scores of the somatosensory evaluation result identification module by the somatosensory training duty ratio parameters by the somatosensory training plan making module, and arranging from small to large.
Further, the rehabilitation training module comprises:
a vision training module, the vision training module comprising:
the wearable VR module is used for the patient to perform visual training through VR;
the eyeball tracking module is used for tracking the eyeballs of a patient during vision training;
the somatosensory module is used for detecting the somatosensory of the patient when the patient performs visual training;
the motion training module, motion training module includes:
the body type identification module is used for identifying the body type information of the patient;
the limb identification module is used for identifying limb information of the patient according to the patient body type information identified by the body type identification module;
the joint point identification module is used for identifying joint points of the patient according to the user limb information identified by the limb identification module and adjusting the somatosensory training gesture according to the joint point position;
the plantar pressure sensor is used for collecting plantar pressure information of a patient;
and the inertia measurement device is used for measuring the limb movement angle of the patient and the somatosensory angle of the patient during somatosensory training.
Further, the specific method for calculating the somatosensory angle by the somatosensory training module comprises the following steps:
b1, the gravitational acceleration is expressed as g= (0, 1) in the geographical coordinate system T Multiplying by rotation matrix C transformed from geographical coordinate system to sensing coordinate system s n After that, its representation g ' = (g ' in the sensor coordinate system is derived ' x ,g' y ,g' z );
B2, the accelerometer measures the components of the motion acceleration a of the carrier at the current moment on three axes of a sensor coordinate system, namely a respectively x 、a y 、a z The offset e of the gravitational acceleration measured at the current moment relative to the gravitational acceleration calculated at the previous moment;
b3, a proportional-integral controller added in an algorithm can correct the deviation so as to improve the stability of the system;
b4, adjusting the angular velocity omega by using the corrected offset;
b5, updating the quaternion by solving a differential equation of the quaternion;
and B6, normalizing after the quaternion is updated, and calculating an attitude angle.
Further, the specific working mode of the rehabilitation training evaluation module is as follows:
c1, firstly, any group of human body joint point data is taken, and data information representing action characteristics is abstracted in a vector form;
c2, adding each standard rehabilitation training action into a standard posture vector database;
c3, detecting human body joint points of a certain action in actual training of a patient, and abstracting the human body joint points into vector data information;
and C4, carrying out similarity calculation on the joint point information of the identified gesture and each standard gesture information in the standard gesture vector database.
Compared with the prior art, the vestibule rehabilitation training dynamic evaluation system based on artificial intelligence provided by the invention has the advantages that the plantar pressure information of the patient is acquired through the plantar pressure sensor, the plantar stress condition of the patient can be analyzed when the patient makes standard actions, the training effect can be evaluated laterally through the detection of the plantar stress condition of the patient, the limb movement angle of the patient is measured through the inertia measurement device, and the somatosensory angle of the patient when the patient performs somatosensory training is measured, so that whether the actions of the patient are standard or not can be detected when the patient performs rehabilitation training.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
Fig. 1 is a schematic diagram of an overall structure according to an embodiment of the present invention.
Detailed Description
In order to make the technical scheme of the present invention better understood by those skilled in the art, the present invention will be further described in detail with reference to the accompanying drawings.
Referring to fig. 1, an artificial intelligence based dynamic vestibular rehabilitation training assessment system includes:
the user login module is used for customizing the personalized vestibule rehabilitation training by the patient login training system;
the training action input module is used for inputting training actions and standard action videos;
the disease condition evaluation module is used for evaluating the disease condition of the patient according to the standard action video stored in the training action input module, wherein the evaluation comprises visual evaluation and body sensing evaluation;
the rehabilitation training plan making module is used for making a corresponding rehabilitation training plan according to the condition evaluation of the patient by the condition evaluation module, wherein the training plan comprises a visual training plan and a somatosensory training plan;
the rehabilitation training module is used for performing rehabilitation training on the patient according to a rehabilitation training program formulated by the rehabilitation training program formulation module, wherein the rehabilitation training comprises visual training and somatosensory training;
the rehabilitation training evaluation module is used for evaluating training results of rehabilitation training of patients and sending the evaluation results to the rehabilitation training plan making module for updating the rehabilitation training plan.
The system specifically works in a mode that a patient logs in a training system through a user login module to conduct personalized vestibular rehabilitation training customization, a training action and a standard action video are input through a training action input module, the patient condition is evaluated through a condition evaluation module according to the standard action video stored in the training action input module, a corresponding rehabilitation training plan is formulated according to condition evaluation of the patient through a rehabilitation training plan formulation module, the rehabilitation training is conducted on the patient through the rehabilitation training module according to the rehabilitation training plan formulated by the rehabilitation training plan formulation module, training results are evaluated on the rehabilitation training of the patient through a rehabilitation training evaluation module, and the evaluation results are sent to the rehabilitation training plan formulation module to update the rehabilitation training plan.
The user login module is used for registering patients and inputting illness states, and is also used for managing each patient using the system, wherein management operation comprises list viewing, adding and deleting of the patients, and viewing and modifying of the attributes of the users.
The rehabilitation training planning module comprises:
the visual evaluation result identification module is used for carrying out identification scoring on each item of visual evaluation result of the patient by the illness state evaluation module;
the body sensing evaluation result identification module is used for carrying out identification scoring on each item of the body sensing evaluation result of the patient for the illness state evaluation module;
the training parameter input module is used for inputting training duty ratio parameters of visual training and somatosensory training by medical staff, wherein the sum of the visual training parameters and the somatosensory training parameters is 1;
the visual training plan making module is used for multiplying the identification scores of each item of the visual evaluation result identification module with the visual training duty ratio parameters and arranging the visual evaluation result identification scores from small to large;
and the somatosensory training plan making module is used for multiplying the identification scores of each item of the somatosensory evaluation result identification module with the somatosensory training duty ratio parameter and arranging the items from small to large.
The specific working mode of the rehabilitation training plan making module is as follows:
a1, a visual evaluation result identification module carries out identification scoring on each item of visual evaluation results of patients on the condition evaluation module;
a2, the body sensing evaluation result identification module carries out identification scoring on each item of the body sensing evaluation result of the patient;
a3, medical staff inputs training duty ratio parameters of visual training and somatosensory training through a training parameter input module;
a4, multiplying each item identification score of the visual evaluation result identification module by the visual training duty ratio parameter by the visual training plan making module, and arranging from small to large;
and A5, multiplying the individual item identification scores of the somatosensory evaluation result identification module by the somatosensory training duty ratio parameters by the somatosensory training plan making module, and arranging from small to large.
The rehabilitation training module comprises:
the vision training module, the vision training module includes:
the wearable VR module is used for the patient to perform visual training through VR;
the eyeball tracking module is used for tracking the eyeballs of a patient during vision training;
the somatosensory module is used for detecting the somatosensory of the patient when the patient performs visual training;
the motion training module is felt to motion training module includes:
the body type identification module is used for identifying the body type information of the patient;
the limb identification module is used for identifying limb information of the patient according to the patient body type information identified by the body type identification module;
the joint point identification module is used for identifying joint points of the patient according to the user limb information identified by the limb identification module and adjusting the somatosensory training gesture according to the joint point position;
the plantar pressure sensor is used for collecting plantar pressure information of a patient;
and the inertia measurement device is used for measuring the limb movement angle of the patient and the somatosensory angle of the patient during somatosensory training.
The specific working method of the rehabilitation training module is that a patient carries out visual training through the wearable VR module, the patient carries out eyeball tracking through the eyeball tracking module during visual training, so that whether the eyeball of the patient is in a training state can be detected in real time, meanwhile, the patient is subjected to somatosensory detection during visual training through the somatosensory module, accidents such as falling and sliding are avoided, when the patient is subjected to somatosensory training, the body type information of the patient is identified through the body type identification module, the limb information of the patient is identified through the limb identification module according to the body type information of the patient, the joint point of the patient is identified through the joint point identification module according to the body information of the user identified by the limb identification module, and the somatosensory training posture is adjusted according to the joint point position, so that the identification error can not occur during action identification, meanwhile, the plantar pressure information of the patient is acquired through the plantar pressure sensor, the plantar pressure condition of the patient can be analyzed during standard action of the patient, the body movement angle of the patient can be measured through the detection of the plantar pressure condition of the patient, the body movement effect of the patient can be estimated laterally, and the body sensation angle of the patient during somatosensory training can be detected, and whether the patient is subjected to standard action of the rehabilitation training can be detected.
The specific method for calculating the somatosensory angle by the somatosensory training module comprises the following steps:
b1, the gravitational acceleration is expressed as g= (0, 1) in the geographical coordinate system T Multiplying by rotation matrix C transformed from geographical coordinate system to sensing coordinate system s n After that, its representation g ' = (g ' in the sensor coordinate system is derived ' x ,g' y ,g' z ) Wherein g' x 、g' y 、g' z Representing the g components in the X-axis, Y-axis and Z-axis respectively, due to C S n Is obtained by sequentially rotating the sensor coordinate system around the X axis, the Y axis and the Z axis, so that the geographic coordinate system can be converted into the sensor coordinate system only by sequentially rotating the sensor coordinate system around the Z axis, the Y axis and the X axis, and C s n =(C n s ) T =(C h b ) T The expression of the gravitational acceleration in the sensor coordinate system is:
,
wherein due to C s n The four-element number of the previous integration period is used for calculation, so that g' represents the coordinate of the gravity acceleration g under the sensor coordinate system at the previous moment;
b2, the accelerometer measures the components of the motion acceleration a of the carrier at the current moment on three axes of a sensor coordinate system, namely a respectively x 、a y 、a z The expression is as follows:
,
because ofAccording to the limit theorem, when the offset angle alpha is small enough, the sine sin father alpha of the offset angle= father alpha;
b3, the proportional-integral controller added in the algorithm can correct the deviation so as to improve the stability of the system,wherein K is used p Represents proportional gain, K i Representing the integral gain, dt represents the sampling period, and the corrected angle offset e c The method comprises the following steps:
;
b4, adjusting the angular velocity ω with the corrected offset amount, thereby correcting the angular velocity ω c The method comprises the following steps:
,
wherein omega cx ,ω cy ,ω cz The components of the adjusted angular velocity on the X axis, the Y axis and the Z axis are respectively;
and B5, updating the quaternion by solving a differential equation of the quaternion, wherein the differential equation of the quaternion is solved according to a first-order Dragon-Aldrich tower method, and the differential equation of the quaternion can be obtained by the following steps:
,
wherein R represents the corrected angular velocity omega c Matrix representation in quaternion space, namely:
,
let T be the sampling period, according to the first order Dragon lattice-base tower expansion, get the quaternion after updating, namely:
;
and B6, carrying out normalization after the quaternion is updated, and calculating an attitude angle, wherein a normalization formula is as follows:
,
converting the quaternion into Euler angle representation, wherein the formula is as follows:
,
phi is the limb rotation attitude angle converted from the sensor coordinate system to the geographic coordinate system.
The rehabilitation training evaluation module specifically works in the following manner:
c1, firstly, any group of human body joint point data is taken, and data information representing action characteristics is abstracted in a vector form;
c2, adding each standard rehabilitation training action into a standard posture vector database, wherein the standard posture vector is represented by b= (x) 2 ,y 2 ) A representation;
c3, then, detecting the joint point of the human body for a certain action in the actual training of the patient, and abstracting the action into vector data information, wherein the actual action vector of the human body is represented by a= (x) 1 ,y 1 ) A representation;
and C4, carrying out similarity calculation on the joint point information of the identified gesture and each standard gesture information in a standard gesture vector database, and obtaining according to cosine similarity of the two vectors:
,
wherein the function value is 0 when the two poses are perpendicular, that is, the more standard the motion is, the closer the function value is to 1.
Working principle: when the system is used, the user logs in the training system through the user login module to conduct personalized vestibule rehabilitation training customization, the training action and standard action videos are input through the training action input module, the patient condition is evaluated through the condition evaluation module according to the standard action videos stored in the training action input module, the corresponding rehabilitation training plan is formulated according to the condition evaluation of the patient through the rehabilitation training plan formulation module, the rehabilitation training module carries out rehabilitation training according to the rehabilitation training plan formulated by the rehabilitation training plan formulation module, the rehabilitation training of the patient is evaluated through the rehabilitation training evaluation module, and the evaluation result is sent to the rehabilitation training plan formulation module to update the rehabilitation training plan.
While certain exemplary embodiments of the present invention have been described above by way of illustration only, it will be apparent to those of ordinary skill in the art that modifications may be made to the described embodiments in various different ways without departing from the spirit and scope of the invention. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive of the scope of the invention, which is defined by the appended claims.
Claims (2)
1. Vestibular rehabilitation training dynamic evaluation system based on artificial intelligence, which is characterized by comprising:
the user login module is used for registering patients and inputting illness states, and is also used for managing each patient using the system, wherein management operation comprises checking, adding and deleting lists of the patients, checking and modifying attributes of the users, and the user login module is used for personalized vestibular rehabilitation training customization of the patient login training system;
the training action input module is used for inputting training actions and standard action videos;
the system comprises a training action input module, a disease condition evaluation module, a disease detection module and a disease detection module, wherein the training action input module is used for inputting a standard action video to a patient;
the rehabilitation training plan making module is used for making a corresponding rehabilitation training plan according to the condition evaluation of the patient by the condition evaluation module, the training plan comprises a visual training plan and a somatosensory training plan, and the rehabilitation training plan making module comprises a visual evaluation result identification module, a somatosensory evaluation result identification module, a training parameter input module, a visual training plan making module and a somatosensory training plan making module:
the visual evaluation result identification module is used for carrying out identification scoring on each item of visual evaluation result of the patient by the illness state evaluation module;
the body sensing evaluation result identification module is used for carrying out identification scoring on each item of the body sensing evaluation result of the patient for the illness state evaluation module;
the training parameter input module is used for inputting training duty ratio parameters of visual training and somatosensory training by medical staff, wherein the sum of the visual training parameters and the somatosensory training parameters is 1;
the visual training plan making module is used for multiplying the identification scores of each item of the visual evaluation result identification module with the visual training duty ratio parameters and arranging the visual evaluation result identification scores from small to large;
the somatosensory training plan making module is used for multiplying each item identification score of the somatosensory evaluation result identification module with the somatosensory training duty ratio parameter and arranging the items from small to large;
the rehabilitation training module is used for performing rehabilitation training by a patient according to a rehabilitation training plan formulated by the rehabilitation training plan formulation module, the rehabilitation training comprises visual training and somatosensory training, the rehabilitation training module comprises a visual training module, a wearable VR module, an eyeball tracking module, a somatosensory module and a somatosensory training module, the wearable VR module is used for performing visual training by the patient through VR, the eyeball tracking module is used for performing eyeball tracking by the patient during visual training, and the somatosensory module is used for performing somatosensory detection when the patient performs visual training;
the body feeling training module comprises a body type identification module, a limb identification module, a joint point identification module, a plantar pressure sensor and an inertia measurement device, wherein the body type identification module is used for identifying body type information of a patient, the limb identification module is used for identifying body information of the patient according to the body type information of the patient identified by the body type identification module, the joint point identification module is used for identifying joint points of the patient according to the body information of the user identified by the limb identification module and adjusting the body feeling training posture according to the joint point position, the plantar pressure sensor is used for acquiring plantar pressure information of the patient, and the inertia measurement device is used for measuring body feeling angles of the patient when the patient performs body feeling training;
the rehabilitation training evaluation module is used for evaluating training results of rehabilitation training of patients and sending the evaluation results to the rehabilitation training plan making module for updating a rehabilitation training plan, and the rehabilitation training evaluation module specifically works in the following manner:
c1, firstly, any group of human body joint point data is taken, and data information representing action characteristics is abstracted in a vector form;
c2, adding each standard rehabilitation training action into a standard posture vector database;
c3, detecting human body joint points of a certain action in actual training of a patient, and abstracting the human body joint points into vector data information;
and C4, carrying out similarity calculation on the joint point information of the identified gesture and each standard gesture information in a standard gesture vector database, and obtaining according to cosine similarity of the two vectors:
,
wherein the function value is 0 when the two poses are perpendicular, that is, the more standard the motion is, the closer the function value is to 1.
2. The vestibular rehabilitation training dynamic evaluation system based on artificial intelligence according to claim 1, wherein the specific working mode of the rehabilitation training planning module is as follows:
a1, a visual evaluation result identification module carries out identification scoring on each item of visual evaluation results of patients on the condition evaluation module;
a2, the body sensing evaluation result identification module carries out identification scoring on each item of the body sensing evaluation result of the patient;
a3, medical staff inputs training duty ratio parameters of visual training and somatosensory training through a training parameter input module;
a4, multiplying each item identification score of the visual evaluation result identification module by the visual training duty ratio parameter by the visual training plan making module, and arranging from small to large;
and A5, multiplying the individual item identification scores of the somatosensory evaluation result identification module by the somatosensory training duty ratio parameters by the somatosensory training plan making module, and arranging from small to large.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311155294.0A CN116869490B (en) | 2023-09-08 | 2023-09-08 | Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311155294.0A CN116869490B (en) | 2023-09-08 | 2023-09-08 | Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116869490A CN116869490A (en) | 2023-10-13 |
CN116869490B true CN116869490B (en) | 2024-01-09 |
Family
ID=88260940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311155294.0A Active CN116869490B (en) | 2023-09-08 | 2023-09-08 | Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116869490B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117958812B (en) * | 2024-03-28 | 2024-06-14 | 广州舒瑞医疗科技有限公司 | Human body posture feedback evaluation method for dynamic vestibular rehabilitation training |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015179910A1 (en) * | 2014-05-30 | 2015-12-03 | Isotechnology Pty Ltd | System and method for facilitating patient rehabilitation |
JP2017012691A (en) * | 2015-07-06 | 2017-01-19 | パラマウントベッド株式会社 | Rehabilitation support device, rehabilitation support system, rehabilitation support method and program |
KR20170062826A (en) * | 2015-11-30 | 2017-06-08 | 권대규 | A postural balance training system |
CN107050763A (en) * | 2017-06-14 | 2017-08-18 | 韶关星火创客科技有限公司 | A kind of its control method of new robot for rehabilitation of anklebone |
KR20190012986A (en) * | 2017-07-31 | 2019-02-11 | 전북대학교산학협력단 | A dynamic harness rehabilitation training system based motion recognition |
CN111883229A (en) * | 2020-07-31 | 2020-11-03 | 焦点科技股份有限公司 | Intelligent movement guidance method and system based on visual AI |
CN115624678A (en) * | 2022-10-27 | 2023-01-20 | 上海韶脑传感技术有限公司 | Rehabilitation training system and training method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180228430A1 (en) * | 2017-02-10 | 2018-08-16 | Mindmaze Holding Sa | System, method and apparatus for rehabilitation with tracking |
-
2023
- 2023-09-08 CN CN202311155294.0A patent/CN116869490B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015179910A1 (en) * | 2014-05-30 | 2015-12-03 | Isotechnology Pty Ltd | System and method for facilitating patient rehabilitation |
JP2017012691A (en) * | 2015-07-06 | 2017-01-19 | パラマウントベッド株式会社 | Rehabilitation support device, rehabilitation support system, rehabilitation support method and program |
KR20170062826A (en) * | 2015-11-30 | 2017-06-08 | 권대규 | A postural balance training system |
CN107050763A (en) * | 2017-06-14 | 2017-08-18 | 韶关星火创客科技有限公司 | A kind of its control method of new robot for rehabilitation of anklebone |
KR20190012986A (en) * | 2017-07-31 | 2019-02-11 | 전북대학교산학협력단 | A dynamic harness rehabilitation training system based motion recognition |
CN111883229A (en) * | 2020-07-31 | 2020-11-03 | 焦点科技股份有限公司 | Intelligent movement guidance method and system based on visual AI |
CN115624678A (en) * | 2022-10-27 | 2023-01-20 | 上海韶脑传感技术有限公司 | Rehabilitation training system and training method |
Also Published As
Publication number | Publication date |
---|---|
CN116869490A (en) | 2023-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Madgwick et al. | An extended complementary filter for full-body MARG orientation estimation | |
Milosevic et al. | Kinect and wearable inertial sensors for motor rehabilitation programs at home: State of the art and an experimental comparison | |
Baghdadi et al. | Hip and trunk kinematics estimation in gait through Kalman filter using IMU data at the ankle | |
CN102567638B (en) | A kind of interactive upper limb healing system based on microsensor | |
Memar et al. | Quantification of whole-body bradykinesia in Parkinson's disease participants using multiple inertial sensors | |
US20150201867A1 (en) | Electronic free-space motion monitoring and assessments | |
Ganea et al. | Detection and classification of postural transitions in real-world conditions | |
CN203149575U (en) | Interactive upper limb rehabilitation device based on microsensor | |
Chen | Human motion analysis with wearable inertial sensors | |
CN116869490B (en) | Vestibule rehabilitation training dynamic evaluation system based on artificial intelligence | |
Alexandre et al. | Wearable and IoT technologies application for physical rehabilitation | |
Majumder et al. | A robust orientation filter for wearable sensing applications | |
Sabatini | Inertial sensing in biomechanics: a survey of computational techniques bridging motion analysis and personal navigation | |
Lin et al. | Assessment of shoulder range of motion using a wearable inertial sensor network | |
CN110319840A (en) | Conjugate gradient attitude algorithm method towards abnormal gait identification | |
Caserman et al. | Real-time step detection using the integrated sensors of a head-mounted display | |
Liu et al. | A wearable gait analysis and recognition method for Parkinson’s disease based on error state Kalman filter | |
Mascret et al. | A wearable sensor network with embedded machine learning for real-time motion analysis and complex posture detection | |
Du et al. | RETRACTED: Research on the intelligent model of progress in physical education training based on motion sensor | |
Spasojević et al. | A vision-based system for movement analysis in medical applications: the example of Parkinson disease | |
García-de-Villa et al. | Inertial sensors for human motion analysis: A comprehensive review | |
Nguyen et al. | Quantification of compensatory torso motion in post-stroke patients using wearable inertial measurement units | |
Fang et al. | Wearable technology for robotic manipulation and learning | |
Huang et al. | Evaluating power rehabilitation actions using a fuzzy inference method | |
Meng et al. | An inertial-based upper-limb motion assessment model: performance validation across various motion tasks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |