CN111048200A - System, method and terminal for assessing stereotypy behavior of autistic patient - Google Patents

System, method and terminal for assessing stereotypy behavior of autistic patient Download PDF

Info

Publication number
CN111048200A
CN111048200A CN201911169004.1A CN201911169004A CN111048200A CN 111048200 A CN111048200 A CN 111048200A CN 201911169004 A CN201911169004 A CN 201911169004A CN 111048200 A CN111048200 A CN 111048200A
Authority
CN
China
Prior art keywords
virtual reality
tested person
body posture
stereotypy
subsystem
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911169004.1A
Other languages
Chinese (zh)
Inventor
翟广涛
段慧煜
范磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN201911169004.1A priority Critical patent/CN111048200A/en
Publication of CN111048200A publication Critical patent/CN111048200A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Computer Graphics (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention provides a stereotypy behavior evaluation system, a stereotypy behavior evaluation method and a terminal for an autistic patient, wherein the system comprises a virtual reality subsystem, a body posture tracking subsystem and a comprehensive processing subsystem, wherein the virtual reality subsystem is used for presenting a virtual reality scene to a tested person and allowing the tested person to perform interactive activities with the scene; the body posture tracking subsystem is used for acquiring the spatial position information and the body posture information of the tested person and transmitting the spatial position information and the body posture information to the comprehensive processing module for processing and analysis; and the comprehensive processing subsystem is used for processing the posture data and obtaining a comprehensive auxiliary diagnosis result. The invention can obtain the comprehensive stereotypy behavior evaluation result, provides quantitative evaluation indexes of the stereotypy behavior of the autism, and is beneficial to the auxiliary diagnosis of the autism.

Description

System, method and terminal for assessing stereotypy behavior of autistic patient
Technical Field
The invention relates to a stereotypy behavior evaluation system for an autism patient, in particular to a stereotypy behavior evaluation system, a stereotypy behavior evaluation method and a stereotypy behavior evaluation terminal for the autism patient based on virtual reality.
Background
Autism Spectrum Disorder (ASD), a disease classified as a neurodevelopmental disorder. Individuals with autism spectrum disorders often present with two types of symptoms: questions in social communications or social interactions, and restricted or repetitive patterns of behavior, interests, or activities. Autistic patients have long experienced difficulties including difficulties in creating and maintaining social interactions, as well as difficulties in maintaining work and performing routine tasks.
The prior art mainly adopts a method for diagnosing patients with autism, has complicated procedures and long period, and leads doctors with professional qualifications to observe to obtain results. At present, the prevalence of autism is high, but the number of physicians qualified to diagnose autism is small, and therefore, a system that can assist physicians in quickly diagnosing autism is of great importance to society.
Through retrieval, the patent application with application number 201810670529.2 discloses an autism auxiliary diagnosis method based on a neural network, which is introduced into an autism analysis and diagnosis process by taking a BP neural network with a three-layer structure as a classifier. And analyzing the obtained eyeball gaze behavior of the observed person by utilizing the characteristic of the BP neural network. Reduces professional requirements on diagnosis conclusion personnel and has high efficiency.
The invention patent with the chinese patent application No. 201380015493.X discloses an autism diagnosis assistance system and an autism diagnosis assistance apparatus, which include a camera unit that takes an image of an eyeball of a subject or an electrode unit that is attached to the head of the subject and detects the movement of the eyeball, and a line-of-sight detection unit (a) that is attached to a display unit on the line of sight of the subject to diagnose autism of the subject.
The above patent of the invention mainly focuses on the analysis of the eyeball of the patient with autism, and cannot perform the assessment of the stereotypy behavior of autism as a whole, and cannot be accurately used for auxiliary diagnosis.
Disclosure of Invention
The invention aims to provide a virtual reality-based autistic patient stereotypy behavior assessment system, a virtual reality-based autistic patient stereotypy behavior assessment method and a virtual reality-based autistic patient stereotypy behavior assessment terminal, which are used for assisting a doctor in diagnosing whether a tested person has stereotypy behaviors or not.
According to a first aspect of the present invention, there is provided an autistic patient stereotypy behavior assessment system, comprising:
the virtual reality subsystem is used for presenting a virtual reality scene to the tested person and allowing the tested person to carry out interactive activities with the virtual reality field;
the body posture tracking subsystem is used for acquiring the space position information and the body posture information of the tested person;
and the comprehensive processing subsystem is used for receiving and processing the spatial position information and the body posture information of the tested person acquired by the body posture tracking subsystem to obtain the engraving behavior evaluation result.
Optionally, the virtual reality subsystem comprises:
the head-mounted virtual reality display module is used for presenting a virtual reality scene to the tested person;
and the handheld control module realizes the interaction between the tested person and the virtual reality scene by touching or pressing keys in the scene.
Optionally, the body posture tracking subsystem comprises:
the integral body posture tracking module is used for shooting a tested person by using a depth camera or a common monocular camera, calculating the depth or action image of the shot tested person and extracting the human skeleton of the tested person in the depth image by using a depth neural network model;
the fine body gesture tracking module comprises an accelerometer sensor and a gyroscope sensor and is used for tracking the whole motion track of the hand of the measured person and the displacement and acceleration of the important joints of the hand;
and the position tracking module is used for acquiring the spatial position and the angle information of the head-wearing virtual reality display module and the hand-held control module.
Optionally, the comprehensive processing subsystem takes the real-time human skeleton of the tested person, the whole motion track of the hand, the displacement and the acceleration of the important joint of the hand, the spatial position and the angle information of the head-mounted virtual reality display module and the hand-held control module, which are acquired by the whole body posture tracking module, the fine body posture tracking module and the position tracking module, as input, utilizes the deep neural network model to perform feature extraction, and feeds back the possibility judgment result of the stereotyped behavior on the behavior of the tested person in the form of probability.
Optionally, the training process of the deep neural network includes:
training two sub-classification models of a self-closed group and a standard development group according to the motion information acquired by the whole body posture tracking module, the fine body posture tracking module and the position tracking module, and performing weighted integration on the sub-classification models acquired by training according to a neural network integration mode to obtain a final classification model.
Optionally, the possibility judgment is performed on the stereotype behavior of the measured person according to the final classification model.
Optionally, the determination result is expressed in a probability form with a risk of stereotype behavior.
According to a second aspect of the present invention, there is provided an autism patient stereotypy behavior assessment method, applied to the above assessment system, including:
inputting the motion information of the human body of the tested person to the trained deep neural network model; the motion information comprises real-time human skeleton of the tested person, the whole motion track of the hand, the displacement and the acceleration of the important joints of the hand, and the spatial position and the angle information of the head-wearing virtual reality display module and the hand-holding control module;
the deep neural network extracts the motion characteristics of the human body and returns the possibility judgment result that the tested person has stereotyped behaviors in the form of probability.
Optionally, the training process of the deep neural network includes:
training two sub-classification models of a self-closed group and a standard development group according to the motion information, and performing weighted integration on the sub-classification models obtained through training according to a neural network integration mode to obtain a final classification model;
and judging the possibility of the carving behavior of the tested person according to the final classification model, wherein the judgment result is expressed in a probability form.
According to a third aspect of the present invention, there is provided an autism patient stereotypy behavior assessment terminal, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the program, is configured to perform the above-mentioned virtual reality-based autism patient stereotypy behavior assessment method.
Compared with the prior art, the invention has at least one of the following beneficial effects:
the autism board behavior evaluation system based on virtual reality comprises a virtual reality module for displaying a virtual reality picture and performing interaction; the body posture tracking module is used for acquiring the behavior data of the testee; the invention can rapidly collect the body posture behavior data of the tested person, extract the data characteristics by using a deep learning method, comprehensively analyze and compare to obtain the final evaluation result, and assist a doctor to diagnose whether the tested person has stereotypy behavior.
Compared with the prior art, the method mainly focuses on evaluating the stereotypy behavior of the autism to further realize the auxiliary diagnosis of the autism, displays the autism through the virtual reality module, simulates a controllable visual environment, analyzes body posture movement data through the comprehensive processing module based on the deep neural network, and is high in speed and accuracy.
Drawings
FIG. 1 is a block diagram of a stereotypical behavior assessment system for autistic patients in accordance with a preferred embodiment of the present invention;
FIG. 2 is a flowchart of a method for assessing stereotypical behavior of an autistic patient in accordance with a preferred embodiment of the present invention;
in the figure: the system comprises a virtual reality subsystem 1, a head-mounted virtual reality display module 11, a hand-held control module 12, a body posture tracking subsystem 2, an integral body posture tracking module 21, a fine body posture tracking module 22, a position tracking module 23 and a comprehensive processing subsystem 3.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention. The parts not described in detail below can be implemented using prior art techniques.
Fig. 1 is a block diagram of a stereotypical behavior evaluation system for autistic patients according to a preferred embodiment of the present invention.
Referring to fig. 1, the stereotypy behavior evaluation system for the autistic patient in this embodiment includes: a virtual reality subsystem 1, a body posture tracking subsystem 2 and a comprehensive processing subsystem 3. The virtual reality subsystem 1 presents a virtual reality scene to the tested person, and allows the tested person to perform interaction with the scene, wherein the interaction comprises the tested person clicking an object in the virtual reality scene through the handheld control module 12. And the body posture tracking subsystem 2 is used for collecting the space position information and the body posture information of the tested person. And the comprehensive processing subsystem 3 is used for receiving and processing the spatial position information and the body posture information of the tested person acquired by the body posture tracking subsystem 2 to obtain a stereotypy behavior evaluation result, and the result can be used for comprehensively assisting a doctor to diagnose the autism.
In a preferred embodiment, the virtual reality subsystem 1 includes: a head-mounted virtual reality display module 11 and a hand-held control module 12. The head-mounted virtual reality display module 11 is connected with the main controller in a wired or wireless mode, or does not need the main controller, and is used for presenting a virtual reality scene to the tested person; the hand-held control module 12 realizes interaction between the testee and the virtual reality scene by touching or pressing keys in the scene. The hand-held control module 12 may be a handle or the like.
Referring to fig. 1, on the basis of any of the above embodiments, the posture tracking subsystem 2 includes: a whole body posture tracking module 21, a fine body posture tracking module 22 and a position tracking module 23.
The overall body posture tracking module 21 shoots a tested person by using a depth camera, calculates a depth image of the tested person, and extracts a human body skeleton of the tested person in the depth image by using a depth neural network model; the overall body posture tracking module 21 may also use a common monocular camera to shoot the person to be tested, calculate the shot action image of the person to be tested, and extract the human skeleton of the person to be tested in the image by using the deep neural network model. The deep neural network model here can be implemented using existing techniques.
The fine gesture tracking module 22 tracks the whole motion track of the hand and the displacement and the acceleration of the important joints of the hand; and the position tracking module 23 is used for acquiring the spatial position and the angular information of the head-mounted virtual reality display module 11 and the hand-held control module 12.
Specifically, in some embodiments, the fine body posture tracking module 22 may include a bracelet with a sensor, a somatosensory glove, and the like, and meanwhile, a positioning module is further disposed in the head-mounted virtual reality display module to obtain three-dimensional position information and a corresponding relationship with time, so as to obtain head motion information; further, the sensor includes an accelerometer sensor and a gyroscope sensor. The somatosensory glove collects position, motion and action information of a user, and the bracelet collects position and motion information.
On the basis of any of the above embodiments, the comprehensive processing subsystem 3 takes the real-time human skeleton of the tested person, the whole motion trajectory of the hand, the displacement and the acceleration of the important joints of the hand, and the spatial position and the angle information of the head-mounted virtual reality display module 11 and the hand-held control module 12, which are acquired by the whole body posture tracking module 21, the fine body posture tracking module 22 and the position tracking module 23, as inputs, performs feature extraction by using a deep neural network model, and feeds back a possibility judgment result of the stereotyped behavior having the behavior on the tested person in a probability form. The deep neural network model in this embodiment is implemented by using the prior art.
Fig. 2 is a flowchart of a stereotypy behavior assessment method for an autistic patient according to an embodiment of the invention.
As shown in fig. 2, the virtual reality-based stereotypy behavior assessment method for autism patients according to one embodiment of the present invention includes the following steps:
and S1, inputting the motion information of the human body into the trained deep neural network model.
Further, the process of inputting the motion information of the human body further includes: the real-time human skeleton of the tested person, the whole motion track of the hand and the displacement and the acceleration of the important joints of the hand are obtained according to the whole body posture tracking module, the fine body posture tracking module and the position tracking module, and the space position and the angle information are obtained according to the head-mounted virtual reality display module and the hand-held control module.
Further, the training process of the deep neural network comprises the following steps: training two sub-classification models of a self-closed group and a standard development group according to motion information obtained by the whole body posture tracking module, the fine body posture tracking module and the position tracking module, and then performing weighted integration on the sub-classification models obtained by training according to a neural network integration mode, wherein the integration mode comprises linear classification, SVM classification and neural network classification to obtain a final classification model. In this embodiment, the weighted integration of the sub-classification models obtained by training according to the neural network integration mode may adopt the prior art, for example, two sub-deep neural networks analyze and process the obtained data, and for the data output by the two sub-deep neural networks, linear classification, SVM classification, neural network classification, and the like may be adopted to obtain the final result.
And S2, extracting the motion characteristics of the human body by the deep neural network, and returning the possibility judgment result that the tested person has the stereotype behaviors on the behaviors in the form of probability.
Further, the training process of the deep neural network comprises the following steps: according to the final classification model, the output result of the behavior of the tested person after passing through the final classification model is compared with the distance (difference distance of the output result) between the autism group and the standard development group, and the possibility judgment result of the stereotypy behavior of the tested person is carried out, wherein the judgment result is expressed in the form of the probability that the tested person has the stereotypy behavior, and the diagnosis of the autism can be assisted by doctors through the probability. The probability values represent the likelihood of having stereotyped behaviors that the physician synthesizes in conjunction with other observed data.
Finally, the invention can rapidly collect the body posture behavior data of the tested person, extract the data characteristics by using a deep learning method, comprehensively analyze and compare to obtain the final evaluation result, and assist the doctor to diagnose whether the tested person has stereotypy behavior.
In addition, in combination with the above system and method, an embodiment of the present invention further provides a virtual reality-based autism patient stereotypy behavior assessment terminal, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the program, may be configured to execute the above virtual reality-based autism patient stereotypy behavior assessment method.
Of course, in another embodiment, a computer readable storage medium is also provided, on which a computer program is stored, which when executed by a processor is operable to perform the above-mentioned virtual reality based autism patient stereotypy behavior assessment method.
Optionally, the memory is for storing a program; the memory may include volatile memory, such as random access memory, e.g., static random access memory, double data rate synchronous dynamic random access memory, etc.; the memory may also include non-volatile memory, such as flash memory. The memories are used to store computer programs (e.g., applications, functional modules, etc. that implement the above-described methods), computer instructions, etc., which may be stored in partition in the memory or memories. And the computer programs, computer instructions, data, etc. described above may be invoked by a processor.
The computer programs, computer instructions, etc. described above may be stored in one or more memories in a partitioned manner. And the computer programs, computer instructions, data, etc. described above may be invoked by a processor.
A processor for executing the computer program stored in the memory to implement the steps of the method according to the above embodiments. Reference may be made in particular to the description relating to the preceding method embodiment.
The processor and the memory may be separate structures or may be an integrated structure integrated together. When the processor and the memory are separate structures, the memory, the processor may be coupled by a bus.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (10)

1. An autistic patient stereotypy behavior assessment system, comprising:
the virtual reality subsystem is used for presenting a virtual reality scene to the tested person and allowing the tested person to carry out interactive activities with the virtual reality field;
the body posture tracking subsystem is used for acquiring the space position information and the body posture information of the tested person;
and the comprehensive processing subsystem is used for receiving and processing the spatial position information and the body posture information of the tested person acquired by the body posture tracking subsystem to obtain the engraving behavior evaluation result.
2. The autistic patient stereotypy behavior assessment system according to claim 1, wherein the virtual reality subsystem comprises:
the head-mounted virtual reality display module is used for presenting a virtual reality scene to the tested person;
and the handheld control module realizes the interaction between the tested person and the virtual reality scene by touching or pressing keys in the scene.
3. The autistic patient stereotypy behavior assessment system according to claim 1, wherein the body posture tracking subsystem comprises:
the integral body posture tracking module is used for shooting a tested person by using a depth camera or a common monocular camera, calculating the depth or action image of the shot tested person and extracting the human skeleton of the tested person in the depth image by using a depth neural network model;
the fine body gesture tracking module comprises an accelerometer sensor and a gyroscope sensor and is used for tracking the whole motion track of the hand of the measured person and the displacement and acceleration of the important joints of the hand;
and the position tracking module is used for acquiring the spatial position and the angle information of the head-wearing virtual reality display module and the hand-held control module.
4. The autistic patient stereotypy behavior assessment system according to claim 3, wherein the integrated processing subsystem takes the real-time human skeleton of the subject, the whole motion trajectory of the hand and the displacement and acceleration of the important joints of the hand, and the spatial position and angle information of the head-mounted virtual reality display module and the hand-held control module, which are acquired by the whole body posture tracking module, the fine body posture tracking module and the position tracking module, as inputs, performs feature extraction by using a deep neural network model, and feeds back a possibility determination result of having a behavioural stereotypy on the subject in a form of probability.
5. The autistic patient stereotypy behavior assessment system according to claim 4, wherein the training process of the deep neural network comprises:
training two sub-classification models of a self-closed group and a standard development group according to the motion information acquired by the whole body posture tracking module, the fine body posture tracking module and the position tracking module, and then performing weighted integration on the sub-classification models obtained through training according to a neural network integration mode to obtain a final classification model.
6. The autistic patient stereotypy behavior assessment system according to claim 5, wherein the possibility determination is performed on the stereotypy behavior of the subject according to the final classification model to obtain the determination result.
7. The autism patient stereotypy behavior assessment system according to claim 6, wherein the determination result is expressed in the form of a probability of having a risk of stereotypy behavior.
8. A stereotypy behavior assessment method for an autistic patient, comprising:
inputting the motion information of the human body of the tested person to the trained deep neural network model; the motion information comprises real-time human skeleton of the tested person, the whole motion track of the hand, the displacement and the acceleration of the important joints of the hand, and the spatial position and the angle information of the head-wearing virtual reality display module and the hand-holding control module;
the deep neural network extracts the motion characteristics of the human body and returns the possibility judgment result that the tested person has stereotyped behaviors in the form of probability.
9. The autistic patient stereotypy behavior assessment method according to claim 8, wherein the training process of the deep neural network comprises:
training two sub-classification models of a self-closed group and a standard development group according to the motion information, and performing weighted integration on the sub-classification models obtained through training according to a neural network integration mode to obtain a final classification model;
and judging the possibility of the carving behavior of the tested person according to the final classification model, wherein the judgment result is expressed in a probability form.
10. An autism patient stereotypy behavior assessment terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program is operable to perform the virtual reality based autism patient stereotypy behavior assessment method of claim 8 or 9.
CN201911169004.1A 2019-11-25 2019-11-25 System, method and terminal for assessing stereotypy behavior of autistic patient Pending CN111048200A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911169004.1A CN111048200A (en) 2019-11-25 2019-11-25 System, method and terminal for assessing stereotypy behavior of autistic patient

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911169004.1A CN111048200A (en) 2019-11-25 2019-11-25 System, method and terminal for assessing stereotypy behavior of autistic patient

Publications (1)

Publication Number Publication Date
CN111048200A true CN111048200A (en) 2020-04-21

Family

ID=70233367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911169004.1A Pending CN111048200A (en) 2019-11-25 2019-11-25 System, method and terminal for assessing stereotypy behavior of autistic patient

Country Status (1)

Country Link
CN (1) CN111048200A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111883252A (en) * 2020-07-29 2020-11-03 济南浪潮高新科技投资发展有限公司 Auxiliary diagnosis method, device, equipment and storage medium for infantile autism

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105980856A (en) * 2013-09-05 2016-09-28 Fio公司 Biomarkers For Early Determination Of A Critical Or Life Threatening Response To Illness And/Or Treatment Response
CN106355010A (en) * 2016-08-30 2017-01-25 深圳市臻络科技有限公司 Self-service cognition evaluation apparatus and method
CN107529648A (en) * 2017-08-11 2018-01-02 太原理工大学 The schizophrenia sorting technique of task based access control conversion testing behaviouristics data
CN107847464A (en) * 2015-07-01 2018-03-27 杜克大学 Diagnosis and the method for the treatment of acute respiratory infections
CN109509552A (en) * 2018-12-05 2019-03-22 中南大学 A kind of mental disease automatic distinguishing method of the multi-level features fusion based on function connects network
CN110197724A (en) * 2019-03-12 2019-09-03 平安科技(深圳)有限公司 Predict the method, apparatus and computer equipment in diabetes illness stage
CN110349674A (en) * 2019-07-05 2019-10-18 昆山杜克大学 Autism-spectrum obstacle based on improper activity observation and analysis assesses apparatus and system
CN110459327A (en) * 2019-08-07 2019-11-15 上海市精神卫生中心(上海市心理咨询培训中心) A kind of self-closing disease stereotypic behavior analysis method and system based on deep learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105980856A (en) * 2013-09-05 2016-09-28 Fio公司 Biomarkers For Early Determination Of A Critical Or Life Threatening Response To Illness And/Or Treatment Response
CN107847464A (en) * 2015-07-01 2018-03-27 杜克大学 Diagnosis and the method for the treatment of acute respiratory infections
CN106355010A (en) * 2016-08-30 2017-01-25 深圳市臻络科技有限公司 Self-service cognition evaluation apparatus and method
CN107529648A (en) * 2017-08-11 2018-01-02 太原理工大学 The schizophrenia sorting technique of task based access control conversion testing behaviouristics data
CN109509552A (en) * 2018-12-05 2019-03-22 中南大学 A kind of mental disease automatic distinguishing method of the multi-level features fusion based on function connects network
CN110197724A (en) * 2019-03-12 2019-09-03 平安科技(深圳)有限公司 Predict the method, apparatus and computer equipment in diabetes illness stage
CN110349674A (en) * 2019-07-05 2019-10-18 昆山杜克大学 Autism-spectrum obstacle based on improper activity observation and analysis assesses apparatus and system
CN110459327A (en) * 2019-08-07 2019-11-15 上海市精神卫生中心(上海市心理咨询培训中心) A kind of self-closing disease stereotypic behavior analysis method and system based on deep learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111883252A (en) * 2020-07-29 2020-11-03 济南浪潮高新科技投资发展有限公司 Auxiliary diagnosis method, device, equipment and storage medium for infantile autism

Similar Documents

Publication Publication Date Title
Kasprowski et al. Guidelines for the eye tracker calibration using points of regard
Lv et al. Evaluation of Kinect2 based balance measurement
CN111887867A (en) Method and system for analyzing character formation based on expression recognition and psychological test
US10849532B1 (en) Computer-vision-based clinical assessment of upper extremity function
US20140307927A1 (en) Tracking program and method
CN116507276A (en) Method and apparatus for machine learning to analyze musculoskeletal rehabilitation from images
Surer et al. Methods and technologies for gait analysis
Rechy-Ramirez et al. Impact of commercial sensors in human computer interaction: a review
WO2022060432A1 (en) Activity assistance system
CN113974589B (en) Multi-modal behavior paradigm evaluation optimization system and cognitive ability evaluation method
Metsis et al. Computer aided rehabilitation for patients with rheumatoid arthritis
Sharma et al. Real-time recognition of yoga poses using computer vision for smart health care
US20210287014A1 (en) Activity assistance system
CN111048200A (en) System, method and terminal for assessing stereotypy behavior of autistic patient
US20230367398A1 (en) Leveraging machine learning and fractal analysis for classifying motion
TW201445493A (en) A self-care system for assisting quantitative assessment of rehabilitation movement
CN112741620A (en) Cervical spondylosis evaluation device based on limb movement
Kastaniotis et al. Using kinect for assesing the state of Multiple Sclerosis patients
Iskander et al. A k-nn classification based vr user verification using eye movement and ocular biomechanics
KR20140132864A (en) easy measuring meathods for physical and psysiological changes on the face and the body using users created contents and the service model for healing and wellness using these techinics by smart devices
KR20220160131A (en) Appaeatus and method for providing artificial intelligence based virtual reality psychological test service
Al-Jubouri et al. A Survey On Movement Analysis (Hand, Eye, Body) And Facial Expressions-Based Diagnosis Autism Disorders Using Microsoft Kinect V2
Boyce et al. Electrodermal activity analysis for training of military tactics
US20230027320A1 (en) Movement Disorder Diagnostics from Video Data Using Body Landmark Tracking
Jeon et al. Assessing goal-directed three-dimensional movements in a virtual reality block design task

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination