CN112001209A - Student classroom learning behavior monitoring system based on artificial intelligence - Google Patents

Student classroom learning behavior monitoring system based on artificial intelligence Download PDF

Info

Publication number
CN112001209A
CN112001209A CN201910447608.1A CN201910447608A CN112001209A CN 112001209 A CN112001209 A CN 112001209A CN 201910447608 A CN201910447608 A CN 201910447608A CN 112001209 A CN112001209 A CN 112001209A
Authority
CN
China
Prior art keywords
student
sitting posture
unit
students
classroom
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910447608.1A
Other languages
Chinese (zh)
Inventor
刘军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Scope Co ltd
Original Assignee
Shenzhen Scope Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Scope Co ltd filed Critical Shenzhen Scope Co ltd
Priority to CN201910447608.1A priority Critical patent/CN112001209A/en
Publication of CN112001209A publication Critical patent/CN112001209A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Educational Administration (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Educational Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides an artificial intelligence-based student classroom learning behavior monitoring system, which comprises the following program modules: a registration unit; an image pickup unit; a storage unit; a data processing center; a body type detection unit; and a virtual robot; wherein, this data processing center includes: a classroom position picture processing unit; a student behavior trajectory synthesis unit; a sitting posture model library; a student sitting posture matching unit; the virtual robot reports the classroom behavior trace graph corresponding to the student identity in the set stage time provided by the student behavior trace synthesis unit and the sitting posture abnormal condition provided by the student sitting posture matching unit as feedback information, and provides feedback for the outside. The method can realize personalized traceless monitoring, and has no influence on the normal experience of students in class learning.

Description

Student classroom learning behavior monitoring system based on artificial intelligence
Technical Field
The invention relates to image recognition and data processing, in particular to monitoring of classroom learning behaviors of students.
Background
The classroom is the main study place of students, and the time that the students stay in the classroom for study in school is also the longest, therefore it is very important to the monitoring and management of the study state of the students in the classroom. At present, the number of students in a class is large, and teachers are difficult to supervise and record the learning conditions of all students in the class all the day, especially the sitting postures of the students. The myopia rate of primary school students in the country is as high as 34% -60%, the myopia rate of junior middle school students is 68%, and the myopia rate of gravity is as high as 90%. One very important factor of high myopia rate is that students do not have scientific sitting postures, and the other factor is that the height, weight and shape of each student are different, and teachers do not have the energy to correct and remind the sitting postures of each student.
In addition, the students are active in the classroom, which students are active, which areas in the classroom are active, and the teacher can not observe the learning behaviors of the students in the classroom, so that the sitting postures and the learning states of the students in the classroom cannot be accurately and scientifically monitored.
The existing monitoring system for the classroom learning behaviors of students needs the students to wear specific equipment capable of communicating so as to acquire and position data. Through wearing equipment, make statistics of and analysis, warning to the position of sitting of student and the orbit of action in the classroom, need every student at the learning process, dispose extra device, can't realize no trace monitoring, have certain influence to the normal experience of student in classroom study.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an artificial intelligence-based monitoring system for the classroom learning behavior of students, which can realize personalized traceless monitoring and has no influence on the normal experience of the students in classroom learning.
The technical scheme adopted by the invention for solving the technical problems is as follows: the utility model provides a monitoring system of student's classroom learning action based on artificial intelligence, includes following program module:
the registration unit is used for realizing the registration of students in the monitoring system;
the camera shooting unit is used for acquiring face images and classroom images of all students in a class in real time and acquiring images of the whole body of the successfully identified students;
the storage unit is used for storing the pictures collected by the camera shooting unit;
the data processing center is used for calculating characteristic parameters of the pictures provided by the camera unit, and identifying the face of each student and confirming and comparing the identity of each student;
the body type detection unit is used for acquiring body type detection data of each student in a school hospital or a sports department, and the body type detection data is bound with the identity information of the students; and
the virtual robot is used for providing an interface for the monitoring system to interact with the outside;
wherein, this data processing center includes:
the classroom position picture processing unit is used for carrying out face recognition on the classroom scene picture shot by the camera shooting unit and remarking labels for the face features successfully recognized in the picture so that the picture is attached with the identity information of the students successfully recognized;
the student behavior track synthesis unit is used for synthesizing pictures of each student successfully registered in the system within a synthesis time period preset by the system;
the sitting posture model library is used for judging the sitting posture by taking the table contour value and the shooting proportion as reference standards after the face recognition of the picture acquired by the camera shooting unit is successful;
the student sitting posture matching unit is used for matching the sitting posture parameters provided by the sitting posture model base with standard sitting posture parameters prestored in the system;
the virtual robot reports the classroom behavior trace graph corresponding to the student identity in the set stage time provided by the student behavior trace synthesis unit and the sitting posture abnormal condition provided by the student sitting posture matching unit as feedback information, and provides feedback for the outside.
In some embodiments, the student behavior trajectory synthesis unit performs image synthesis in a manner that a certain number of student behavior trajectories are grouped; subsequent pictures are iterated on the basis of the previous group.
In some embodiments, the student behavior trace synthesis unit presets a threshold range of the synthesized picture face label information overlapping region, and performs color marking on a part exceeding the threshold to obtain a classroom behavior trace map of the identified object in a preset time period.
In some embodiments, the sitting posture model library comprises standard body shape characteristics of different heights and weights of each age group and sex issued by the authority, and the proportion range between the head, the hand, the shoulder and the neck and other body parts when the student squats, and the normal body shape proportion parameters thereof are pre-stored in the system as the comparison parameters of the correct sitting posture of each student.
In some embodiments, the student sitting posture matching unit matches the age and height of the successfully identified students in identity, outputs the corresponding standard body proportion, and matches the relative size proportion value of the head, hand, shoulder and neck parts of each successfully identified face feature calculated in the sitting posture model library with the standard proportion parameters prestored in the system.
In some embodiments, the virtual robot includes a virtual robot image or cartoon image, which is connected to the data processing center, and when receiving an inquiry instruction from a user, a teacher, or a parent, the virtual robot can timely retrieve a classroom behavior trace map and a report of abnormal sitting posture corresponding to the identity of a student in a set period of time of the data processing center.
In some embodiments, the virtual robot is further configured to set an abnormal threshold for the sitting posture proportion of any student, and determine that the sitting posture is abnormal if the comparison value exceeds the threshold range.
In some embodiments, the sitting posture model library calculates a β value from β = (registered height-detected height)/registered height, and determines that the identification object is in a sitting posture when the β value exceeds a set value.
In some embodiments, the data processing center performs sitting posture recognition of the picture for students who successfully recognize facial features and perform identity authentication, and calculates relative proportion data of the head, hands, shoulder and neck parts of the person in the picture and the outline of the table.
In some embodiments, the virtual robot provides the feedback information to a wisdom class card of the class in which the student is located; and/or the virtual robot provides the feedback information to the mobile phone of the teacher and/or the parent of the class where the student is located.
The invention has the advantages that the smart registration unit, the camera unit, the storage unit, the data processing center, the body type detection unit and the virtual robot are matched skillfully; furthermore, by skillfully configuring the classroom position picture processing unit, the student behavior track synthesis unit, the sitting posture model library and the student sitting posture matching unit in the data processing center, the classroom behavior track graph corresponding to the student identity in the set stage time provided by the student behavior track synthesis unit and the abnormal sitting posture condition provided by the student sitting posture matching unit are reported as feedback information by means of the virtual robot, so that feedback is provided for the outside, individualized traceless monitoring can be realized, and the normal experience of the students in classroom learning is not influenced.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
fig. 1 illustrates a framework structure of the monitoring system for the classroom learning behavior of the student based on artificial intelligence of the invention.
Fig. 2 illustrates the process of the present invention to obtain a result picture of the track composition of a specific student.
Wherein the reference numerals are as follows: 100. the monitoring system 10, the registration unit 20, the camera unit 30, the storage unit 40, the data processing center 41, the classroom position picture processing unit 42, the sitting posture model library 43, the student behavior track synthesis unit 44, the student sitting posture matching unit 50, the body type detection unit 60, the virtual robot 65, the feedback information 70, the intelligent class board 80, the mobile phones 411, 412, 413 and 41n pictures 410 and 430 result pictures A41 are synthesized to form a statistical area A43 result area.
Detailed Description
The preferred embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
Referring to fig. 1 and 2, fig. 1 illustrates a framework structure of an artificial intelligence-based monitoring system for classroom learning behaviors of students according to the present invention. Fig. 2 illustrates the process of the present invention to obtain a result picture of the track composition of a specific student. The invention provides a monitoring system 100 for classroom learning behaviors of students based on artificial intelligence, which comprises: a registration unit 10, an imaging unit 20, a storage unit 30, a data processing center 40, a body type detection unit 50, and a virtual robot 60. The data processing center 40 includes a classroom position picture processing unit 41, a student behavior trajectory synthesis unit 42, a sitting posture model library 43, and a student sitting posture matching unit 44. The virtual robot 60, based on the monitoring results from the data processing center 40, can transmit corresponding feedback information 65 to the smart class card 70 in the student classroom and/or to the cell phone 80 of the parent and/or teacher.
It is understood that the monitoring system 100 is a software system running on a processor of a particular computing device. The registration unit 10, the camera unit 20, the storage unit 30, the data processing center 40, the body type detection unit 50, and the virtual robot 60 can all be understood as program modules constituting this software system. For example, a particular computing device includes a computer disposed in a classroom of a particular class and a number of cameras connected to the computer. Particular computing devices may also include servers and other computers communicatively connected to the computer via a network.
And the registration unit 10 is used for realizing the registration of students in the monitoring system 100. The registration information comprises the filling of personal information such as names, school numbers, classes and the like, the input of face images and the authentication and binding of identity information.
The image pickup unit 20 picks up images of the whole or local area of the classroom by means of a single or a plurality of camera devices, and is used for acquiring face images of all students in the class in real time, acquiring classroom images, and acquiring images of the whole body of the identified students.
And the storage unit 30 is used for storing the pictures acquired by the camera unit 20.
And the data processing center 40 is used for calculating characteristic parameters of the pictures provided by the camera unit 20, and performing face recognition, identity confirmation and comparison on each student.
The data processing center 40 recognizes the sitting posture of the picture of the student who successfully recognizes the facial features and performs the identity authentication, and calculates the relative proportion data of the head, the hand, the shoulder and the neck of the person in the picture and the outline of the table.
The body type detection unit 50 is used for acquiring body type detection data of each student in a school hospital or a sports department, the data comprises parameterized data of height, weight, abnormal skeleton, body type fatness and thinness and the like, and the body type detection data is bound with identity information of the students.
The classroom position image processing unit 41 is configured to perform face recognition on a scene image of a classroom captured by the imaging unit 20, and remark labels for face features successfully recognized in the image, so that the image is accompanied with identity information of students successfully recognized in the image.
And the student behavior track synthesis unit 42 is used for synthesizing pictures of each student successfully registered in the system in a synthesis time period preset by the system. The synthesis time period preset by the system can be time, day, week and the like according to the needs of practical application.
It can be understood that, when the synthesis time period is longer, the number of the synthesized pictures is larger, and the synthesis speed is greatly reduced. Therefore, the method adopts a mode that a certain number of pictures are grouped to synthesize the pictures; subsequent pictures are iterated on the basis of the previous group, that is, picture synthesis is performed in a periodic fixed number manner, so as to balance the synthesis time period and the synthesis speed.
It is worth mentioning that the picture for synthesis is taken from the picture with student information (i.e., the aforementioned tagged remarked picture) that was successfully identified in the classroom position picture processing unit 41.
The student behavior trace synthesizing unit 42 presets a threshold range of the synthesized picture face label information overlapping region, and color marks the part exceeding the threshold to obtain a classroom behavior trace map of the identified object in a preset time period. And filtering and normalizing the part outside the color marking area to identify the identity information of the object and store the final picture.
A sitting posture model library 43, configured to determine a sitting posture by using the table contour value and the shooting ratio as reference standards after successful face recognition is performed on the picture acquired by the camera unit 20; specifically, the value of β is calculated from β = (registered height-detected height)/registered height. If the value of β exceeds a predetermined range, it is determined that the recognition target is in a "sitting" state, and the relative size ratio of the head, hand, shoulder, neck, and the like of the recognition target is calculated.
The sitting posture model library 43 comprises standard body type characteristics of different heights and weights of each age group and sex issued by authorities (such as national health departments), and proportion ranges of the head, the hands, the shoulders, the necks and other body parts when the students squat, and normal body type proportion parameters of the standard body type characteristics are pre-stored in the system to serve as comparison parameters of correct sitting postures of the students.
And the student sitting posture matching unit 44 is used for matching the ages and heights in the identities of the successfully identified students, outputting the corresponding standard body proportion, and matching the relative size proportion value of the head, the hand, the shoulder and neck parts of each successfully identified face feature calculated in the sitting posture model library 43 with the standard proportion parameters prestored in the system.
And a virtual robot 60 for providing an interface for the monitoring system 100 to interact with the outside. The virtual robot 60 includes a virtual robot image or cartoon image, which is connected to the data processing center 40, and when receiving the query instruction from the user, teacher, or parent, the virtual robot can timely retrieve the classroom behavior trace map (provided by the student behavior trace synthesis unit 42) and the abnormal sitting posture situation report (provided by the student sitting posture matching unit 44) corresponding to the student identity of the data processing center 40 within the set period of time.
The virtual robot 60 is further configured to set an abnormal threshold for the sitting posture proportion of any student, and determine that the sitting posture is abnormal if the comparison value exceeds the threshold range; further, the virtual robot 60 may interactively remind the user on the intelligent class card 70 by including the sitting posture abnormality information in the feedback information 65, and send reminding or warning information to the mobile phone 80 of the teacher and/or the parents.
The feedback information 65 includes a composite trace map of the phase time of each student, and abnormal reminding information and abnormal proportion condition of the sitting posture of each student, and parents and teachers can check the behavior trace of the student and the abnormal sitting posture condition of the student in the mobile phone 80 or the intelligent class board 70 through the feedback information 65 and perform human intervention and correction.
The present invention will be described in more detail with reference to specific examples.
Referring to fig. 1, for the student behavior trace synthesis unit 42, the teacher can set the stage time to "day" in the system, the camera unit 20 takes pictures of the conditions in the classroom continuously in one day, and the system 100 synthesizes the face pictures corresponding to the identity information of each student in the registration unit 10. Referring to fig. 2, taking xiaoming as an example, the data processing center 40 performs face recognition on all pictures in one day of the image capturing unit 20, and calculates all pictures including xiaoming identity information: the camera unit 20 collects a plurality of pictures 411, 412, 413 to 41n at time points T1, T2, and T3 to Tn, and the pictures 411, 412, 413 to 41n are firstly transmitted to the classroom position picture processing unit 41 for remarking identity information, wherein circles represent remark positions of Xiaoming in each picture; then, the student behavior trajectory synthesis unit 42 synthesizes the minuscule pictures with the identity information transmitted from the classroom position picture processing unit 41, and obtains a synthesized picture 410.
Assuming that the system presets the repetition region threshold to be 80%, the student behavior trace synthesis unit 42 performs filtering processing on the synthesized picture 410, leaves a part of the statistical region a41 of the synthesized picture 410 that exceeds 80%, obtains a result region a43, labels the result region in red, and obtains a result picture 430 (i.e., a classroom behavior trace map), where the result region a43 labeled in red is a main region representing activities on the day of the mingming. The resulting picture 430 may be recorded in the storage unit 3 and saved for later use.
Further, the teacher can inquire or inquire about the classmates in the result area a43 to know whether the teacher is in the classroom and what classmates are in contact with each other, learning, or playing with each other.
It is assumed that the information of xiaoming in the registration unit 10 includes male, age 12. In the sitting posture model base 43, the students of 12 years old men are prestored, the height range (H1-H2) cm, the length ratio alpha 1-alpha 2 of the shoulder and the head, the included angle is 90 degrees +/-10 degrees, the included angle between the arm and the shoulder is 150 degrees +/-10 degrees, and the included angle between the head and the chest is 170 degrees +/-10 degrees. The body type detecting unit 50 includes body type scanning information of the whole class classmates of the class where xiaoming is located in the school hospital, and includes the height, weight and skeleton development conditions of the xiaoming classmates. Then, the sitting posture model base 43 analyzes the pictures of the day of the tomorrow, calculates the value of β according to β = (registered height-detected height)/registered height, and when it exceeds a certain range (i.e., a set value), determines that the recognition object is in a sitting posture; further, the small and clear length ratio of the shoulder to the head, the included angle, and the included angle between the arm and the shoulder are calculated.
And the student sitting posture matching unit 44 is used for matching the calculation result provided by the sitting posture model base 43 with the standard value, judging that the sitting posture is small and clear if the calculated result exceeds the threshold range of the standard value, and feeding back the abnormal parameters. For example, if the angle between the head and the chest is detected to be 100 °, and the number of times is 20, then the sitting posture abnormality is determined, and the sitting posture abnormality is reported to the virtual robot 60. The abnormal sitting posture report can be recorded in the storage unit 3 and stored for later use.
The virtual robot 60 includes the abnormal parameters and items of the included angle between the head and the chest in the student sitting posture matching unit 44 in the feedback information 65, and pushes the abnormal parameters and items to the mobile phone 80 of the teacher/the parents, and displays the abnormal parameters and items in the relevant area of the intelligent class card 70, so that the teacher can timely remind and intervene the abnormal sitting posture.
Compared with the prior art, the monitoring system 100 of the invention has the beneficial effects that:
1. through the body type detection unit 50, a school hospital or a sports department can be remotely linked, and the body characteristic data of each student is called; establishing a standard sitting posture suitable for each student through the sitting posture model library 43; the sitting posture of each student is monitored and analyzed in real time through the camera unit 20; the calculation result of the sitting posture model base 43 is matched with the standard sitting posture by the student sitting posture matching unit 44, the system sets a matching threshold value, and personalized feedback and early warning information are respectively carried out on the sitting posture with abnormal probability of each student to the mobile phone 80 of the parents of the student and/or the teacher.
2. By acquiring the activity track of each student in the classroom and recognizing the face, the students can continuously acquire pictures within a specified time period and analyze and recognize the pictures; the pictures are overlapped and synthesized in stages, a system sets a region probability threshold value, and outputs the region picture with high probability of each student appearing in specified time, so that the behavior track of each student in the classroom can be known; the teacher or parents can know the longest time that students stay in the classroom area in real time through the system 100, and by inquiring about the students in the classroom area, the teacher can know how long each student makes contact with the students every day and what type of contact.
3. Through the virtual robot 60, information feedback and interaction can be performed between the intelligent class card 70 and the mobile phone 80 of the teacher and/or the parents, so that the user can timely know about the operation of the system 100 and abnormal conditions of students.
In conclusion, the monitoring system 100 of the present invention can perform seamless monitoring and reminding on the sitting posture of the student, so that parents and teachers can know whether the sitting posture of the student is abnormal in real time, and further take necessary intervention; the behavior tracks of students in a classroom can be analyzed, so that teachers can timely know the learning behavior conditions of all students in the class; the system can analyze and feed back the conditions of each student according to different standards and different conditions, thereby achieving personalized monitoring.
It should be understood that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same, and those skilled in the art can modify the technical solutions described in the above embodiments, or make equivalent substitutions for some technical features; and such modifications and substitutions are intended to be included within the scope of the appended claims.

Claims (10)

1. The utility model provides a monitoring system of student's classroom learning action based on artificial intelligence which characterized in that includes following program module:
the registration unit is used for realizing the registration of students in the monitoring system;
the camera shooting unit is used for acquiring face images and classroom images of all students in a class in real time and acquiring images of the whole body of the successfully identified students;
the storage unit is used for storing the pictures collected by the camera shooting unit;
the data processing center is used for calculating characteristic parameters of the pictures provided by the camera unit, and identifying the face of each student and confirming and comparing the identity of each student;
the body type detection unit is used for acquiring body type detection data of each student in a school hospital or a sports department, and the body type detection data is bound with the identity information of the students; and
the virtual robot is used for providing an interface for the monitoring system to interact with the outside;
wherein, this data processing center includes:
the classroom position picture processing unit is used for carrying out face recognition on the classroom scene picture shot by the camera shooting unit and remarking labels for the face features successfully recognized in the picture so that the picture is attached with the identity information of the students successfully recognized;
the student behavior track synthesis unit is used for synthesizing pictures of each student successfully registered in the system within a synthesis time period preset by the system;
the sitting posture model library is used for judging the sitting posture by taking the table contour value and the shooting proportion as reference standards after the face recognition of the picture acquired by the camera shooting unit is successful;
the student sitting posture matching unit is used for matching the sitting posture parameters provided by the sitting posture model base with standard sitting posture parameters prestored in the system;
the virtual robot reports the classroom behavior trace graph corresponding to the student identity in the set stage time provided by the student behavior trace synthesis unit and the sitting posture abnormal condition provided by the student sitting posture matching unit as feedback information, and provides feedback for the outside.
2. The monitoring system of claim 1, wherein: the student behavior track synthesis unit adopts a mode that a certain number of student behavior tracks are combined into a group to synthesize pictures; subsequent pictures are iterated on the basis of the previous group.
3. The monitoring system of claim 1, wherein: the student behavior track synthesis unit presets a threshold range of a synthesized picture face label information superposition area, and carries out color marking on a part exceeding the threshold to obtain a classroom behavior track map of an identified object in a preset time period.
4. The monitoring system of claim 1, wherein: the sitting posture model library comprises standard body type characteristics of different heights and weights of each age group and sex, which are issued by an authority, and proportion ranges among body parts such as the head, the hands, the shoulders, the necks and the like when the student squats, and normal body type proportion parameters of the student are pre-stored in a system and serve as comparison parameters of correct sitting postures of students.
5. The monitoring system of claim 4, wherein: the student sitting posture matching unit is used for matching the age and the height of the successfully identified students in the identities, outputting the corresponding standard body proportion, and matching the relative size proportion values of the head, the hands, the shoulders and the neck of each successfully identified face feature calculated in the sitting posture model base with the standard proportion parameters prestored in the system.
6. The monitoring system of claim 1, wherein: the virtual robot comprises a virtual robot image or cartoon image, which is connected with the data processing center, and when receiving the query instruction of a user, a teacher or a parent, the virtual robot can timely call the classroom behavior trace graph and the abnormal sitting posture condition report of the data processing center corresponding to the identity of the student within the set stage time.
7. The monitoring system of claim 6, wherein: the virtual robot is also used for setting an abnormal threshold value for the sitting posture proportion of any student, and judging that the sitting posture is abnormal if the comparison value exceeds the threshold value range.
8. The monitoring system of claim 1, wherein: the sitting posture model base calculates a β value based on β = (registered height-detected height)/registered height, and determines that the recognition target is in a sitting posture when the β value exceeds a set value.
9. The monitoring system of claim 1, wherein: the data processing center performs sitting posture recognition of the picture for students who successfully recognize facial features and perform identity authentication, and calculates relative proportion data of head, hands, shoulder and neck parts of characters in the picture and the outline of the table.
10. The monitoring system according to any one of claims 1 to 9, wherein: the virtual robot provides the feedback information for the intelligent class cards of the class where the students are located; and/or the virtual robot provides the feedback information to the mobile phone of the teacher and/or the parent of the class where the student is located.
CN201910447608.1A 2019-05-27 2019-05-27 Student classroom learning behavior monitoring system based on artificial intelligence Pending CN112001209A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910447608.1A CN112001209A (en) 2019-05-27 2019-05-27 Student classroom learning behavior monitoring system based on artificial intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910447608.1A CN112001209A (en) 2019-05-27 2019-05-27 Student classroom learning behavior monitoring system based on artificial intelligence

Publications (1)

Publication Number Publication Date
CN112001209A true CN112001209A (en) 2020-11-27

Family

ID=73461269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910447608.1A Pending CN112001209A (en) 2019-05-27 2019-05-27 Student classroom learning behavior monitoring system based on artificial intelligence

Country Status (1)

Country Link
CN (1) CN112001209A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470209A (en) * 2021-07-01 2021-10-01 厦门悦讯信息科技股份有限公司 Method and system for realizing classroom roll call through face recognition
CN115990012A (en) * 2022-11-08 2023-04-21 广东保伦电子股份有限公司 Comprehensive prevention and control method for preventing myopia in specific space
CN116935721A (en) * 2023-07-20 2023-10-24 云启智慧科技有限公司 Auxiliary teaching method and system for special student groups

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169456A (en) * 2017-05-16 2017-09-15 湖南巨汇科技发展有限公司 A kind of sitting posture detecting method based on sitting posture depth image
CN107609517A (en) * 2017-09-15 2018-01-19 华中科技大学 A kind of classroom behavior detecting system based on computer vision
CN108509896A (en) * 2018-03-28 2018-09-07 腾讯科技(深圳)有限公司 A kind of trace tracking method, device and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169456A (en) * 2017-05-16 2017-09-15 湖南巨汇科技发展有限公司 A kind of sitting posture detecting method based on sitting posture depth image
CN107609517A (en) * 2017-09-15 2018-01-19 华中科技大学 A kind of classroom behavior detecting system based on computer vision
CN108509896A (en) * 2018-03-28 2018-09-07 腾讯科技(深圳)有限公司 A kind of trace tracking method, device and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470209A (en) * 2021-07-01 2021-10-01 厦门悦讯信息科技股份有限公司 Method and system for realizing classroom roll call through face recognition
CN115990012A (en) * 2022-11-08 2023-04-21 广东保伦电子股份有限公司 Comprehensive prevention and control method for preventing myopia in specific space
CN115990012B (en) * 2022-11-08 2023-08-18 广东保伦电子股份有限公司 Comprehensive prevention and control method for preventing myopia in specific space
CN116935721A (en) * 2023-07-20 2023-10-24 云启智慧科技有限公司 Auxiliary teaching method and system for special student groups

Similar Documents

Publication Publication Date Title
CN108256433B (en) Motion attitude assessment method and system
Samet et al. Face recognition-based mobile automatic classroom attendance management system
CN112001209A (en) Student classroom learning behavior monitoring system based on artificial intelligence
CN107241572B (en) Training video tracking evaluation system for students
CN105303632A (en) Mobile signing monitoring system and working method
CN105956960A (en) College physical education equipment management system
CN110135476A (en) A kind of detection method of personal safety equipment, device, equipment and system
WO2017161734A1 (en) Correction of human body movements via television and motion-sensing accessory and system
CN110478862A (en) A kind of exercise guide system and its guidance method
CN110464356B (en) Comprehensive monitoring method and system for exercise capacity
CN207264314U (en) A kind of movement Compare System based on wearable device
CN116092199B (en) Employee working state identification method and identification system
CN114495169A (en) Training data processing method, device and equipment for human body posture recognition
CN109961039A (en) A kind of individual's goal video method for catching and system
CN111507301B (en) Video processing method, video processing device, computer equipment and storage medium
CN114783043B (en) Child behavior track positioning method and system
CN113947742A (en) Person trajectory tracking method and device based on face recognition
AU2021203869A1 (en) Methods, devices, electronic apparatuses and storage media of image processing
CN111652192A (en) Tumble detection system based on kinect sensor
CN114241375A (en) Monitoring method used in movement process
CN114565976A (en) Training intelligent test method and device
CN210933642U (en) Body-building guidance system
CN117292288A (en) Sports test method, system, electronic device, chip and storage medium
CN206948499U (en) The monitoring of student's real training video frequency tracking, evaluation system
CN116229507A (en) Human body posture detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201127