CN115002343A - Method and system for objectively evaluating classroom performance of student based on machine vision - Google Patents

Method and system for objectively evaluating classroom performance of student based on machine vision Download PDF

Info

Publication number
CN115002343A
CN115002343A CN202210487171.6A CN202210487171A CN115002343A CN 115002343 A CN115002343 A CN 115002343A CN 202210487171 A CN202210487171 A CN 202210487171A CN 115002343 A CN115002343 A CN 115002343A
Authority
CN
China
Prior art keywords
classroom
student
behavior
module
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210487171.6A
Other languages
Chinese (zh)
Inventor
李敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Institute of Engineering
Original Assignee
Chongqing Institute of Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Institute of Engineering filed Critical Chongqing Institute of Engineering
Priority to CN202210487171.6A priority Critical patent/CN115002343A/en
Publication of CN115002343A publication Critical patent/CN115002343A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a method and a system for objectively evaluating classroom performance of students based on machine vision, which comprises the steps of constructing a classroom behavior judgment model; generating a classroom seat background reference map based on the monitoring data; recognizing and tracking the face and body data of the student to obtain the seat corresponding relation; filling the seat corresponding relation in a classroom seat background reference picture to obtain a classroom behavior snapshot picture; the classroom behavior snapshot is input into a classroom behavior judgment model to judge classroom behavior of students to obtain classroom performance evaluation results, classroom participation, classroom activity, speech and attendance performance of the students are judged through face data of the students, and the problem that ordinary performance is low when evaluated by the existing classroom performance evaluation method of the students is solved.

Description

Method and system for objectively evaluating classroom performance of student based on machine vision
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method and a system for objectively evaluating classroom performance of students based on machine vision.
Background
In most high schools, the total score of each class of students is composed of an examination score and a normal score, and the normal score comprises the situation that the class students participate in the activities of the class, the speaking situation and the attendance performance.
The students are required to use the mobile phone to scan the codes to sign in during attendance checking, and the online signing students do not need real names, so that the signing of other people is facilitated, the authenticity of the attendance checking cannot be determined in real time, the ordinary scores of the students cannot be objectively and fairly given, and the authenticity of the ordinary scores is reduced.
Disclosure of Invention
The invention aims to provide a method and a system for objectively evaluating the classroom performance of students based on machine vision, and aims to solve the problem that the conventional method for evaluating the classroom performance of students has low authenticity of ordinary achievements.
In order to achieve the above object, in a first aspect, the present invention provides a method for objectively evaluating classroom performance of a student based on machine vision, comprising the following steps:
constructing a classroom behavior judgment model;
generating a classroom seat background reference map based on the monitoring data;
recognizing and tracking the face and body data of the student to obtain the seat corresponding relation;
filling the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture;
and inputting the classroom behavior snapshot picture into the classroom behavior judgment model to judge the classroom performance of the student to obtain a classroom performance evaluation result.
The specific way for constructing the classroom behavior judgment model is as follows:
constructing a behavior analysis network, a training data set and a test data set;
training the behavior analysis network by using the training data set based on a semi-supervised learning mode to obtain a training model;
and evaluating and optimizing the training model by using the test data set to obtain a classroom behavior judgment model.
The specific mode for generating the classroom seat background reference map based on the monitoring data is as follows:
acquiring monitoring data;
judging the angle and the aspect ratio of the target area of the monitoring data to obtain a judgment result;
and correcting the image of the target area based on the judgment result to obtain a background reference image.
The specific way of identifying and tracking the face and body data of the student to obtain the seat corresponding relation is as follows:
recognizing student face data in a classroom based on the monitoring data;
comparing the student face data with a face database to obtain student information;
recording corresponding coat and hat characteristic information of the students on the basis of the student information;
tracking seating behaviors and seat numbers of the students based on the characteristic information of the clothes and the hat to obtain seating conditions;
and generating a corresponding relation between the student information and a seat based on the seating condition.
The student information includes student name, school number and class schedule information.
After the step of inputting the classroom behavior snapshot into the classroom behavior judgment model to judge the classroom performance of the student and obtain the classroom performance evaluation result, the method further comprises the following steps:
generating a classroom atmosphere trend graph according to the classroom performance evaluation result based on a preset time period;
analyzing based on the classroom atmosphere trend graph to obtain an analysis result;
and adjusting the classroom instruction effect of the teacher based on the analysis result.
In a second aspect, the invention provides a system for objectively evaluating classroom performance of a student based on machine vision, which comprises a construction module, a generation module, a tracking module, a filling module and a judgment module, wherein the construction module, the generation module, the tracking module, the filling module and the judgment module are connected in sequence;
the building module is used for building a classroom behavior judgment model;
the generation module generates a classroom seat background reference map based on the monitoring data;
the tracking module is used for identifying and tracking the face and body data of the student to obtain the seat corresponding relation;
the filling module is used for filling the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture;
and the judging module is used for inputting the classroom behavior snapshot picture into the classroom behavior judging module to judge the classroom performance of the student and obtain a classroom performance evaluation result.
The invention relates to a method for objectively evaluating the classroom performance of a student based on machine vision, which comprises the steps of constructing a classroom behavior judgment model; generating a classroom seat background reference map based on the monitoring data; recognizing and tracking the face and body data of the student to obtain the seat corresponding relation; filling the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture; and inputting the classroom behavior snapshot picture into the classroom behavior judgment model to judge the classroom performance of the students to obtain classroom performance evaluation results, and judging classroom participation conditions, classroom activity conditions, speaking conditions and attendance performances of the students according to the face data of the students, so that the problem of low normal score authenticity evaluated by the conventional classroom performance evaluation method of the students is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for objectively evaluating classroom performance of students based on machine vision according to the present invention.
Fig. 2 is a workflow diagram of a classroom behavior determination model.
Fig. 3 is a comparison before and after correction of the monitoring data.
Fig. 4 is a schematic structural diagram of a classroom behavior snapshot.
Fig. 5 is a schematic structural diagram of objective evaluation of student classroom performance based on machine vision according to the present invention.
The system comprises a 1-construction module, a 2-generation module, a 3-tracking module, a 4-filling module and a 5-judgment module.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
Referring to fig. 1 to 4, in a first aspect, the present invention provides a method for objectively evaluating classroom performance of a student based on machine vision, including the following steps:
s1, constructing a classroom behavior judgment model;
the concrete mode is as follows:
s11, constructing a behavior analysis network, a training data set and a test data set;
specifically, the specific way of constructing the training data set and the test data set is as follows: the method comprises the steps of constructing classroom behavior data, and dividing the classroom behavior data into a training data set and a testing data set according to a conventional data analysis proportion, wherein classroom behaviors at least comprise class taking, note taking, sleeping, mobile phone playing, absence and new entry. The behavior analysis network is trained by using YOLOv5 as a backbone framework.
S12, training the behavior analysis network by using the training data set based on a semi-supervised learning mode to obtain a training model;
specifically, the training data set is used for carrying out network iterative training and repeated evaluation on the behavior analysis network based on a semi-supervised learning mode, and when the preset training times or the preset effect is achieved, the training is finished to obtain a training model.
S13, the training model is evaluated and optimized by using the test data set, and a classroom behavior judgment model is obtained.
Specifically, the training model is evaluated and optimized by using the test data set to obtain an optimized model; and performing effect verification on the optimization model by using data which are collected in a real scene and are not limited to various scales, sizes, illumination and angle types, and confirming the generalization and robustness of the optimization model to obtain a classroom behavior judgment model.
S2, generating a classroom seat background reference map based on the monitoring data;
the concrete mode is as follows:
s21, acquiring monitoring data;
specifically, the monitoring data is data collected by a camera in a classroom.
S22, judging the angle and the aspect ratio of the target area of the monitoring data to obtain a judgment result;
specifically, since most of the monitoring cameras in the classroom are installed in corners and the field of view is oblique to the direction of the seat, it is necessary to perform angle and aspect ratio determination on the target area of the monitoring data to obtain a determination result.
S23 corrects the image of the target area based on the determination result, and obtains a background reference map.
Specifically, when the determination result is an oblique surface, the monitoring data is corrected to obtain a background reference image, and when the determination result is a forward surface, the monitoring data is used as the background reference image. As shown in fig. 3, the left graph in fig. 3 is the monitoring data before correction, and the right graph is the background reference graph after correction.
S3, recognizing and tracking the face and body data of the student to obtain the seat corresponding relation;
the concrete mode is as follows:
s31 recognizing the face data of the students in the classroom based on the monitoring data;
s32, comparing the student face data with a face database to obtain student information;
specifically, the student information includes student name, school number and schedule information.
S33, recording coat and hat characteristic information of the corresponding student based on the student information;
s34, tracking the seating behaviors and the seat numbers of the students based on the characteristic information of the hats and clothes to obtain seating conditions;
s35 generates a correspondence between the student information and a seat based on the seating situation.
S4, filling the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture;
specifically, the seat correspondence is filled in the classroom seat background reference picture to obtain a classroom behavior snapshot picture, and the classroom behavior snapshot picture is displayed on a terminal.
S5, inputting the snapshot of classroom behavior into the classroom behavior judgment model to judge the classroom performance of students, and obtaining classroom performance evaluation results.
Specifically, in the period from the last class (including the first section and the second section) to the next class (including the first section and the second section), the classroom behavior snapshot is input, the classroom behavior judgment model judges the classroom behavior of the student, and the classroom behavior evaluation result is obtained, the classroom behavior at least comprises whether the student is absent, whether the student is late, whether the student has bad behaviors such as sleeping or mobile phone playing and learning behaviors worthy of advocating such as conscientiously going to class, making notes and whether a class is absent. If the arrival time also includes arrival time, the sleeping or playing the mobile phone includes duration, the number of times of making notes and the like. On a terminal (not limited to a teaching device, a terminal display screen such as intelligent glasses and the like), intervention behaviors such as giving warnings to students based on the classroom behavior snapshot picture can be directly shout for student name answers or based on classroom bad behaviors identified by the classroom performance evaluation results. The student information of absenteeism can be automatically pushed to the instructor in time so as to confirm whether the student is safe or the track is clear.
Attendance data or information is automatically recorded and stored, including absence of work, leaving from the middle (leaving from the first class to the second class), changing seats, entering newly (entering into the class from the second class), and the like.
Data resource preparation, data applied by the method is from a educational administration system and a monitoring machine room, the data source to be butted has teacher and teaching information thereof, at least comprises name and work number of the teacher giving lesson, class, student number, teaching classroom, week, class and date time (which are butted with a network holiday arrangement table, the flushed lesson is determined in a self-adaptive manner to remind whether to adjust the lesson and supplement or plan in advance), face data and a camera number (which correspond to the classroom), and by comparing the class table information in the student information with the name of the teacher giving lesson, the teaching classroom and the like, the class table information can directly distinguish the students from the other students at the terminal, so that the teacher can know the class condition more.
The algorithm principle is as follows: an algorithm network (the classroom behavior judgment model) is based on a YOLOv5 target detection framework, and through stacked CSPNets, the semantic features with stronger targets are extracted and fused with superficial layer texture features, so that the detection capability of small targets (the targets are small due to the fact that the focal length of a visual field is far in a classroom) such as taking notes and playing mobile phones is improved; meanwhile, in order to improve the accuracy of positioning the small target, a layer of network structure PAN from bottom to top is connected behind, and the strong positioning characteristics of the bottom layer are uploaded. And finally, outputting decoding results to different scale layers of the network, and obtaining the coordinates of the regressed target frame and the confidence coefficient of the target.
The algorithm implementation scheme is as follows:
input end: the size of the input image of the network is N × M, and the stage generally includes an image preprocessing stage, i.e., scaling the input image to the input size of the network, and performing normalization and other operations. In the network training stage, in order to improve the robustness and the generic capability of the model, a complex data enhancement technology is adopted during model training, so that the algorithm model is suitable for scenes such as fuzzy scenes, light reflection scenes, complex postures and the like, and a data enhancement strategy (program-driven data diversity or quantity increase) includes but is not limited to: motion blur, random clipping, random rotation (angle-40 °), brightness and color change, random scaling, Mosaic data enhancement, and the like.
Reference network (Focus + CSP): inputting an original N x M3 image of the classroom behavior snapshot into a Focus structure, changing the original N x M3 image into a (N/2) x (M/2) x 12 feature map by adopting a slicing operation, and finally changing the original N x M3 image into a (N/2) x (M/2) 32 feature map by 32 convolution operations of convolution kernels; CSP structure: by adopting a CSP1 structure and using a cross-stage local network structure, namely splitting a feature map into two parts, one part is subjected to convolution operation, the other part and the result of the convolution operation of the previous part are subjected to concatee, and the strong semantic features of the deep target and the shallow texture features are fused, so that a large amount of reasoning calculation is relieved, the learning capacity of CNN (content-based network) can be enhanced, and the accuracy can be kept while the weight is reduced.
The hack network: and the CSP2 structure (PAN: Botton-Up) designed by using CSPnet is positioned between the reference network and the head network to strengthen the capability of network feature fusion.
Head (Head network) output: and fusing the multi-scale features to finish the output of the detection behavior.
S6, generating a classroom atmosphere trend graph according to the classroom performance evaluation result based on a preset time period;
specifically, the classroom performance evaluation result in a classroom in a certain time period can be selected to generate a classroom atmosphere trend graph.
S7, analyzing based on the classroom atmosphere trend graph to obtain an analysis result;
and S8, adjusting the classroom instruction effect of the teacher based on the analysis result.
Referring to fig. 5, in a second aspect, the present invention provides a system for objectively evaluating classroom performance of a student based on machine vision, including a building module 1, a generating module 2, a tracking module 3, a filling module 4 and a determining module 5, where the building module 1, the generating module 2, the tracking module 3, the filling module 4 and the determining module are connected in sequence;
the building module 1 is used for building a classroom behavior judgment model;
the generation module 2 is used for generating a classroom seat background reference map based on monitoring data;
the tracking module 3 is used for identifying and tracking the face and body data of the student to obtain the seat corresponding relation;
the filling module 4 is used for filling the seat corresponding relation in the classroom seat background reference image to obtain a classroom behavior snapshot image;
and the judging module 5 is used for inputting the classroom behavior snapshot picture into the classroom behavior judging module to judge the classroom performance of the student and obtain a classroom performance evaluation result.
Specifically, the building module 1 builds a classroom behavior judgment model; the generation module 2 generates a classroom seat background reference map based on the monitoring data; the tracking module 3 identifies and tracks the face and body data of the student to obtain the seat corresponding relation; the filling module 4 fills the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture; the judging module 5 inputs the classroom behavior snapshot picture into the classroom behavior judging module to judge the classroom performance of the student to obtain classroom performance evaluation results, and judges classroom participation conditions, classroom activity conditions, speaking conditions and attendance performance of the student according to the face data of the student, so that the problem of low normal score authenticity evaluated by the conventional classroom performance evaluating method for the student is solved.
While the present invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

1. A method for objectively evaluating classroom performance of a student based on machine vision is characterized by comprising the following steps:
constructing a classroom behavior judgment model;
generating a classroom seat background reference map based on the monitoring data;
recognizing and tracking the face and body data of the student to obtain the seat corresponding relation;
filling the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture;
and inputting the classroom behavior snapshot picture into the classroom behavior judgment model to judge the classroom performance of the student to obtain a classroom performance evaluation result.
2. The method for machine-vision-based objective assessment of student classroom performance according to claim 1,
the specific mode for constructing the classroom behavior judgment model is as follows:
constructing a behavior analysis network, a training data set and a test data set;
training the behavior analysis network by using the training data set based on a semi-supervised learning mode to obtain a training model;
and evaluating and optimizing the training model by using the test data set to obtain a classroom behavior judgment model.
3. The method for machine vision based objective assessment of student classroom performance of claim 1,
the specific mode for generating the classroom seat background reference map based on the monitoring data is as follows:
acquiring monitoring data;
judging the angle and the aspect ratio of the target area of the monitoring data to obtain a judgment result;
and correcting the image of the target area based on the judgment result to obtain a background reference image.
4. The method for machine vision based objective assessment of student classroom performance of claim 1,
the specific way of identifying and tracking the face and body data of the student to obtain the seat corresponding relation is as follows:
recognizing student face data in a classroom based on the monitoring data;
comparing the student face data with a face database to obtain student information;
recording corresponding coat and hat characteristic information of the students on the basis of the student information;
tracking seating behaviors and seat numbers of the students based on the characteristic information of the clothes and the hat to obtain seating conditions;
and generating the corresponding relation between the student information and the seat based on the seating condition.
5. The method for machine vision based objective assessment of student classroom performance of claim 4,
the student information comprises student names, student numbers and class schedule information.
6. The method for machine-vision-based objective assessment of student classroom performance according to claim 1,
after the step of inputting the classroom behavior snapshot into the classroom behavior judgment model to judge the classroom performance of the student and obtain the classroom performance evaluation result, the method further comprises the following steps:
generating a classroom atmosphere trend graph according to the classroom performance evaluation result based on a preset time period;
analyzing based on the classroom atmosphere trend graph to obtain an analysis result;
and adjusting the classroom tutoring effect of the teacher based on the analysis result.
7. A system for objectively evaluating student classroom performance based on machine vision, which is applied to the method for objectively evaluating student classroom performance based on machine vision according to any one of claims 1 to 6,
the device comprises a construction module, a generation module, a tracking module, a filling module and a judgment module, wherein the construction module, the generation module, the tracking module, the filling module and the judgment module are sequentially connected;
the building module is used for building a classroom behavior judgment model;
the generation module generates a classroom seat background reference map based on the monitoring data;
the tracking module is used for identifying and tracking the face and body data of the student to obtain the seat corresponding relation;
the filling module is used for filling the seat corresponding relation in the classroom seat background reference picture to obtain a classroom behavior snapshot picture;
and the judging module is used for inputting the classroom behavior snapshot picture into the classroom behavior judging module to judge the classroom performance of the student and obtain a classroom performance evaluation result.
CN202210487171.6A 2022-05-06 2022-05-06 Method and system for objectively evaluating classroom performance of student based on machine vision Pending CN115002343A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210487171.6A CN115002343A (en) 2022-05-06 2022-05-06 Method and system for objectively evaluating classroom performance of student based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210487171.6A CN115002343A (en) 2022-05-06 2022-05-06 Method and system for objectively evaluating classroom performance of student based on machine vision

Publications (1)

Publication Number Publication Date
CN115002343A true CN115002343A (en) 2022-09-02

Family

ID=83025616

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210487171.6A Pending CN115002343A (en) 2022-05-06 2022-05-06 Method and system for objectively evaluating classroom performance of student based on machine vision

Country Status (1)

Country Link
CN (1) CN115002343A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105206126A (en) * 2015-10-15 2015-12-30 于璐菁 Auxiliary system and method for taking lessons
CN107609517A (en) * 2017-09-15 2018-01-19 华中科技大学 A kind of classroom behavior detecting system based on computer vision
CN109345156A (en) * 2018-12-12 2019-02-15 范例 A kind of Classroom Teaching system based on machine vision
CN110837795A (en) * 2019-11-04 2020-02-25 防灾科技学院 Teaching condition intelligent monitoring method, device and equipment based on classroom monitoring video
CN111242962A (en) * 2020-01-15 2020-06-05 中国平安人寿保险股份有限公司 Method, device and equipment for generating remote training video and storage medium
CN111666809A (en) * 2020-04-20 2020-09-15 阜阳师范大学 Intelligent system for evaluating classroom performance of students
CN113723250A (en) * 2021-08-23 2021-11-30 华中师范大学 Classroom intelligent analysis method and system for helping teacher to grow up mentally

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105206126A (en) * 2015-10-15 2015-12-30 于璐菁 Auxiliary system and method for taking lessons
CN107609517A (en) * 2017-09-15 2018-01-19 华中科技大学 A kind of classroom behavior detecting system based on computer vision
CN109345156A (en) * 2018-12-12 2019-02-15 范例 A kind of Classroom Teaching system based on machine vision
CN110837795A (en) * 2019-11-04 2020-02-25 防灾科技学院 Teaching condition intelligent monitoring method, device and equipment based on classroom monitoring video
CN111242962A (en) * 2020-01-15 2020-06-05 中国平安人寿保险股份有限公司 Method, device and equipment for generating remote training video and storage medium
CN111666809A (en) * 2020-04-20 2020-09-15 阜阳师范大学 Intelligent system for evaluating classroom performance of students
CN113723250A (en) * 2021-08-23 2021-11-30 华中师范大学 Classroom intelligent analysis method and system for helping teacher to grow up mentally

Similar Documents

Publication Publication Date Title
CN112183238B (en) Remote education attention detection method and system
CN103761894A (en) Interaction classroom implementing method and interactive platform
CN111597305B (en) Entity marking method, entity marking device, computer equipment and storage medium
Shu et al. Research on the learning behavior of university students in blended teaching
CN115810163A (en) Teaching assessment method and system based on AI classroom behavior recognition
CN110660285A (en) Scene self-adaptive customized intelligent training method and system
CN114581271A (en) Intelligent processing method and system for online teaching video
JP2022014890A (en) Concentration determination program
CN111489595B (en) Method and device for feeding back test information in live broadcast teaching process
CN115002343A (en) Method and system for objectively evaluating classroom performance of student based on machine vision
CN114998440B (en) Multi-mode-based evaluation method, device, medium and equipment
KR100971475B1 (en) A system of vitural- reality interview using Stereoscopic Display and interview method of the same
WO2023152584A1 (en) Meeting session control based on attention determination
US20230316949A1 (en) Communication skills training
CN110378261A (en) A kind of student's recognition methods and device
TWM636247U (en) Sports Event Evaluation Device
CN115829234A (en) Automatic supervision system based on classroom detection and working method thereof
CN114882424A (en) Sports item evaluation method, related equipment and computer readable storage medium
CN113240559A (en) Multimedia teaching device is used in hotel management teaching
CN114037358A (en) Academic situation analysis method and system based on college interactive classroom activity data
CN113409635A (en) Interactive teaching method and system based on virtual reality scene
CN114581835A (en) Intelligent video teaching method and system for realizing motion recognition
CN116797090B (en) Online assessment method and system for classroom learning state of student
CN111105651A (en) AR-based waste classification teaching method and system
Takahashi et al. Improvement of detection for warning students in e-learning using web cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination