CN110992229B - Scientific teaching effect evaluation method based on knowledge migration - Google Patents
Scientific teaching effect evaluation method based on knowledge migration Download PDFInfo
- Publication number
- CN110992229B CN110992229B CN201911259418.3A CN201911259418A CN110992229B CN 110992229 B CN110992229 B CN 110992229B CN 201911259418 A CN201911259418 A CN 201911259418A CN 110992229 B CN110992229 B CN 110992229B
- Authority
- CN
- China
- Prior art keywords
- teaching effect
- teaching
- training
- matrix
- label
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000000694 effects Effects 0.000 title claims abstract description 62
- 238000011156 evaluation Methods 0.000 title claims abstract description 39
- 230000005012 migration Effects 0.000 title claims abstract description 18
- 238000013508 migration Methods 0.000 title claims abstract description 18
- 238000012549 training Methods 0.000 claims abstract description 40
- 238000000034 method Methods 0.000 claims abstract description 22
- 239000011159 matrix material Substances 0.000 claims description 36
- 230000006870 function Effects 0.000 claims description 15
- 230000010354 integration Effects 0.000 claims description 13
- 230000004913 activation Effects 0.000 claims description 6
- 238000013210 evaluation model Methods 0.000 claims description 6
- 230000003993 interaction Effects 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 3
- 230000009191 jumping Effects 0.000 claims description 3
- 238000002372 labelling Methods 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 239000002245 particle Substances 0.000 claims description 3
- 230000000717 retained effect Effects 0.000 claims description 3
- 238000009966 trimming Methods 0.000 claims description 3
- 239000000463 material Substances 0.000 claims description 2
- 238000012546 transfer Methods 0.000 claims 1
- 230000003044 adaptive effect Effects 0.000 abstract description 2
- 238000010276 construction Methods 0.000 abstract description 2
- 230000008569 process Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 239000004744 fabric Substances 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
- G06Q50/205—Education administration or guidance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06393—Score-carding, benchmarking or key performance indicator [KPI] analysis
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Educational Administration (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Physics & Mathematics (AREA)
- Marketing (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Technology (AREA)
- General Business, Economics & Management (AREA)
- Data Mining & Analysis (AREA)
- Development Economics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Quality & Reliability (AREA)
- General Engineering & Computer Science (AREA)
- Game Theory and Decision Science (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Operations Research (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a scientific teaching effect evaluation method based on knowledge migration, which mainly comprises two steps of collecting large amount of data to establish a general model for teaching effect evaluation and collecting small amount of data to establish a special model for teaching effect evaluation; the first step also comprises four steps of collecting data and initializing, generating a pseudo label, training a teaching effect prediction classifier and predicting a teaching effect; and the second step also comprises four steps of collecting data and initializing, finely adjusting parameters and calculating a domain difference index, training a teaching effect prediction classifier and predicting a teaching effect. The invention has the advantages that: the problems of class imbalance, long training time and uncertain smooth hypothesis in the construction of a teaching effect prediction model in teaching evaluation can be solved, a special model adaptive to a new environment can be rapidly obtained by using a small amount of data in the new environment, and the method has certain self-adaptive performance.
Description
Technical Field
The invention relates to the technical field of teaching, in particular to a scientific teaching effect evaluation method based on knowledge migration.
Background
The teaching evaluation is an activity for judging the value of a teaching process and a teaching result according to a teaching target and serving a teaching decision, and is a process for judging the actual or potential value of the teaching activity. Along with the continuous improvement of living standard of people, people pay more attention to education, but more the evaluation of the teaching effect of teachers in schools is to consider the achievement of children, often neglect other factors, and be more comprehensive.
Patent CN109559260A a teaching effect evaluation system discloses a teaching effect evaluation system, including test module and evaluation module, test module is to the score of student's examination in term and end of term examination carry out the examination, evaluation module is that student, leader or colleague and student's head of a family carry out comprehensive scoring to mr's teaching method, teaching environment, teaching management and teaching attitude. The evaluation on the teaching effect of the teacher not only considers the scores of students, but also considers the ordinary teaching method, teaching environment, teaching management and teaching attitude of the teacher, and students, leaders or colleagues and parents of the students comprehensively evaluate the teacher in multiple directions, so that the phenomenon that the students give negative to the students in the period of study because the examination scores of the students are not ideal is avoided. Patent CN110457641A teaching effect evaluation system for teaching classroom that practicality is strong discloses a teaching effect evaluation system for teaching classroom that practicality is strong, through setting up test exercise entry test module, student evaluation unit, information entry unit, evaluation data information contrast module and data information integration management unit, can compare with the result of student classroom evaluation according to the result that different students tested and rated, trail classroom actual effect more accurately in real time for the mr is effectual knows the cognition, experience and the mood of student's study in-process, and the result of evaluation more accords with reality. The patent CN110084508A a method and device for evaluating classroom teaching effects provides a method and device for evaluating classroom teaching effects, which collects electroencephalogram data of each classroom participant in the same time period through the same collection frequency, calculates the correlation between the electroencephalogram data of each two classroom participants, uses the correlation as the consistency index between the electroencephalogram data of each two classroom participants, finally obtains the consistency index between the electroencephalogram data of all classroom participants through a method of calculating the average value, and evaluates the classroom teaching effects through the consistency index.
From the above, although some studies have recognized the singleness of teaching effect evaluation, the scientificity of the evaluation is still insufficient. On one hand, the teaching effect is influenced by a plurality of factors, and the influence mode is complex and cannot be realized by simply dividing a plurality of indexes by threshold values; on the other hand, invasive sensing such as brain wave sensors is costly and may affect the teaching process to some extent. Therefore, it is necessary to design a more advanced and scientific teaching effect evaluation method and system.
Disclosure of Invention
The invention provides a scientific teaching effect evaluation method based on knowledge migration, which specifically comprises the following steps:
step 1: general evaluation model building
Collecting a large amount of data, and establishing a general model for teaching effect evaluation, which comprises the following steps:
step 101: collecting the teaching content, the mode and the effect data,establishing a labeled sample setEstablishing label-free sample setp ∈ {1, 2.. eta., l }, q ∈ { l +1, l + 2.. eta., n }, where l denotes the total number of labeled samples, n denotes the total number of all samples, and sample x denotes the total number of samplesp,The system is a 15-dimensional vector, and the sample characteristics comprise teaching subjects, depth, writing on a blackboard or not, multimedia, teaching assessment mode, teacher gender, teacher age, teacher study history, average work time, corresponding number of people, interaction degree, attendance rate, student feedback, average score and language in class; y isp∈{χ1,χ2,χ3,χ4,χ5Denotes a label, which indicates the teaching effect, χ1,χ2,χ3,χ4,χ5Respectively poor, normal, good and good,representing a real number domain;
manually setting the maximum training times T>0 is a positive integer, the training time t is 0, the robustness factor mu epsilon (0,1) is manually set, and the loss coefficient C is manually set>0, manually setting a propagation coefficient alpha epsilon (0,1), manually setting the number N of hidden nodes to be a positive integer larger than 15, randomly generating N input weights w, wherein w is a 9-dimensional column vector, and obtaining w1,w2,...,wN(ii) a Randomly generating N input offsets b, wherein b is a real number to obtain b1,b2,...,bN(ii) a Let Lp=0;
Step 102: for the label-free sample set in step 101And (3) labeling a pseudo label, specifically as follows:
step 10201: randomly taking a value for the Gaussian bandwidth coefficient sigma >0, and establishing an affinity matrix W as follows:
wherein (W)ijAn element in row i and column j of W, i, j being 1, 2.
Step 10202: calculating a final label matrix F:
F=(1-α)(I-αS)-1Y
wherein, I is an identity matrix, D is a degree matrix of W, D is a diagonal matrix, and the ith diagonal element is Transmission matrixY is an initial label matrix whose elements are
Wherein r is 1,2,3,4, 5;is a sample xqWherein the pseudo tag of FqrIs the qth row and the r column element of the matrix F;
step 103: training teaching effect prediction classifier specifically as follows:
step 10301: sequentially taking r as 1,2,3,4 and 5, and training different types of chirCorresponding base classifierThe following were used: will be provided withAndthe middle label or the pseudo label is xrIs taken out of the sample of (1), nrObtaining a sample setFurther, the X type can be obtainedrClassifier
Wherein, z represents the number of samples,
wherein I is a unit matrix, C >0 is a loss coefficient, e is an n-dimensional unit column vector,
h(x)=[G(w1,b1,x),...,G(wN,bN,x)]T
for a non-linear mapping function, G (w, b, x) is an activation function,outputting a matrix for the hidden layer; offset thresholdWherein,in order to get the function of the integer downwards,
the function max2min (-) arranges the input sequence from large to small and outputs the arranged sequence,xi is the middle part of XiThe element is mu epsilon (0,1) as a robust factor;
step 10302: for that obtained in step 10301The integration is carried out as follows: inputting a sample z intoIf only one classifier outputs 1, the final classification result is the class corresponding to the classifier; for the other case, { ξ +r(z), the category corresponding to the minimum element in r ═ 1,2,3,4,5} is the final classification result;
step 10303: using the classifier integration method obtained in step 10302, for { xiI 1,2, a, n, and obtaining a corresponding prediction result { delta ═ di1,2,., n }, and let δ ═ δ1,...,δn]TThen, calculate Lc=δTL δ, whereinIs a Laplace graph matrix if Lc<LpOr t is 0, the current set of base classifiers is retained, i.e.Let Lp←LcIncreasing T by self 1, jumping to the step 2 if T is less than or equal to T, otherwise, stopping training and entering the next step;
step 104: derived based on trainingAnd predicting the teaching effect with the integration rule described in step 10302.
Step 2, establishing a special evaluation model
A small amount of data is collected, and a special model for teaching effect evaluation is established as follows:
step 201: collecting teaching content, mode and effect data, and establishing labeled sample sets∈{n+1,...,n+l2In which l2Representing the total number of marked samples collected in the special model for establishing teaching effect evaluationThe 15-dimensional vector, the sample features are consistent with the features involved in step 101; y iss∈{χ1,χ2,χ3,χ4,χ5Denotes a label, χ1,χ2,χ3,χ4,χ5Respectively poor, normal, good and good,representing a real number domain;
manually setting the maximum training times T2>0 is a positive integer, and the training time t is 0; manually setting migration trade-off coefficient gamma>0, order Op=0;
Step 202: based on training in step 1Trimming w1,w2,...,wNAnd b1,b2,...,bNObtaining an adjusted set of base classifiersBy usingPerforming integrated predictionsTo obtainIn thatHas an error rate of a e [0,1 ]](ii) a The integrated prediction rules are as follows:
inputting a sample z intoIf only one classifier outputs 1, the final classification result is the class corresponding to the classifier; for other cases, then The category corresponding to the minimum element in the classification is the final classification result,is xir(z) adjusting w1,w2,...,wNAnd b1,b2,...,bNThe latter result;
step 203: the domain difference index Ω is calculated as follows:
step 204: calculating the current migration index OcΩ + γ a, where γ>0 is the migration trade-off factor, if Oc<OpOr t is 0, then letLet O bep←OcWhen the value of t is increased by 1,if T is less than or equal to T2Jumping to step 202, otherwise, stopping training and entering the next step;
step 205: derived based on trainingAnd predicting the teaching effect with the integration rule described in the step 202.
Wherein the activation function involved is:
Or
G(w,b,x)=cos(wTx+b)。
The fine tuning method involved in step 202 is as follows:
let w after fine adjustment1,w2,...,wNAnd b1,b2,...,bNAre respectively asAnd the values are randomly taken on the distribution of the particles,randomly valued in its distribution, τ ∈ {1, 2.., N },represents a mean value of wτVariance ofGauss score ofThe cloth is made of a cloth material,represents a mean value of bτVariance ofOf Gaussian distribution, Σw=σwIN,INRepresenting an N-dimensional unit matrix, σw,σb>0。
Compared with the prior art, the invention has the following advantages: the problems of class imbalance, long training time and uncertain smooth hypothesis in the construction of a teaching effect prediction model in teaching evaluation can be solved, a special model adaptive to a new environment can be rapidly obtained by using a small amount of data in the new environment, and the method has certain self-adaptive performance.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
A scientific teaching effect evaluation method based on knowledge migration is shown in figure 1 and specifically comprises the following steps:
step 1: general evaluation model building
Collecting a large amount of data, and establishing a general model for teaching effect evaluation, which comprises the following steps:
step 101: collecting teaching content, mode and effect data, and establishing labeled sample setEstablishing label-free sample setp ∈ {1, 2.. eta., l }, q ∈ { l +1, l + 2.. eta., n }, where l denotes the total number of labeled samples, n denotes the total number of all samples, and sample x denotes the total number of samplesp,The system is a 15-dimensional vector, and the sample characteristics comprise teaching subjects, depth, writing on a blackboard or not, multimedia, teaching assessment mode, teacher gender, teacher age, teacher study history, average work time, corresponding number of people, interaction degree, attendance rate, student feedback, average score and language in class; y isp∈{χ1,χ2,χ3,χ4,χ5Denotes a label, which indicates the teaching effect, χ1,χ2,χ3,χ4,χ5Respectively poor, normal, good and good,representing a real number domain;
manually setting the maximum training times T>0 is a positive integer, the training time t is 0, the robustness factor mu epsilon (0,1) is manually set, and the loss coefficient C is manually set>0, manually setting a propagation coefficient alpha epsilon (0,1), manually setting the number N of hidden nodes to be a positive integer larger than 15, randomly generating N input weights w, wherein w is a 9-dimensional column vector, and obtaining w1,w2,...,wN(ii) a Randomly generating N input offsets b, wherein b is a real number to obtain b1,b2,...,bN(ii) a Let Lp=0;
Step 102: for the label-free sample set in step 101And (3) labeling a pseudo label, specifically as follows:
step 10201: randomly taking a value for the Gaussian bandwidth coefficient sigma >0, and establishing an affinity matrix W as follows:
wherein (W)ijAn element in row i and column j of W, i, j being 1, 2.
Step 10202: calculating a final label matrix F:
F=(1-α)(I-αS)-1Y
wherein, I is an identity matrix, D is a degree matrix of W, D is a diagonal matrix, and the ith diagonal element is Transmission matrixY is an initial label matrix whose elements are
Wherein r is 1,2,3,4, 5;is a sample xqWherein the pseudo tag of FqrIs the qth row and the r column element of the matrix F;
step 103: training teaching effect prediction classifier specifically as follows:
step 10301: sequentially taking r as 1,2,3,4 and 5, and training different types of chirCorresponding base classifierThe following were used: will be provided withAndthe middle label or the pseudo label is xrIs taken out of the sample of (1), nrThe number of the main components is one,obtaining a sample setFurther, the X type can be obtainedrClassifier
Wherein, z represents the number of samples,
wherein I is a unit matrix, C >0 is a loss coefficient, e is an n-dimensional unit column vector,
h(x)=[G(w1,b1,x),...,G(wN,bN,x)]T
for a non-linear mapping function, G (w, b, x) is an activation function,outputting a matrix for the hidden layer; offset thresholdWherein,in order to get the function of the integer downwards,
the function max2min (-) arranges the input sequence from large to small and outputs the arranged sequence,xi is the middle part of XiElement of mu e (0,1) is LuA rod factor;
step 10302: for that obtained in step 10301The integration is carried out as follows: inputting a sample z intoIf only one classifier outputs 1, the final classification result is the class corresponding to the classifier; for the other case, { ξ +r(z), the category corresponding to the minimum element in r ═ 1,2,3,4,5} is the final classification result;
step 10303: using the classifier integration method obtained in step 10302, for { xiI 1,2, a, n, and obtaining a corresponding prediction result { delta ═ di1,2,., n }, and let δ ═ δ1,...,δn]TThen, calculate Lc=δTL δ, whereinIs a Laplace graph matrix if Lc<LpOr t is 0, the current set of base classifiers is retained, i.e.Let Lp←LcIncreasing T by self 1, jumping to the step 2 if T is less than or equal to T, otherwise, stopping training and entering the next step;
step 104: derived based on trainingAnd predicting the teaching effect with the integration rule described in step 10302.
Step 2, establishing a special evaluation model
A small amount of data is collected, and a special model for teaching effect evaluation is established as follows:
step 201: collecting teaching content, mode and effect data, and establishing labeled sample sets∈{n+1,...,n+l2In which l2Representing the total number of marked samples collected in the special model for establishing teaching effect evaluationThe 15-dimensional vector, the sample features are consistent with the features involved in step 101; y iss∈{χ1,χ2,χ3,χ4,χ5Denotes a label, χ1,χ2,χ3,χ4,χ5Respectively poor, normal, good and good,representing a real number domain;
manually setting the maximum training times T2>0 is a positive integer, and the training time t is 0; manually setting migration trade-off coefficient gamma>0, order Op=0;
Step 202: based on training in step 1Trimming w1,w2,...,wNAnd b1,b2,...,bNObtaining an adjusted set of base classifiersBy usingPerforming integrated predictionsTo obtainIn thatHas an error rate of a e [0,1 ]](ii) a IntegrationThe prediction rules are as follows:
inputting a sample z intoIf only one classifier outputs 1, the final classification result is the class corresponding to the classifier; for other cases, then The category corresponding to the minimum element in the classification is the final classification result,is xir(z) adjusting w1,w2,...,wNAnd b1,b2,...,bNThe latter result;
step 203: the domain difference index Ω is calculated as follows:
step 204: calculating the current migration index OcΩ + γ a, where γ>0 is the migration trade-off factor, if Oc<OpOr t is 0, then letLet O bep←OcLet T increase by 1, if T is less than or equal to T2Jumping to step 202, otherwise, stopping training and entering the next step;
step 205: derived based on trainingAnd predicting the teaching effect with the integration rule described in the step 202.
Preferably, the activation function involved is:
Or
G(w,b,x)=cos(wTx+b)。
Preferably, the fine tuning method involved in step 202 is as follows:
let w after fine adjustment1,w2,...,wNAnd b1,b2,...,bNAre respectively asAnd the values are randomly taken on the distribution of the particles,randomly valued in its distribution, τ ∈ {1, 2.., N },represents a mean value of wτVariance ofThe distribution of the gaussian component of (a) is,represents a mean value of bτVariance ofOf Gaussian distribution, Σw=σwIN,INRepresenting an N-dimensional unit matrix, σw,σb>0。
In carrying out this patent, a characteristic value example is given below:
1. "teaching subjects": including languages, mathematics, foreign languages, etc., may be more refined subjects such as discrete mathematics, combinatorial mathematics, etc.
2. "depth": popularization, advancement and high grade.
3. "whether there is a board book": presence or absence.
4. "whether there is multimedia": presence or absence.
5. "teaching assessment mode": examination + attendance, reporting + attendance, attendance only.
6. "teacher gender": male, female, and others.
7. "teacher age": taking a positive integer.
8. "teacher study" in the past: doctor, Master, Benke, high school and below.
9. "average time of job": and taking a real number which is greater than or equal to 0.
10. "number of people in need": taking a positive integer.
11. "degree of interaction": active, general, no interaction.
12. "attendance rate": real number between 0 and 1, and the absent duty ratio calculation mode of each class refers to: the absenteeism rate in each class (the number of people in each class absent/the number of people in each class), which is the average of the absenteeism rate in each class.
13. "student feedback": excellence, median and difference, and their mode.
14. "average minute": taking an integer between 0 and 100.
15. "language in class": the language used in lessons includes Chinese, English, etc.
The teaching effect label in the sample set can adopt an expert evaluation mode.
In the implementation process, a large amount of data needs to be acquired for a general model for teaching effect evaluation, training is carried out, and the model can be directly used after training is finished.
When the method is applied to a new environment (such as a new school), the prediction performance of a general model is often poor, a special model for teaching effect evaluation can be obtained by migration on the basis of the general model, only a small amount of data needs to be collected at the moment, and the method can be directly used after training is completed.
The above examples are provided only for the purpose of describing the present invention, and are not intended to limit the scope of the present invention. The scope of the invention is defined by the appended claims. Various equivalent substitutions and modifications can be made without departing from the spirit and principles of the invention, and are intended to be within the scope of the invention.
Claims (3)
1. A scientific teaching effect evaluation method based on knowledge migration is characterized by specifically comprising the following steps:
step 1: general evaluation model building
Collecting a large amount of data, and establishing a general model for teaching effect evaluation, which comprises the following steps:
step 101: collecting teaching content, mode and effect data, and establishing labeled sample setEstablishing label-free sample setp ∈ {1, 2.. eta., l }, q ∈ { l +1, l + 2.. eta., n }, where l denotes the total number of labeled samples, n denotes the total number of all samples, and sample x denotes the total number of samplesp,The system is a 15-dimensional vector, and the sample characteristics comprise teaching subjects, depth, writing on a blackboard or not, multimedia, teaching assessment mode, teacher gender, teacher age, teacher study history, average work time, corresponding number of people, interaction degree, attendance rate, student feedback, average score and language in class; y isp∈{χ1,χ2,χ3,χ4,χ5Denotes a label, which indicates the teaching effect, χ1,χ2,χ3,χ4,χ5Respectively very poor, normal and goodSo that the utility model has the advantages that the material is good,representing a real number domain;
manually setting the maximum training times T>0 is a positive integer, the training time t is 0, the robustness factor mu epsilon (0,1) is manually set, and the loss coefficient C is manually set>0, manually setting a propagation coefficient alpha epsilon (0,1), manually setting the number N of hidden nodes to be a positive integer larger than 15, randomly generating N input weights w, wherein w is a 9-dimensional column vector, and obtaining w1,w2,...,wN(ii) a Randomly generating N input offsets b, wherein b is a real number to obtain b1,b2,...,bN(ii) a Let Lp=0;
Step 102: for the label-free sample set in step 101And (3) labeling a pseudo label, specifically as follows:
step 10201: randomly taking a value for the Gaussian bandwidth coefficient sigma >0, and establishing an affinity matrix W as follows:
wherein (W)ijAn element in row i and column j of W, i, j being 1, 2.
Step 10202: calculating a final label matrix F:
F=(1-α)(I-αS)-1Y
wherein, I is an identity matrix, D is a degree matrix of W, D is a diagonal matrix, and the ith diagonal element is Transmission matrixY is an initial label matrix whose elements are
Wherein r is 1,2,3,4, 5;is a sample xqWherein the pseudo tag of FqrIs the qth row and the r column element of the matrix F;
step 103: training teaching effect prediction classifier specifically as follows:
step 10301: sequentially taking r as 1,2,3,4 and 5, and training different types of chirCorresponding base classifierThe following were used: will be provided withAndthe middle label or the pseudo label is xrIs taken out of the sample of (1), nrObtaining a sample setFurther, the X type can be obtainedrClassifier
Wherein, z represents the number of samples,
wherein I is a unit matrix, C >0 is a loss coefficient, e is an n-dimensional unit column vector,
h(x)=[G(w1,b1,x),...,G(wN,bN,x)]T
for a non-linear mapping function, G (w, b, x) is an activation function,outputting a matrix for the hidden layer; offset thresholdWherein,in order to get the function of the integer downwards,
the function max2min (-) arranges the input sequence from large to small and outputs the arranged sequence,xi is the middle part of XiThe element is mu epsilon (0,1) as a robust factor;
step 10302: for that obtained in step 10301The integration is carried out as follows: inputting a sample z intoIf only one classifier outputs 1, the final classification result is the class corresponding to the classifier; for the other case, { ξ +r(z), the category corresponding to the minimum element in r ═ 1,2,3,4,5} is the final classification result;
step 10303: using the classifier integration method obtained in step 10302, for { xiI 1,2, a, n, and obtaining a corresponding prediction result { delta ═ di1,2,., n }, and let δ ═ δ1,...,δn]TThen, calculate Lc=δTL δ, whereinIs a Laplace graph matrix if Lc<LpOr t is 0, the current set of base classifiers is retained, i.e.Let Lp←LcIncreasing T by self 1, jumping to the step 2 if T is less than or equal to T, otherwise, stopping training and entering the next step;
step 104: derived based on trainingAnd predicting the teaching effect with the integration rule described in step 10302.
Step 2, establishing a special evaluation model
A small amount of data is collected, and a special model for teaching effect evaluation is established as follows:
step 201: collecting teaching content, mode and effect data, and establishing labeled sample sets∈{n+1,...,n+l2In which l2Representing the total number of marked samples collected in the special model for establishing teaching effect evaluationThe 15-dimensional vector, the sample features are consistent with the features involved in step 101; y iss∈{χ1,χ2,χ3,χ4,χ5Denotes a label, χ1,χ2,χ3,χ4,χ5Respectively poor, normal, good and good,representing a real number domain;
manually setting the maximum training times T2>0 is a positive integer, and the training time t is 0; manually setting migration trade-off coefficient gamma>0, order Op=0;
Step 202: based on training in step 1Trimming w1,w2,...,wNAnd b1,b2,...,bNObtaining an adjusted set of base classifiersBy usingPerforming integrated predictionsTo obtainIn thatHas an error rate of a e [0,1 ]](ii) a The integrated prediction rules are as follows:
inputting a sample z intoIf only one classifier outputs 1, the final classification result is the class corresponding to the classifier; for other cases, then The category corresponding to the minimum element in the classification is the final classification result,is xir(z) adjusting w1,w2,...,wNAnd b1,b2,...,bNThe latter result;
step 203: the domain difference index Ω is calculated as follows:
step 204: calculating the current migration index OcΩ + γ a, where γ>0 is the migration trade-off factor, if Oc<OpOr t is 0, then letLet O bep←OcLet T increase by 1, if T is less than or equal to T2Jumping to step 202, otherwise, stopping training and entering the next step;
3. The method for evaluating the scientific teaching effect based on knowledge transfer as claimed in claim 1, wherein the fine tuning method involved in the step 202 is as follows:
let w after fine adjustment1,w2,...,wNAnd b1,b2,...,bNAre respectively asAnd the values are randomly taken on the distribution of the particles,randomly valued in its distribution, τ ∈ {1, 2.., N },represents a mean value of wτVariance ofThe distribution of the gaussian component of (a) is,represents a mean value of bτVariance ofOf Gaussian distribution, Σw=σwIN,INRepresenting an N-dimensional unit matrix, σw,σb>0。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911259418.3A CN110992229B (en) | 2019-12-10 | 2019-12-10 | Scientific teaching effect evaluation method based on knowledge migration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911259418.3A CN110992229B (en) | 2019-12-10 | 2019-12-10 | Scientific teaching effect evaluation method based on knowledge migration |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110992229A CN110992229A (en) | 2020-04-10 |
CN110992229B true CN110992229B (en) | 2021-02-26 |
Family
ID=70091967
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911259418.3A Active CN110992229B (en) | 2019-12-10 | 2019-12-10 | Scientific teaching effect evaluation method based on knowledge migration |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110992229B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102686106B1 (en) | 2022-03-30 | 2024-07-19 | 한국과학기술기획평가원 | Method for analyzing impact of science and technology training |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10521734B2 (en) * | 2018-04-20 | 2019-12-31 | Sas Institute Inc. | Machine learning predictive labeling system |
US10599769B2 (en) * | 2018-05-01 | 2020-03-24 | Capital One Services, Llc | Text categorization using natural language processing |
CN109781411B (en) * | 2019-01-28 | 2020-05-19 | 西安交通大学 | Bearing fault diagnosis method combining improved sparse filter and KELM |
-
2019
- 2019-12-10 CN CN201911259418.3A patent/CN110992229B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110992229A (en) | 2020-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Purwanto | Education research quantitative analysis for little respondents: comparing of Lisrel, Tetrad, GSCA, Amos, SmartPLS, WarpPLS, and SPSS | |
Naser et al. | Predicting student performance using artificial neural network: In the faculty of engineering and information technology | |
CN112508334B (en) | Personalized paper grouping method and system integrating cognition characteristics and test question text information | |
Ozberk et al. | Investigating the factors affecting Turkish students' PISA 2012 mathematics achievement using hierarchical linear modeling | |
Šarić-Grgić et al. | Student clustering based on learning behavior data in the intelligent tutoring system | |
Beytekin et al. | Quality of Faculty Life and Lifelong Learning Tendencies of University Students. | |
Alves | Making diagnostic inferences about student performance on the Alberta education diagnostic mathematics project: An application of the Attribute Hierarchy Method | |
Tu et al. | A polytomous model of cognitive diagnostic assessment for graded data | |
Zhang et al. | Formative evaluation of college students’ online English learning based on learning behavior analysis | |
CN110992229B (en) | Scientific teaching effect evaluation method based on knowledge migration | |
Turhan et al. | Estimation of student success with artificial neural networks | |
Huang et al. | Developing argumentation processing agents for computer-supported collaborative learning | |
CN113934846A (en) | Online forum topic modeling method combining behavior-emotion-time sequence | |
Shapovalova et al. | Adaptive testing model as the method of quality knowledge control individualizing | |
Guldemond et al. | Group effects on individual learning achievement | |
Akdeniz et al. | Investigating individual innovativeness levels and lifelong learning tendencies of students in TMSC | |
Zhou | Research on teaching resource recommendation algorithm based on deep learning and cognitive diagnosis | |
Zeman et al. | Complex cells decrease errors for the Müller-Lyer illusion in a model of the visual ventral stream | |
Suniasih | The Effectiveness of Discovery Learning Model and Problem-Based Learning Using Animated Media to Improve Science Learning Outcomes | |
Shi | Building a Diversified College English Teaching Evaluation Model Using Fuzzy K-means Clustering in E-Learning | |
Komaravalli et al. | Detecting Academic Affective States of Learners in Online Learning Environments Using Deep Transfer Learning | |
Sun | A Comprehensive Evaluation Scheme of Students’ Classroom Learning Status Based on Analytic Hierarchy Process | |
Pagudpud et al. | Mining the national career assessment examination result using clustering algorithm | |
Chen | The Role of Information Convergence Technology in Reshaping the Multiple Directions of Ideological and Political Education in Colleges and Universities | |
McCallum et al. | Using data for monitoring and target setting: A practical guide for teachers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: Zone 50268, Zhongke Dadaozhen Building, No. 767 Yulan Avenue, High tech Zone, Hefei City, Anhui Province, 230088 Patentee after: Anhui Xinzhi Digital Technology Co.,Ltd. Address before: 230088 building 210-c2, A3 / F, Hefei Innovation Industrial Park, 800 Wangjiang West Road, high tech Zone, Hefei City, Anhui Province Patentee before: Anhui Xinzhi digital media information technology Co.,Ltd. |
|
CP03 | Change of name, title or address |