WO2003050782A1 - Exercise setting system - Google Patents

Exercise setting system Download PDF

Info

Publication number
WO2003050782A1
WO2003050782A1 PCT/JP2002/012811 JP0212811W WO03050782A1 WO 2003050782 A1 WO2003050782 A1 WO 2003050782A1 JP 0212811 W JP0212811 W JP 0212811W WO 03050782 A1 WO03050782 A1 WO 03050782A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
question
level
answer
exercise
Prior art date
Application number
PCT/JP2002/012811
Other languages
French (fr)
Japanese (ja)
Inventor
Makoto Ito
Keisuke Takaishi
Original Assignee
Hogakukan Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hogakukan Co., Ltd. filed Critical Hogakukan Co., Ltd.
Priority to AU2002361082A priority Critical patent/AU2002361082A1/en
Priority to JP2003551760A priority patent/JPWO2003050782A1/en
Publication of WO2003050782A1 publication Critical patent/WO2003050782A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Definitions

  • the present invention is a system for providing exercises to help students, such as various qualification examinations and university entrance examinations, to understand the contents of learning and to confirm the degree of understanding of the contents they have learned. Background art
  • the students In order to measure the level of understanding of the content learned and to further improve the content of the learning, the students repeatedly solve exercises related to the completed learning content and take practice tests. In particular, if a student is going to take a qualification test or the like, he or she may solve a collection of questions that imitate the contents of this test, or questions that have been asked in the past in the test, so-called past questions. Then, when you solve the collection of questions, you score yourself using the set of answers to the collection of questions.
  • the task of the present invention is to solve the exercise problem according to the learner's purpose, such as to repeatedly learn the range that has been completed and to judge the ability at that point in time.
  • the purpose is to provide an exercise question setting system that only sets the questions. Disclosure of the invention
  • a first invention is a display, a storage unit for storing user information and problem information, an input unit for inputting information, and a process for specifying a problem based on the input information of the input unit and the problem information of the storage unit.
  • the user information stored in the storage unit includes a capability level of a user.
  • the problem information includes a problem itself and attribute information of the problem.
  • the attribute information of the problem includes
  • the processing unit includes the difficulty level of the question and the standard answer time set for each question, and the processing unit calculates the sum of the standard answer time of each question from among the questions corresponding to the user's ability level. It is characterized in that it has a function to select questions within the specified desired exercise time.
  • the user's ability level may be determined in any manner. For example, it may be determined automatically based on the student's correct answer rate or the grade of a specific group of questions, or the user may declare it himself. This level can be set in any number of levels.
  • the second invention includes a user terminal, and the processing unit displays the selected question on the user terminal, and calculates an answer rate and a correct answer rate for the answer to the question input from the user terminal. Function and the calculated answer rate and correct answer rate And a function of setting a capability level of the user based on the information and storing the capability level in the storage unit as user information.
  • the third invention is characterized in that the processing unit has a function of matching the number of selected questions with the number of desired exercises when the number of desired exercises is input from the user's terminal. Have.
  • the question ratio for each difficulty level according to the user's ability level is stored in the storage unit, and the processing unit selects the question based on the question ratio corresponding to the user's ability level.
  • the feature is that it has a function.
  • the fifth invention is characterized in that the question attributes include any one or more of a question type classification, a question range classification, and an importance level.
  • the above question category corresponds to the curriculum, and is like a table of contents of a textbook used by a user who is a candidate for a specific test.
  • a sixth aspect of the invention is characterized in that the processing unit determines the difficulty level of the question based on the average correct answer rate of the total user for each question and stores the difficulty level as the question information.
  • the seventh invention is characterized in that a range in which the correct answer rate of all users for each question is equal to or higher than a predetermined reference value is set to a plurality of difficulty levels.
  • the eighth invention has a confidence input means for inputting the user's confidence in the answer together with the answer to the question, and the processing unit has a function of associating the answer with the confidence in the storage unit. It is characterized by points.
  • the ninth invention is characterized in that the processing unit has a function of displaying, as a list, the correctness of each answer and the user's confidence as a user's exercise result.
  • the tenth invention is characterized in that the processing unit has a function of extracting a problem based on the degree of confidence.
  • the storage unit stores the trainer's comment associated with the condition according to the correctness of the answer and the confidence level
  • the processing unit stores the correctness / wrongness and the confidence level of the exercise result of the user and the above condition.
  • the feature is that it has a function to extract the corresponding trainer comment in comparison with the above.
  • the processing unit calculates the accuracy rate, which is the ratio of the number of correct answers to the number of correct answers by the user with respect to the number of correct answers, for each difficulty level of the problem as a result of the user's training.
  • the feature is that it has the function of calculating and displaying it.
  • the thirteenth invention is based on the fifth invention and is characterized in that the importance level of a problem with a low difficulty level is set to a high level.
  • FIG. 1 is a diagram showing the overall configuration of the system of the first embodiment.
  • FIG. 2 is a flowchart showing a procedure for using the system of the first embodiment.
  • FIG. 3 is an answer time ratio table according to the difficulty level of the answer time according to the first embodiment.
  • FIG. 4 is a diagram illustrating the overall configuration of the system according to the second embodiment.
  • FIG. 5 is a diagram showing a comment table of the second embodiment.
  • FIG. 6 is a flowchart showing a procedure for using the system of the second embodiment.
  • FIG. 7 is a diagram showing the configuration of the result display screen of the second embodiment.
  • FIG. 8 is a diagram showing a result table of the second embodiment.
  • FIG. 9 is a diagram showing an answer table of the second embodiment.
  • FIG. 10 is a diagram showing a question selection screen according to the second embodiment.
  • FIG. 11 is a diagram showing a question screen of the second embodiment.
  • FIG. 12 is a table showing the user's exercise results of the second example.
  • FIG. 13 is a table showing the answers and the degrees of confidence of the users of the second embodiment.
  • the system of the first embodiment shown in FIGS. 1 to 3 includes a management center 5 and a plurality of user terminals 1 connected via a communication network N such as an Internet.
  • the management center 5 is a provider of exercises, such as a school or a prep school.
  • the above-mentioned user is a member of the system and a learner.
  • the user-side terminal 1 is a personal computer or the like, and the user-side terminal 1 includes the display of the present invention and an input unit.
  • the management center 5 has a problem database 2 for storing problem information, and a It is provided with a user data base 3 for storing the data and a processing unit 4 for processing the data.
  • the problem database 2 and the user-database 3 correspond to the storage unit of the present invention.
  • a server having the functions of the databases 2 and 3 and the processing unit 4 may be installed in the management center 5. However, it is not necessary to install the processing unit 4 and the above-mentioned data bases 2 and 3 integrally in the control center 5.
  • the server provided with the processing unit 4 and the database may be installed separately. In short, it is only necessary to provide the processing unit 4 and the databases 2 and 3 capable of exchanging data with the processing unit 4.
  • the user can use this system by an input unit or a display directly connected to the processing unit 4 of the management center 5.
  • the exercise database 2 stores exercise questions. These are questions for various tests, for example, for bar exams, patent attorney exams, university entrance exams, and corporate training.
  • a question database 2 may be provided for each type of test, or each question may be stored in a single database with the attribute “ ⁇ test”.
  • Each question is associated with the standard answer time required to solve the question, the difficulty level of the question, the importance level, the correct answer and the content description, and so on.
  • the standard answer time mentioned above may be a time determined based on the time the user of this system actually answered, but it must be solved in this way to pass the actual test. It is such a time.
  • the difficulty level is a measure of how difficult it is for a user to get a correct answer.
  • the decision is based on the percentage of correct answers to the question. For example, a problem with a high correct answer rate for all answerers can be defined as a problem with a low difficulty level, and a problem with a low correct answer rate for all answerers can be defined as a problem with a high difficulty level.
  • the importance level is the level of importance for the purpose of improving the user's ability, and what is important for that purpose. Can be freely set in the system. For example, questions that are predicted from the frequency of questions in past tests are determined according to the likelihood of being taken on the test, determined based on the difficulty of the question, or are basic or applied. Can be determined based on various criteria.
  • a difficulty level or importance level helps the user to perform efficient learning. For example, a less difficult question can be answered correctly by anyone, so at least that should be possible, but a high degree of difficulty that most people cannot answer The problem may be deferred.
  • learning time when learning time is limited, it can be a criterion for determining the order of learning, such as tackling the most important issues first.
  • the question attributes include a formal classification of the questions and a content classification.
  • the above-mentioned formal classification is a way of giving questions or giving answers. For example, a selection formula that selects the correct answer from multiple answer examples, a sorting formula that sorts the sentences, a fill-in formula that fills in the blanks in the question, and the number of correct or wrong There is a number formula to be answered.
  • the content categories for the above questions include exam types, subjects, and ranges according to the curriculum for each subject. Specifically, when the type of examination is a bar examination, the subjects are “Constitutional”, “Civil law”, “Criminal law”,... etc., and the scope is “People's rights in the Constitution” These are “duties” and “diet”, and more specifically, “people who enjoy human rights”, “duties of tax payment”, and “property rights” in “rights and duties of the people”.
  • the setting of the range and the manner of the above-mentioned formal classification can be freely determined by the management center 5 side, but use a classification or expression that is easy to understand for a specific test, for example, a user who studies a bar exam. Is preferred.
  • the user database 3 stores, as user information, learning history when the user has learned using the system of the present invention, in addition to basic attributes such as name, address, age, and gender.
  • This learning history can include when and which questions were asked, the contents of the user's answers to those questions, and their correctness, correctness rate, and score. Also, remember the user's ability level.
  • a user ID and a password are set for the user, and the user is identified by the I password.
  • step S1 the user accesses the home page of the exercise system from his / her terminal 1.
  • the user ID and the password are input (step S2).
  • step S3 the processing unit 4 identifies the user in the user data base 3 based on the input user ID and password, and determines whether or not the user's ability level is stored in the user. . If the user's ability level is stored, the process proceeds to step S7, but if not, the process proceeds to step S4.
  • step S4 a display is made to indicate that the capability level is not stored in the user terminal 1 and to prompt self-reporting of the capability level.
  • the user side terminal 1 notifies the user that he does not make a self-declaration, and in step S5, the processing unit 4 sets the level B as the capability level of the user. And proceed to step S7.
  • step S4 If the user self-reports his / her Nokarepelle in step S4, the process proceeds to step S6, enters the report level, and proceeds to step S7.
  • the processing unit 4 automatically performs the initial declaration.
  • the level may be set. This initial level is of course not limited to B.
  • A, B, and C are set as the user ability levels, where A corresponds to the elementary level, B corresponds to the intermediate level, and C corresponds to the advanced level.
  • step S7 the user inputs a desired exercise time T.
  • the desired exercise time T is the time available for the user to solve this exercise.
  • the processing section 4 refers to the answer time ratio table shown in FIG.
  • the answer time ratio table in Fig. 3 is a table that determines the ratio of the difficulty levels in the questions to be asked according to the user's ability level.
  • the difficulty levels are set as a, b, and c, and a, b, and c are set in ascending order of difficulty.
  • the ratio for each difficulty level is the ratio of the answer time.
  • the question ratio for each difficulty level of the question is calculated according to the user's desired exercise time and the user's ability level.
  • the question rate for each difficulty level is adapted to the answer time, but may be adapted to the number of questions.
  • the question number ratio table instead of the answer time ratio table in Fig. 3 to determine the question rate. For example, for a user with ability level C, three out of ten questions are considered as difficulty a, five as difficulty b, and two as difficulty c. is there.
  • step S9 the above-described solution time T a, Tb, T c for each difficulty level and corresponding questions are extracted. That is, from the questions of the difficulty level a, a question is extracted in which the total of the standard answer times is substantially equal to the answer time T a for each difficulty level. If it is not possible to extract a question so that the total of the above standard answer times coincides with the answer time Ta for each difficulty level calculated in step S8, the total of the above standard answer times ⁇ T a .
  • the method of extracting the problem may be any method, such as using a random number table, but it is preferable that the same question is not repeated for the user. For that purpose, it is necessary to memorize the question numbers etc. that have been set for each user. In the same way, problems of difficulty level b and difficulty level c are extracted, and they are determined as question problems.
  • the total standard answer time of the determined question is within the user's desired exercise time T.
  • step S10 the number of questions set above is added to the number of questions set for the user in the past to obtain a cumulative number.
  • step S11 the processing unit 4 causes the user-side terminal 1 to display the above-mentioned question. From this point, the user starts to solve the problem, and the processing unit 4 starts measuring time in step 12.
  • the time limit is the sum of the standard answer times for the questions, but this time is within the user's desired exercise time T.
  • step S13 the user solves the problem, and writes the answer in the answer field on the display of the user-side terminal 1.
  • the processing unit 4 determines whether the time is over, that is, whether the time limit has been reached. If the time is over, the process proceeds to step S16, and the problem display is erased from the display of the user terminal 1. If the time is not over in step S14, the process proceeds to step S15, and it is determined whether the user submits the answer. Users can submit their answers within the time limit. For example, if the submit button displayed on the display is clicked, the process proceeds to step S16, and the display of the problem disappears from the display. If the user does not click the submit button, the question will be displayed and the user will be able to enter the answer until the time expires in step S14.
  • the display of the question is erased from the display of the user terminal 1 so that the question cannot be solved after a time-out or when the user selects the answer submission.
  • step S16 when the problem disappears from the user terminal 1, the user's answer is input to the processing unit 4 in step S17.
  • step S18 the processing unit 4 scores the above answer, and calculates the answer rate and the correct answer rate.
  • the above answer rate is the ratio of the number of answers to the number of questions, and the correct answer rate and Is the ratio of the number of correct answers to the number of questions.
  • step S19 it is determined whether the answer rate is 90% or more. If the answer rate is less than 90%, the process proceeds to step S20 and the user ability level is demoted. For example, lowering from level C to B and from B to A.
  • reviewing the user ability level based on the answer rate is based on the following reasons.
  • the answer rate is low, it means that there is an unanswered question. If you do not answer it, you judge it as a problem that you can not solve just by seeing it, and if you skip that question or take time to do other problems, spend time on that problem. There are cases where it could not be broken. Or, you may have lost your motivation on the way and submit it without thinking up to the time limit. In any case, it is possible to judge that the ability of the user as an examinee is insufficient.
  • the above 90% criterion should be set to an appropriate value according to the learning content and the passing level of the examination.
  • step S20 the process proceeds from step S20 to step S23, and the results such as the demoted user's ability level, scoring result, correct answer rate, answer rate, etc. are displayed.
  • step S21 it is determined whether the cumulative value of the questions has reached a predetermined value. If the cumulative value of the questions does not reach the predetermined value, the process proceeds to step S23 to display the result including the current user ability level.
  • step S22 the process proceeds to step S22 to review the user ability level.
  • the user's ability level is reviewed based on the user's correct answer rate previously calculated. Set the correct answer rate for qualifying as No. A or B in advance.
  • the Noh level should be set according to the current skill level together with the accuracy rate of this exercise. In other words, when a user of the current ability level A reaches the predetermined accuracy rate, it may be recognized as level B or C. Conversely, if the accuracy rate is less than the predetermined accuracy rate, the user of the skill level B or C may have the skill level A.
  • This step The ability level at that time may be promoted or demoted. Note that the ability level set here may be changed again depending on the results of the next and subsequent exercises.
  • the ability level is set in consideration of the correct answer rate only when the cumulative value of the questions has reached the predetermined value in step S21. This is because, for example, even if you give a very high accuracy rate only once, if you immediately raise your ability level, the next time you make it difficult to answer questions, the accuracy rate will drop immediately. This is to prevent the occurrence of eels. We try to reset the ability level only for users who have practiced to some extent.
  • the reference value of the cumulative question value in step S21 described above can be freely set by the system, or may not be provided.
  • step S23 the processing unit 4 displays the result of this exercise on the user-side terminal 1, and in step S24, the processing unit 4 converts the input data and the calculation result in the above steps.
  • All are stored as user information.
  • the exercise information is stored in the user information of each user as to when and which exercise questions were asked and what was answered, as well as the answer rate and correct answer rate.
  • the user can perform the exercises according to his / her level in a test format within the time desired by the user.
  • the learning history data such as the performance of each user stored in the user data base 3 can be used as reference material when the user receives consultation for an examination.
  • Each question can also be stored with the correct answer rate for that question. This accuracy rate is based on the answers of all users who have given the question. Since the difficulty level of the question can be measured based on the correct answer rate, for example, the correct answer rate may be calculated periodically and the difficulty level of the question may be reset. In this way, you can set the difficulty level that matches the actual situation.
  • the user inputs only his / her desired exercise time.
  • the user can input his / her request such as the classification of the problem and the number of questions.
  • These can also be used as extraction conditions for questions to be set.
  • the range of questions, the question format, the importance level, and so on any attribute that is stored in association with the question as a question attribute can be used as the question extraction condition. Wear.
  • Such a user's desired condition may be input at any time in a step before the problem display in step SI1.
  • the management center 5 is provided with a comment database 6.
  • the comment database 6 stores comments extracted by the processing unit 4 using the user's grade as an extraction condition.
  • the same components as those in the first embodiment of FIG. 1 are denoted by the same reference numerals.
  • the above-mentioned degree of confidence is the degree of confidence the user has selected the answer.
  • the confidence level when answering with a certainty that the answer is correct is uncertain, but the confidence level when selecting the answer is considered to be probably the correct answer. Instead, let's express the degree of confidence when answering with a guesswork by X. Then, the user inputs these confidence levels in association with each answer.
  • the degree of confidence is not limited to three levels as described above, but can be set arbitrarily.
  • Comment Table 7 in Fig. 5 Examples of comments recorded in the comment database 6 above are shown in Comment Table 7 in Fig. 5. The details will be described later, but as shown in the comment table 7 in Fig. 5, the display comment in the right comment column 7a is associated with the condition for selecting each comment described in the left condition column 7b.
  • the comment database 6 memorizes it. Then, the processing unit 4 extracts a comment that meets the conditions based on the data in the comment table 7, and causes the user-side terminal 1 to display the comment.
  • step S 101 the user enters an exercise system from his / her own terminal 1. Access the system homepage.
  • Steps S101 to S106 are the same as steps S1 to S6 of the first embodiment (see FIG. 2), and a description thereof will be omitted.
  • step S107 the processing unit 4 of the management center 5 displays the past performance of the user on the user-side terminal 1 as the performance display screen 8 of FIG.
  • the grade display screen 8 is provided with a training grade section 9 for displaying the previous exercise grade, a comment section 10 from the trainer, and an answer section 11 for the questions set in the previous exercise.
  • the result table 12 shown in FIG. 8 is displayed.
  • This grade table 12 shows the results of compiling the grades according to the importance of the questions.
  • the importance of the problem corresponds to the difficulty level of the present invention.
  • five levels a to e are set, and an importance column 12 a is provided at the top of the grade report 12.
  • the level of difficulty is lower, and the issues are more important.
  • the problem of importance a is the most important and the difficulty is low.
  • Such a setting can be freely set in the system, or the importance may be set based on a difficulty and another criterion.
  • problems of low difficulty were set to high importance for the following reason.
  • tests aimed at reducing the number of students failing to answer questions that are easy and can be answered correctly by many students will be fatal to those candidates. Therefore, before studying difficult problems that no one could solve, I thought it was important to master the less difficult problems.
  • the importance setting in the second embodiment is that it is more important to correctly answer a problem with low difficulty and a high accuracy rate than to answer a problem with high difficulty instability. Based on the idea.
  • this system aims to develop the ability to give stable and correct answers preferentially from less difficult questions.
  • the level of importance of the above questions is determined by the correct answer rate of all users of this system. For example, here, the correctness rate of the question of importance a is 80% or more, the question of importance b is 60% or more and less than 80%, and the correctness rate is 50% More than 60%, importance d is 30% or more and less than 50%, and importance e is less than 30%.
  • the questions classified according to the correct answer rate of the above-mentioned total users are regarded as the issues of importance corresponding to the classification.
  • the correct answer rate of the total users for setting the importance of the question is displayed in the column 12b provided below the importance column 12a.
  • the correct answer rate of the total users is the ratio of the number of correct answers to the number of all users who solved the problem.
  • the correct answer rate is calculated each time a grade is evaluated, and the importance of each question is determined based on the calculated rate. Therefore, the degree of importance of each problem may change ⁇ times, but in practice, the variance of the results often decreases as the number of operations increases.
  • Columns 12d, 12e, and 12f are columns that display the degree of confidence in the correct answer rate.
  • the value displayed in the ⁇ proportion ⁇ 1 2 d is the ratio of the number of correct answers to the number of correct answers plus the degree of confidence ⁇ .
  • the rate ⁇ is the ratio of the degree of confidence ⁇ ⁇ ⁇ ⁇ to the number of correct answers
  • the rate X is the ratio of the degree of confidence X to the number of correct answers.
  • Each is calculated according to the importance of the problem.
  • the accuracy rate is the rate of correct answers with certainty, and corresponds to the accuracy rate of the present invention. Users can judge that their ability has been gained by not only the overall rate of correct answers but also the high rate of correct answers.
  • the ⁇ rate and the X rate are the percentages that answered correctly even though they were not confident, so if these values are high, it is necessary to check the lack of confidence and determine that it is necessary to gain confidence. Can also.
  • the processing unit 4 displays the comment extracted from the comment table 7 in FIG.
  • the processing unit 4 displays the comment extracted from the comment table 7 in FIG.
  • the rank such as a rank is the importance of the problem.
  • the answer time is the standard answer time set in this system. Exercising at 80% and 90% of the time means that the exercise to solve the problem is shorter than the standard answer time. That is to do.
  • the answer table 13 shown in FIG. 9 is displayed in the answer column 11 of the grade display screen 8 in FIG.
  • This answer table 13 has a question number column 13a, a correct error column 13b that displays the correctness of the user's answer for each question, an answer column 13c that displays the user's answer, and a correct answer column that displays the correct answer. With 1 3d. Furthermore, when the display position of each problem number is clicked, a description of the problem is displayed in a window (not shown).
  • the grade display screen 8 as described above is of course not displayed, and can be prevented from being displayed even when the number of questions actually solved is small.
  • This system allows the user to practice as much as possible, even if the time available for the exercise is short.However, if the number of answers is too small, even if the total is evaluated and the grade is evaluated, it is meaningful.
  • the minimum number of answers to be evaluated is set in advance in order to evaluate results. After the total number of answers exceeds the minimum number of answers, the past exercise results are evaluated by going back to the number of answers including the latest answer and exceeding the minimum number of answers.
  • step S107 of Fig. 6 when the past grades as described above are displayed, in step S108, the user refers to the results and selects the next exercise to be performed.
  • step S109 a question selection screen 14 shown in FIG. 10 is displayed.
  • step S109 the user inputs necessary items to the question selection screen 14.
  • the question selection screen 14 is provided with a plurality of input fields for inputting desired conditions for selecting a question. Displays the user's ability level in addition to the time field 14 a for inputting the time that can be practiced, the desired question number field 14 b for inputting the number of desired questions, etc.
  • Ability level column 14c a question type column 14d for inputting the question format described in the first embodiment, a time ratio column 14e for inputting the answer time ratio, a confidence level column 14f,
  • the answer result column is 14 g.
  • the capability level of the user is automatically set to the ability level el14c. It is not a desired input field because it is displayed on the screen. However, users may be able to modify their ability level from this screen.
  • the above time ratio column 14e is a column used when setting the time limit for the exercise as a percentage of the standard time.
  • the confidence level 14 f is a field for inputting the confidence level when the data of the confidence level given to each answer in the previous exercise by the user is used as the extraction condition of the question.
  • 4h is a field to enter the importance of the problem you want to practice.
  • step 110 the processing unit 4 sets the items input from the question selection screen 14 and the exercise history of the user stored in the user database 3 as extraction conditions, and sets the questions from the question database 2. Identify the problem. Then, in step S111, the cumulative number of questions is calculated.
  • each question number has a question display column 15a, answer display column 15b, answer selection column 15c, and confidence input column 15d.
  • reference numerals 16 and 17 are check boxes for the user to check the selected answer and the degree of confidence.
  • the processing unit 4 starts time measurement in step S113, and waits until the time is over in step S115 or until the question is submitted in step S116, and the question question screen 1 is displayed. 5 is displayed. Meanwhile, in step S114, the user solves the problem and enters his confidence. In step S117, the display of the problem is stopped, and the exercise is completed. In step S118, the answer and the degree of confidence selected by the user are input to the processing unit 4.
  • step S119 scoring is performed and the answer rate is calculated.
  • the scoring is to judge the correctness of the answer of the exercise performed this time, but if the number of exercises this time is small, the correct answer rate is calculated including the answer result of the past exercise There is also.
  • step S120 it is determined whether the answer rate is 90% or more, and if the answer rate is less than 90%, the process proceeds to step S121, in which the ability level is demoted and the step S125 is performed. Proceed to.
  • step S122 determines whether the cumulative number of questions has reached the predetermined value, and if not, go to step S125. Go to step S 1 2 3 if it has been achieved.
  • step S123 the ability level is reviewed according to the accuracy rate, and updated if necessary.
  • step S124 a grade table is created.
  • the report created here is Table 12 shown in Fig. 8 that includes the results of this exercise.
  • step S125 the answer table 13 shown in FIG. 9 and comments are extracted.
  • the condition for extracting the No. 1 comment in Fig. 5 is "N1 or more (correct, false) occur in the a-rank problem".
  • This is a condition that, for a question of importance a, the result of this answer is incorrect, that is, “wrong” and there were N1 or more that were “correct” in the previous time.
  • data as shown in FIG. 12 stored in the user database 6 as an exercise history is used.
  • Table 18 in Figure 12 tabulates the results of a particular user's exercise. Table 18 associates each question number with the importance of the question, the correctness of the user's answer, and the confidence.
  • the previous and previous column 18c show the results of the previous and previous times Has been done.
  • the last two times before means the past and the last time before this time
  • the answer results in the last column 18b correspond to the questions set in the same exercise.
  • the results in column 18c do not necessarily correspond to the questions set out in the same exercise.
  • the processing unit 4 when the processing unit 4 creates the grade table 12 and the answer table 13 and extracts comments, the processing unit 4 displays them on the user terminal 1 in step S126.
  • the result display screen for displaying these is almost the same as the grade display screen 8 in FIG. 7, but the result display screen displayed here is “this time”, whereas FIG. Is displayed.
  • step S121 the case where the demotion is performed in step S121 and the case where If the cumulative number of questions does not reach the predetermined value in step S122, a new grade sheet 12 in Fig. 8 is not created, but even in such a case, the answer table 13 (See Figure 9) and display the comment. If the ability level has been updated, that may be displayed as a comment.
  • Table 19 shows the number of questions answered confidently, the number of correct answers, the number of incorrect answers, and the number of questions without confidence in 50 questions, regardless of the level of importance of the questions. Are shown at a glance. Not only yourself, but also advisors who provide study consultation can provide guidance using such data.
  • step S127 all the results are stored in the user database 3, and the process ends.
  • the difficulty level of the question corresponding to the importance is determined based on the correct answer rate of the total user. Therefore, the difficulty level and the importance are determined by the user. It has become something which was in line with his ability.
  • 50% or more that is, questions that can be correctly answered by more than half of the users are set as medium difficulty, and questions with a higher correct answer rate are set for a plurality of difficulty levels a to c.
  • the total correct answer rate of 50% of the users corresponds to the preset reference value of the present invention. In this way, if you set the middle level and higher levels to multiple levels, you will have to master the problems that are difficult for the user to be less difficult than the medium level and work on each level step by step be able to.
  • the user terminal 1 remains connected to the processing unit 4 of the management center 15 from when the user accesses the homepage until the exercise result is displayed. However, when the processing unit 4 determines a question to be set, it may be downloaded to the user-side terminal 1 and disconnected.
  • steps S12 to S16 in the flowchart of FIG. 2 are processed only by the user-side terminal 1.
  • steps SI 13 to SI 17 in the flowchart of FIG. 6 are processed only by the user terminal 1.
  • connection with the processing unit 4 is cut off while the exercise problem is solved on the user-side terminal 1, there is an advantage that no communication cost is required.
  • the management center 5 becomes unnecessary.
  • the user database 3 may store the user data of only the user corresponding to the terminal 1.
  • the management center 15 manages the problem database 2 as in the first and second embodiments described above, the addition and deletion of a problem and the change of attributes can be performed by the management center. It can be simplified by the convenience of the 5 side.
  • the processing unit 4 performs the processing such as the results of a plurality of users aiming for the same kind of test and data such as levels. Can also be counted.
  • the daily exercise may be performed only by the user's one terminal 1 and the data may be periodically transmitted to the management center 5.
  • the user can perform the exercises suitable for his / her level in the test format at any time within the time desired by the user. Therefore, the user does not need to prepare many question books or search for a problem that suits his or her level. In this way, you can easily solve problems that suit your level, so that you can learn more efficiently, for example, to pass an exam.
  • the user can answer the exercise problem using the user terminal.
  • the ability level is set according to the answer rate and the correct answer rate of the user, it is possible to set the ability level of the user according to the user.
  • the user can solve as many problems as he wants in the time that he can use.
  • the user's answer rate and correct answer rate are stored in the storage unit, they can be used as the learning history of each user when the user receives consultation for an examination.
  • the question can be selected according to the question format, the question range, and the importance level.
  • the question format By specifying the question format, the user can exercise in the desired format.
  • the question range By specifying the question range, the user can exercise according to his or her own learning speed.
  • specifying the level of importance enables more efficient exercises.
  • the difficulty level of the problem can be set according to the actual situation.
  • the seventh invention it is possible to set a problem at a level that matches the ability level of the user.
  • users will be able to gradually step up from familiar goals when mastering issues.
  • the user's confidence in answering each individual answer can be grasped. From the confidence and the answer, the user's learning weakness, learning posture, or personality can be inferred. Therefore, the user It can also be used as a reference when the person or advisor, etc., considers how to proceed with learning in the future.
  • the user can grasp the answer and the degree of confidence at a glance.
  • a problem with a specific degree of confidence can be repeatedly set. For example, if you increase the user's confidence by practicing questions with low confidence, you will get stable and good results.
  • the eleventh invention it is possible to give a more useful comment to the user in consideration of the user's confidence.
  • the correct answer rate with confidence can be understood. From this response rate, the user can estimate how stable his / her current exercise results are.
  • the thirteenth invention it is based on the idea that it is more important to correctly answer a question with a low difficulty level and a high accuracy rate than to answer an unstable question with a high difficulty level. Importance can be set. As a result, it is possible to train users to be able to give stable and correct answers preferentially from less difficult problems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

An exercise setting system for setting a number of exercises for review or test which are to be solved within the time available for a learner. The system includes a display, storage units (2, 3) for storing user information and question information, an input unit for inputting information, and a processing unit (4) for specifying a question according to the input information of the input unit and the question information of the storage unit. The user information to be stored in the storage unit includes a user ability level, and the question information includes a question itself and attribute information of the question. The question attribute information includes the difficulty level of each question and a standard answering time set for each question. The processing unit has a function for selecting questions from questions of the user ability level, so that the total of the standard answering times is within the time available for the user.

Description

明細書  Specification
演習問題出題システム . 技術分野 Exercise problem questioning system. Technical field
この発明は、 各種資格試験や大学入試などの受験生などが、 学習内容を理解し たり、 自分が学習した内容の理解の度合いを確認したりするのに役立つ演習問題 出題システムである。 背景技術  The present invention is a system for providing exercises to help students, such as various qualification examinations and university entrance examinations, to understand the contents of learning and to confirm the degree of understanding of the contents they have learned. Background art
学習した内容の理解度合いを測り、さらに学習内容を自分のものにするために、 履修済みの学習内容に関する演習問題を繰り返し解いたり、 模擬試験を受けたり する。 特に、 資格試験などを受験しょうとしている学習者の場合には、 本試験の 内容を模した問題集や、 その試験で過去に出題された問題、 いわゆる過去問を解 くこともある。 そして、 問題集を解いたら、 その問題集の解答集を用いて自己採 点する。  In order to measure the level of understanding of the content learned and to further improve the content of the learning, the students repeatedly solve exercises related to the completed learning content and take practice tests. In particular, if a student is going to take a qualification test or the like, he or she may solve a collection of questions that imitate the contents of this test, or questions that have been asked in the past in the test, so-called past questions. Then, when you solve the collection of questions, you score yourself using the set of answers to the collection of questions.
また、 公開模試や、 試験を仮定しての在宅テストのため、 厳密な時間制限を行 うものとして、インターネッ ト等を介して出題するシステムも考えられている(特 開 2 0 0 1— 2 5 6 3 1 4号公報参照)。  In addition, a system in which questions are set via the Internet or the like as strict time limits for public mock exams or home tests assuming examinations has also been considered (Japanese Patent Publication No. No. 56 31 4).
上記のように、 問題集を利用して演習問題を解く場合には、 自分の目的に合わ せて、 解く問題を選択することができる。 例えば、 大学入試の受験生が、 自分の 学習の進渉にあわせて、 履修済みの範囲の問題だけを選択することもできるし、 実際の入試のように、 あらゆる範囲の問題を選択することもできる。 ただし、 そ の問題集の演習問題が、 学習者の履修力リキュラムに対応した形で編集されてい ない場合には、 自分の履修済みの問題を選択するのはかなり面倒であった。  As mentioned above, when using the collection of questions to solve exercises, you can select the questions to be solved according to your own purpose. For example, a college entrance exam student can select only the range of questions that have been taken as the student progresses, or can select the entire range of questions, as in the actual entrance exam. . However, if the exercises in the book were not compiled in a way that would correspond to the student's curriculum, it would have been quite cumbersome to select one that had already been taken.
また、 学習者によって、 得意な範囲や、 不得意な範囲があり、 得意な範囲より 不得意な範囲の演習を何回もやらなければならないのが通常である。 しかし、 同 じ問題を繰り返し解いていたのでは、 内容を理解しなくても、 答えを覚えてしま うというようなことが起こる。 このようなことを回避するためには、 何種類もの 問題集を用意しなければならないが、 それでは、 受験生の経済的負担が大きい。 しかも、 どの問題集も、 自分の不得意とする範囲からだけ、 問題が出題されてい るわけではなく、 その学習者にとってあまり必要としない問題も含まれている。 そのため、 用意する問題集が多くなればなるほど、 利用しない部分も多くなつて 無駄にしてしまうことになる。 Also, depending on the learner, there are ranges that are good at or weak at, and it is usual for the students to have to practice more often than at what they are not good at. However, if you solve the same problem repeatedly, you may learn the answer without understanding the contents. In order to avoid such a situation, it is necessary to prepare many kinds of question books, but the economic burden on the examinees is large. In addition, not all questions are questioned only from the extent to which they are not good at themselves, but also include questions that are rarely necessary for the learner. Therefore, the more questions you have, the more parts you will not use, and you will waste them.
一方、 自分の実力をテス トするつも りで、 テスト形式の問題集を解く場合や、 上記特許文献 1のように、 在宅テストを行う場合には、 テス ト時間を守らなけれ ば実力を正確に把握することはできない。 そのため、 学習者は、 一定の時間が有 るときにしか、 テスト問題を解くことができないという不便があった。  On the other hand, if you intend to test your ability and solve a collection of test-type problems, or if you perform a home-based test as in Patent Document 1 above, if you do not observe the test time, your ability will be accurate. I can't figure out. Therefore, there was an inconvenience that the learner could solve the test questions only when there was a certain amount of time.
この発明の課題は、 履修済みの範囲を繰り返し学習するためや、 その時点での 実力を判定するためなど、 学習者の目的に合わせた演習問題を、 学習者の持ち時 間内に解くべき数だけ出題する演習問題出題システムを提供することである。 発明の開示  The task of the present invention is to solve the exercise problem according to the learner's purpose, such as to repeatedly learn the range that has been completed and to judge the ability at that point in time. The purpose is to provide an exercise question setting system that only sets the questions. Disclosure of the invention
上記の課題を解決するために、 コンピュータ発明は、 以下の特徴を有する。 第 1の発明は、 ディスプレイと、 ユーザー情報および問題情報を記憶する記憶 部と、 情報を入力する入力部と、 この入力部の入力情報と記憶部の問題情報とに 基づいて問題を特定する処理部とを備え、 上記記憶部に記憶させるユーザ一情報 にはユーザーの能力レベルを含み、 上記問題情報は、 問題そのものと、 その問題 の属性情報とからなり、 上記問題の属性情報には、 各問題の難易度レベルと、 問 題ごとに設定した標準解答時間とを含み、 上記処理部は、 上記ユーザーの能カレ ベルに応じた問題の中から各問題の標準解答時間の合計が、 ユーザーが特定した 希望演習時間内に収まる範囲で問題を選択する機能を備えた点に特徴を有する。 なお、 上記ユーザ一の能力レベルは、 どのようにして決めてもかまわない。 例 えば、 学習者の問題正解率や、 特定の問題群の成績によって、 自動的に決めても よいし、 ユーザーが自ら申告するようにしてもよい。 また、 このレベルは、 何段 階設定してもよい。  In order to solve the above problems, a computer invention has the following features. A first invention is a display, a storage unit for storing user information and problem information, an input unit for inputting information, and a process for specifying a problem based on the input information of the input unit and the problem information of the storage unit. The user information stored in the storage unit includes a capability level of a user. The problem information includes a problem itself and attribute information of the problem. The attribute information of the problem includes The processing unit includes the difficulty level of the question and the standard answer time set for each question, and the processing unit calculates the sum of the standard answer time of each question from among the questions corresponding to the user's ability level. It is characterized in that it has a function to select questions within the specified desired exercise time. The user's ability level may be determined in any manner. For example, it may be determined automatically based on the student's correct answer rate or the grade of a specific group of questions, or the user may declare it himself. This level can be set in any number of levels.
第 2の発明は、 ユーザー側端末を備え、 処理部は選択した問題をユーザー側端 末に表示させる機能と、 上記ユーザー側端末から入力された上記問題に対する解 答について解答率および正解率を算出する機能と、 算出した解答率および正解率 に基づいたユーザーの能力レベルを設定し、 それをユーザー情報として記憶部に 記憶させる機能とを備えた点に特徴を有する。 The second invention includes a user terminal, and the processing unit displays the selected question on the user terminal, and calculates an answer rate and a correct answer rate for the answer to the question input from the user terminal. Function and the calculated answer rate and correct answer rate And a function of setting a capability level of the user based on the information and storing the capability level in the storage unit as user information.
第 3の発明は、 処理部は、 ユーザ一側端末から希望演習問題数が入力された場 合には、 選択した問題数を上記希望演習問題数に一致させる機能を備えた点に特 徴を有する。  The third invention is characterized in that the processing unit has a function of matching the number of selected questions with the number of desired exercises when the number of desired exercises is input from the user's terminal. Have.
第 4の発明は、 ユーザーの能力レベルに応じた難易度レベル別の出題比率を記 億部に記憶し、処理部は、ユーザーの能力レベルに対応する出題比率に基づいて、 出題問題を選択する機能を備えた点に特徴を有する。  In the fourth invention, the question ratio for each difficulty level according to the user's ability level is stored in the storage unit, and the processing unit selects the question based on the question ratio corresponding to the user's ability level. The feature is that it has a function.
第 5の発明は、 問題属性に、 出題形式分類、 出題範囲分類、 重要度レベルのう ち、 いずれか 1または複数を含む点に特徴を有する。  The fifth invention is characterized in that the question attributes include any one or more of a question type classification, a question range classification, and an importance level.
なお、 上記出題範囲分類とは、 カリキュラムに対応したもので、 特定の試験の 受験生であるユーザ一が使用する教科書の目次項目のようなものである。  The above question category corresponds to the curriculum, and is like a table of contents of a textbook used by a user who is a candidate for a specific test.
第 6の発明は、 処理部は、 個々の問題に対する総ユーザ一の平均正解率を基に してその問題の難易度を決定して、問題情報として記憶させる点に特徴を有する。 第 7の発明は、 個々の問題に対する総ユーザーの正解率が予め設定した基準値 以上の範囲を複数の難易度レベルに設定する点に特徴を有する。  A sixth aspect of the invention is characterized in that the processing unit determines the difficulty level of the question based on the average correct answer rate of the total user for each question and stores the difficulty level as the question information. The seventh invention is characterized in that a range in which the correct answer rate of all users for each question is equal to or higher than a predetermined reference value is set to a plurality of difficulty levels.
第 8の発明は、 問題の解答とともにその解答に対するユーザーの自信度を入力 する自信度入力手段を備え、 処理部は、 解答と自信度とを対応づけて記憶部に記 憶させる機能を備えた点に特徴を有する。  The eighth invention has a confidence input means for inputting the user's confidence in the answer together with the answer to the question, and the processing unit has a function of associating the answer with the confidence in the storage unit. It is characterized by points.
第 9の発明は、 処理部が、 ユーザ一の演習結果として、 各解答の正誤と、 ユー ザ一の自信度とを一覧表にして表示させる機能を備えた点に特徴を有する。  The ninth invention is characterized in that the processing unit has a function of displaying, as a list, the correctness of each answer and the user's confidence as a user's exercise result.
第 1 0の発明は、 処理部が、 自信度に基づいて問題を抽出する機能を備えた点 に特徴を有する。  The tenth invention is characterized in that the processing unit has a function of extracting a problem based on the degree of confidence.
第 1 1の発明は、 記憶部が、 解答の正誤と自信度とに応じた条件に対応づけた トレイナ一コメントを記憶し、 処理部は、 ユーザ一の演習結果の正誤および自信 度と上記条件とを対比して、 対応する トレイナーコメントを抽出する機能を備え た点に特徴を有する。  In the eleventh invention, the storage unit stores the trainer's comment associated with the condition according to the correctness of the answer and the confidence level, and the processing unit stores the correctness / wrongness and the confidence level of the exercise result of the user and the above condition. The feature is that it has a function to extract the corresponding trainer comment in comparison with the above.
第 1 2の発明は、 処理部が、 ユーザーの渎習結果として、 問題の難易度レベル 別に、 正解数に対するユーザーが自信を持って正解した数の割合である確答率を 算出して、 それを表示させる機能を備えた点に特徴を有する。 According to the twelfth invention, the processing unit calculates the accuracy rate, which is the ratio of the number of correct answers to the number of correct answers by the user with respect to the number of correct answers, for each difficulty level of the problem as a result of the user's training. The feature is that it has the function of calculating and displaying it.
第 1 3の発明は、 第 5の発明を前提とし、 難易度レベルの低い問題の重要度レ ベルを、 高いレベルに設定する点に特徴を有する。 図面の簡単な説明  The thirteenth invention is based on the fifth invention and is characterized in that the importance level of a problem with a low difficulty level is set to a high level. BRIEF DESCRIPTION OF THE FIGURES
図 1は、 第 1実施例のシステムの全体構成を示した図である。  FIG. 1 is a diagram showing the overall configuration of the system of the first embodiment.
図 2は、 第 1実施例のシステムの利用手順を示したフロ一チヤ一トである。 図 3は、 第 1実施例の、 解答時間の難易度レベル別、 解答時間比率テーブルで ある。  FIG. 2 is a flowchart showing a procedure for using the system of the first embodiment. FIG. 3 is an answer time ratio table according to the difficulty level of the answer time according to the first embodiment.
図 4は、 第 2実施例のシステムの全体構成を示した図である。  FIG. 4 is a diagram illustrating the overall configuration of the system according to the second embodiment.
図 5は、 第 2実施例のコメント表を示した図である。  FIG. 5 is a diagram showing a comment table of the second embodiment.
図 6は、 第 2実施例のシステムの利用手順を示したフローチャートである。 図 7は、 第 2実施例の成績表示画面の構成を示した図である。  FIG. 6 is a flowchart showing a procedure for using the system of the second embodiment. FIG. 7 is a diagram showing the configuration of the result display screen of the second embodiment.
図 8は、 第 2実施例の成績表を示した図である。  FIG. 8 is a diagram showing a result table of the second embodiment.
図 9は、 第 2実施例の解答表を示した図である。  FIG. 9 is a diagram showing an answer table of the second embodiment.
図 1 0は、 第 2実施例の問題選択画面を示した図である。  FIG. 10 is a diagram showing a question selection screen according to the second embodiment.
図 1 1は、 第 2実施例の出題画面を示した図である。  FIG. 11 is a diagram showing a question screen of the second embodiment.
図 1 2は、 第 2実施例のユーザーの演習結果を示した表である。  FIG. 12 is a table showing the user's exercise results of the second example.
図 1 3は、 第 2実施例のユーザーの解答と自信度を示した表である。 発明を実施するための最良の形態  FIG. 13 is a table showing the answers and the degrees of confidence of the users of the second embodiment. BEST MODE FOR CARRYING OUT THE INVENTION
図 1〜図 3に示す第 1実施例のシステムは、 管理センター 5 と、 インタ一ネヅ トなどの通信網 Nを介して接続した複数のユーザー側端末 1で構成されている。 上記管理センター 5は、 演習問題の提供者側で、 例えば、 学校や予備校などで ある。  The system of the first embodiment shown in FIGS. 1 to 3 includes a management center 5 and a plurality of user terminals 1 connected via a communication network N such as an Internet. The management center 5 is a provider of exercises, such as a school or a prep school.
また、 上記ユーザ一とは、 このシステムの会員であり、 学習者である。 上記ュ 一ザ一側端末 1はパソコンなどで、 このユーザ一側端末 1には、 この発明のディ スプレイと入力部とが含まれる。  The above-mentioned user is a member of the system and a learner. The user-side terminal 1 is a personal computer or the like, and the user-side terminal 1 includes the display of the present invention and an input unit.
上記管理センター 5側には、 問題情報を記憶する問題データベース 2と、 ュ一 ザ一情報を記憶するユーザーデ一夕ベース 3 と、 デ一夕を処理する処理部 4とを 備えている。 上記問題データベース 2 とユーザ一データベース 3 とが、 この発明 の記憶部にあたる。 The management center 5 has a problem database 2 for storing problem information, and a It is provided with a user data base 3 for storing the data and a processing unit 4 for processing the data. The problem database 2 and the user-database 3 correspond to the storage unit of the present invention.
具体的には、 上記管理センター 5に、 上記データベース 2 , 3や、 上記処理部 4の機能を備えたサーバーを設置していればよい。 ただし、 上記処理部 4と、 上 記デ一夕ベース 2や 3は、 管理セン夕一 5内に一体にして設置しておく必要はな い。 処理部 4を備えたサーバ一と、 データベースとは別々のところに設置されて いてもかまわない。 要するに、 処理部 4と、 この処理部 4との間でデータのやり 取りができるデータベース 2 , 3を備えていればよい。  Specifically, a server having the functions of the databases 2 and 3 and the processing unit 4 may be installed in the management center 5. However, it is not necessary to install the processing unit 4 and the above-mentioned data bases 2 and 3 integrally in the control center 5. The server provided with the processing unit 4 and the database may be installed separately. In short, it is only necessary to provide the processing unit 4 and the databases 2 and 3 capable of exchanging data with the processing unit 4.
また、 上記ユーザー側端末 1の代わりに、 管理センター 5の処理部 4に直接接 続した入力部やディスプレイによって、 ユーザーが、 このシステムを利用するよ うにすることもできる。  Also, instead of the user terminal 1, the user can use this system by an input unit or a display directly connected to the processing unit 4 of the management center 5.
上記問題データベース 2には、 演習問題を記憶している。 これらの問題は、 例 えば、 司法試験用、 弁理士試験用、 大学入試用、 企業研修用というような、 各種 試験用の問題である。 それぞれの試験の種類ごとに問題データベース 2を備えて いてもよいし、 あるいは、 個々の問題に、 「〇〇試験」 という属性を付けて、 1つ のデータベースに記憶させておいてもよい。  The exercise database 2 stores exercise questions. These are questions for various tests, for example, for bar exams, patent attorney exams, university entrance exams, and corporate training. A question database 2 may be provided for each type of test, or each question may be stored in a single database with the attribute “〇〇test”.
そして、 各問題にはその問題を解くのに掛かる標準解答時間と、 問題の難易度 レベル、 重要度レベル、 正解と内容解説などが対応づけられている。 上記標準解 答時間とは、 このシステムのュ一ザ一が実際に解答した時間を基に決めた時間で もよいが、 実際の試験に合格するためにはこのく らいで解かなければならないと いうような時間である。  Each question is associated with the standard answer time required to solve the question, the difficulty level of the question, the importance level, the correct answer and the content description, and so on. The standard answer time mentioned above may be a time determined based on the time the user of this system actually answered, but it must be solved in this way to pass the actual test. It is such a time.
上記難易度レベルとは、 ユーザーにとって正解を出すことがどのく らい難しい かという基準である。 実際には、 その問題に対する解答の正解率などを基準にし て決める。例えば、全解答者の正解率の高い問題を難易度レベルの低い問題とし、 全解答者の正解率の低い問題を難易度レベルの高い問題として規定することがで きる。  The difficulty level is a measure of how difficult it is for a user to get a correct answer. In practice, the decision is based on the percentage of correct answers to the question. For example, a problem with a high correct answer rate for all answerers can be defined as a problem with a low difficulty level, and a problem with a low correct answer rate for all answerers can be defined as a problem with a high difficulty level.
また、 上記重要度レベルとは、 ユーザーの実力アップの目的にとってどの程度 重要であるかというレベルであって、 その目的のために、 何を重要とするかにつ いては、 システムで自由に設定できるものである。 例えば、 過去の試験における 出題頻度などから予測したその問題が試験に出題される可能性に応じて决めたり、 問題の難易度に応じてきめたり、 基本問題であるとか応用問題であるというよう な基準で決めたりすることができる。 The importance level is the level of importance for the purpose of improving the user's ability, and what is important for that purpose. Can be freely set in the system. For example, questions that are predicted from the frequency of questions in past tests are determined according to the likelihood of being taken on the test, determined based on the difficulty of the question, or are basic or applied. Can be determined based on various criteria.
このような難易度レベルや、 重要度レベルが設定されていると、 ユーザ一が、 効率の良い学習を行う助けになる。 例えば、 難易度の低い問題は、 誰もが正解を 出すことができるので、 最低限それだけは正解を出せるようにしなければならな いが、 ほとんどの人が答えることができないような難易度の高い問題は、 後回し にしてもよい場合がある。 また、 学習時間が限られている場合には、 重要度の高 い問題から取り組むなど、 学習順序を決める基準にもなる。  Setting such a difficulty level or importance level helps the user to perform efficient learning. For example, a less difficult question can be answered correctly by anyone, so at least that should be possible, but a high degree of difficulty that most people cannot answer The problem may be deferred. In addition, when learning time is limited, it can be a criterion for determining the order of learning, such as tackling the most important issues first.
さらに問題属性には、 出題の形式分類と、 内容分類も含まれている。  In addition, the question attributes include a formal classification of the questions and a content classification.
上記形式分類とは、 出題の仕方あるいは解答のさせ方である。 例えば、 複数め 解答例の中から正解を選択させる選択式や、 文章を並べ替える並べ替え式、 問題 分中の空欄を補充する穴埋め式、 問題文の正誤を判別させて正もしくは誤の数を 答えさせる個数式などがある。  The above-mentioned formal classification is a way of giving questions or giving answers. For example, a selection formula that selects the correct answer from multiple answer examples, a sorting formula that sorts the sentences, a fill-in formula that fills in the blanks in the question, and the number of correct or wrong There is a number formula to be answered.
上記出題の内容分類には、 試験の種類、 科目、 さらに科目ごとのカリキュラム に応じた範囲などがある。 具体的には、 試験の種類が司法試験の場合、 科目とし ては 「憲法」、 「民法」、 「刑法」、 …などであり、 範囲とは、 「憲法」 の中の 「国民 の権利 ·義務」 や 「国会」 であったり、 さらに細かく 「国民の権利 ·義務」 の中 の、 「人権の享有主体」、 「納税の義務」、 「財産権」 などである。 この  The content categories for the above questions include exam types, subjects, and ranges according to the curriculum for each subject. Specifically, when the type of examination is a bar examination, the subjects are “Constitutional”, “Civil law”, “Criminal law”,… etc., and the scope is “People's rights in the Constitution” These are "duties" and "diet", and more specifically, "people who enjoy human rights", "duties of tax payment", and "property rights" in "rights and duties of the people". this
範囲の設定や、 上記形式分類の仕方は、 管理センター 5側で自由に決めることが できるが、 特定の試験、 例えば司法試験の受験勉強をするユーザ一にとつてわか りやすい分類や表現を用いることが好ましい。 The setting of the range and the manner of the above-mentioned formal classification can be freely determined by the management center 5 side, but use a classification or expression that is easy to understand for a specific test, for example, a user who studies a bar exam. Is preferred.
上記ユーザーデータベース 3には、 ユーザー情報として、 氏名、 住所、 年齢、 性別などの基本属性の他に、 そのユーザーがこの発明のシステムを利用して学習 したときの学習履歴も記憶させておく。 この学習履歴には、 いつ、 どの問題を出 題されたのかということや、 それらの問題に対するユーザ一解答の内容や、 その 正誤、 正解率、 あるいは、 得点などを含めることができる。 また、 ュ一ザ一の能 カレベルも記憶させておく。 そして、 上記ユーザ一には、 ユーザー I Dやパスワードを設定して、 これら I パスワードによって、 ユーザーを特定するようにしている。 The user database 3 stores, as user information, learning history when the user has learned using the system of the present invention, in addition to basic attributes such as name, address, age, and gender. This learning history can include when and which questions were asked, the contents of the user's answers to those questions, and their correctness, correctness rate, and score. Also, remember the user's ability level. A user ID and a password are set for the user, and the user is identified by the I password.
以下に、 ユーザーがこの第 1実施例のシステムを利用する手順を、 図 2のフロ 一チャートに従って説明する。 なお、 ユーザーは、 あらかじめ会員登録を済ませ たものとする。 また、 図 2のフローチャートには、 ユーザ一の作業手順と、 管理 センター 5側の処理部 4の処理手順との両方を表している。  The procedure for the user to use the system of the first embodiment will be described below with reference to the flowchart of FIG. It is assumed that the user has already registered as a member. In addition, the flowchart of FIG. 2 shows both the work procedure of the user and the processing procedure of the processing unit 4 on the management center 5 side.
まず、 ステップ S 1で、 ユーザーは、 自分の端末 1から演習問題システムのホ —ムページにアクセスする。 ユーザー側端末 1のディスプレイに上記ホームぺー ジが表示されたら、 ユーザー I Dと、 パスワードを入力する (ステップ S 2 )。 ステップ S 3では、 処理部 4が、 入力された上記ユーザー I Dおよびパスヮー ドでユーザ一デ一夕ベース 3内のユーザーを特定し、 そのユーザーにユーザーの 能力レベルが記憶されているかどうかを判断する。 ユーザーの能力レベルが記憶 されているときにはステップ S 7へ進むが、 記億されていないときにはステヅプ S 4へ進む。  First, in step S1, the user accesses the home page of the exercise system from his / her terminal 1. When the home page is displayed on the display of the user terminal 1, the user ID and the password are input (step S2). In step S3, the processing unit 4 identifies the user in the user data base 3 based on the input user ID and password, and determines whether or not the user's ability level is stored in the user. . If the user's ability level is stored, the process proceeds to step S7, but if not, the process proceeds to step S4.
ステップ S 4では、ユーザー側端末 1に能力レベルが記憶されていないことと、 能力レベルの自己申告を促す表示をする。 ここで、 ユーザーが自己申告をしない 場合には、 ユーザー側端末 1から自己申告をしない旨を通知するようにして、 ス テツプ S 5で、処理部 4が、そのユーザ一の能力レベルとしてレベル Bを設定し、 ステップ S 7へ進む。  In step S4, a display is made to indicate that the capability level is not stored in the user terminal 1 and to prompt self-reporting of the capability level. Here, if the user does not make a self-declaration, the user side terminal 1 notifies the user that he does not make a self-declaration, and in step S5, the processing unit 4 sets the level B as the capability level of the user. And proceed to step S7.
上記ステップ S 4で、 ユーザ一が自分の能カレペルを自己申告する場合には、 ステップ S 6へ進み、 申告レベルを入力し、 ステップ S 7へ進む。  If the user self-reports his / her Nokarepelle in step S4, the process proceeds to step S6, enters the report level, and proceeds to step S7.
この第 1実施例では、 ユーザーの能力レベルが記憶されていない場合には、 ュ 一ザ一に自己申告をさせるようにしているが、 自己申告をさせないで、 処理部 4 で自動的に、 初期レベルを設定するようにしてもよい。 この初期レベルは、 もち ろん Bに限らない。 なお、 この第 1実施例においては、 ユーザー能力レベルとし て、 A, B , Cを設定し、 Aが初級、 Bが中級、 Cが上級に対応する。  In the first embodiment, when the ability level of the user is not stored, the user is required to make a self-declaration. However, the self-declaration is not performed, and the processing unit 4 automatically performs the initial declaration. The level may be set. This initial level is of course not limited to B. In the first embodiment, A, B, and C are set as the user ability levels, where A corresponds to the elementary level, B corresponds to the intermediate level, and C corresponds to the advanced level.
ステップ S 7では、 ユーザーが、 希望演習時間 Tを入力する。 この希望演習時 間 Tとは、 ユーザーがこの演習問題を解くのに、 今、 利用できる時間のことであ る。 ステップ S 8で、 処理部 4は、 図 3に示す解答時間比率テーブルを参照して、 上記希望演習時間 Tを問題レベル別時間 T a, T b , T cに分配する。 図 3の解 答時間比率テーブルとは、 ユーザーの能力レベルに応じて、 出題する問題中の難 易度レベルの比率を決めたテーブルである。 なお、 この第 1実施例では、 難易度 レベルを a, b, cと設定し、 難度の低い順に a, b, cとしている。 そして、 上記難易度レベル別の比率は、 解答時間の比率である。 In step S7, the user inputs a desired exercise time T. The desired exercise time T is the time available for the user to solve this exercise. In step S8, the processing section 4 refers to the answer time ratio table shown in FIG. The answer time ratio table in Fig. 3 is a table that determines the ratio of the difficulty levels in the questions to be asked according to the user's ability level. In the first embodiment, the difficulty levels are set as a, b, and c, and a, b, and c are set in ascending order of difficulty. The ratio for each difficulty level is the ratio of the answer time.
すなわち、 能力レベル Cのユーザ一の場合、 難易度レベル aのための解答時間 Taと、 難易度レベル bのための解答時間 T bと、 難易度レベル cのための解答 時間 T cとの比率は、 T a : T b : T c = 3 : 5 : 2である。 具体的には、 能力 レベル Aのユーザーが、 希望演習時間 1 00分を入力した場合には、 上記 T a二 30分、 Tb= 50分、 T c = 2 0分ということになる。 このように、 ステップ S 8では、 ユーザーの希望演習時間とユーザ一能力レベルとに応じて、 問題の難 易度レベル別出題比率を算出する。  In other words, in the case of one user having ability level C, the ratio of the answer time Ta for the difficulty level a, the answer time Tb for the difficulty level b, and the answer time Tc for the difficulty level c Is Ta: Tb: Tc = 3: 5: 2. Specifically, if the user of ability level A inputs the desired exercise time of 100 minutes, the above Ta will be 30 minutes, Tb = 50 minutes, and Tc = 20 minutes. As described above, in step S8, the question ratio for each difficulty level of the question is calculated according to the user's desired exercise time and the user's ability level.
この第 1実施例では、 問題の難易度レベル別出題比率を、 解答時間に適応して いるが、 問題数に適応するようにしてもよい。 問題数に適応する場合には、 図 3 の解答時間比率テーブルの代わりに、 同様の問題数比率テーブルを用いて出題比 率を決める。 例えば、 能力レベル Cのユーザ一に対しては、 1 0問中 3問を難易 度 aの問題にして、 5問を難易度 bにし、 2問を難易度 cの問題にするというこ とである。  In the first embodiment, the question rate for each difficulty level is adapted to the answer time, but may be adapted to the number of questions. When adapting to the number of questions, use the same question number ratio table instead of the answer time ratio table in Fig. 3 to determine the question rate. For example, for a user with ability level C, three out of ten questions are considered as difficulty a, five as difficulty b, and two as difficulty c. is there.
ステップ S 9では、 上記算出した難易度レベル別解答時間 T a , Tb, T cそ れそれに、対応する問題を抽出する。すなわち、難易度レベル aの問題の中から、 それらの標準解答時間の合計が上記難易度レベル別解答時間 T aにほぼ等しくな るような問題を抽出する。 上記標準解答時間の合計が、 ステップ S 8で算出した 難易度レベル別解答時間 T aに一致するように問題が抽出できない場合には、 上 記標準解答時間の合計 < T aになるようにする。  In step S9, the above-described solution time T a, Tb, T c for each difficulty level and corresponding questions are extracted. That is, from the questions of the difficulty level a, a question is extracted in which the total of the standard answer times is substantially equal to the answer time T a for each difficulty level. If it is not possible to extract a question so that the total of the above standard answer times coincides with the answer time Ta for each difficulty level calculated in step S8, the total of the above standard answer times <T a .
なお、 問題抽出の仕方は、 乱数表を用いるなど、 どのような方法でもかまわな いが、 そのユーザーに対して同じ問題が何度も出題されないようにした方が好ま しい。 そのためには、 個々のュ一ザ一に出題した問題番号などを記憶しておく必 要がある。 同様に、 難易度レベル bの問題、 難易度レベル cの問題を抽出し、 それらを出 題問題と決定する。 ここで、 決定した出題問題の標準解答時間の合計は、 ユーザ —の希望演習時間 T以内になる。 The method of extracting the problem may be any method, such as using a random number table, but it is preferable that the same question is not repeated for the user. For that purpose, it is necessary to memorize the question numbers etc. that have been set for each user. In the same way, problems of difficulty level b and difficulty level c are extracted, and they are determined as question problems. Here, the total standard answer time of the determined question is within the user's desired exercise time T.
ステップ S 1 0で、 上記出題問題数を、 そのユーザーに過去に出題した問題数 に加算して累積数とする。  In step S10, the number of questions set above is added to the number of questions set for the user in the past to obtain a cumulative number.
ステップ S 1 1で、 処理部 4は、 上記出題問題をユーザー側端末 1に表示させ る。 この時点から、 ユーザ一は、 問題を解き始めるので、 ステップ 1 2で処理部 4は時間の計測を開始する。 制限時間は、 出題問題の標準解答時間の合計になる が、 この時間は、 ユーザ一の希望演習時間 T以内である。  In step S11, the processing unit 4 causes the user-side terminal 1 to display the above-mentioned question. From this point, the user starts to solve the problem, and the processing unit 4 starts measuring time in step 12. The time limit is the sum of the standard answer times for the questions, but this time is within the user's desired exercise time T.
ステップ S 1 3でユーザーは、 問題を解き、 解答をユーザ一側端末 1のデイス プレイ上の解答欄に記入する。ステップ S 1 4で、処理部 4は、タイムオーバ一、 すなわち制限時間に達したかどうかを判断する。 タイムオーバ一の場合には、 ス テヅプ S 1 6へ進み、 ユーザー側端末 1のディスプレイから問題表示を消す。 上記ステップ S 1 4でタイムォ一バーでないときには、ステップ S 1 5へ進み、 ユーザーが解答を提出するかどうかを判断する。 ユーザーは、 制限時間内でも、 答案を提出することができる。 例えば、 ディスプレイ上に表示した提出ボタンを クリ ックすれば、 ステップ S 1 6へ進み、 ディスプレイ上から問題の表示が無く なる。 ユーザーが、 上記提出ボタンをクリ ックしない場合には、 ステップ S 1 4 でタイムオーバーになるまで、 問題が表示され、 解答を記入できる状態が維持さ れる。  In step S13, the user solves the problem, and writes the answer in the answer field on the display of the user-side terminal 1. In step S14, the processing unit 4 determines whether the time is over, that is, whether the time limit has been reached. If the time is over, the process proceeds to step S16, and the problem display is erased from the display of the user terminal 1. If the time is not over in step S14, the process proceeds to step S15, and it is determined whether the user submits the answer. Users can submit their answers within the time limit. For example, if the submit button displayed on the display is clicked, the process proceeds to step S16, and the display of the problem disappears from the display. If the user does not click the submit button, the question will be displayed and the user will be able to enter the answer until the time expires in step S14.
なお、 この第 1実施例では、 タイムオーバーや、 ユーザーが解答提出を選んだ 場合には、 それ以降、 問題を解くことができないように、 ユーザー側端末 1のデ イスプレイから問題の表示を消すようにしているが、 必ずしも、 上記ディスプレ ィ上から問題を消さなくてもよい。 例えば、 問題は表示したままでも、 解答を入 力できないようにしてもよい。  In the first embodiment, the display of the question is erased from the display of the user terminal 1 so that the question cannot be solved after a time-out or when the user selects the answer submission. However, it is not always necessary to eliminate the problem from the above display. For example, you may be able to keep the question displayed but not enter the answer.
ステップ S 1 6へ進み、 ユーザー側端末 1から問題が消えたら、 ステップ S 1 7でユーザーの解答が、 処理部 4へ入力される。  Proceeding to step S16, when the problem disappears from the user terminal 1, the user's answer is input to the processing unit 4 in step S17.
ステップ S 1 8で、 処理部 4は、 上記解答を採点するとともに、 解答率、 正解 率を算出する。 上記解答率とは、 出題数に対する解答数の割合であり、 正解率と は、 出題数に対する正解数の割合である。 In step S18, the processing unit 4 scores the above answer, and calculates the answer rate and the correct answer rate. The above answer rate is the ratio of the number of answers to the number of questions, and the correct answer rate and Is the ratio of the number of correct answers to the number of questions.
ステップ S 1 9では、 上記解答率が 9 0 %以上かどうかを判定する。 ここで、 上記解答率が 9 0 %未満の場合には、 ステップ S 2 0へ進み、 ユーザー能カレべ ルを降格する。 例えば、 レベル Cから Bへ、 Bから Aに下げることである。  In step S19, it is determined whether the answer rate is 90% or more. If the answer rate is less than 90%, the process proceeds to step S20 and the user ability level is demoted. For example, lowering from level C to B and from B to A.
上記のように、 解答率によって、 ユーザー能力レベルを見直すのは、 次のよう な理由による。  As described above, reviewing the user ability level based on the answer rate is based on the following reasons.
上記解答率が低いということは、解答していない問題があるということである。 解答していないということは、 ユーザーが見ただけで解くことができない問題と 判断して、 その問題を飛ばしてしまった場合や、 他の問題に時間が掛かってしま つて、 その問題に時間を割けなかった場合などがある。 あるいは、 途中で、 やる 気を無く して、 制限時間いっぱいまで考えないで、 提出してしまった場合なども 考えられる。 どのような場合にしても、 受験生としてのユーザ一能力が不十分で あると判断できるからである。  If the answer rate is low, it means that there is an unanswered question. If you do not answer it, you judge it as a problem that you can not solve just by seeing it, and if you skip that question or take time to do other problems, spend time on that problem. There are cases where it could not be broken. Or, you may have lost your motivation on the way and submit it without thinking up to the time limit. In any case, it is possible to judge that the ability of the user as an examinee is insufficient.
ただし、 上記 9 0 %という基準は、 学習内容や、 試験の合格レベルなどに応じ て適当な値を設定するようにする。  However, the above 90% criterion should be set to an appropriate value according to the learning content and the passing level of the examination.
そして、 ステップ S 2 0からステップ S 2 3へ進み、 上記降格されたユーザ一 能力レベル、 採点結果、 正解率、 解答率などの、 結果を表示する。  Then, the process proceeds from step S20 to step S23, and the results such as the demoted user's ability level, scoring result, correct answer rate, answer rate, etc. are displayed.
これに対し、 解答率が 9 0 %以上の場合、 ステップ S 2 1へ進み、 出題の累積 値が所定値に達したかどうかを判断する。 上記出題累積値が、 所定値に達してい ないときには、 ステップ S 2 3へ進んで、 現在のユーザー能力レベルを含んだ結 果を表示する。  On the other hand, if the answer rate is 90% or more, the process proceeds to step S21, and it is determined whether the cumulative value of the questions has reached a predetermined value. If the cumulative value of the questions does not reach the predetermined value, the process proceeds to step S23 to display the result including the current user ability level.
上記ステップ S 2 1で、 出題累積値が所定値に達していた場合には、 ステップ S 2 2へ進んで、 ユーザー能力レベルの見直しを行う。 ステップ S 2 2では、 先 に算出したユーザーの正解率に基づいて、 ユーザ一能力レベルを見直す。 能カレ ベル Aや Bと認定するための正解率をあらかじめ設定しておく。 また、 能カレべ ルは、 今回の演習の正解率とともに現時点での能力レベルに応じて設定するよう にする。 つまり、 現在能力レベル Aのユーザーが所定の正解率に達した時点で、 レベル Bや Cと認められることもある。 反対に、 所定の正解率未満のときには、 能力レベル: Bや Cのユーザ一が、 能力レベル Aとなることもある。 このステップ において、その時点での能力レベルを、昇格したり、降格したりすることもある。 なお、 ここで設定された能力レベルは、 次回以降の演習結果によって、 再度変 更されることもある。 If the cumulative value of the questions has reached the predetermined value in step S21, the process proceeds to step S22 to review the user ability level. In step S22, the user's ability level is reviewed based on the user's correct answer rate previously calculated. Set the correct answer rate for qualifying as No. A or B in advance. In addition, the Noh level should be set according to the current skill level together with the accuracy rate of this exercise. In other words, when a user of the current ability level A reaches the predetermined accuracy rate, it may be recognized as level B or C. Conversely, if the accuracy rate is less than the predetermined accuracy rate, the user of the skill level B or C may have the skill level A. This step , The ability level at that time may be promoted or demoted. Note that the ability level set here may be changed again depending on the results of the next and subsequent exercises.
この第 1実施例では、 ステップ S 2 1で出題累積値が所定値に達している場合 にのみ、 正解率を考慮した能力レベルの設定をするようにしている。 これは、 例 えば、 1回だけ、 すごく高い正解率を出したからといっても、 すぐに能力レベル を上げると、 次回の出題内容が難しくなつて、 すぐに、 正解率が落ちてしまうよ うなことが起こるのを防く'ためである。 ある程度、 演習を積んだユーザーに限つ て、 能力レベルの再設定をするようにしている。 ただし、 上記ステップ S 2 1に おける出題累積値の基準値は、 システムで自由に設定できる し、 設けなくてもか まわない。  In the first embodiment, the ability level is set in consideration of the correct answer rate only when the cumulative value of the questions has reached the predetermined value in step S21. This is because, for example, even if you give a very high accuracy rate only once, if you immediately raise your ability level, the next time you make it difficult to answer questions, the accuracy rate will drop immediately. This is to prevent the occurrence of eels. We try to reset the ability level only for users who have practiced to some extent. However, the reference value of the cumulative question value in step S21 described above can be freely set by the system, or may not be provided.
ステップ S 2 3で、 処理部 4は、 ユーサ一側端末 1に対して、 この演習での結 果を表示したら、 ステップ S 2 4で、 上記ステップの中で、 入力されたデータや 演算結果を、 全て、 ユーザー情報として記憶する。 各ユーザーのユーザー情報に は、 いつ、 どの演習問題を出題されて何を解答したかということや、 解答率、 正 解率が記憶され、 演習は終了する。  In step S23, the processing unit 4 displays the result of this exercise on the user-side terminal 1, and in step S24, the processing unit 4 converts the input data and the calculation result in the above steps. , All are stored as user information. The exercise information is stored in the user information of each user as to when and which exercise questions were asked and what was answered, as well as the answer rate and correct answer rate.
以上の手順によって、 ユーザーは、 自分が希望した時間内に、 自分のレベルに 合った演習問題を、 テスト形式で行うことができる。  With the above procedure, the user can perform the exercises according to his / her level in a test format within the time desired by the user.
また、 上記ユーザ一デ一夕ベース 3に記憶された個々のユーザーの成績などの 学習履歴データは、 ユーザーが受験相談を受けるときに、 参考資料として利用す ることができる。  In addition, the learning history data such as the performance of each user stored in the user data base 3 can be used as reference material when the user receives consultation for an examination.
なお、 各問題には、 その問題に対する正解率も対応させて記憶させておくこと もできる。 この正解率は、 その問題を出題された全てのユーザーの解答を基にし た正解率である。 この正解率によって、 その問題の難易度を計ることもできるの で、 例えば、 定期的に上記正解率を算出して、 問題の難易度レベルを設定し直す ようにしてもよい。 このよ'うにすれぱ、 より実情にマッチした難易度レベルの設 定ができる。  Each question can also be stored with the correct answer rate for that question. This accuracy rate is based on the answers of all users who have given the question. Since the difficulty level of the question can be measured based on the correct answer rate, for example, the correct answer rate may be calculated periodically and the difficulty level of the question may be reset. In this way, you can set the difficulty level that matches the actual situation.
また、 上記第 1実施例では、 ユーザーは、 自分の希望演習時間だけを入力する ようにしているが、 問題の分類や、 出題数などの希望を入力できるようにして、 それらを出題すべき問題の抽出条件とすることもできる。 例えば、 先に説明した ような、 科目や、 出題範囲のほか、 出題形式、 重要度レベルなど、 問題属性とし て問題に対応づけて記憶しているものなら何でも、 問題抽出条件にすることがで きる。 In the first embodiment, the user inputs only his / her desired exercise time. However, the user can input his / her request such as the classification of the problem and the number of questions. These can also be used as extraction conditions for questions to be set. For example, as described above, in addition to the subject, the range of questions, the question format, the importance level, and so on, any attribute that is stored in association with the question as a question attribute can be used as the question extraction condition. Wear.
このような、 ユーザーの希望条件は、 ステップ S I 1の問題表示以前のステヅ プなら、 どの時点で入力するようにしてもかまわない。  Such a user's desired condition may be input at any time in a step before the problem display in step SI1.
上記問題抽出の条件として、 ユーザーが、 演習の成績を基にした様々な希望条 件を入力できるようにするとともに、 成績に、 ユーザーの解答に対する自信度を 反映させた第 2実施例を、 図 4〜図 1 2を用いて以下に説明する。 また、 この第 2実施例では、 図 4に示すように、 管理センター 5にコメントデータベース 6を 備えている。 コメントデータベース 6には、 ユーザーの成績を抽出条件として、 処理部 4が抽出するコメントが記憶されている。 ただし、 図 1の第 1実施例と同 じ構成要素には、 同じ符号を付けている。  As a condition for the above problem extraction, the second example in which the user can input various desired conditions based on the results of the exercise and reflects the user's confidence in answering the results, as shown in Fig. This will be described below with reference to FIGS. In the second embodiment, as shown in FIG. 4, the management center 5 is provided with a comment database 6. The comment database 6 stores comments extracted by the processing unit 4 using the user's grade as an extraction condition. However, the same components as those in the first embodiment of FIG. 1 are denoted by the same reference numerals.
なお、 上記自信度とは、 ユーザーが、 どのく らいの自信を持ってその解答を選 択したかというものである。 この第 2実施例では、 正解であると確信を持って解 答した場合の自信度を〇、 確信はないが、 おそらく正解であろうと考えて選択し た場合の自信度を△、 全く 自信がなく、 当てずっぽうで解答したときの自信度を Xで表すことにする。そして、ユーザ一が、 これら自信度を各解答に対応づけて、 入力するようにしている。 ただし、 自信度は上記のように 3段階に限らず、 任意 に設定できる。  The above-mentioned degree of confidence is the degree of confidence the user has selected the answer. In the second example, the confidence level when answering with a certainty that the answer is correct is uncertain, but the confidence level when selecting the answer is considered to be probably the correct answer. Instead, let's express the degree of confidence when answering with a guesswork by X. Then, the user inputs these confidence levels in association with each answer. However, the degree of confidence is not limited to three levels as described above, but can be set arbitrarily.
また、 上記コメントデータべ一ス 6に記億されているコメントの例を図 5のコ メント表 7に示す。 詳細は、 後で説明するが、 図 5のコメント表 7のように、 右 側のコメント欄 7 aの表示コメントと、 左側の条件欄 7 bに記載した各コメント を選択する条件とを対応づけて、 コメントデータベース 6が記憶している。 そし て、 上記処理部 4が、 このコメント表 7のデータを基にして、 条件にあったコメ ントを抽出し、 ユーザ一側端末 1に表示させるようにしている。  Examples of comments recorded in the comment database 6 above are shown in Comment Table 7 in Fig. 5. The details will be described later, but as shown in the comment table 7 in Fig. 5, the display comment in the right comment column 7a is associated with the condition for selecting each comment described in the left condition column 7b. The comment database 6 memorizes it. Then, the processing unit 4 extracts a comment that meets the conditions based on the data in the comment table 7, and causes the user-side terminal 1 to display the comment.
次に、 第 2実施例におけるユーザ一のシステム使用手順を、 図 6のフ口一チヤ 一トに従って説明する。  Next, a user's system use procedure in the second embodiment will be described with reference to the flowchart of FIG.
ステップ S 1 0 1で、 ユーザーは、 自分のユーザ一側端末 1から演習問題シス テムのホームページにアクセスする。 このステップ S 1 0 1から、 ステップ S 1 0 6までのステップは、 上記第 1実施例のステップ S 1〜ステップ S 6 (図 2参 照) と同じなので、 ここではその説明は省略する。 In step S 101, the user enters an exercise system from his / her own terminal 1. Access the system homepage. Steps S101 to S106 are the same as steps S1 to S6 of the first embodiment (see FIG. 2), and a description thereof will be omitted.
上記ステップ S 1 0 6までに能力レベルを設定したら、 ステップ S 1 0 7へ進 む。 ステップ S 1 0 7では、 管理センタ一 5の処理部 4力 そのユーザ一の過去 の成績を、 図 7の成績表示画面 8としてユーザ一側端末 1に表示させる。 成績表 示画面 8には、 前回の演習成績を表示する トレーニング成績欄 9と、 トレイナ一 からのコメント欄 1 0、 前回の演習で出題された問題の解答欄 1 1 とを備えてい る。  After the ability level is set by step S106, the process proceeds to step S107. In step S107, the processing unit 4 of the management center 5 displays the past performance of the user on the user-side terminal 1 as the performance display screen 8 of FIG. The grade display screen 8 is provided with a training grade section 9 for displaying the previous exercise grade, a comment section 10 from the trainer, and an answer section 11 for the questions set in the previous exercise.
トレーニング成績欄 9には、 図 8に示す成績表 1 2が表示される。 この成績表 1 2は、 問題の重要度別に成績を集計した結果を示している。 この第 2実施例で は、 問題の重要度は、 この発明の難易度レベルに対応させている。 ここでは、 a 〜eの五段階に設定し、 成績表 1 2の一番上に、 重要度欄 1 2 aを設けている。 そして、 難易度が低いレベル aから順に、 重要度が高い問題ということにする。 つまり、 重要度 aの問題が、 最も重要度が高く難易度は低いということである。 このような設定は、 システムで自由に設定することができるし、 難易度と別の基 準で重要度を設定してもかまわない。  In the training result column 9, the result table 12 shown in FIG. 8 is displayed. This grade table 12 shows the results of compiling the grades according to the importance of the questions. In the second embodiment, the importance of the problem corresponds to the difficulty level of the present invention. In this case, five levels a to e are set, and an importance column 12 a is provided at the top of the grade report 12. Then, in order of level a, the level of difficulty is lower, and the issues are more important. In other words, the problem of importance a is the most important and the difficulty is low. Such a setting can be freely set in the system, or the importance may be set based on a difficulty and another criterion.
ただし、 この第 2実施例で、 難易度の低い問題を、 重要度を高く設定したのは 次の理由による。 人数を絞るための試験の場合、 易しくて多くの受験生が正解で きる問題を正解できないことは、 その受験生にとって致命傷になる。 そこで、 誰 も解けないような難しい問題を勉強するより先に、 難易度レベルの低い問題をマ ス夕一するのが重要であると考えたからである。  However, in the second embodiment, problems of low difficulty were set to high importance for the following reason. In the case of tests aimed at reducing the number of students, failing to answer questions that are easy and can be answered correctly by many students will be fatal to those candidates. Therefore, before studying difficult problems that no one could solve, I thought it was important to master the less difficult problems.
すなわち、 この第 2実施例における重要度の設定は、 難易度が低く、 正解率の 高い問題を確実に正解することが、難易度の高い問題に不安定に正解するよりも、 重要であるという考えに基づいている。 そのうえで、 このシステムは、 より難易 度の低い問題から優先的に安定して正解できる力を養成することを目指している。 上記問題の重要度レベル、 すなわち難易度レベルは、 このシステムの総ユーザ 一の正解率によって決めている。 例えば、 ここでは、 重要度 aの問題の正解率は 8 0 %以上、重要度 bの問題は 6 0 %以上 8 0 %未満、重要度 cは、正解率 5 0 % 以上 6 0 %未満、 重要度 dは、 3 0 %以上 5 0 %未満、 重要度 eは、 3 0 %未満 としている。 In other words, the importance setting in the second embodiment is that it is more important to correctly answer a problem with low difficulty and a high accuracy rate than to answer a problem with high difficulty instability. Based on the idea. In addition, this system aims to develop the ability to give stable and correct answers preferentially from less difficult questions. The level of importance of the above questions, that is, the level of difficulty, is determined by the correct answer rate of all users of this system. For example, here, the correctness rate of the question of importance a is 80% or more, the question of importance b is 60% or more and less than 80%, and the correctness rate is 50% More than 60%, importance d is 30% or more and less than 50%, and importance e is less than 30%.
そして、 成績評価対象となった演習の中で解答された問題のうち、 上記総ユー ザ一の正解率により区分される問題を、 その区分に対応する重要度の問題として いる。  Then, of the questions answered in the exercises for which the grades were evaluated, the questions classified according to the correct answer rate of the above-mentioned total users are regarded as the issues of importance corresponding to the classification.
このように、 問題の重要度を設定するための総ユーザーの正解率を、 重要度欄 1 2 aの下に設けた欄 1 2 bに表示している。 上記総ユーザーの正解率は、 その 問題を解いた全てのユーザ一の数に対する正解数の割合である。 そして、 この正 解率は、成績評価の度に算出され、それに基づいて各問題の重要度が決定される。 従って、 各問題の重要度は、 每回変化することもあり得るが、 実際には、 演算回 数を重ねる毎に結果のばらつきが小さくなることが多い。  As described above, the correct answer rate of the total users for setting the importance of the question is displayed in the column 12b provided below the importance column 12a. The correct answer rate of the total users is the ratio of the number of correct answers to the number of all users who solved the problem. The correct answer rate is calculated each time a grade is evaluated, and the importance of each question is determined based on the calculated rate. Therefore, the degree of importance of each problem may change 每 times, but in practice, the variance of the results often decreases as the number of operations increases.
欄 1 2 bの下には、 そのユーザーの正解率を、 問題の重要度毎に表示する個人 の正解率欄 1 2 cを設けている。 この結果から、 例えば、 重要度の高い問題に正 解できているかどうかとか、 他の人が正解している問題を自分も正解しているの かどうかというようなことがわかる。  Below the column 12b, there is an individual accuracy column 12c that displays the user's accuracy rate for each problem importance. These results indicate, for example, whether you have successfully answered the questions of high importance or whether you have correctly answered the questions that others have answered correctly.
さらに、 欄 1 2 d、 1 2 e、 1 2 f は、 正解率に対する自信度を表示する欄で ある。 〇率攔 1 2 dに表示される値は、 正解した問題数に対するその解答に自信 度〇を付けた個数の割合である。 同様に、 △率は、 正解数に対する自信度△の割 合で、 X率は、 正解数に対する自信度 Xの割合である。 そして、 それぞれ、 問題 の重要度別に算出されている。 このような自信度の欄を設けることによって、 ュ 一ザ一が、 正解した場合に、 自信を持って正解したのか、 まぐれで正解したのか というようなことを推測することができる。 特に、 〇率は、 確信を持って正解し た割合で、 この発明の確答率にあたる。 ユーザーは、'全体の正解数率が高いだけ でなく、 確答率が高いことによって、 実力がついたと判断することができる。 また、 △率や X率は、 自信がなかったのに正解した比率なので、 これらの値が 高い場合には、 自信の無い点を確認して、 自信を付けることが必要であると判断 することもできる。  Columns 12d, 12e, and 12f are columns that display the degree of confidence in the correct answer rate. The value displayed in the {proportion} 1 2 d is the ratio of the number of correct answers to the number of correct answers plus the degree of confidence 〇. Similarly, the rate Δ is the ratio of the degree of confidence に 対 す る to the number of correct answers, and the rate X is the ratio of the degree of confidence X to the number of correct answers. Each is calculated according to the importance of the problem. By providing such a column of the degree of confidence, it is possible to guess whether the user correctly answers with correct confidence or correctness at a fluke when a correct answer is made. In particular, the accuracy rate is the rate of correct answers with certainty, and corresponds to the accuracy rate of the present invention. Users can judge that their ability has been gained by not only the overall rate of correct answers but also the high rate of correct answers. In addition, the △ rate and the X rate are the percentages that answered correctly even though they were not confident, so if these values are high, it is necessary to check the lack of confidence and determine that it is necessary to gain confidence. Can also.
さらに、 図 7の成績表示画面 1 0のトレイナーからのコメント欄 1 0には、 処 理部 4が、 図 5のコメント表 7から抽出したコメントを表示させる。 コメントに は、 N o . 1のように、 「aランクの問題を、 (解答時間) x 8 0 %もしくは 9 0 % で演習してください」 とか、 N o . 3のように、 「aランクの問題であなたが間違 つた問題を再度解いてみましょう。」 というようなコメントが表示される。 Further, in the comment column 10 from the trainer on the grade display screen 10 in FIG. 7, the processing unit 4 displays the comment extracted from the comment table 7 in FIG. In the comments Like "No. 1", please practice "a-rank questions at (answer time) x 80% or 90%" or "No. 3"Let's try to solve the wrong problem again. "
ここで、 aランクなどのランクとは、 問題の重要度のことである。 そして、 解 答時間とは、 このシステムで設定している標準解答時間のことで、 その 8 0 %、 9 0 %で演習するということは、 標準解答時間よりも短時間で問題を解く演習を するということである。  Here, the rank such as a rank is the importance of the problem. The answer time is the standard answer time set in this system. Exercising at 80% and 90% of the time means that the exercise to solve the problem is shorter than the standard answer time. That is to do.
さらに、 図 7の成績表示画面 8の解答欄 1 1には、 図 9に示す解答表 1 3が表 示される。 この解答表 1 3は、 問題番号欄 1 3 aと、 各問題に対するユーザーの 解答の正誤を表示する正誤欄 1 3 b、 ユーザーの解答を表示する解答欄 1 3 c、 正解を表示する正解欄 1 3 dとを備えている。さらに、各問題番号の表示位置を、 ク リ ックすると、 その問題の解説が、 図示しないウィンドウに表示されるように している。  Further, the answer table 13 shown in FIG. 9 is displayed in the answer column 11 of the grade display screen 8 in FIG. This answer table 13 has a question number column 13a, a correct error column 13b that displays the correctness of the user's answer for each question, an answer column 13c that displays the user's answer, and a correct answer column that displays the correct answer. With 1 3d. Furthermore, when the display position of each problem number is clicked, a description of the problem is displayed in a window (not shown).
以上のような成績表示画面 8は、 初めてこのシステムを利用する場合には、 も ちろん表示されないし、 実際に解いた問題数が少ない場合にも表示されないよう にすることもできる。 このシステムでは、 ユーザーが演習時間として使える時間 が短くてもそれなりの演習ができるようにしたシステムであるが、 解答数があま り少ない場合に、 集計を行って成績を評価しても、 意味のない場合があるので、 成績評価を行うためには、 評価対象とする最低解答数を予め設定しておくように している。 そして、 解答累計が、 上記最低解答数を超えてからは、 最新の解答を 含んで上記最低解答数を超えるまで遡って、 過去の演習結果を評価するようにし ている。  When the system is used for the first time, the grade display screen 8 as described above is of course not displayed, and can be prevented from being displayed even when the number of questions actually solved is small. This system allows the user to practice as much as possible, even if the time available for the exercise is short.However, if the number of answers is too small, even if the total is evaluated and the grade is evaluated, it is meaningful. In some cases, the minimum number of answers to be evaluated is set in advance in order to evaluate results. After the total number of answers exceeds the minimum number of answers, the past exercise results are evaluated by going back to the number of answers including the latest answer and exceeding the minimum number of answers.
図 6のステップ S 1 0 7で、 上記のような過去の成績が表示されたら、 ステツ プ S 1 0 8で、 ユーザーが、 その結果を参考にして、 次に行う演習問題を選択す るために、 図 1 0に示す問題選択画面 1 4を表示する。 ステップ S 1 0 9で、 ュ —ザ一が、 その問題選択画面 1 4に必要事項を入力する。  In step S107 of Fig. 6, when the past grades as described above are displayed, in step S108, the user refers to the results and selects the next exercise to be performed. Next, a question selection screen 14 shown in FIG. 10 is displayed. In step S109, the user inputs necessary items to the question selection screen 14.
この問題選択画面 1 4には、 問題を選択するための希望条件を入力するための 複数の入力欄が設けられている。 演習可能な時間を入力する時間欄 1 4 aや、 希 望出題数を入力する希望出題数欄 1 4 bのほか、 ユーザーの能力レベルを表示す る実力レベル欄 1 4 c、 上記第 1実施例で説明した出題形式を入力する出題型式 欄 1 4 d、 解答時間の割合を入力する時間の割合欄 1 4 e、 自信度欄 1 4 f 、 解 答結果欄 1 4 gである。 The question selection screen 14 is provided with a plurality of input fields for inputting desired conditions for selecting a question. Displays the user's ability level in addition to the time field 14 a for inputting the time that can be practiced, the desired question number field 14 b for inputting the number of desired questions, etc. Ability level column 14c, a question type column 14d for inputting the question format described in the first embodiment, a time ratio column 14e for inputting the answer time ratio, a confidence level column 14f, The answer result column is 14 g.
上記実力レベル橢 1 4 cには、 図 6のステツプ S 1 0 2でユーザ一が入力した ユーザー I Dから処理部 4が特定したュ一ザ一情報に基づいて、 そのユーザ一の 能力レベルを自動的に表示させるようにしているので、 希望入力欄ではない。 た だし、 ュ一ザ一が、 この画面から、 自分の能力レベルを修正できるようにしても かまわない。  Based on the user information specified by the processing unit 4 based on the user ID entered by the user in step S102 of FIG. 6, the capability level of the user is automatically set to the ability level el14c. It is not a desired input field because it is displayed on the screen. However, users may be able to modify their ability level from this screen.
上記時間の割合欄 1 4 eには、 演習の制限時間を標準時間に対する割合で設定 する場合に用いる欄である。  The above time ratio column 14e is a column used when setting the time limit for the exercise as a percentage of the standard time.
また、 自信度 1 4 f は、 ユーザ一が前回の演習で各解答に付けた自信度のデー 夕を問題の抽出条件とする場合に、 自信度を入力する欄であり、 問題重要度欄 1 4 hは、 演習したい問題の重要度を入力する欄である。  The confidence level 14 f is a field for inputting the confidence level when the data of the confidence level given to each answer in the previous exercise by the user is used as the extraction condition of the question. 4h is a field to enter the importance of the problem you want to practice.
例えば、先に説明したように、トレイナ一のコメント欄に、 「aランクの問題を、 (解答時間) X 8 0 %もしくは 9 0 %で演習してください」 とあった場合、 時間 の割合欄 1 4 eに 「 8 0」 と入力し、 問題の重要度攔 1 4 hに、 「a」 と入力すれ ば、 処理部 4が、 ランク aの問題を演習問題として抽出し、 それを出題したら、 標準時間の 8 0 %でタイムオーバーとする。  For example, as described above, if the trainer's comment section says, “Please practice the a-rank question at (answer time) X 80% or 90%”, the time percentage field Enter “80” in 14 e and enter “a” in the importance of the question 攔 14 h.Processing unit 4 extracts the question with rank a as an exercise, The time is over at 80% of the standard time.
また、 自信を持って解答できなかった問題を再度解きたい場合には、 上記自信 度欄 1 4 f に、 自信度△や自信度 Xを入力する。  If you want to solve a question that you could not answer with confidence, enter confidence degree △ or confidence degree X in the confidence level field 14f.
ステップ 1 1 0で、 処理部 4は、 上記問題選択画面 1 4から入力された項目と ユーザデータベース 3に記憶されているそのユーザーの演習履歴とを抽出条件と して、 問題データベース 2から出題する問題を抽出する。 そして、 ステップ S 1 1 1で、 出題の累積数を演算する。  In step 110, the processing unit 4 sets the items input from the question selection screen 14 and the exercise history of the user stored in the user database 3 as extraction conditions, and sets the questions from the question database 2. Identify the problem. Then, in step S111, the cumulative number of questions is calculated.
ステップ S 1 1 2で、 問題を表示する。 問題は、 図 1 1の出題画面 1 5のよう に、 問題番号毎に、 問題表示欄 1 5 a、 解答表示欄 1 5 b、 解答選択欄 1 5 c、 自信度入力欄 1 5 dを備えている。 なお、 図中、 符号 1 6 , 1 7は、 ユーザ一が、 選択した解答や自信度をチェヅクするためのチェックボックスである。  In steps S 1 1 and 2, the problem is displayed. For questions, as shown on the question screen 15 in Fig. 11, each question number has a question display column 15a, answer display column 15b, answer selection column 15c, and confidence input column 15d. ing. In the figure, reference numerals 16 and 17 are check boxes for the user to check the selected answer and the degree of confidence.
また、 複数の演習問題は、 画面をスクロールすることによって、 次々に表示さ れるようにしている。 Multiple exercises are displayed one after another by scrolling the screen. I am trying to be.
処理部 4は、 ステップ S 1 1 3で時間計測を開始し、 ステップ S 1 1 5で夕ィ ムオーバーになるか、 ステップ S 1 1 6で問題提出が行われるまでの間、 問題の 出題画面 1 5を表示させておく。 その間に、 ステップ S 1 1 4で、 ユーザーが問 題を解き、 自信度を記入する。 ステップ S 1 1 7で、 問題の表示を止めて、 演習 を終了したら、 ステップ S 1 1 8で、 ユーザ一が選択した解答と自信度が処理部 4に入力される。  The processing unit 4 starts time measurement in step S113, and waits until the time is over in step S115 or until the question is submitted in step S116, and the question question screen 1 is displayed. 5 is displayed. Meanwhile, in step S114, the user solves the problem and enters his confidence. In step S117, the display of the problem is stopped, and the exercise is completed. In step S118, the answer and the degree of confidence selected by the user are input to the processing unit 4.
ステップ S 1 1 9で、 採点を行い、 解答率を算出する。 ここでの採点は、 今回 行った演習の解答の正誤を判定することであるが、 正解率は、 今回の演習題数が 少なかった場合には、 過去の演習の解答結果も含んで算出する場合もある。  In step S119, scoring is performed and the answer rate is calculated. The scoring here is to judge the correctness of the answer of the exercise performed this time, but if the number of exercises this time is small, the correct answer rate is calculated including the answer result of the past exercise There is also.
ステップ S 1 2 0では、解答率が 9 0 %以上かどうかを判定し、解答率が 9 0 % 未満の場合には、 ステップ S 1 2 1に進み、 能力レベルを降格しステツプ S 1 2 5へ進む。  In step S120, it is determined whether the answer rate is 90% or more, and if the answer rate is less than 90%, the process proceeds to step S121, in which the ability level is demoted and the step S125 is performed. Proceed to.
—方、 解答率が 9 0 %以上の場合には、 ステップ S 1 2 2に進み、 出題累積値 が所定値に達したかどうかを判定し、未達成の場合にはステップ S 1 2 5へ進み、 達成していた場合には、 ステップ S 1 2 3へ進む。  If the answer rate is 90% or more, go to step S122, determine whether the cumulative number of questions has reached the predetermined value, and if not, go to step S125. Go to step S 1 2 3 if it has been achieved.
ステップ S 1 2 3では、 正解率に応じた能力レベルの見直しを行い、 必要なら 更新する。  In step S123, the ability level is reviewed according to the accuracy rate, and updated if necessary.
さらに、ステップ S 1 2 4では、成績表を作成する。ここで作成する成績表は、 今回の演習結果を含んで集計した図 8に示す表 1 2である。  Further, in step S124, a grade table is created. The report created here is Table 12 shown in Fig. 8 that includes the results of this exercise.
ステップ S 1 2 5では、 図 9に示す解答表 1 3とコメントの抽出を行う。  In step S125, the answer table 13 shown in FIG. 9 and comments are extracted.
ここで、 処理部 4が、 コメントデータベース 6からコメントを抽出する方法に ついて、 簡単に説明する。  Here, a method by which the processing unit 4 extracts a comment from the comment database 6 will be briefly described.
例えば、 図 5の No. 1のコメントを抽出する条件は、 「aランクの問題で (正, 誤) が N1個以上発生」 である。 これは、 重要度 aの問題について、 今回の解答 結果が不正解、 つまり 「誤」 であって、 前回は 「正」 であったというものが N1 個以上あった場合という条件である。 このような条件を判断するために、 ユーザ 一データベース 6に演習履歴として記憶した図 1 2に示すようなデータを利用す る。 図 1 2の表 1 8は、 ある特定のユーザーの演習結果を表にしたものである。 表 1 8は、 各問題番号に対応させて、 その問題の重要度と、 そのユーザーの解答 の正誤と、 自信度とを対応づけてある。 For example, the condition for extracting the No. 1 comment in Fig. 5 is "N1 or more (correct, false) occur in the a-rank problem". This is a condition that, for a question of importance a, the result of this answer is incorrect, that is, “wrong” and there were N1 or more that were “correct” in the previous time. In order to determine such a condition, data as shown in FIG. 12 stored in the user database 6 as an exercise history is used. Table 18 in Figure 12 tabulates the results of a particular user's exercise. Table 18 associates each question number with the importance of the question, the correctness of the user's answer, and the confidence.
右端の今回欄 1 8 aには、今回の演習の解答に対する正誤と自信度が記入され、 その隣の前回欄 1 8 bには、 前回、 前々回欄 1 8 cには、 前々回の結果が表示さ れている。 ただし、 今回の演習で全ての問題が出題されるとは限らないので、 今 回出題されなかった問題については、 前回出題されたときの結果が記入されてい る。 つまり、 今回というのは、 厳密には、 各問題にとって、 出題された最も現時 点に近い回ということである。 また、 前回、 前々回というのは、 今回に直近の過 去と、 その前ということであり、 前回欄 1 8 bの解答結果が、 全て、 同じ回の演 習で出題された問題に対応しているわけではないし、 前々回欄 1 8 cの結果も、 同様に、 同じ回の演習で出題された問題に対応しているわけではない。  In the rightmost column 18a this time, the correctness and confidence in the answer to this exercise and the degree of confidence are entered, and in the next column 18b next to it, the previous and previous column 18c show the results of the previous and previous times Has been done. However, not all the questions will be asked in this exercise, so for questions that were not taken this time, the results from the previous question will be entered. In other words, strictly speaking, this means, for each question, the round closest to the current point in time. In addition, the last two times before means the past and the last time before this time, and the answer results in the last column 18b correspond to the questions set in the same exercise. And the results in column 18c do not necessarily correspond to the questions set out in the same exercise.
そして、次の演習結果が出た場合には、その結果が、今回欄 1 8 aに記入され、 新しく、 結果が記入されたセルの結果が、 前回欄 1 8 bに移動する。  Then, when the next exercise result is obtained, the result is entered in the current column 18a, and the result of the cell in which the new result is entered is moved to the previous column 18b.
そして、 表 1 8において、 上記 N o . 1の抽出条件を確認するためには、 問題 番号 4、 5のように、 前回の解答が 「正」 で今回は 「誤」 の問題を計数する。 そ して、 その数が N1個以上のとき、 この条件を満たしたと判定し、 N o . 1のコ メントを抽出する。 また、 N o . 6〜 1 1のように、 解答と、 自信度とを組み合 わせた条件も、 図 1 2のデータで判定することができる。  Then, in Table 18, in order to confirm the extraction condition of No. 1 above, as in the case of question numbers 4 and 5, the number of questions in which the previous answer was “correct” and the current answer was “wrong” is counted. When the number is N1 or more, it is determined that this condition is satisfied, and the comment of No. 1 is extracted. In addition, the condition in which the answer and the degree of confidence are combined as in No. 6 to 11 can also be determined from the data in FIG.
なお、 演習結果によっては、 複数の条件に適合してしまうこともある。 その場 合に、 適合した全ての条件に対応するコメントを全て同時に表示させたのでは、 ユーザーが、どのコメントに対処すればよいか混乱してしまうことも予想される。 そこで、 複数の条件を満たした場合には、 表示させるコメント選ぶ優先順位を設 定してくようにしてもよい。  Depending on the results of the exercise, multiple conditions may be met. In that case, if all the comments corresponding to all the conditions that are met are displayed at the same time, it is expected that the user may be confused as to which comment to deal with. Therefore, when a plurality of conditions are satisfied, the priority for selecting the comment to be displayed may be set.
以上のようにして、 処理部 4が、 成績表 1 2、 解答表 1 3を作成し、 コメント を抽出したら、 ステップ S 1 2 6で、 それらをユーザー側端末 1に表示させる。 これらを表示させる結果表示画面は、図 7の成績表示画面 8とほぽ同じであるが、 図 7が、 「前回の…」 であるのに対し、 ここで表示する結果表示画面は、 「今回の ···」 と表示される点が異なる。  As described above, when the processing unit 4 creates the grade table 12 and the answer table 13 and extracts comments, the processing unit 4 displays them on the user terminal 1 in step S126. The result display screen for displaying these is almost the same as the grade display screen 8 in FIG. 7, but the result display screen displayed here is “this time”, whereas FIG. Is displayed.
ただし、 この第 2実施例では、 ステップ S 1 2 1で降格された場合と、 ステツ プ S 1 2 2で出題累積数が所定値に達していない場合には、 図 8の成績表 1 2を 新たに作成しないことにしているが、 その場合でも、 今回の解答に対する解答表 1 3 (図 9参照) と、 コメントを表示する。 また、 能力レベルが、 更新された場 合には、 そのことを、 コメントとして、 表示させるようにしてもよい。 However, in the second embodiment, the case where the demotion is performed in step S121 and the case where If the cumulative number of questions does not reach the predetermined value in step S122, a new grade sheet 12 in Fig. 8 is not created, but even in such a case, the answer table 13 (See Figure 9) and display the comment. If the ability level has been updated, that may be displayed as a comment.
また、 この第 2実施例のように各解答に対するユーザ一の自信度を収集してお けば、 図 1 3に示すように、 自信度と正 ·誤数の関係を表 1 9にして表示するこ ともできる。 この表 1 9は、 問題の重要度などのレベルには関係なく、 5 0問の 出題に対して、 確信をもって解答した数や、 その中の正解、 不正解の数、 自信が ない問題の数を一目でわかるように表したものである。 自分自身はもちろんのこ と、 学習相談に応じるアドバイザーなども、 このようなデータを参考にして指導 することができる。  If the user's confidence for each answer is collected as in the second embodiment, the relationship between the confidence and the correct / incorrect number is displayed in Table 19 as shown in Fig. 13. You can do it. Table 19 shows the number of questions answered confidently, the number of correct answers, the number of incorrect answers, and the number of questions without confidence in 50 questions, regardless of the level of importance of the questions. Are shown at a glance. Not only yourself, but also advisors who provide study consultation can provide guidance using such data.
例えば、 正解数は多いのに、 自信度〇が少ない場合、 もっと自信を持つように 励ましたり、 演習を繰り返して自信を付けるようにアドバイスしたりすることも できる。 反対に、 不正解なのに自信度〇を付けた場合、 つまり、 確信を持って解 答したのに不正解であった値が高いという場合には、 勘違いなどの理解不足があ ると考えられる。  For example, if the number of correct answers is large but the degree of confidence is low, you can encourage yourself to be more confident or give advice to repeat the exercises to gain confidence. Conversely, if you give a degree of self-confidence to an incorrect answer, that is, if you answered confidently, but the value of the incorrect answer is high, it is considered that there is a lack of understanding such as misunderstanding.
そして、 ステップ S 1 2 7で、 全ての結果をユーザーデータベース 3に記憶さ せて終了する。  Then, in step S127, all the results are stored in the user database 3, and the process ends.
なお、 上記第 2実施例では、 重要度に対応させた問題の難易度レベルを、 総ュ 一ザ一の正解率に基づいて決めるようにしているので、 難易度レベルや、 重要度 が、 ユーザーの実力にあったものとなっている。 また、 5 0 %以上、 すなわち、 半数以上のユーザ一が正解できる問題を、 中位の難易度として、 それ以上の正解 率の問題を複数の難易度レベル a〜 cに設定している。 この場合、 総、ユーザーの 正解率 5 0 %が、 この発明の、 予め設定した基準値に相当する。 このように、 中 位のレベル以上を複数のレベルに設定すると、 ユーザーにとって、 必ずマスター しなければならない、 中位より難易度レベルの低い問題を、 マスターする際に、 レベル毎に段階的に取り組むことができる。  In the second embodiment, the difficulty level of the question corresponding to the importance is determined based on the correct answer rate of the total user. Therefore, the difficulty level and the importance are determined by the user. It has become something which was in line with his ability. In addition, 50% or more, that is, questions that can be correctly answered by more than half of the users are set as medium difficulty, and questions with a higher correct answer rate are set for a plurality of difficulty levels a to c. In this case, the total correct answer rate of 50% of the users corresponds to the preset reference value of the present invention. In this way, if you set the middle level and higher levels to multiple levels, you will have to master the problems that are difficult for the user to be less difficult than the medium level and work on each level step by step be able to.
例えば、 そのユーザ一の全ての問題に対する正解率が低い場合でも、 まずは、 「aランクの問題を…解いてください」 というコメントを出して、 そのランクの 正解率が上がったら、 「bランクの問題を…」 というコメントを出すようにする。 レベルが細かく設定されていた方が、 ュ一ザ一の能力レベルにあった問題を抽 出し易くなり、 ユーザーは、 自分に合った問題に取り組んで、 徐々にレベルアツ プすることができるようになる。 For example, even if the correct answer rate for all the questions for that user is low, first make a comment "Please solve the a-rank question ..." When the answer rate rises, make a comment saying "B rank problem ...". If the level is set in detail, it will be easier to extract problems that were at the user's unique ability level, and users will be able to work on the problem that suits them and gradually improve the level .
なお、 上記第 1、 第 2の各実施例では、 ユーザーがホームページにアクセスし てから演習結果が表示されるまでの間、 ユーザー側端末 1が、 管理センタ一 5の 処理部 4に接続したままであるが、 処理部 4が出題すべき問題を決定したら、 ュ —ザ一側端末 1にダウンロードして接続を切るようにしてもよい。 第 1実施例で は、 図 2のフローチャート中のステップ S 1 2からステップ S 1 6をユーザ一側 端末 1だけで処理することになる。 また、 第 2実施例では 図 6のフ口一チヤ一 ト中のステヅプ S 1 1 3からステップ S I 1 7をユーザー側端末 1だけで処理す ることになる。  In the first and second embodiments, the user terminal 1 remains connected to the processing unit 4 of the management center 15 from when the user accesses the homepage until the exercise result is displayed. However, when the processing unit 4 determines a question to be set, it may be downloaded to the user-side terminal 1 and disconnected. In the first embodiment, steps S12 to S16 in the flowchart of FIG. 2 are processed only by the user-side terminal 1. Further, in the second embodiment, steps SI 13 to SI 17 in the flowchart of FIG. 6 are processed only by the user terminal 1.
その場合には、 時間計測を行って、 制限時間が来た時点で、 答案入力ができな くするようなプログラムも、 上記ユーザー側端末 1にダウンロードしておく必要 がある。 そして、 制限時間が来たら、 解答を処理部 4へ送信する。  In this case, it is necessary to download a program to the user terminal 1 that measures the time and prevents the user from inputting the answer when the time limit comes. Then, when the time limit has come, the answer is sent to the processing unit 4.
このように、 ユーザ一側端末 1で、 演習問題を解いている間は、 処理部 4との 間の接続を切るようにすれば、 通信費用が掛からないというメリッ トが有る。  As described above, if the connection with the processing unit 4 is cut off while the exercise problem is solved on the user-side terminal 1, there is an advantage that no communication cost is required.
また、 ユーザ一側端末 1に、 処理部 4と、 問題データベース 2、 ュ一ザ一デー 夕ベース 3、コメントデ一夕ベース 6を備えれば、管理セン夕ー 5は不要になる。 このように、 ユーザー端末 1だけで、 このシステムを構成する場合には、 ユーザ 一データべ一ス 3には、 その端末 1に対応するユーザーのみのユーザーデ一夕を 記憶させるようにすればよい。 ただし、 上記第 1、 第 2の各実施例のように、 管 理センタ一 5側で、 問題データベース 2を管理するようにすれば、 問題の追加や 削除、 属性の変更などを、 管理セン夕一 5側の都合で簡単にすることができる。 さらに、 管理センター 5が、 複数のユーザー側端末 1 と接続するようにしてお いた場合には、 処理部 4が、 同種の試験を目指す複数のユーザ一の成績や、 レべ ルなどのデー夕を集計することもできる。  Also, if the user side terminal 1 is provided with the processing unit 4, the problem database 2, the user data base 3, and the comment data base 6, the management center 5 becomes unnecessary. As described above, when this system is configured with only the user terminal 1, the user database 3 may store the user data of only the user corresponding to the terminal 1. . However, if the management center 15 manages the problem database 2 as in the first and second embodiments described above, the addition and deletion of a problem and the change of attributes can be performed by the management center. It can be simplified by the convenience of the 5 side. Furthermore, when the management center 5 is connected to a plurality of user terminals 1, the processing unit 4 performs the processing such as the results of a plurality of users aiming for the same kind of test and data such as levels. Can also be counted.
この場合も、 日頃の演習は、 ユーザ一側端末 1だけで行って、 定期的に管理セ ンター 5にデータを送信するようにしてもよい。 産業上の利用の可能性 Also in this case, the daily exercise may be performed only by the user's one terminal 1 and the data may be periodically transmitted to the management center 5. Industrial applicability
第 1の発明から第 1 3の発明によれば、 ユーザ一は、 いつでも、 自分が希望し た時間内に、自分のレベルに合った演習問題を、テスト形式で行うことができる。 従って、 ユーザーは、 何冊もの問題集を用意したり、 自分のレベルに合った問 題を探したりする必要がない。 このように、 簡単に、 自分のレベルに合った問題 を解くことができることにより、 例えば、 試験合格へ向けて、 より効率のよい学 習ができる。  According to the first invention to the thirteenth invention, the user can perform the exercises suitable for his / her level in the test format at any time within the time desired by the user. Therefore, the user does not need to prepare many question books or search for a problem that suits his or her level. In this way, you can easily solve problems that suit your level, so that you can learn more efficiently, for example, to pass an exam.
第 2の発明によれば、 ユーザーが、 ユーザー側端末を用いて、 演習問題に解答 することができる。 また、 ュ一ザ一の解答率や正解率に応じて能力レベルを設定 するようにしているので、 実倩に合ったユーザーの能力レベルの設定ができる。 第 3の発明によれば、 ユーザーは、 自分のやりたい数の問題を、 自分が使える 時間の中で解くことができる。  According to the second invention, the user can answer the exercise problem using the user terminal. In addition, since the ability level is set according to the answer rate and the correct answer rate of the user, it is possible to set the ability level of the user according to the user. According to the third invention, the user can solve as many problems as he wants in the time that he can use.
また、 ユーザーの解答率や正解率を記憶部に記憶しておけば、 各ユーザーの学 習履歴として、 ユーザ一が受験相談を受けるときに利用することもできる。  If the user's answer rate and correct answer rate are stored in the storage unit, they can be used as the learning history of each user when the user receives consultation for an examination.
第 4の発明によれば、 ユーザーの能力レベルに合った問題の出題ができる。 第 5の発明によれば、 出題形式や、 出題範囲、 重要度レベルによって問題を選 択することができる。 ユーザーは、 出題形式を特定することによって、 希望する 形式の演習ができるし、 出題範囲を特定することによって、 自分の学習速度に合 つた演習ができる。 また、 重要度レベルを特定することにより、 さらに効率的な 演習ができる。  According to the fourth aspect of the present invention, it is possible to set a question corresponding to the ability level of the user. According to the fifth invention, the question can be selected according to the question format, the question range, and the importance level. By specifying the question format, the user can exercise in the desired format. By specifying the question range, the user can exercise according to his or her own learning speed. In addition, specifying the level of importance enables more efficient exercises.
第 6の発明によれば、 問題の難易度レベルを実情に合わせて設定することがで きる。  According to the sixth invention, the difficulty level of the problem can be set according to the actual situation.
第 7の発明によれば、 ユーザーの能力レベルに合ったレベルの問題を出題する ことができる。特に、ユーザーが、 問題をマスターする際に、 身近な目標から徐々 にステップァヅプできるようになる。  According to the seventh invention, it is possible to set a problem at a level that matches the ability level of the user. In particular, users will be able to gradually step up from familiar goals when mastering issues.
第 8および第 9の発明によれば、 個々の解答に対するユーザーの解答時の自信 度が把握できる。 この自信度と、 解答と関連から、 そのユーザ一の学習上の弱点 や、 学習姿勢、 あるいは、 性格なども推測することができる。 従って、 ユーザ一 本人や、 ア ドバイザーなどが、 今後の学習の進め方などを考える際の参考にする こともできる。 According to the eighth and ninth inventions, the user's confidence in answering each individual answer can be grasped. From the confidence and the answer, the user's learning weakness, learning posture, or personality can be inferred. Therefore, the user It can also be used as a reference when the person or advisor, etc., considers how to proceed with learning in the future.
特に第 9の発明では、 解答と自信度とをユーザーが、 一目で把握することがで きる。  Particularly, in the ninth invention, the user can grasp the answer and the degree of confidence at a glance.
第 1 0の発明によれば、特定の自信度の問題を繰り返し出題することができる。 例えば、 自信度の低い問題を演習することによって、 ユーザーの自信度を高めれ ば、 安定して好成績がとれるようになる。  According to the tenth aspect, a problem with a specific degree of confidence can be repeatedly set. For example, if you increase the user's confidence by practicing questions with low confidence, you will get stable and good results.
第 1 1の発明によれば、 ユーザーの自信度を考慮して、 ユーザーにとってより 有用なコメントを出すことができる。  According to the eleventh invention, it is possible to give a more useful comment to the user in consideration of the user's confidence.
第 1 2の発明によれば、 自信を持って正解した確答率がわかる。 この確答率か ら、 ユーザーは、 自分の現在の演習結果がどの程度安定したものなのかを推測す ることができる。  According to the twelfth invention, the correct answer rate with confidence can be understood. From this response rate, the user can estimate how stable his / her current exercise results are.
第 1 3の発明によれば、 難易度が低く、 正解率の高い問題を確実に正解するこ とが、 難易度の高い問題に不安定に正解するよりも、 重要であるという考えに基 づいた重要度の設定ができる。 そのため、 ユーザーに対し、 より難易度の低い問 題から優先的に安定して正解できる力を養成することができるようになる。  According to the thirteenth invention, it is based on the idea that it is more important to correctly answer a question with a low difficulty level and a high accuracy rate than to answer an unstable question with a high difficulty level. Importance can be set. As a result, it is possible to train users to be able to give stable and correct answers preferentially from less difficult problems.

Claims

請求の範囲 The scope of the claims
ディスプレイ と、 ユーザー情報および問題情報を記憶する記億部と、 倩 報を入力する入力部と、 この入力部の入力情報と記憶部の問題情報とに基づ いて問題を特定する処理部とを備え、 上記記憶部に記憶させるユーザー情報 にはユーザ一の能力レベルを含み、 上記問題情報は、 問題そのものと、 その 問題の属性情報とからなり、 上記問題の属性情報には、 各問題の難易度レべ ルと、 問題ごとに設定した標準解答時間とを含み、 上記処理部は、 上記ユー ザ一の能力レベルに応じた問題の中から各問題の標準解答時間の合計が、 ュ 一ザ一が特定した希望演習時間内に収まる範囲で問題を選択する機能を備え た演習問題出題システム。  A display, a storage unit for storing user information and problem information, an input unit for inputting information, and a processing unit for specifying a problem based on the input information of the input unit and the problem information of the storage unit. The user information stored in the storage unit includes the ability level of the user, and the problem information includes the problem itself and attribute information of the problem. The attribute information of the problem includes difficulty of each problem. And the standard answer time set for each question. The processing unit calculates the sum of the standard answer times for each question from among the questions corresponding to the user's ability level. Exercise question setting system with a function to select questions within the desired exercise time specified by I.
ユーザー側端末を備え、 処理部は選択した問題をユーザー側端末に表示 させる機能と、 上記ユーザー側端末から入力された上記問題に対する解答に ついて解答率および正解率を算出する機能と、 算出した解答率および正解率 に基づいたユーザーの能力レベルを設定し、 それをユーザー情報として記億 部に記憶させる機能とを備えた請求項 1に記載の演習問題出題システム。  A processing unit is provided with the user terminal, the processing unit displays the selected question on the user terminal, a function to calculate the answer rate and the correct answer rate for the answer to the question input from the user terminal, and the calculated answer 2. The exercise question system according to claim 1, further comprising a function of setting a user's ability level based on the rate and the correct answer rate, and storing the user's ability level in the storage unit as user information.
処理部は、 ユーザー側端末から希望演習問題数が入力された場合には、 選択した問題数を上記希望演習問題数に一致させる機能を瞌えた請求項 2に 記載の演習問題出題システム。  3. The exercise question system according to claim 2, wherein the processing unit has a function of, when the number of desired exercises is input from the user terminal, matching the number of selected exercises with the number of desired exercises.
ユーザーの能力レベルに応じた難易度レベル別の出題比率を記憶部に記 億し、 処理部は、 ユーザーの能力レベルに対応する出題比率に基づいて、 出 題問題を選択する機能を備えた請求項 1〜 3のいずれか 1に記載の演習問題 出題システム。  The storage unit stores the ratio of questions for each difficulty level according to the user's ability level, and the processing unit has a function to select the question based on the question ratio corresponding to the user's ability level. Exercise questions given in any one of Items 1-3.
問題属性に、 出題形式分類、 出題範囲分類、 重要度レベルのうち、 いず れか 1または複数を含む請求項 1〜4のいずれか 1に記載の演習問題出題シ ステム。  The exercise question system according to any one of claims 1 to 4, wherein the question attributes include any one or more of a question format classification, a question range classification, and an importance level.
処理部は、 個々の問題に対する総ユーザーの平均正解率を基にしてその 問題の難易度を決定して、 問題情報として記億させる請求項 1〜 5のいずれ か 1に記載の演習問題出題システム。  The exercise question setting system according to any one of claims 1 to 5, wherein the processing unit determines the difficulty level of the question based on an average correct answer rate of all users for the individual question and records the difficulty level as question information. .
個々の問題に対する総ユーザーの正解率が予め設定した基準値以上の範 囲を複数の難易度レベルに設定する請求項 5に記載の演習問題出題システム c 8 . 問題の解答とともにその解答に対するユーザーの自信度を入力する自信 度入力手段を備え、 処理部は、 解答と自信度とを対応づけて記憶部に記憶さ せる機能を備えた請求項 1 ~ 7のいずれか 1に記載の演習問題出題システム c 9 . 処理部は、 ユーザーの演習結果として、 各解答の正誤と、 ユーザーの自 信度とを一覧表にして表示させる機能を備えた請求項 8に記載の演習問題出 題システム。 If the correct answer rate of the total users for each question is higher than a preset reference value Includes a confidence input means for inputting a user's confidence exercise question system c 8. With answers problems for the answer of claim 5 to set the circumference into a plurality of difficulty levels, the processing section answers and exercise question system c 9. processor according to any one of claims 1 to 7 having a function to store the degree of confidence in association with the storage unit, as the user exercises results, correctness of the answer 9. The exercise question system according to claim 8, further comprising a function of displaying a list of the user's confidence and a list.
1 0 . 処理部は、 自信度に基づいて問題を抽出する機能を備えた請求項 8また は 9に記載の演習問題出題システム。  10. The exercise question setting system according to claim 8, wherein the processing unit has a function of extracting a question based on the degree of confidence.
1 1 . 記憶部は、 解答の正誤と自信度とに応じた条件に対応づけたトレイナー コメントを記憶レ、 処理部は、 ユーザ一の演習結果の正誤および自信度と上 記条件とを対比して、 対応するコメントを抽出する機能を備えた請求項 8〜 1 0のいずれか 1に記載の演習問題出題システム。 1 1. The storage unit stores the trainer's comment associated with the condition according to the correctness of the answer and the confidence level, and the processing unit compares the correctness / wrongness and confidence level of the user's exercise result with the above condition. The exercise question setting system according to any one of claims 8 to 10, further comprising a function of extracting a corresponding comment.
1 2 . 処理部は、 ユーザーの演習結果として、 問題の難易度レベル別に、 正解 数に対するュ一ザ一が自信を持って正解した数の割合である確答率を算出し て、 それを表示させる機能を備えた請求項 8〜 1 1のいずれか 1に記載の演 習問題出題システム。  1 2. The processing unit calculates the accuracy rate, which is the ratio of the number of correct answers by the user to the number of correct answers, with respect to the number of correct answers, for each difficulty level of the problem, and displays the result as the user's exercise result. The exercise question setting system according to any one of claims 8 to 11 having a function.
1 3 . 難易度レベルの低い問題の重要度レベルを、 高いレベルに設定する請求 項 5に記載の演習問題出題システム。  13. The exercise question setting system according to claim 5, wherein the importance level of the question having a low difficulty level is set to a high level.
PCT/JP2002/012811 2001-12-12 2002-12-05 Exercise setting system WO2003050782A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2002361082A AU2002361082A1 (en) 2001-12-12 2002-12-05 Exercise setting system
JP2003551760A JPWO2003050782A1 (en) 2001-12-12 2002-12-05 Exercise question system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2001378705 2001-12-12
JP2001-378705 2001-12-12
JP2002269520 2002-09-17
JP2002-269520 2002-09-17

Publications (1)

Publication Number Publication Date
WO2003050782A1 true WO2003050782A1 (en) 2003-06-19

Family

ID=26625018

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2002/012811 WO2003050782A1 (en) 2001-12-12 2002-12-05 Exercise setting system

Country Status (3)

Country Link
JP (1) JPWO2003050782A1 (en)
AU (1) AU2002361082A1 (en)
WO (1) WO2003050782A1 (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122079A (en) * 2003-10-15 2005-05-12 Digital Boutique Inc SELF-MANAGEMENT TYPE e-LEARNING TOOL
WO2006000632A1 (en) * 2004-06-24 2006-01-05 Nokia Corporation Knowledge assessment
JP2006023566A (en) * 2004-07-08 2006-01-26 Matsushita Electric Ind Co Ltd Degree-of-comprehension determining system and method therefor
SG119225A1 (en) * 2004-06-04 2006-02-28 Education Learning House Co Lt Method of multi-level analyzing personal learning capacity
WO2006093065A1 (en) * 2005-03-02 2006-09-08 The Japan Institute For Educational Measurement, Inc. Learning level judging device and learning level judging program
WO2007066451A1 (en) * 2005-12-09 2007-06-14 Matsushita Electric Industrial Co., Ltd. Information processing system, information processing apparatus and method
JP2008026583A (en) * 2006-07-21 2008-02-07 Yamaguchi Univ Adaptive test system and its method
JP2008129032A (en) * 2006-11-16 2008-06-05 Casio Comput Co Ltd Practice procedure generator and practice procedure generation processing program
JP2010122254A (en) * 2008-11-17 2010-06-03 Kankyo Keiei Senryaku Soken:Kk User education system
JP2011076407A (en) * 2009-09-30 2011-04-14 Oki Electric Industry Co Ltd Document processing system
WO2011110570A1 (en) 2010-03-09 2011-09-15 Glaxosmithkline Biologicals S.A. Treatment of streptococcal infections
WO2011129048A1 (en) * 2010-04-14 2011-10-20 株式会社ソニー・コンピュータエンタテインメント Game support server, game device, game support system and game support method
WO2012127580A1 (en) * 2011-03-18 2012-09-27 富士通株式会社 Question-setting device and question-setting method
WO2012131948A1 (en) * 2011-03-30 2012-10-04 富士通株式会社 Problem-providing device and problem-providing method
US8382483B2 (en) 2006-06-21 2013-02-26 Panasonic Corporation Service providing system
WO2013102966A1 (en) * 2012-01-06 2013-07-11 Flens株式会社 Learning assistance server, learning assistance system, and learning assistance program
US8521271B2 (en) 2006-11-06 2013-08-27 Panasonic Corporation Brain wave identification method adjusting device and method
KR20150007194A (en) 2011-03-16 2015-01-20 후지쯔 가부시끼가이샤 Test execution assistance device, test execution assistance method, and storage medium
JP2015018096A (en) * 2013-07-10 2015-01-29 正樹 後藤 Collective education method and collective education computer system
JP2015121682A (en) * 2013-12-24 2015-07-02 富士通株式会社 Learning assist program, learning assist device, and learning assist method
JP2015197460A (en) * 2014-03-31 2015-11-09 株式会社サイトビジット Information processor, information processing method, and program
CN105139709A (en) * 2015-09-24 2015-12-09 广西德高仕安全技术有限公司 Generation system of safety training examination paper and method
JP2016177306A (en) * 2014-11-28 2016-10-06 株式会社サイトビジット E-learning system
JP2016206619A (en) * 2015-04-16 2016-12-08 RISU Japan株式会社 Electronic publication containing teaching material for learning and learning support system using the same
JP6308482B1 (en) * 2017-03-06 2018-04-11 弘道 外山 Learning support system, learning support server, learning support method, learning support program, and operation method of learning support system
CN109086431A (en) * 2018-08-13 2018-12-25 广东小天才科技有限公司 Learning method and electronic equipment are consolidated in a kind of knowledge point
JP2019174580A (en) * 2018-03-28 2019-10-10 株式会社High−Standard&Co. Teaching material data creation program and teaching material creation method
WO2020115171A1 (en) 2018-12-06 2020-06-11 Glaxosmithkline Biologicals Sa Immunogenic compositions
EP3799884A1 (en) 2019-10-01 2021-04-07 GlaxoSmithKline Biologicals S.A. Immunogenic compositions
JP2021092725A (en) * 2019-12-12 2021-06-17 カシオ計算機株式会社 Learning support device, learning support method and program
JP2021093064A (en) * 2019-12-12 2021-06-17 株式会社ナレロー Information processing method, information processing device, storage medium and program
WO2022175423A1 (en) 2021-02-22 2022-08-25 Glaxosmithkline Biologicals Sa Immunogenic composition, use and methods

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7345149B2 (en) * 2020-03-31 2023-09-15 株式会社大阪教育研究所 Assignment recommendation system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04145481A (en) * 1990-10-08 1992-05-19 Brother Ind Ltd Electronic learning machine
JPH0830187A (en) * 1994-07-20 1996-02-02 Shiyuusui:Kk Automatic marking device, marking display method and answer paper used therefor
JPH10319826A (en) * 1997-05-19 1998-12-04 Katsuichi Tashiro Teaching material management system
JPH11282826A (en) * 1998-03-31 1999-10-15 Nippon Telegr & Teleph Corp <Ntt> Electronic education system using internet
JP2000267554A (en) * 1999-03-16 2000-09-29 Gakushu Kankyo Kenkyusho:Kk Device and method for supporting learning, and recording medium thereof
WO2001050439A1 (en) * 1999-12-30 2001-07-12 Cerego Cayman, Inc. System, apparatus and method for maximizing effectiveness and efficiency of learning, retaining and retrieving knowledge and skills
JP2001318583A (en) * 2000-05-11 2001-11-16 Kengo Ito Learning system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04145481A (en) * 1990-10-08 1992-05-19 Brother Ind Ltd Electronic learning machine
JPH0830187A (en) * 1994-07-20 1996-02-02 Shiyuusui:Kk Automatic marking device, marking display method and answer paper used therefor
JPH10319826A (en) * 1997-05-19 1998-12-04 Katsuichi Tashiro Teaching material management system
JPH11282826A (en) * 1998-03-31 1999-10-15 Nippon Telegr & Teleph Corp <Ntt> Electronic education system using internet
JP2000267554A (en) * 1999-03-16 2000-09-29 Gakushu Kankyo Kenkyusho:Kk Device and method for supporting learning, and recording medium thereof
WO2001050439A1 (en) * 1999-12-30 2001-07-12 Cerego Cayman, Inc. System, apparatus and method for maximizing effectiveness and efficiency of learning, retaining and retrieving knowledge and skills
JP2001318583A (en) * 2000-05-11 2001-11-16 Kengo Ito Learning system

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122079A (en) * 2003-10-15 2005-05-12 Digital Boutique Inc SELF-MANAGEMENT TYPE e-LEARNING TOOL
SG119225A1 (en) * 2004-06-04 2006-02-28 Education Learning House Co Lt Method of multi-level analyzing personal learning capacity
WO2006000632A1 (en) * 2004-06-24 2006-01-05 Nokia Corporation Knowledge assessment
JP2006023566A (en) * 2004-07-08 2006-01-26 Matsushita Electric Ind Co Ltd Degree-of-comprehension determining system and method therefor
WO2006093065A1 (en) * 2005-03-02 2006-09-08 The Japan Institute For Educational Measurement, Inc. Learning level judging device and learning level judging program
JPWO2007066451A1 (en) * 2005-12-09 2009-05-14 パナソニック株式会社 Information processing system, information processing apparatus and method
WO2007066451A1 (en) * 2005-12-09 2007-06-14 Matsushita Electric Industrial Co., Ltd. Information processing system, information processing apparatus and method
US7945865B2 (en) 2005-12-09 2011-05-17 Panasonic Corporation Information processing system, information processing apparatus, and method
US8382483B2 (en) 2006-06-21 2013-02-26 Panasonic Corporation Service providing system
JP2008026583A (en) * 2006-07-21 2008-02-07 Yamaguchi Univ Adaptive test system and its method
US8521271B2 (en) 2006-11-06 2013-08-27 Panasonic Corporation Brain wave identification method adjusting device and method
JP2008129032A (en) * 2006-11-16 2008-06-05 Casio Comput Co Ltd Practice procedure generator and practice procedure generation processing program
JP4742288B2 (en) * 2006-11-16 2011-08-10 カシオ計算機株式会社 Practice procedure generation device and practice procedure generation processing program
JP2010122254A (en) * 2008-11-17 2010-06-03 Kankyo Keiei Senryaku Soken:Kk User education system
JP2011076407A (en) * 2009-09-30 2011-04-14 Oki Electric Industry Co Ltd Document processing system
WO2011110570A1 (en) 2010-03-09 2011-09-15 Glaxosmithkline Biologicals S.A. Treatment of streptococcal infections
WO2011129048A1 (en) * 2010-04-14 2011-10-20 株式会社ソニー・コンピュータエンタテインメント Game support server, game device, game support system and game support method
US9613540B2 (en) 2011-03-16 2017-04-04 Fujitsu Limited Examination support apparatus, and examination support method
KR20150007194A (en) 2011-03-16 2015-01-20 후지쯔 가부시끼가이샤 Test execution assistance device, test execution assistance method, and storage medium
JP5686180B2 (en) * 2011-03-18 2015-03-18 富士通株式会社 Questioning apparatus and questioning method
US9536440B2 (en) 2011-03-18 2017-01-03 Fujitsu Limited Question setting apparatus and method
WO2012127580A1 (en) * 2011-03-18 2012-09-27 富士通株式会社 Question-setting device and question-setting method
JP5686183B2 (en) * 2011-03-30 2015-03-18 富士通株式会社 Questioning apparatus and questioning method
WO2012131948A1 (en) * 2011-03-30 2012-10-04 富士通株式会社 Problem-providing device and problem-providing method
US9711057B2 (en) 2011-03-30 2017-07-18 Fujitsu Limited Question setting apparatus and method
WO2013102966A1 (en) * 2012-01-06 2013-07-11 Flens株式会社 Learning assistance server, learning assistance system, and learning assistance program
JP2013142718A (en) * 2012-01-06 2013-07-22 Flens Co Ltd Learning-support server, learning-support system, and learning-support program
US9672752B2 (en) 2012-01-06 2017-06-06 Flens Inc. Learning assistance server, learning assistance system, and learning assistance program
JP2015018096A (en) * 2013-07-10 2015-01-29 正樹 後藤 Collective education method and collective education computer system
JP2015121682A (en) * 2013-12-24 2015-07-02 富士通株式会社 Learning assist program, learning assist device, and learning assist method
JP2015197460A (en) * 2014-03-31 2015-11-09 株式会社サイトビジット Information processor, information processing method, and program
JP2016177306A (en) * 2014-11-28 2016-10-06 株式会社サイトビジット E-learning system
JP2016206619A (en) * 2015-04-16 2016-12-08 RISU Japan株式会社 Electronic publication containing teaching material for learning and learning support system using the same
CN105139709A (en) * 2015-09-24 2015-12-09 广西德高仕安全技术有限公司 Generation system of safety training examination paper and method
JP6308482B1 (en) * 2017-03-06 2018-04-11 弘道 外山 Learning support system, learning support server, learning support method, learning support program, and operation method of learning support system
JP2018146799A (en) * 2017-03-06 2018-09-20 弘道 外山 Learning support system, learning support server, learning support method, learning support program, and operation method for learning support system
JP2019174580A (en) * 2018-03-28 2019-10-10 株式会社High−Standard&Co. Teaching material data creation program and teaching material creation method
JP7195570B2 (en) 2018-03-28 2022-12-26 株式会社High-Standard&Co. Teaching material data creation program and teaching material creation method
CN109086431A (en) * 2018-08-13 2018-12-25 广东小天才科技有限公司 Learning method and electronic equipment are consolidated in a kind of knowledge point
CN109086431B (en) * 2018-08-13 2020-11-03 广东小天才科技有限公司 Knowledge point consolidation learning method and electronic equipment
WO2020115171A1 (en) 2018-12-06 2020-06-11 Glaxosmithkline Biologicals Sa Immunogenic compositions
WO2021064050A1 (en) 2019-10-01 2021-04-08 Glaxosmithkline Biologicals Sa Immunogenic compositions
EP3799884A1 (en) 2019-10-01 2021-04-07 GlaxoSmithKline Biologicals S.A. Immunogenic compositions
JP2021092725A (en) * 2019-12-12 2021-06-17 カシオ計算機株式会社 Learning support device, learning support method and program
JP2021093064A (en) * 2019-12-12 2021-06-17 株式会社ナレロー Information processing method, information processing device, storage medium and program
JP7415520B2 (en) 2019-12-12 2024-01-17 カシオ計算機株式会社 Learning support device, learning support method, and program
JP7421207B2 (en) 2019-12-12 2024-01-24 株式会社ナレロー Information processing method, information processing device, storage medium, and program
WO2022175423A1 (en) 2021-02-22 2022-08-25 Glaxosmithkline Biologicals Sa Immunogenic composition, use and methods

Also Published As

Publication number Publication date
JPWO2003050782A1 (en) 2005-04-21
AU2002361082A1 (en) 2003-06-23

Similar Documents

Publication Publication Date Title
WO2003050782A1 (en) Exercise setting system
US9672752B2 (en) Learning assistance server, learning assistance system, and learning assistance program
US20060246411A1 (en) Learning apparatus and method
Wood Multiple choice: A state of the art report
US20090061408A1 (en) Device and method for evaluating learning
KR101031304B1 (en) Electroinc media based universal learning system and method thereof
Shin et al. Evaluating different standard-setting methods in an ESL placement testing context
JP7077533B2 (en) Learning support device, learning support system, learning support method and computer program
US20030186206A1 (en) Method for presenting most suitable question and apparatus for presenting most suitable question
JP2002221893A (en) Learning support system
Flannelly Using feedback to reduce students' judgment bias on test questions
WO2018051844A1 (en) Management system, management method, and program
JP3634856B2 (en) Test result analysis apparatus, method and program
Brown et al. Functional assessment of immediate task planning and execution by adults with acquired brain injury
JP2002287608A (en) Learning support system
JP6551818B1 (en) Information processing apparatus and program
JP2019053270A (en) Learning ability evaluation system, learning ability evaluation method, and computer program
JP2020003690A (en) Learning support device, method, and computer program
Kwon et al. Reading speed as a constraint of accuracy of self‐perception of reading skill
JP4940213B2 (en) Education system using electronic blackboard
KR20230136265A (en) Personalized and customized education support system based on elo rating
JP2005331650A (en) Learning system, information processor, information processing method and program
JP6313515B1 (en) Learning ability evaluation system, learner terminal, and computer program
JP2002072857A (en) Method and system for performing simulated examination while utilizing communication network
JP2019191388A (en) Learning assist device and learning assist program

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2003551760

Country of ref document: JP

122 Ep: pct application non-entry in european phase