[go: up one dir, main page]

GB2627259A - A computer implemented adaptive testing and question generation system and method - Google Patents

A computer implemented adaptive testing and question generation system and method Download PDF

Info

Publication number
GB2627259A
GB2627259A GB2302270.0A GB202302270A GB2627259A GB 2627259 A GB2627259 A GB 2627259A GB 202302270 A GB202302270 A GB 202302270A GB 2627259 A GB2627259 A GB 2627259A
Authority
GB
United Kingdom
Prior art keywords
question
questions
level
learner
student
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2302270.0A
Other versions
GB202302270D0 (en
Inventor
Krishna Boya Rama
Enuga Prasanthi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB2302270.0A priority Critical patent/GB2627259A/en
Publication of GB202302270D0 publication Critical patent/GB202302270D0/en
Publication of GB2627259A publication Critical patent/GB2627259A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/06Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer-type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • G09B7/04Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student characterised by modifying the teaching programme in response to a wrong answer, e.g. repeating the question, supplying a further explanation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/06Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer-type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers
    • G09B7/08Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer-type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers characterised by modifying the teaching programme in response to a wrong answer, e.g. repeating the question, supplying further information

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Electrically Operated Instructional Devices (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A computer implemented adaptive testing and question generation system comprises an adaptive testing module, a question generation module (132), an answers evaluation module, and a moderator. The question generation module (132) generates questions based on four aspects like successful control for difficulty, novelty, fluency and latency (112). The answers evaluation module observes indirectly connected latent events for learning history like partial understanding, slips and guessing answers (114). The system detects the guess work (114) by evaluating answers based on min/max time of response (120) required to answer the question and the system declares the performance of learner based on views and likes (122), response time for each question, estimates the benchmark comparing the previous year marks (136) for estimation of regional and national level positional of the learner (110).

Description

Title of the Invention
"A COMPUTER IMPLEMENTED ADAPTIVE TESTING AND QUESTION GENERATION SYSTEM AND METHOD"
Technical field of the Invention
[0001] The present invention generally relates to network based knowledge assessment and learning system. More particularly, it relates to novel adaptive testing and question generation system.
Background of the invention
[0002] Traditional multiple choice testing techniques to assess the extent of a person's knowledge in a subject matter include varying numbers of possible choices that are selectable by one-dimensional or Right/ Wrong (RW) answers. A typical multiple choice test might include questions with three possible answers, where generally one of such answers can be eliminated by the test subject as incorrect as a matter of first impression. This gives rise to a significant probability that a guess on the remaining answers could result in a correct response.
[0003] Under this situation, a successful guess would mask the true extent or the state of knowledge of the test subject, as to whether he or she is informed (i.e., confident with a correct response), misinformed (i e., confident in the response, which response, however, is not correct) or lacked information e, having no information). Accordingly, the traditional multiple choice one-dimensional testing technique is highly ineffectual as a means to measure the true extent of knowledge of the test Subject.
[0004] Despite this significant drawback, the traditional one-dimensional, multiple choice testing techniques are widely used by information-intensive and information-dependent organizations such as banking, insurance, utility companies, educational institutions and governmental agencies.
[0005] US10579654B2 discloses generation of questions included to be a learned knowledge point. A knowledge graph for questions, a node path including a target node indicating the to-be-learned knowledge point, the nodes in knowledge graph for questions indication question-answers steps, questions styles corresponding to the question answering steps. Generating questions required by the user according to question-answering steps, knowledge points tested in the question-answering steps, and questioning styles corresponding to the question-answering steps indicated by nodes on the node path.
[0006] CN109255998B discloses the system comprises an information acquisition module, a structured question bank, an online test module, a special subject training module, a test paper acquisition module, an accurate analysis module, a task issuing module, a learning condition tracking module, an electronic lesson preparation module and a social interaction module. The accurate analysis module analyzes the information according to the test paper acquisition module and the online test module to obtain diagnosis information.
[0007] None of the models that are existing assesses learners' overall performance in all subjects, analyzes the areas for improvement, and estimates their competitive standing.
[0008] Therefore, there is a need to construct personalized learning models for better learning process. A model that assesses learners' overall performance in all subjects, analyzes the areas for improvement, and estimates their competitive standing.
Brief Summary of the invention
[0009] The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
[0010] It is an object of the present invention to develop a novel adaptive testing and question generation system.
[0011] It is yet another object of the present invention to range the difficulty levels of assessment from high to low based on the struggle of the learner.
[0012] It is yet another object of the present invention to generate questions for assessing the subject knowledge and performance of learner.
[0013] It is yet another object of the present invention to evaluate answers for guess work.
[0014] It is yet another object of the present invention to declare the position of learner comparing with regional/national level.
[0015] According to an aspect of present invention, an adaptive testing and question generation method is disclosed. The method comprises a step of configuring a novel adaptive testing and question generation system for automatically generating question-answer pairs with varying phrases, difficulty levels, and variables and intelligently shows questions in each test as per the learner's ability and characteristics.
[0016] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes selecting a concept for generation of questions based on a difficulty level.
[0017] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes displaying a plurality of questions as per 'learning ability of a learner' based on a previous year's result.
[0018] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes generating questions by changing question structure, changing variables, shuffling answer choices and real time difficulty level.
[0019] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes ranging difficulty from easy to hard by evaluating struggle of learner.
[0020] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes evaluating the questions generated by question generation module for fluency and novelty.
[0021] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes evaluating answers considering factors like partial understanding, slips and guessing answers.
[0022] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes concluding the results on performance as poor subject knowledge/good subject knowledge.
[0023] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes displaying the analytics about performance in terms of guessed answers, wrong answers, time of response to each question, and subject knowledge.
[0024] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes recommending the study material to revise the concepts for improving knowledge.
[0025] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes declaring the position of learner comparing with regional/national level.
[0026] In accordance with an aspect of the present invention, wherein the method of working the novel adaptive testing and question generation system includes guiding teachers and parents on learner's strengths and areas of improvement.
[0027] In accordance with an aspect of the present invention, wherein the said adaptive testing module enable the user to select concept or subject to test knowledge and performance.
[0028] In accordance with an aspect of the present invention, wherein the said output device is used to display the plurality of questions generated by question generation module.
[0029] In accordance with an aspect of the present invention, wherein the said question generation module generates questions by changing question structure, changing variables, shuffling answer choices and real time difficulty level based on difficulty level.
[0030] In accordance with an aspect of the present invention, wherein the said evaluation module sets the level of difficulty ranging from hard to easy base on the struggle of learner to reach the benchmark and to assess the level of subject knowledge to conclude.
[0031] In accordance with an aspect of the present invention, wherein moderator evaluates for novelty and fluency in questions generated by question generation model.
[0032] In accordance with an aspect of the present invention, wherein the adaptive testing module determines guess work based on time of response and target level reached by learner.
[0033] In accordance with an aspect of the present invention, wherein adaptive testing module evaluate guesswork of learner by analyzing the min/max response time, if the response time is less than average minimum response time then it concludes it is guesswork.
[0034] In accordance with an aspect of the present invention, wherein the evaluation module evaluates answers considering factors like partial understanding, slips and guessing answers.
[0035] In accordance with an aspect of the present invention, wherein the adaptive testing module provides analytics of learner performance in all subjects, analyzing the areas of improvement, assessing the response time (guesswork or not), ranking the resources based on views and likes, evaluating the regional and national position of the learner.
[0036] In accordance with an aspect of the present invention, the adaptive testing module utilizes resources like videos, Time-based tests, quizzes, exam questions, flashcards, randomly generated questions and teacher assigned tests to promote active recalling and spaced repetitions.
[0037] In accordance with an aspect of the present invention, wherein the adaptive testing module diagnoses the potential areas to be concerned by learner, enables the learner to get a picture of their learning journey to change and/or create brand new schemes of learning.
[0038] According to another aspect of present invention, a novel adaptive testing and question generation system is disclosed. The system comprises an adaptive testing module, a question generation module, an answers evaluation module, and a moderator.
[0039] In accordance with the aspect of present invention, the output and input devices are connected to terminals through a network. The adaptive testing module ranges the difficulty levels of assessment from high to low based on the previous year marks of a learner and determines target based on response time and correctly answered questions in mock/previous tests.
[0040] In accordance with the aspect of present invention, the question generation module generates questions based on four aspects like successful control for difficulty, novelty, fluency and latency. The module manipulates the question structure, generate more di stractors, create more descriptiveness and change the variables to assess the subject knowledge and performance.
[0041] In accordance with the aspect of present invention, the moderator evaluates fluency and novelty in the questions generated by the question generation module.
[0042] In accordance with the aspect of present invention,the answers evaluation moduleobserves indirectly connected latent events for learning history like partial understanding, slips and guessing answers.
[0043] In accordance with the aspect of present invention,the said system detects the guess work by evaluating answers based on min/max time of response required to answer the question.
[0044] In accordance with the aspect of present invention, the said system declares the performance of learner based on views and likes, response time for each question, estimates the benchmark comparing the previous year marks for estimation of regional and national level positional of the learner.
[0045] Further objects, features, and advantages of the invention will be readily apparent from the following description of the preferred embodiments thereof, taken in conjunction with the accompanying drawings.
Brief Description of the Drawings
[0046] The invention will be further understood from the following detailed description of a preferred embodiment taken in conjunction with an appended drawing, in which: Fig. 1 illustrates a method of testing learner's performance and ability in accordance with an exemplary embodiment of the present invention.
Fig. 2 illustrates the block diagram showing various features of adaptive testing module in accordance with an exemplary embodiment of the present invention.
Fig. 3 illustrates the block diagram showing features of answer evaluation module in accordance with an exemplary embodiment of the present invention.
Fig. 4 illustrates the block diagram showing question generation module generating questions from multiple choices to descriptive questions in accordance with an exemplary embodiment of the present invention.
Fig. 5 illustrates the evaluation of the learner reaching the target level in accordance with an exemplary embodiment of the present invention.
Fig. 6 illustrates the block diagram in which the questions generated by the question generation algorithm in accordance with an exemplary embodiment of the present invention.
Fig. 7 illustrates the working of adaptive learning module evaluating the guesswork of the students in accordance with an exemplary embodiment of the present invention.
Fig. 8 illustrates the adaptive testing process of analyzing the performance and displaying analytics in accordance with an exemplary embodiment of the present invention.
Fig. 9 illustrates the graphical curves comparing initial study with reviews in accordance with an exemplary embodiment of the present invention.
Fig. 10 illustrates thetarget level set by evaluating the time of response, previous years benchmark and the number of questions answered in the test in accordance with an exemplary embodiment of the present invention.
Fig. 11 illustrates the flow diagram representing average weight for predicting level of a student in accordance with an exemplary embodiment of the present invention.
Fig 12 illustrates the flow diagram representing extra average weight for predicting level of a student in accordance with an exemplary embodiment of the present invention.
Fig. 13 illustrates the block diagram representing working of question generation system with question generation module creating hypernyms, distracters and answer option generators creating hypernyms, distracters in accordance with an exemplary embodiment of the present invention.
Fig. 14 illustrates the architectural diagram representing azure functions as servi ces,in accordance with an exemplary embodiment of the present invention.
Detailed Description of the invention
[0047] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. In addition, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
[0048] The use of "including", "comprising" or "having" and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms "a" and "an" herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Further, the use of terms "first", "second", and "third", and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
[0049] According to an exemplary embodiment of present invention, an adaptive testing and question generation method is disclosed. The method comprises a step of configuring a novel adaptive testing and question generation system for automatically generating question-answer pairs with varying phrases, difficulty levels, and variables and intelligently shows questions in each test as per the learner's ability and characteristics.
[0050] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes selecting a concept for generation of questions based on a difficulty level.
[0051] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes displaying a plurality of questions as per 'learning ability of a learner' based on a previous year's result.
[0052] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes generating questions by changing question structure, changing variables, shuffling answer choices and real time difficulty level.
[0053] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes ranging difficulty from easy to hard by evaluating struggle of learner.
[0054] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes evaluating the questions generated by question generation module for fluency and novelty.
[0055] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes evaluating answers considering factors like partial understanding, slips and guessing answers.
[0056] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes concluding the results on performance as poor subject knowledge/good subject knowledge.
[0057] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes displaying the analytics about performance in terms of guessed answers, wrong answers, time of response to each question, and subject knowledge.
[0058] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes recommending the study material to revise the concepts for improving knowledge.
[0059] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes declaring the position of learner comparing with regional/national I evel [0060] In accordance with the exemplary embodiment of the present invention, wherein the method of working the novel adaptive testing and question generation system includes guiding teachers and parents on learner's strengths and areas of improvement.
[0061] In accordance with the exemplary embodiment of the present invention, wherein the said adaptive testing module enable the user to select concept or subject to test knowledge and performance.
[0062] In accordance with the exemplary embodiment of the present invention, wherein the said output device is used to display the plurality of questions generated by question generation module.
[0063] In accordance with the exemplary embodiment of the present invention, wherein the said question generation module generates questions by changing question structure, changing variables, shuffling answer choices and real time difficulty level based on difficulty level.
[0064] In accordance with the exemplary embodiment of the present invention, wherein the said evaluation module sets the level of difficulty ranging from hard to easy base on the struggle of learner to reach the benchmark and to assess the level of subject knowledge to conclude.
[0065] In accordance with the exemplary embodiment of the present invention, wherein moderator evaluates for novelty and fluency in questions generated by question generation model.
[0066] In accordance with the exemplary embodiment of the present invention, wherein the adaptive testing module determines guess work based on time of response and target level reached by learner.
[0067] In accordance with the exemplary embodiment of the present invention, wherein adaptive testing module evaluate guesswork of learner by analyzing the min/max response time, if the response time is less than average minimum response time then it concludes it is guesswork.
[0068] In accordance with the exemplary embodiment of the present invention, wherein the evaluation module evaluates answers considering factors like partial understanding, slips and guessing answers.
[0069] In accordance with the exemplary embodiment of the present invention, wherein the adaptive testing module provides analytics of learner performance in all subjects, analyzing the areas of improvement, assessing the response time (guesswork or not), ranking the resources based on views and likes, evaluating the regional and national position of the learner.
[0070] In accordance with the exemplary embodiment of the present invention, the adaptive testing module utilizes resources like videos, Time-based tests, quizzes, exam questions, flashcards, randomly generated questions and teacher assigned tests to promote active recalling and spaced repetitions.
[0071] In accordance with the exemplary embodiment of the present invention, wherein the adaptive testing module diagnoses the potential areas to be concerned by learner, enables the learner to get a picture of their learning journey to change and/or create brand new schemes of learning.
[0072] Referring to Figs now, Fig. 1 illustrates the flow process of testing the learner's performance and ability in accordance with the exemplary embodiment of the present invention. In accordance with the exemplary embodiment of present invention, learner is enabled to select the concept or subject to test his knowledge and performance. The questions in the test are generated by the question generation module. These questions are fluent and novel evaluated by moderator. The difficulty level of questions displayed to the learner is chosen by analyzing his performance in the mock/previous tests in last academic year and those marks are benchmarked. After analyzing the difficulty level the test gets started. The questions range easy to difficult and short descriptive to long descriptive questions. The tests range from time-based tests, quizzes, exam questions, flashcards, and randomly generated questions to teacher-assigned tests.
[0073] In foundation level the student is offered content related to their ability such as: resources, videos, and practice tests etc., to master their level and progress them to next levelif the student is of the medium level then the content related to their ability such as: resources, videos and practice tests etc., to master their level and progress them to next level. If the student is of the advanced level, then the content displayed is of advanced level content, in form of resources, videos and practice tests.
[0074] Fig. 2 illustrates the block diagram showing various features of adaptive testing module in accordance with the exemplary embodiment of the present invention. One of the key features of adaptive testing module is personalized learning by providing analytics about the learner's performance in all subjects, analyzing the areas of improvement, assessing the response time (120) (guesswork or not), ranking the resources based on views and likes (122), estimating the regional and national position of the learner as per the performance in assessment (124). The module curated to display questions in such a way those intelligently generating questions in each test as per the learner's ability and characteristics, thereby enhancing the learning effect conveniently and effectively. Another key feature of module is it judges the performance but does not evaluate pass or fail. It suggests learning material and reducing the difficulty level in the mock tests.
[0075] Fig. 3 illustrates the block diagram showing features of answer evaluation module in accordance with the exemplary embodiment of the present invention. In accordance with the exemplary embodiment of present invention, it evaluates partial understanding, slips, and guessing answers and also learners state fluctuates during the learning process. These leads to student's stochastic behaviors when practicing the concepts and evaluates the answers by considering factors. The answer evaluating module observes that minimum target level is reached by the student (126). It involves in changing variables in questions (128) and changes the question itself (130). After observing minimum target level it observes for the target by tutor is reached (132).
[0076] Fig. 4 illustrates the block diagram showing question generation module (132) generating questions from multiple choices to descriptive questions in accordance with the exemplary embodiment of the present invention. The question generation module randomly changes the question's structure (118), generates more distractors, shows the questions that are more descriptive (134), or changes the variables in the question. This is done to assess the subject knowledge and performance. If the learner reaches the target level in spite of all constraints, it is declared that learner has mastered the concepts.
[0077] The question generation module generates questions in three levels i.e., easy, medium, and difficult (138). The difficulty level is chosen based on the benchmark of previous years (136) and the learner's performance in previous academic years. Based on level, the questions are generated by paraphrasing and changing the values if it is a question from the mathematics. Different similar questions are generated for each user with different values.
[0078] The question type is predicted by the convolutional neural network, which also adds the meanings of the question type to the process of generating questions. This Question Generation technique also be applied using neural encoder-decoder model to generate meaningful and different questions from natural language sentences. The encoder reads the input text and the position of the answer to produce an answer-sensitive input representation, which is sent to the decoder to generate an answer-oriented question. The difficulty of the generated questions be controlled by easy, medium and difficulty levels of distractors and hypernyms.
[0079] Fig. 5 illustrates the evaluation of the learner reaching the target level. If the target level is reached, the difficulty in the mocks will increase in accordance with the exemplary embodiment of the present invention. Otherwise, the difficulty level would be set to easy or medium to improve the subject knowledge and performance. And study material would be recommended to the student to revise the concepts. After reaching the highest difficulty level, if the student can reach the set target level, it can be declared that the student mastered the concepts and can skip that topic.
[0080] Fig. 6 illustrates the block diagram in which the questions generated by the question generation algorithm are evaluated for fluency and novelty by a moderator in accordance with the exemplary embodiment of the present invention.
[0081] Fig. 7 illustrates the working of adaptive learning module evaluating the guesswork of the students in accordance with the exemplary embodiment of the present invention. It evaluates in such a way that it can detect the guesses answers (140). The adaptive learning model analyses the minimum and maximum response time required to answer every question. This analysis is done using every student's response time (142). The machine learning algorithms analyses all these response times and predict response time for every question based on the level of difficulty, description of the question etc. [0082] Fig. 8 illustrates the adaptive testing process in accordance with the exemplary embodiment of the present invention. It includes a question generation repository and learner's profile repository in accordance with the exemplary embodiment of the present invention. The question generation module generates questions as per learners' performance in the previous year. These questions are delivered to learners. To improve subject knowledge and performance, the difficulty level of mocks will be increased if the target level is reached. Otherwise, the level will be set to easy or medium. And study material would be recommended to the student to revise the concepts. After reaching the highest difficulty level also the student reaching the set target level, it is declared that the student mastered the concepts and can skip that topic. The question generation module analyses this master level.
[0083] Fig. 9 illustrates the graphical curves comparing initial study with reviews in accordance with the exemplary embodiment of the present invention. Fig. 10 illustrates thetarget level set by evaluating the time of response, previous years benchmark and the number of questions answered in the test in accordance with the exemplary embodiment of the present invention.
[0084] Fig. 11 illustrates the flow diagram representing average weight for predicting level of a student in accordance with the exemplary embodiment of the present invention. From the collected questions, the system creates a test with 3 easy, 4 medium and 3 difficult questions. System displays the questions and collect the answers. Calculates the average weight of the student through equation Average Weight-01-Z90 0 3 Wi x P(A/Q=i) Li0.1 Wi x Ti Based on the average weight the level of the student is determined: If the average is <0.5, then student is at Foundation level If the Average weight is = 0.5, then student is at Medium level If the Average weight is > 0.5, then student is at Advanced level.
[0085] Fig. 12 illustrates the flow diagram representing extra average weight for predicting level of a student in accordance with the exemplary embodiment of the present invention. The system compensating students from disadvantaged back grounds (poor education, poor regions or economically backward). The extra average is calculated as follows: E0., wi xp(A/o =i) EL3, xp(AR2=0)Average Weight Ol_L?.3 (wpx EwixTi w x [0086] Fig. 13 illustrates the block diagram representing working of question generation system with question generator creating hypernyms, distracters and answer option generators creating hypernyms, distracters in accordance with the exemplary embodiment of the present invention.
[0087] Personalized Learning Average Weight Formula Predictor To predict the learning level of a student and personalize their learning. The latest updated technologies like Artificial Intelligence and Machine Learning are used to predict the student assessment level.
[0088] At the initial level, the students are assigned with a test which contains 3 Easy, 4 Medium, and 3 Hard questions. The weightage provided for each level of question is: WE-Weight of Easy level Question = W0.1 = 0.1 Wm-Weight of Medium-level Question = W0.2 = 0.2 WIT-Weight of High-level Question = W0.3=0.3 By using average weightage formula, the level of student is decided.
[0089] Average Weight Prediction Formula: At the initial level, the students are assigned with a test which contains Easy, Medium and Hard questions. Each level of question contains a unique weight mentioned as above.
Vi31 Wi x P OM= _Z9r2o.i Wi x Ni Average Weight - *oa Wi x Ti Wi x Ti Wherein, student answered the questions, where P (A/Q=H) -Probability that question difficulty is High P (A/Q=M) -Probability that question difficulty is Medium student answered the questions, where P (A/Q=E) -Probability that student answered the questions, where question difficulty is Easy Nn-Number easy level questions that are correct = P(A/Q=E) NM-Number medium level questions that are correct = P(A/Q=M) NH-Number high level questions that are correct = P(A/Q=H) Tr -Total Number easy level questions TM -Total Number medium level questions TH-Total Number high level questions [0090] Sample Flow Diagram for Average Weight: Below flow diagram gives the addition of weight for each level while calculating the avenge weight.
The flow diagram associated is FIG. 11.
The student level prediction in exam will be decided by equation average weight. The below process is used to predict student performance: * Create a test with 3 easy, 4 medium and 3 difficult questions.
* Display the test to given student and collect their answers.
* Use Average weight formula to calculate the Average weight of the student test.
* If the Average weight is <0.5, then student is at Foundation level so display only Foundation level course contents, resources, videos and tests etc. * If the Average weight is = 0.5, then student is at Medium level so display only Foundation level course contents, resources, videos and tests etc. * if the Average weight is > 0.5, then student is at Advanced level so display only Foundation level course contents, resources, videos and tests etc. [0091] Average Weight Formula for Historical Test Results: Example of Predicting Student Level in a given Class of 9 students. In exam assume that there are 10 questions in a given test with 3 easy, 4 medium and 3 difficult questions. In below to shows the statistics of the score and average weight obtained by the students in test.
Total 10 Questions Level in a Test 3 Easy 4 Medium 3 Hard Average Weight Questions Per Level 3 4 3 Weight of each question 0.1 0.2 0.3 Student 1 - Correctly Answered 1 1 1 0.3 Questions Student 2 - Correctly Answered 2 2 2 0.6 Questions Student 3 - Correctly Answered 1 2 3 0.7 Questions Student 4 - Correctly Answered 1 3 3 0.8 Questions Student 5 - Correctly Answered 3 3 3 0.9 Questions Student 6 - Correctly Answered 2 2 1 0.45 Questions Student 7 - Correctly Answered 1 2 3 0.7 Questions Student 8 - Correctly Answered 3 3 2 0.75 Questions Student 9 - Correctly Answered 3 4 3 1 Questions [0092] To calculate the average weight of studentl on the above table. The student 1 was answered 1 easy,' medium, 1 hard question out of 10 questions in the test. Applying the average formula to calculate the average weight of student 1 E" 92 Wi x P(AIQ=i) Average Weight of student 1-* El?130.1 WixTi = (0.3x1 + 0.2x1 + 0.1x1)/ (0.3x3 + 0.2x4 + 0.1x3) = (0.6)/ (2) = 0.3 [0093] The average weight of student 1 is 0.3 which is less than threshold weightage 0.5. Therefore, the student is of the Foundation Level. As the student is Foundation Level, the next test displayed will mainly have foundation level questions in order for them to master it and be able to progress.
[0094] Extra Weightage Adjustment: The Extra Weightage adjustment is used to compensate students from disadvantaged backgrounds means poor education, poor regions or economically backward.
Extra weight adjustment for disadvantaged backgrounds: yvi x p(A/o=i) - vv x P(A/Q=i) Average Weight 01- - + (Wp x v v 0.3 WixTi.3 VVi xTi z4=0.1 ni.o.1 [0095] Poor Regional Students Weightage Calculation: Extra weightage of 0.05 is added to the overall weight for students from poorer regions. This way there is no disadvantage for the students compared to other students where they have better education facilities (e.g., access to the interne, tuition and better teaching). If a student achieved the maximum score in a test, then we will not be giving any extra weight to avoid >1 score.
Total 10 Questions Level in a Test-> 3 Easy 4 3 hard Average Weight Region Poor at Test Level (+0.05) Medium Questions Per 3 4 3 (Avg Wt + (Avg Wt Region Weight)) Level-> Weight of each 0.1 0.2 0.3 0.05 question-> Studentl-Correctly Answered Questions 1 1 1 0.3 0.315 Student2-C orrectly Answered Questions 2 2 2 0.6 0.63 Student3-Correctly Answered Questions 1 2 3 0.7 0.735 Student4-Correctly Answered Questions 1 3 3 0.8 0.84 Students-Correctly Answered Questions 3 3 3 0.9 0.945 Studento-Correctly Answered Questions 2 2 1 0.45 0.4725 Student7-Correctly Answered Questions 1 2 3 0.7 0.735 Student8-Correctly Answered Questions 3 3 2 0.75 0.7875 Student9-C orrectly Answered Questions 3 4 3 1 1 [0096] Sample Flow Diagram for Extra Average weight: Below flow diagram gives the way of adding the extra weight while calculating the average weight.The flow diagram associated is FIG. 12.
[0097] Response Time of weightage: If some students performed well by answering questions correctly in less time compared to other students, then we can say they are more skilled so we can award extra weightage to predict their level correctly compared to other students.The test average time is 5 minutes out of the allocated 10 minutes. For an intelligent student, it will take a minimum of 2.5 minutes to complete. Therefore, if a student finishes a test in less than a minute, even though they might have answered all questions correctly, there is no extra weightage awarded as the threshold minimum limit of 2.5 minutes to complete a test is breached. We don't give any weightage to students who completed the test below 2.5 minutes. An extra weight of 0.02 will be added by considering the response time taken. And the new average weight will be taken into consideration for calculation shown in below table.
Total 10 Questions 3 Easy 4 3 hard Average Weight Time Advantage Level in a Test-> Medium (+0.02) Questions Per Level - 3 4 3 (Avg Wt + (Avg Wt * time advantage Weight)) > Weight of each 0.1 0.2 0.3 question -> Student 1- Correctly 1 1 1 0.3 0.306 Answered Questions Student 2 -Correctly Answered Questions 2 2 2 0.6 0.612 Student 3 -Correctly Answered Questions 1 2 3 0.7 0.714 Student 4 -Correctly Answered Questions 1 3 3 0.8 0.816 Student 5 -Correctly Answered Questions 3 3 3 0.9 0.918 Student 6 -Correctly Answered Questions 2 2 1 0.45 0.459 Student 7 -Correctly Answered Questions 1 2 3 0.7 0.714 Student 8 -Correctly Answered Questions 3 3 2 0.75 0.765 Student 9 -Correctly Answered Questions 3 4 3 1 1 [0098] Poor School Time Weightage within a Region: There are possibilities where both Good and average schools exist within a single region. If student belongs to poor school, then an extra weight of 0.03 will be added by considering the poor school weightage within a region. If student belongs to Good and average schools then no extra weight will be added to the average weight.
[0099] Below is a sample calculation of the average weight after adding extra weight.
Total 10 Questions 3 Easy 4 3 hard Average Weight Poor School Level in a Test-> Medium Weight (+0.03) Questions Per Level-> 3 4 3 (Avg Wt + (Avg Wt * poor school Weight)) Weight of each 0.1 0.2 0.3 question-> Student 1 -Correctly 1 1 1 0.3 0.309 Answered Questions Student 2 - Correctly 2 2 2 0.6 0.618 Answered Questions Student 3 - Correctly 1 2 3 0.7 0.721 Answered Questions Student 4 -Correctly Answered Questions 1 3 3 0.8 0.824 Student 5 -Correctly 3 3 3 3 0.9 0.927 Answered Questions Student 6 -Correctly Answered Questions 2 2 1 0.45 0.4635 Student 7 - Correctly 1 2 3 0.7 0.721 Answered Questions Student 8 - Correctly 3 3 2 0.75 0.7725 Answered Questions Student 9 -Correctly 3 4 3 1 1 Answered Questions [0100] Combining all Additional weights to Predict Student Level: Below table is the sample of average weight after considering all additional weights like time advantage (0.02), poor region (0.05), No Resources (0.02), school Significance (0.03). We can observe the difference between average weight and average weight obtained after considering additional weight.
Total 10 Questions Level 3 4 3 Average Average Weight in a Test-> Easy medium hard Weight After Additional weights Questions Per Level-> 3 4 3 Time Advantage + Poor Region + No Resources + School Significance Weight of each question-> 0.1 0.2 0.3 Student 1 - Correctly 1 1 1 0.3 0.336 Answered Questions Student 2 - Correctly 2 2 2 0.6 0.672 Answered Questions Student 3 - Correctly 1 2 3 0.7 0.784 Answered Questions Student 4 - Correctly 1 3 3 0.8 0.896 Answered Questions Student 5 - Correctly 3 3 3 0.9 1 Answered Questions Student 6 - Correctly 2 2 1 0.6 0.504 Answered Questions Student 7 - Correctly 1 2 3 0.7 0.784 Answered Questions Student 8 - Correctly 3 3 2 0.75 0.84 Answered Questions Student 9 - Correctly 3 4 3 1 1 Answered Questions [0101] Calculation of Average weight based on Month: Average weight is calculated monthly for every student based on his/her score in the month. We add extra weight for every month and decrease the weight for the next coming month with a slight difference. We are giving more weight in the first month to motivate the students initially. As months pass by we decrease the weight of the month and it will be taken into consideration in the calculation of average weight.
Monthly Average Weight of Student - 1-v"001 LWi x Mi 1=-7 z LWi x Ti Where LWi = Weight of Level 1/Monthl Mi=Marks scored in the current month Ti=Total marks of the tests in a month Below table is the sample of average weight and also average weight after considering all additional weights like time advantage (0.02), poor region (0.05), No Resources (0.02), school Significance (0.03). We can observe the difference between average weight and average weight obtained after considering additional weight.
Levels Ll L2 L3 L4 L5 Average weight of past score Average (past score)-> Weight After Additional weights Month-> May June July August Sep Time Advantage + Poor Region + No Resources + School Significance Sample Score 4/10 6/10 8/10 5/10 6/10 Weight-> OAS 0.14 013 012 0.11 Student-1 Score 4 6 8 5 6 0.575384 0.6386769231 Student-2 Score 1 1 1 1 1 0.1 0.111 Student-3 Score 1 2 3 4 5 0.284615 0.3159230769 Student-4 Score 5 4 3 2 1 0.315384 0.3500769231 Student-5 Score 0.496923 7 5 6 4 2 0769 0.5515846154 Student-6 Score 6 6 6 6 6 0.6 0.666 Student-7 Score 3 3 3 3 3 0.3 0.333 Student-8 Score 9 9 9 9 9 0.9 0.999 Student-9 Score 0.476923 2 5 3 6 9 0769 0.5293846154 Student-10 0.426153 Score 9 2 1 2 7 8462 0.4730307692 [0102] OverAll Percentage Calculation: We will be providing the ranking based on the total number of questions attempted out of the total number of questions. The raking calculator differs based on the region.
Below is the sample ranking pattern calculated based on overall percentage.
OverAll Calculation-> OverAll Marks out OverAll Percentage(All Subject Rank of 50 marks Score/All Subject Questions)*100 Student-1 Score 48 96 1 Student-2 Score 45 90 2 Student-3 Score 40 80 3 Student-4 Score 35 70 4 Student-5 Score 30 60 5 Student-6 Score 25 50 6 Student-7 Score 20 40 7 Student-8 Score 18 36 8 Student-9 Score 15 30 9 Student-10 Score 10 20 10 Student-9 Score 5 10 11 [0103] Question Generation The goal of Unsupervised Relationship Extraction is to identify the main named entities and then use distractors, without any prior knowledge of the semantic types of the relationships. One of the challenges in exam generation is the selection of a question set that is of appropriate difficulty with good coverage of the material. Based on the input level provided to the model, the question is generated by paraphrasing and changing the values if it is a question from the topic Maths. Different similar questions are generated for each user with different values.We can use Convolutional Neural Network (CNN) to predict the question type and then incorporated the question type semantics into the Question Generation process.
[0104] This Question Generation technique can also be applied using neural encoder-decoder model to generate meaningful and different questions from natural language sentences. The encoder reads the input text and the position of the answer to produce an answer-sensitive input representation, which is sent to the decoder to generate an answer-oriented question. We need to generate adequate distractors, i.e. reasonable but incorrect alternative entity-words. A good exam should contain questions with a range of difficulty, so that the student's understanding is accurately assessed. An exam should contain mix of easy, medium, and high difficulty questions to identify students with good understanding of the subject from intermediate or weak students. The difficulty of the generated questions can be controlled by easy, medium and difficulty levels of distractors and hypernyms etc. The rule-based question generation with distractors and hypernyms can achieve reasonably good performance.
[0105] Rules and algorithm of question generation module (132): Step 1: Read the question Step 2: Add the appropriate distracters to the question Step 3: Add the appropriate Hypernyms to the question Step 4: Based on the input difficulty level, add the multiplication factor to the question Step 5: If the subject is Maths, then difficulty level is Easy then apply multiplication factor between 3 and 5 Else if subject is not Math's, then go to step 8 Step 6: If the subject is Maths, then difficulty level is Medium then add multiplication factor between 6 and 9 Else if subject is not Math's, then go to step 8 Step 7: If the subject is Maths, and difficulty level is Hard then apply multiplication factor between 10.51 and 14.49; Else if subject is not Math's, then go to step 8 Step 8: Generate the question [0106] The benefits of using question generations system are: * This offers students of practice retrieving information from memory * providing students with feedback about their misunderstanding * focusing students' attention on the important concepts * reinforcing learning by repeating core concepts in multiple ways to study * motivating students to engage in learning activities by providing accurate level estimation.
[0107] Math question generation based on difficulty level as input to the model: Rules of the question generating system for easy, medium, hard levels shown below Rule 1: If the input is very easy, then change every number in a given question by multiplying with 2 to generate new variety of very easy question.
Rule 2: If the input is easy, then change every number in a given question by multiplying with random number generated from 3 to 5 for generating new variety of easy question.
Rule 3: If the input is medium, then change every number in a given question by multiplying with random number generated from 6 to 9 to generate new variety of medium level question.
Rule 4: If the input is hard, then change every number in a given question by multiplying with random number generated from 10.51 to 14.49 to generate new variety of hard level question.
Consider the below example for the generation of math's questions based on multiplication factor.
Easy Level Question Multiplied by factor 4.
Medium Level Question Multiplied by factor 7.
Hard Level Question Multiplied by factor 12.3 [0108] Question Generator Sample Flow Diagram: S:,S;S:Sn.$SMS Difficulty Question Generation Level Very Easy Luis bought a pizza which costs £6. Find the cost of 4 pizzas Easy Luis bought a pizza which costs £24. Find the cost of 16 pizzas Medium Luis bought a pizza which costs £42. Find the cost of 28 pizzas Hard Luis bought a pizza which costs £73.8. Find the cost of 49 pizzas [0109] Architecture Design for Adaptive Testing and Question Generation Model: To start with, our application will be in Azure where we have different API functions which are managed by Azure API management. All the functionalities related to, Question Generation, Question Creation, Prediction level, Results are managed by APIs.
[0110] Fig. 14 illustrates the architectural diagram representing azure functions as services,in accordance with an exemplary embodiment of the present invention. It gives the end-to-end process for Adaptive Testing and Question Generation Model.
[0111] Below are APIs that are triggered or called for different functionality: 1. Admin Questions Repo API: All the questions of different levels (Easy, Medium and Hard) will be get using this Admin Questions Repo API.
2. Question Generator API: To generate questions based on multiplication factor, level this Question Generator API will be triggered.
3. Student Quiz API: To allow students to take the quiz Student Quiz API will be called and through service bus it will be communicated to prediction level API to predict the score and generate the analytics.
All the reports and error call failures from the API calls will be gets stored in Azure Cosmos DB.
4. Teacher Analytics API: Each action of a student in terms of taking test, score, average, results, performance, ranks will get stored in the Azure DB. Using all the data Analytics API gives the dashboard of each student with all parameters which will help the students to improve the skills.
[0112] Below are the few definitions that defines the architecture: Azure Function: Azure Function is a serverless compute service that enables user to rim event-triggered code without having to provision or manage infrastructure.
Serverless Microservices: Serverless Microservices combines both Serverless and Micro service architectures, which gives the advantage of scalable loosely coupled services without managing physical servers.
API Management: Azure API Management provides an API gateway that sits in front of the HTTP function. You can use API Management to publish and manage APIs used by client applications. Using a gateway helps to decouple the front-end application from the back-end APIs.
Cosmos DB: Azure Cosmos DB is a multi-model database service. The function application fetches documents from Cosmos DB in response to HTTP GET requests from the client.
Azure Service Bus: It is a reliable cloud messaging service for simple hybrid integration.

Claims (1)

  1. 5. CLAIMS I/VVe Claim: 1 A computer implemented adaptive testing and question generation system (100), comprises: a. an adaptive testing module, a question generation module (132), an answers evaluation module, and a moderator; b. an output device to display the 'multiple choice and/or descriptive questions' and an input device to allow selection of one or more answers by connecting the terminals through network; c. the adaptive testing module: i ranges the difficulty levels of assessment from high to low (132) based on the previous year marks of a learner; ii determines target based on response time (120) and correctly answered questions in mock/previous tests; d. the question generation module (132): i. generates questions based on four aspects like successful control for difficulty, novelty, fluency and latency; i. manipulates the question structure, generate more distractors, create more descriptiveness and change the variables to assess the subject knowledge and performance (118); e. the said moderator: i. evaluates the fluency and novelty in the questions generated by the question generation module (132); f. the answers evaluation module: i. observes indirectly connected latent events for learning history like partial understanding, slips and guessing answers; i. the said system detects the guess work by evaluating answers based on min/max time of response (120) required to answer the question; and the said system declares the performance of learner based on views and likes (122), response time for each question, estimates the benchmark comparing the previous year marks (136) for estimation of regional and national level positional of the learner (116).
    2. The system as claimed in claim 1, wherein question generation module (132)follows the rules and algorithm comprising steps of: a. reading a question; b. adding an appropriate distracters to the question; c. adding an appropriate hypernyms to the question; d. adding a multiplication factor to the question based on an input difficulty level; e. generating the question if the subject is not math; f. applying multiplication factor between 3 and 5, if the subject is math and the difficulty level is easy; g. applying multiplication factor between 6 and 9, if the subject is math and the difficulty level is medium; h. applying multiplication factor between 10.51 and 14.49, if the subject is math and the difficulty level is hard; and i. generating the question after applying multiplication factor based on the difficulty level.
    3. The system as claimed in claim 1, wherein the question generation module (132) based on difficulty level as input to the model generates for very easy, easy, medium, and hard level math questions following the steps comprising of: a changing every number in a given question by multiplying with 2 to generate new variety of a very easy question; b. changing every number in a given question by multiplying with random number generated from 3 to 5 for generating new variety of an easy question; c. changing every number in a given question by multiplying with random number generated from 6 to 9 to generate new variety of medium level question; and d. changing every number in a given question by multiplying with random number generated from 10.51 to 14.49 to generate new variety of hard level question.
    4. The system as claimed in claim 1, wherein the adaptive testing and question generation system comprises different Application Program Interface (API) functions configured to be managed by a API management sub system, and all the fun cti on al iti es related to, question generation, question creation, prediction level, results are managed by an Admin Questions Repo API, a Question Generator API, a Student Quiz API, and a Teacher Analytics API.
    5. The system as claimed in claim 1, wherein the Student Quiz API is configured to allow students to take the quiz Student Quiz API, and through a service bus it will be communicated to prediction level API to predict the score and generate the analytics.
    6. The system as claimed in claim 1, wherein the Teacher Analytics API associated with a database is configured to store each action of a student in terms of taking test, score, average, results, performance, and ranks.
    7. The system as claimed in claim 1, wherein the adaptive testing and question generation system using all the Analytics API gives the dashboard of each student with all parameters which will help the students to improve the skills.
    8. A computer implemented adaptive testing and question generation method, wherein the method comprises steps of: a. configuring the novel adaptive testing and question generation system for automatically generating question-answer pairs with varying phrases, difficulty levels, and variables and intelligently shows questions in each test as per the learner's ability and characteristics; b. working the novel adaptive testing and question generation system for: c. selecting a concept (102) for generation of questions based on a difficulty level; d. displaying a plurality of questions as per 'learning ability of a learner' based on a previous year's result (104); e. generating questions by changing question structure, changing variables, shuffling answer choices and real time difficulty level (118); f. ranging difficulty from easy to hard (106) by evaluating struggle of learner; g. evaluating the questions generated by question generation module for fluency and novelty (112); h. evaluating answers considering factors like partial understanding, slips and guessing answers (114); concluding the results on performance as poor subject knowledge/good subj ect knowledge; j. displaying the analytics about performance (108) in terms of guessed answers (120), wrong answers, time of response to each question, and subject knowledge; k. recommending the study material to revise the concepts for improving knowledge (116); 1. declaring the position of learner comparing with regional/national level (110); and m guiding teachers and parents on learner's strengths and areas of improvement.
    9. The method as claimed in claim 8, wherein the said adaptive testing module enable the user to select concept or subject (102) to test knowledge and performance.
    10. The method as claimed in claim 8, wherein the said output device is used to display the plurality of questions generated by question generation module.
    11. The method as claimed in claim 8, wherein the said question generation module generates questions by changing question structure, changing variables, shuffling answer choices and real time difficulty level based on difficulty level (118).
    12. The method as claimed in claim 8, wherein the said evaluation module sets the level of difficulty ranging from hard to easy (106) base on the struggle of learner to reach the benchmark and to assess the level of subject knowledge to conclude.
    13. The method as claimed in claim 8, wherein moderator evaluates for novelty and fluency (112) in questions generated by question generation model.
    14. The method as claimed in claim 8, wherein the adaptive testing module determines guess work based (114) on time of response and target level reached by learner.
    15. The method as claimed in claim 8, wherein adaptive testing module evaluate guesswork of learner by analyzing the min/max response time, if the response time is less than average minimum response time then it concludes it is guesswork.
    16. The method as claimed in claim 8, wherein the evaluation module evaluates answers considering factors like partial understanding, slips and guessing answers (114).
    17. The method as claimed in claim 8, wherein the adaptive testing module provides analytics of learner performance in all subjects, analyzing the areas of improvement, assessing the response time (guesswork or not), ranking the resources based on views and likes, evaluating the regional and national position of the learner (110).
    18. The method as claimed in claim 8, the adaptive testing module utilizes resources like videos, Time-based tests, quizzes, exam questions, flashcards, randomly generated questions and teacher assigned tests to promote active recalling and spaced repetitions.
    19. The method as claimed in claim 8, wherein the adaptive testing module diagnoses the potential areas to be concerned by learner, enables the learner to get a picture of their learning journey to change and/or create brand new schemes of learning.
GB2302270.0A 2023-02-17 2023-02-17 A computer implemented adaptive testing and question generation system and method Pending GB2627259A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2302270.0A GB2627259A (en) 2023-02-17 2023-02-17 A computer implemented adaptive testing and question generation system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2302270.0A GB2627259A (en) 2023-02-17 2023-02-17 A computer implemented adaptive testing and question generation system and method

Publications (2)

Publication Number Publication Date
GB202302270D0 GB202302270D0 (en) 2023-04-05
GB2627259A true GB2627259A (en) 2024-08-21

Family

ID=85772320

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2302270.0A Pending GB2627259A (en) 2023-02-17 2023-02-17 A computer implemented adaptive testing and question generation system and method

Country Status (1)

Country Link
GB (1) GB2627259A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119600857B (en) * 2024-11-26 2025-11-07 深圳中远海运数字科技有限公司 Video training method and system in offline environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
GB202302270D0 (en) 2023-04-05

Similar Documents

Publication Publication Date Title
US8165518B2 (en) Method and system for knowledge assessment using confidence-based measurement
Palocsay et al. A study of the effectiveness of web‐based homework in teaching undergraduate business statistics
Chen et al. Science teaching in kindergartens: factors associated with teachers’ self-efficacy and outcome expectations for integrating science into teaching
US20130252224A1 (en) Method and System for Knowledge Assessment And Learning
Shute Smart evaluation: Cognitive diagnosis, mastery learning and remediation
Balyk et al. Design of approaches to the development of teacher’s digital competencies in the process of their lifelong learning
Hwang et al. A study of learning time patterns in asynchronous learning environments
KR100994457B1 (en) Method, system and computer readable recording medium for providing customized education services
Klein et al. Fostering of applicable educational knowledge in student teachers: Effects of an error-based seminar concept and instructional support during testing on qualities of applicable knowledge
Nkonkonya Mpuangnan Determining skills of basic schoolteachers in test item construction
GB2627259A (en) A computer implemented adaptive testing and question generation system and method
Youde A meta-analysis of the effects of reflective self-assessment on academic achievement in primary and secondary populations
Kumyoung et al. Development of a causal model of self-regulated learning by students at Loei Rajabhat University
Fox Teacher self-efficacy, content and pedagogical knowledge, and their relationship to student achievement in Algebra I
Zenisky et al. Massachusetts Adult Proficiency Tests-College and Career Readiness (MAPT-CCR) Technical Manual1
Albert Using a sample survey project to assess the teaching of statistical inference
Timmerman Peer review in an undergraduate biology curriculum: Effects on students’ scientific reasoning, writing and attitudes
Gelman et al. Active statistics: stories, games, problems, and hands-on demonstrations for applied regression and causal inference
Raval Perspectives on Students' Teaching Evaluations of AIS Courses.
Farrelly Determining correlations between global history teacher’s metacognitive awareness and teaching demographics
Fernández-Isabel et al. Using an llm-based framework to analyze the student performance
Serbin et al. Multicriteria Model of Students’ Knowledge Diagnostics Based on the Doubt Measuring Level Method for E&M Learning
Oglesby-Phelps The Effects of Do the Math Intervention on Tier II Elementary Students
Cruz Communication and related factors affecting academic success among college students
Lee Question generation workflow: Incorporating student-generated content and peer evaluation