student test performance
Recently Published Documents


TOTAL DOCUMENTS

14
(FIVE YEARS 0)

H-INDEX

4
(FIVE YEARS 0)

2020 ◽  
Vol 34 (01) ◽  
pp. 654-661 ◽  
Author(s):  
Michael Geden ◽  
Andrew Emerson ◽  
Jonathan Rowe ◽  
Roger Azevedo ◽  
James Lester

Modeling student knowledge is critical in adaptive learning environments. Predictive student modeling enables formative assessment of student knowledge and skills, and it drives personalized support to create learning experiences that are both effective and engaging. Traditional approaches to predictive student modeling utilize features extracted from students’ interaction trace data to predict student test performance, aggregating student test performance as a single output label. We reformulate predictive student modeling as a multi-task learning problem, modeling questions from student test data as distinct “tasks.” We demonstrate the effectiveness of this approach by utilizing student data from a series of laboratory-based and classroom-based studies conducted with a game-based learning environment for microbiology education, Crystal Island. Using sequential representations of student gameplay, results show that multi-task stacked LSTMs with residual connections significantly outperform baseline models that do not use the multi-task formulation. Additionally, the accuracy of predictive student models is improved as the number of tasks increases. These findings have significant implications for the design and development of predictive student models in adaptive learning environments.


2016 ◽  
Vol 8 ◽  
pp. 2
Author(s):  
Stephen Lippi

The testing effect is a phenomenon that predicts increased retention of material when individuals are tested on soon-to-be-recalled information (McDaniel, Anderson, Derbish, & Morrisette, 2007). Although this effect is well documented in numerous studies, no study has looked at the impact that computer-based quizzes or online companion tools in a course can have on test performance. In addition to the use of online programs, it is important to understand whether or not the presentation of different question types can lead to increased or decreased student test performance. Although other pedagogical studies have looked at question order on student performance (Norman, 1954; Balch, 1989), none has looked at whether students exposed to questions in short answer format (testing free recall) before taking a multiple choice test (recognition memory) can lead to increased exam scores. The present study sought to understand how use of an online learning system (MindTap, Cengage) and test format order could affect final test scores. There were 5 exams (consisting of separate short answer and multiple choice sections) given to each set of Physiological Psychology students at George Mason University; each exam being worth 150 points. Results indicate that testing order (whether short-answer sections or multiple choice sections were taken first) impacts student test performance and this effect may be mediated by whether or not an online computer program is required. This research has implications for course organization and selection of test format, which may improve student performance. 


Sign in / Sign up

Export Citation Format

Share Document