Grade Configurations and Student Test Performance: Evidence from Recent National Data

2018 ◽  
Author(s):  
Richard DiSalvo
1989 ◽  
Vol 16 (2) ◽  
pp. 77-78 ◽  
Author(s):  
Paul W. Foos

Effects of student-written test questions on student test performance were examined in an Introductory Psychology class. Before each of three tests, randomly assigned students wrote essay questions, multiple-choice questions, or no questions. All tests contained essay and multiple-choice items but no questions written by students. Question writers performed significantly better than nonwriters on the first two tests; the difference on the third test was marginally significant. No differences were found between students who wrote essay and those who wrote multiple-choice questions. Question writing appears to be an effective study technique.


2002 ◽  
Vol 29 (1) ◽  
pp. 10-15 ◽  
Author(s):  
Helen C. Harton ◽  
Deborah S. Richardson ◽  
Ricardo E. Barreras ◽  
Matthew J. Rockloff ◽  
Bibb Latané

Focused Interactive Learning (FIL) is a tool for teaching psychological concepts through student participation in a focused discussion with other class members. Students from 5 upper and lower level psychology courses participated in FIL exercises in which they answered several multiple-choice or opinion questions on their own and then systematically discussed each item for about 2 min with other students before giving a final answer. FIL increased student test performance, helped them get to know other students in the class, and had a small effect on students' self-reported participation and interest in psychology.


2020 ◽  
Vol 34 (01) ◽  
pp. 654-661 ◽  
Author(s):  
Michael Geden ◽  
Andrew Emerson ◽  
Jonathan Rowe ◽  
Roger Azevedo ◽  
James Lester

Modeling student knowledge is critical in adaptive learning environments. Predictive student modeling enables formative assessment of student knowledge and skills, and it drives personalized support to create learning experiences that are both effective and engaging. Traditional approaches to predictive student modeling utilize features extracted from students’ interaction trace data to predict student test performance, aggregating student test performance as a single output label. We reformulate predictive student modeling as a multi-task learning problem, modeling questions from student test data as distinct “tasks.” We demonstrate the effectiveness of this approach by utilizing student data from a series of laboratory-based and classroom-based studies conducted with a game-based learning environment for microbiology education, Crystal Island. Using sequential representations of student gameplay, results show that multi-task stacked LSTMs with residual connections significantly outperform baseline models that do not use the multi-task formulation. Additionally, the accuracy of predictive student models is improved as the number of tasks increases. These findings have significant implications for the design and development of predictive student models in adaptive learning environments.


Sign in / Sign up

Export Citation Format

Share Document