scholarly journals Open-source, Online Homework for Statics and Mechanics of Materials Using WeBWorK: Assessing Effects on Student Learning

2016 ◽  
Author(s):  
Michael Swanbom ◽  
Daniel Moller ◽  
Katie Evans ◽  
Timothy Reeves
Author(s):  
Agnes G. D’Entremont ◽  
Negar M. Harandi ◽  
Jonathan Verrett

Online homework systems provide immediate feedback to students, enhancing student learning. However, paid online homework from textbook publishers or other sources systems can be costly and also raise concerns about student data privacy. WeBWorK is an open-source online homework system that can be setup on local servers, is free to students and has been in use since its development in the mid-1990s. Previous to this work around 200 engineering problems were openly shared on the WeBWorK platform, limiting opportunity for adoption.  In order to address this, we have developed, deployed, and evaluated nearly 1000 new engineering problems across a wide range of engineering topics at the second-year level.  Student perceptions of WeBWorK have been evaluated using surveys at the start and end of courses where it is deployed. These surveys indicate that students generally prefer the WeBWorK system to other online homework systems they have used. Surveys also indicate that students were generally motivated to both attempt and complete all assigned problems that contributed to their grade, and believed WeBWorK enhanced their learning. The creation of error-free WeBWorK questions was difficult, however the hope is that the ability to re-use and share these questions ensures they provide a higher value over the long term than paper-based homework problems.  


2018 ◽  
Vol 17 (1) ◽  
pp. 32
Author(s):  
Agnieszka Jach

Preparation of Moodle quizzes which are data-based and contemporary tends to be tedious and time-consuming. By using innovative tools, this process can be simplified and automated, providing a substantial benefit to the teacher wishing to employ such quizzes, and ultimately improving student learning experience. The purpose of this article is to show how to create data-driven, up-to-date quizzes for Moodle in an easy fashion. The methodology is based on several popular, open-source, free tools, and its implementation details are demonstrated with an example. This makes the methodology readily-available to the practitioners.


Author(s):  
Sally Jordan

A linguistically based authoring tool has been used to write eassessment questions requiring short free-text answers of up to about 20 words in length (typically a single sentence). The answer matching is sophisticated and students are provided with instantaneous targeted feedback on incorrect and incomplete responses. They are able to use this feedback in reattempting the question. Seventy-five questions of this type have been offered to students on an entr-level interdisciplinary science module and they have been well received. Students have been observed attempting the questions and have been seen to respond in differing ways to both the questions themselves and the feedback provided. The answer matching has been demonstrated to be of similar or greater accuracy than specialist human markers.The software described is all either open source or commercially available, but the purpose of this paper is not to advertise these products but rather to encourage reflection on e-assessment’s potential to support student learning.


Sign in / Sign up

Export Citation Format

Share Document