scholarly journals Development and Preliminary Assessment of an Open-source, Online Homework Suite for Advanced Mechanics of Materials using WeBWorK

2018 ◽  
Author(s):  
Michael Swanbom ◽  
Madeline Collins ◽  
Katie Evans
Author(s):  
Agnes G. D’Entremont ◽  
Negar M. Harandi ◽  
Jonathan Verrett

Online homework systems provide immediate feedback to students, enhancing student learning. However, paid online homework from textbook publishers or other sources systems can be costly and also raise concerns about student data privacy. WeBWorK is an open-source online homework system that can be setup on local servers, is free to students and has been in use since its development in the mid-1990s. Previous to this work around 200 engineering problems were openly shared on the WeBWorK platform, limiting opportunity for adoption.  In order to address this, we have developed, deployed, and evaluated nearly 1000 new engineering problems across a wide range of engineering topics at the second-year level.  Student perceptions of WeBWorK have been evaluated using surveys at the start and end of courses where it is deployed. These surveys indicate that students generally prefer the WeBWorK system to other online homework systems they have used. Surveys also indicate that students were generally motivated to both attempt and complete all assigned problems that contributed to their grade, and believed WeBWorK enhanced their learning. The creation of error-free WeBWorK questions was difficult, however the hope is that the ability to re-use and share these questions ensures they provide a higher value over the long term than paper-based homework problems.  


Author(s):  
Agnes D'Entremont

WeBWorK is an open-source online homework platform used in mathematics as well as engineering, where students can be assigned calculated answer engineering science problems. Problems with staged answers (multi-box) problems are possible on this system, and could offer feedback for the answers at each intermediate step of the solution. This would allow students to determine the step where they had an error (or deficit in understanding), similar to providing a hint on what their specific error was in adaptive feedback systems.Second-year students in a mechanical engineering program were exposed to both single- and multi-box questions in WeBWorK and were asked to give feedback about their preferences. The vast majority of students reported that they believed that the multi-box questions provided them good feedback on which step or calculation had error(s). They also pointed out the multi-box problems sped up finding errors in their solutions. However, a large minority indicated concern that multi-box problems constrained the solution to a particular path.Based on these results, providing some multi-box problems may assist students in finding their errors through more detailed feedback on their solution. This may be more effective earlier in a particular topic or in the first problems at any given complexity.


1984 ◽  
Vol 15 (4) ◽  
pp. 267-274 ◽  
Author(s):  
Harriet B. Klein

Formal articulation test responses are often used by the busy clinician as a basis for planning intervention goals. This article describes a 6-step procedure for using efficiently the single-word responses elicited with an articulation test. This procedure involves the assessment of all consonants within a word rather than only test-target consonants. Responses are organized within a Model and Replica chart to yield information about an individual's (a) articulation ability, (b) frequency of target attainment, substitutions, and deletions, (c) variability in production, and (d) phonological processes. This procedure is recommended as a preliminary assessment measure. It is advised that more detailed analysis of continuous speech be undertaken in conjunction with early treatment sessions.


Author(s):  
Fadi P. Deek ◽  
James A. M. McHugh
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document