Using Web-Based Video as an Assessment Tool for Student Performance in Organic Chemistry

2014 ◽  
Vol 91 (7) ◽  
pp. 982-986 ◽  
Author(s):  
John Tierney ◽  
Matthew Bodek ◽  
Susan Fredricks ◽  
Elizabeth Dudkin ◽  
Kurt Kistler
2013 ◽  
Vol 29 (1) ◽  
pp. 117-147 ◽  
Author(s):  
Cynthia J. Khanlarian ◽  
Rahul Singh

ABSTRACT Web-based homework (WBH) is an increasingly important phenomenon. There is little research about its character, the nature of its impact on student performance, and how that impact evolves over an academic term. The primary research questions addressed in this study are: What relevant factors in a WBH learning environment impact students' performance? And how does the impact of these factors change over the course of an academic term? This paper examines and identifies significant factors in a WBH learning environment and how they impact student performance. We studied over 300 students using WBH extensively for their coursework, throughout a semester in an undergraduate class at a large public university. In this paper, we present factors in the WBH learning environment that were found to have a significant impact on student performance during the course of a semester. In addition to individual and technological factors, this study presents findings that demonstrate that frustration with IT use is a component of the learning environment, and as a construct, has a larger impact than usefulness on student performance at the end of a course. Our results indicate that educators may benefit from training students and engaging them in utility of co-operative learning assignments to mitigate the level of frustration with the software in the WBH learning environment and improve student performance.


2019 ◽  
Vol 97 (Supplement_1) ◽  
pp. 79-79
Author(s):  
Lauren R Thomas ◽  
Jeremy G Powell ◽  
Elizabeth B Kegley ◽  
Kathleen Jogan

Abstract In 2015, the University of Arkansas Department of Animal Science developed a strategy for assessing student-learning outcomes within its undergraduate teaching program. The first recognized outcome states that students will demonstrate foundational scientific knowledge in the general animal science disciplines of physiology, genetics, nutrition, muscle foods, and production animal management. Subsequently, a 58-item assessment tool was developed for direct assessment of student knowledge—focusing primarily on freshmen and senior students. Over the past 3 academic calendar years, 381 students (196 freshmen, 48 sophomores, 19 juniors, 113 seniors, 5 graduates) were assessed, either during an introduction to animal science course or by appointment with outgoing seniors majoring in animal science. Scores were categorized using demographic data collected at the beginning of the assessment tool. Comparison categories included academic class, major, and general student background (rural or urban). Data analysis were performed using the Glimmix procedure of SAS, with student serving as the experimental unit and significance set at P ≤ 0.05. Generally speaking, animal science majors performed better (P < 0.01) than students from other majors, and students with a rural background performed better (P < 0.01) than their urban-backgrounded peers. Overall, senior assessment scores averaged 23-percentage points greater (P < 0.01) than freshmen assessment scores, and the average scores for freshmen and seniors were 43% and 66% respectively. In regards to student performance within each discipline, there was an average improvement of 24 percentage points between freshmen and seniors in all of the measured disciplines except for muscle foods, which only saw a 10-percentage point improvement between the two classes. While the overall improvement in scores is indicative of increased student knowledge, the department would like to see greater improvement in all discipline scores for seniors majoring in animal science.


2019 ◽  
Vol 35 (5) ◽  
pp. 723-731 ◽  
Author(s):  
Gurdeep Singh ◽  
Dharmendra Saraswat ◽  
Naresh Pai ◽  
Benjamin Hancock

Abstract. Standard practice of setting up Soil and Water Assessment Tool (SWAT) involves use of a single land use (LU) layer under the assumption that no change takes place in LU condition irrespective of the length of simulation period. This assumption leads to erroneous conclusions about efficacy of management practices in those watersheds where land use changes (LUCs) (e.g. agriculture to urban, forest to agriculture etc.) occur during the simulation period. To overcome this limitation, we have developed a user-friendly, web-based tool named LUU Checker that helps create a composite LU layer by integrating multiple years of LU layers available in watersheds of interest. The results show that the use of composite LU layer for hydrologic response unit (HRU) delineation in 2474-km2 L’Anguile River Watershed in Arkansas was able to capture changed LU at subbasin level by using LU data available in the year 1999 and 2006, respectively. The web-based tool is applicable for large size watersheds and is accessible to multiple users from anywhere in the world. Keywords: Land use, Web-based tool, SWAT, LUU Checker.


2006 ◽  
Vol 15 (3) ◽  
pp. 234-256 ◽  
Author(s):  
Mick Short

This article reports on research conducted in the department of Linguistics and English Language at Lancaster University from 2002 to 2005 on first-year undergraduate student performance in, and reaction to, a web-based introductory course in stylistic analysis. The main focus of this report is a comparison of student responses to the varying ways in which the web-based course was used from year to year. The description of student responses is based on an analysis of end-of-course questionnaires and a comparison of exit grades. In 2002–3, students accessed the first two-thirds of the course in web-based form and the last third through more traditional teaching. In 2003–4 the entire course was accessed in web-based form, and in 2004–5 web-based course workshops were used as part of a combined package which also involved weekly lectures and seminars. Some comparison is also made with student performance in, and responses to, the traditional lecture + seminar form of the course, as typified in the 2001–2 version of the course.


2019 ◽  
Vol 2 (1) ◽  
pp. 109-119
Author(s):  
Corinne M Gist ◽  
Natalie Andzik ◽  
Elle E Smith ◽  
Menglin Xu ◽  
Nancy A Neef

The use of competitive games to increase classroom engagement has become common practice among many teachers. However, it is unclear if using games as an assessment tool is a viable way to increase student performance. This study examined the effects of administering quizzes through a game-based system, Kahoot!,versusprivately on an electronic device. The quiz scores of 56 undergraduate students, enrolled in one of two special education courses, were evaluated. A linear regression was used to compare student scores across the two conditions, as well as performance over the course of a 15-week semester. No significant difference in quiz scores was found between the two conditions, and quiz scores in both conditions improved similarly over time. Sixty-eight percent of the students reported preferring to take the quiz privately on an electric device as opposed to on Kahoot!. Limitations and recommendations for practitioners are discussed.


Author(s):  
Ademir Garcia Reberti ◽  
Nayme Hechem Monfredini ◽  
Olavo Franco Ferreira Filho ◽  
Dalton Francisco de Andrade ◽  
Carlos Eduardo Andrade Pinheiro ◽  
...  

Abstract: Progress Test is an objective assessment, consisting of 60 to 150 multiple-choice questions, designed to promote an assessment of the cognitive skills expected at the end of undergraduate school. This test is applied to all students on the same day, so that it is possible to compare the results between grades and analyze the development of knowledge performance throughout the course. This study aimed to carry out a systematic and literary review about Progress Test in medical schools in Brazil and around the world, understanding the benefits of its implementation for the development of learning for the student, the teacher and the institution. The study was carried out from July 2018 to April 2019, which addressed articles published from January 2002 to March 2019. The keywords used were: “Progress Test in Medical Schools” and “Item Response Theory in Medicine” in the PubMed, Scielo, and Lilacs platforms. There was no language limitation in article selection, but the research was carried out in English. A total of 192,026 articles were identified, and after applying advanced search filters, 11 articles were included in the study. The Progress Test (PTMed) has been applied in medical schools, either alone or in groups of partner schools, since the late 1990s. The test results build the students’ performance curves, which allow us to identify weaknesses and strengths of the students in the several areas of knowledge related to the course. The Progress Test is not an exclusive instrument for assessing student performance, but it is also important as an assessment tool for academic management use and thus, it is crucial that institutions take an active role in the preparation and analysis of this assessment data. Assessments designed to test clinical competence in medical students need to be valid and reliable. For the evaluative method to be valid it is necessary that the subject be extensively reviewed and studied, aiming at improvements and adjustments in test performance.


2011 ◽  
Vol 23 (1) ◽  
pp. 68-77 ◽  
Author(s):  
Sara Kim ◽  
Doug Brock ◽  
Carolyn D. Prouty ◽  
Peggy Soule Odegard ◽  
Sarah E. Shannon ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document