A new course evaluation process

2000 ◽  
Vol 43 (2) ◽  
pp. 125-131 ◽  
Author(s):  
K. Scoles ◽  
N. Bilgutay ◽  
J. Good
2015 ◽  
Vol 27 (5) ◽  
pp. 3-13
Author(s):  
Jacqueline E. McLaughlin ◽  
Amy Sloane ◽  
Elizabeth Billings ◽  
Mary T. Roth

1993 ◽  
Vol 76 (3) ◽  
pp. 995-1000 ◽  
Author(s):  
Patricia L. Dwinell ◽  
Jeanne L. Higbee

This research examined how 187 students assessed a course evaluation form, the anonymity of the evaluation process, the fairness and accuracy students attribute to the task of completing evaluations of instruction, and students' perceptions of the extent to which teachers and administrators make use of the information provided by evaluations. 92% of the student-participants believed that the rating forms provided an effective means of evaluating instruction. The majority thought instructors pay attention to evaluation results and change their behavior accordingly. Only 2% believed that their anonymity was not protected. Students appeared to have more faith in their own evaluations than in those of other students. They also lacked confidence in the use of evaluations for determining salary increases or tenure and promotion.


1973 ◽  
Vol 10 (2) ◽  
pp. 115-124 ◽  
Author(s):  
Kent L. Granzin ◽  
John J. Painter

This study of the student course evaluation process discovered significant correlations between course ratings and variables representing commitment and course-end attitudes toward the course. It found relationships of lesser significance for attitude change measures, while demographics provided generally nonsignificant correlations. Stepwise regression equations developed for their power to predict course ratings relied most heavily on course-end attitude variables. Factor analysis of the variable set revealed 6 factors underlying the course evaluation structure studied, and this analysis guided formulation of new regression equations having reduced predictive power but greater independence among included predictor variables. Conclusions focused on the study’s contributions to understanding the course evaluation process and suggested steps an instructor might take to improve his ratings.


2014 ◽  
Vol 123 ◽  
pp. 380-388 ◽  
Author(s):  
Nur Fadhlina Zainal Abedin ◽  
Jamaliah Mohd Taib ◽  
Hajah Makiah Tussaripah Jamil

2017 ◽  
Vol 7 (1) ◽  
pp. 78
Author(s):  
Muhammet Damar ◽  
Aysun Kapucugil Ikiz ◽  
Guzin Ozdagoglu ◽  
Cenk Ozler ◽  
Yasemin Arbak ◽  
...  

2013 ◽  
Vol 6 (3) ◽  
pp. 333-338 ◽  
Author(s):  
Faruk Guder ◽  
Mary Malliaris

This paper studies the reasons for low response rates in online evaluations. Survey data are collected from the students to understand factors that might affect student participation in the course evaluation process. When course evaluations were opened to the student body, an email announcement was sent to all students, and a reminder email was sent a week later. Our study showed that participation rates increased not only when emails were sent, but also when faculty used in-class time to emphasize the importance of completing the evaluations.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Melissa Courvoisier ◽  
Richelle Baddeliyanage ◽  
Linda Wilhelm ◽  
Lorraine Bayliss ◽  
Sharon E. Straus ◽  
...  

Abstract Background In the past decade, patient-oriented research (POR) has been at the forefront of healthcare research in Canada because it has the potential to make research more meaningful and relevant to patient needs. Despite this growing emphasis on and expectation to conduct POR, there is limited guidance about how to apply POR in practice. To address this capacity building need, the Knowledge Translation (KT) Program and patient partners co-designed, delivered, and evaluated Partners in Research (PiR), a 2-month online course for patients and researchers to collectively learn how to conduct and engage in POR. Methods PiR was delivered to 4 cohorts of patients and researchers between 2017 and 2018. For each cohort, we evaluated the impact of the course on participants’ knowledge, self-efficacy, intentions, and use of POR using surveys at 3 time points: baseline, post-course and 6-months post-course. We also monitored the process of course design and delivery by assessing implementation quality of the PiR course. Participants were asked to rate their satisfaction with course format, course materials, quality of delivery and their level of engagement via a 7-point Likert scale in the post-course survey. Results A total of 151 participants enrolled in the PiR course throughout the 4 cohorts. Of these, 49 patients and 33 researchers (n = 82 participants) consented to participate in the course evaluation. Process and outcome evaluations collected over a 9-month period indicated that participation in the PiR course increased knowledge of POR concepts for patients (p < .001) and for researchers (p < .001) from pre-course to post-course timepoints. Likewise, self-efficacy to engage in POR increased from baseline to post-course for both patients (p < .001) and researchers (p < .001). Moreover, participants reported high levels of satisfaction with content, delivery and interactive components of the course. Conclusions The PiR course increased capacity in POR for both researchers and patients. This work enhances our understanding of how to design useful and engaging education opportunities to increase patient and researcher capacity in POR.


2021 ◽  
Vol 6 (2) ◽  
pp. 95-111
Author(s):  
Gülnihal Gül

In this study, it was aimed to determine the views of music teachers who have worked in secondary education during the COVID-19 Pandemic process regarding the music lessons conducted via distance education. The study group of the research consisted of 11 music teachers who were determined by the convenience sampling method. The data were collected using a semi-structured interview consisting of 6 questions and a demographic information form. In line with the findings obtained from this study, it was determined that the music teachers participating in the research had difficulties in classroom management, parents' approaches, technological equipment, internet problems and students' motivation. Besides, it was determined that the teachers used different technological teaching materials in the course, the achievements in the curriculum were partially achieved and the course evaluation process could not be carried out effectively enough. Key words:  COVID-19, pandemic, distance education, secondary education, music education


2006 ◽  
Vol 3 (8) ◽  
Author(s):  
Kelly D. Bradley ◽  
James W. Bradley

In higher education, course evaluations are given much attention, with results directly impacting such events as merit review and tenure/promotion. The accurate presentation and proper use of the evaluation results is a critical issue. The typical course evaluation process involves distributing a Likert-type survey to a class, compiling the data and reporting means/standard deviations (classical test theory approach, CTT). One alternative analytical technique is the Rasch model. A theoretical review of each model and an empirical example utilizing end of semester course evaluations from an introductory statistics course taught at a Midwest community college is presented to demonstrate the step-by-step process of feedback via each model. A contention is made that the CTT summary is not producing a valid picture of the evaluation data. The survey research community and institutions analyzing similar rating scale data will benefit from the results of this study as it provides a sound methodology for analyzing such data. The education community will also benefit by receiving better-informed results.


Sign in / Sign up

Export Citation Format

Share Document