Author(s):  
Anthony Poon ◽  
Sarah Giroux ◽  
Parfait Eloundou-Enyegue ◽  
François Guimbretière ◽  
Nicola Dell
Keyword(s):  

2020 ◽  
Vol 28 (2) ◽  
Author(s):  
Vanya Ivanova ◽  
◽  
Gergana Petkova

Idioms are metaphorical expressions that cannot be translated literally. They are widely used in English because they make everyday speech more interesting and entertaining for native speakers. It is assumed that there are about twenty-five thousand idiomatic expressions in English and one of the most common thematic areas for idioms is colour. Idiomatic expressions are a fun way to enhance the vocabulary and cultural knowledge of learners of English. However, mastering these expressions cause difficulties for students not only because their meaning is not deductible from the meanings of the words comprising it, but also due to the different meaning of colours in the cultures around the world. For instance, purple is usually connected with aristocracy, affluence, and piousness across the globe but in Thailand and Brazil it is the colour of bereavement. In this article we have described an approach to check the acquisition of idiomatic expressions and facilitate their long-term retention by using online practice tests. These tests are designed by the teacher and taken by students on their personal computers or mobile phones at their own convenience. Furthermore, specifically developed criteria for test construction are listed together with typical test questions based on them. Examples of test items are presented to illustrate the process of test creation. Finally, an appendix of a selection of the most widely used idiomatic expressions with colours is compiled.


2000 ◽  
Vol 86 (1) ◽  
pp. 127-128 ◽  
Author(s):  
David E. Clarke

The use of a computerized, multiple-choice test bank to present practice and assessment tests on a network was evaluated with 46 men and 119 women from a first-year class in psychology. A correlation of .65 ( p <. 001) between scores on a traditional paper-and-pencil test and scores on a computerized test provided some validity for the computerized assessment. Regression analysis showed that ability (previous academic performance) and motivation (number of practice tests taken) accounted for 73% of the explained variance in computerized test scores. Sex differences did not enter the regression equation significantly.


2019 ◽  
Author(s):  
Jeff T. Parker ◽  
Quentin Docter
Keyword(s):  

2018 ◽  
Vol 3 (4) ◽  
Author(s):  
Amanda Burke Aaronson

Review sessions for final exams can be beneficial to student preparation. However, little research has been done on optimally structuring these sessions. Using a common nursing standardized test as a final exam, two semesters are compared using two different review session designs. In the first semester, a general review session, where topics were student-led, was used. In the second, a targeted review session, using practice tests to pre-assess gaps in knowledge, was used. The final exam scores were significantly higher in the second semester than in the first, demonstrating that targeted review sessions might play a role in student success


Author(s):  
David Metcalfe ◽  
Harveer Dev

This chapter presents two practice tests with a mix of question types (e.g. multiple choice or ranking), content (e.g. domain- tested), and styles (e.g. patient, colleague, or personal). Each includes 30 questions and broadly reflects the type of questions likely to be asked in the SJT. To make the most of this test, you should complete it in one sitting within an hour before checking your answers. When checking your answers to ranking questions, remember that credit is still given for ‘near misses’ and so there is no need to hit the ‘correct’ sequence every time. The practice test answers are not accompanied by detailed explanations. For this reason, it would be preferable to complete all the questions in Section 2 before attempting the test. To replicate the SJT as closely as possible, you should ideally complete these questions within an hour under formal examination conditions. Once you have attempted all the questions, turn to to check your answers. It is difficult to interpret your final score, as your rank will depend entirely on how well your colleagues (and every other medical student in the country) fare. If you are organized, you could arrange a study group to work through this book and/ or complete the practice test. Marking your answers as a group will give some indication as to your performance relative to others. It will also provide an opportunity to discuss the various options (including disagreement with our answers) and so gain a deeper understanding of the issues tested by the SJT.


Author(s):  
Avgoustos Tsinakos ◽  
Ioannis Kazanidis

<p>Student testing and knowledge assessment is a significant aspect of the learning process. In a number of cases, it is expedient not to present the exact same test to all learners all the time (Pritchett, 1999). This may be desired so that cheating in the exam is made harder to carry out or so that the learners can take several practice tests on the same subject as part of the course.</p><p><br />This study presents an e-testing platform, namely PARES, which aims to provide assessment services to academic staff by facilitating the creation and management of question banks and powering the delivery of nondeterministically generated test suites. PARES uses a conflict detection algorithm based on the vector space model to compute the similarity between questions and exclude questions which are deemed to have an unacceptably large similarity from appearing in the same test suite. The conflict detection algorithm and a statistical evaluation of its accuracy are presented. Evaluation results show that PARES succeeds in detecting question types at about 90% and its efficiency can be further increased through continuing education and enrichment of the system’s correlation vocabulary.<br /><br /></p><p> </p>


Author(s):  
Wesley J. Wilson ◽  
Ali Brian ◽  
Luke E. Kelly

Novice teachers struggle with assessing fundamental motor skills. With growing time constraints, not to mention the current COVID-19 pandemic, professional development needs to be streamlined, asynchronous, and online to meet the needs of current teachers. The purpose of this study was to test the feasibility and efficacy of the Motor Skill Assessment Program (MSAP) in increasing the assessment competency of the underhand roll among physical educators and to examine which factors associated with posttest assessment scores. Twenty-nine physical educators (female = 21, male = 8) completed the program. Paired sample t tests were used to determine the efficacy of the program in improving assessment accuracy from pretest to posttest. Associations between posttest scores assessed which factors predicted success within the program addressing feasibility. Program completion resulted in significantly better posttest assessment scores among participants. Guided practice attempts and average scores on guided practice tests correlated most strongly and positively with posttest scores. The assessment training program increased the assessment competency of physical educators. Guided practice and using practice tests best predicted participant learning. Now that the MSAP results with teacher learning and is feasible, this efficacy trial should be scaled up to feature a control group and more skills.


Sign in / Sign up

Export Citation Format

Share Document