Post exam item analysis: Implication for intervention
AbstractPost exam item analysis enables teachers to reduce biases on student achievement assessments and improve their way of instruction. Difficulty indices, discrimination power and distracter efficiencies were commonly investigated in item analysis. This research was intended to investigate the difficulty and discrimination indices, distracters efficiency, whole test reliability and construct defects in summative test for freshman common course at Gondar CTE. In this study, 176 exam papers were analyzed in terms of difficulty index, point bi-serial correlation and distracter efficiencies. Internal consistency reliability and construct defects such as meaningless stems, punctuation errors and inconsistencies in option formats were also investigated. Results revealed that the summative test as a whole has moderate difficulty level (0.56 ± 0.20) and good distracter efficiency (85.71% ± 29%). However, the exam was poor in terms of discrimination power (0.16 ± 0.28) and internal consistency reliability (KR-20 = 0.58). Only one item has good discrimination power and one more item excellent in its discrimination. About 41.9% of the items were either too easy or too difficult. Inconsistency in option formats or inappropriate options, punctuation errors and meaningless stems were also observed. Thus, future test development interventions should give due emphasis on item reliability, discrimination coefficient and item construct defects.