Can online student performance be forecasted by learning analytics?

2016 ◽  
Vol 8 (1) ◽  
pp. 26 ◽  
Author(s):  
Kenneth David Strang
Author(s):  
Mohamed H Haggag ◽  
Mahmood Abdel Latif ◽  
Deena Mostafa Helal

Author(s):  
Darcio Costa Nogueira Junior ◽  
Isadora Valle Sousa ◽  
Frederico Cordeiro Martins ◽  
Marta Macedo Kerr Pinheiro

The present work aims at addressing how the use of Learning Analytics (LA) has enabled the retrieval of learning information by the student oneself, by analyzing data availability, self-management and student autonomy in learning processes inside and outside virtual environments. The bibliographic research conducted had a qualitative nature and consisted of a narrative literature review anchored in the theoretical foundations of information (information retrieval and representation) and Learning Analytics. Two relevant user case studies that dealt with LA were selected from the researched articles - the first analyzed the user approach in an adapted learning context with LA whereas the second analyzed the user approach in a personalized learning context with LA. One concluded that the student, as an information user, still has little access to an effective retrieval of what was consolidated throughout one’s own learning process. Besides, in relation to the effectiveness of LA, in the context of adapted and personalized learning, there was a perceived increase in student performance with regard to the use of activities and tasks.


2014 ◽  
Vol 1 (1) ◽  
pp. 129-139 ◽  
Author(s):  
John P Buerck

Academic analytics and learning analytics have been increasingly adopted by academic institutions of higher learning for improving student performance and retention. While several studies have reported the implementation details and the successes of specific analytics initiatives, relatively fewer studies exist in literature that describe the possible constraints that can preclude an academic or learning analytics initiative from succeeding fully, meeting the criteria of success as defined by the stakeholders affected by such initiatives. Our aim in this article is to describe the constraints that precluded a successful completion of our analytics initiative and how we re-envisioned our approach and scope to achieve our primary goals while operating within the constraints and tools associated with our academic environment.


Author(s):  
David Santandreu Calonge ◽  
Karina M. Riggs ◽  
Mariam Aman Shah ◽  
Tim A. Cavanagh

Academic research in the past decade has indicated that using data and analyzing learning in curriculum design decisions can lead to improved student performance and student success. As learning in many instances has evolved into the flexible format online, anywhere at any time, learning analytics could potentially provide impactful insights into student engagement in massive open online courses (MOOCs). These may contribute to early identification of “at risk” participants and provide MOOC facilitators, educators, and learning designers with insights on how to provide effective interventions to ensure participants meet the course learning outcomes and encourage retention and completion of a MOOC. This chapter uses the essential human biology MOOC within the Australian AdelaideX initiative to implement learning analytics to investigate and compare demographics of participants, patterns of navigation including participation and engagement for passers and non-passers in two iterations of the MOOC, one instructor-led, and second self-paced.


2016 ◽  
Vol 2 (3) ◽  
pp. 81-110 ◽  
Author(s):  
Vitomir Kovanovic ◽  
Dragan Gašević ◽  
Shane Dawson ◽  
Srećko Joksimovic ◽  
Ryan Baker

With the widespread adoption of Learning Management Systems (LMS) and other learning technology, large amounts of data – commonly known as trace data – are being recorded and are readily accessible to educational researchers. Among different uses of trace data, it has been extensively used to calculate time that students spent on different learning activities – commonly referred to as student time-on-task. Extracted time-on-task measures are then used to build predictive models of student learning in order to understand and improve learning processes. While time-on-task measures have been extensively used in Learning Analytics research, the details of their estimation are rarely described and the consequences that this process entails are not fully examined.This paper presents findings from two experiments that looked at the different time-on-task estimation methods and how they influence the final research findings. Based on modeling different student performance measures with popular statistical methods in two datasets (one online and one blended), our findings indicate that time-on-task estimation methods play an important role in shaping the final study results. This is particularly true for online setting where the amount of interaction with LMS is typically higher. The primary goal of this paper is to raise awareness and initiate a debate on the important issue of time-on-task estimation within a broader learning analytics community. Finally, the paper provides an overview of commonly adopted time-on-task estimation methods in educational and related research fields.


Sign in / Sign up

Export Citation Format

Share Document