scholarly journals Student Vulnerability, Agency and Learning Analytics: An Exploration

2016 ◽  
Vol 3 (1) ◽  
Author(s):  
Paul Prinsloo ◽  
Sharon Slade

The increasing potential and practice of collecting, analysing and using student data necessitates that higher education institutions (HEIs) critically examine their assumptions, paradigms and practices regarding student data. There is a real danger that some current approaches to learning analytics within higher education ignore the fiduciary duty of HEIs and the impact and scope of the asymmetrical power relationship between students and the institution. In the light of increasing concerns regarding surveillance, higher education cannot afford a simple paternalistic approach to the use of student data. Very few HEIs have regulatory frameworks in place and/or share information with students regarding the scope of data that may be collected, analysed, used and shared. It is clear from literature that basic opting in or opting out does not sufficiently allow for many of the complex issues in the nexus of privacy, consent, vulnerability and agency. The notion of vulnerability (institutional and individual) allows an interesting and useful lens on the collection and use of student data. Though both institutional and individual vulnerability needs to be considered, this paper focuses specifically on student vulnerability. An earlier framework developed by Prinsloo and Slade provides tentative pointers to consider a range of responses to decrease students’ vulnerability, increase students’ agency and move students as participants in learning analytics from quantified selves to qualified selves.

Big Data ◽  
2016 ◽  
pp. 1717-1735
Author(s):  
Paul Prinsloo ◽  
Sharon Slade

Learning analytics is an emerging but rapidly growing field seen as offering unquestionable benefit to higher education institutions and students alike. Indeed, given its huge potential to transform the student experience, it could be argued that higher education has a duty to use learning analytics. In the flurry of excitement and eagerness to develop ever slicker predictive systems, few pause to consider whether the increasing use of student data also leads to increasing concerns. This chapter argues that the issue is not whether higher education should use student data, but under which conditions, for what purpose, for whose benefit, and in ways in which students may be actively involved. The authors explore issues including the constructs of general data and student data, and the scope for student responsibility in the collection, analysis and use of their data. An example of student engagement in practice reviews the policy created by the Open University in 2014. The chapter concludes with an exploration of general principles for a new deal on student data in learning analytics.


Author(s):  
Neerja Singh

Learning analytics is receiving increased awareness because it helps educational institutions in growing student retention, enhancing student fulfillment, and easing the burden of accountability. Although those massive-scale issues are worthy of attention, schools may additionally be inquisitive about how they can use learning analytics in their personal guides to assist their students. In this chapter, the authors define learning analytics, the way it has been used in educational establishments, what learning analytics tools are available, and how college can make use of facts in their publications to reveal scholar overall performance. Finally, the authors articulate several problems and uncertainties with the usage of learning analytics in higher education.


2022 ◽  
pp. 137-161
Author(s):  
Paula Miranda ◽  
Pedro Isaías ◽  
Sara Pifano

The impact of the swift evolution of technology has rippled across all areas of society with technological developments presenting solutions to some of society's greatest challenges. Within higher education, technology is welcomed with the necessary caution of a sector that is responsible for educating and empowering the future workforce. The progressive, and more recently accelerated, digitalisation of education causes the core practices and procedures associated with teaching and learning, including assessment, to be delivered in innovative formats. Technology plays a central role in the delivery of e-assessment, widening its possibilities and broadening its methods and strategies. This chapter aims to examine how innovative technologies are shaping and improving the delivery of e-assessment in the context of higher education. More specifically, it examines the role of artificial intelligence, gamification, learning analytics, cloud computing, and mobile technology in how e-assessment can be delivered.


Author(s):  
Samira ElAtia ◽  
Donald Ipperciel

In this chapter, the authors propose an overview on the use of learning analytics (LA) and educational data mining (EDM) in addressing issues related to its uses and applications in higher education. They aim to provide meaningful and substantial answers to how both LA and EDM can advance higher education from a large scale, big data educational research perspective. They present various tasks and applications that already exist in the field of EDM and LA in higher education. They categorize them based on their purposes, their uses, and their impact on various stakeholders. They conclude the chapter by critically analyzing various forecasts regarding the impact that EDM will have on future educational setting, especially in light of the current situation that shifted education worldwide into some form of eLearning models. They also discuss and raise issues regarding fundamentals consideration on ethics and privacy in using EDM and LA in higher education.


Author(s):  
Paul Prinsloo ◽  
Sharon Slade

Learning analytics is an emerging but rapidly growing field seen as offering unquestionable benefit to higher education institutions and students alike. Indeed, given its huge potential to transform the student experience, it could be argued that higher education has a duty to use learning analytics. In the flurry of excitement and eagerness to develop ever slicker predictive systems, few pause to consider whether the increasing use of student data also leads to increasing concerns. This chapter argues that the issue is not whether higher education should use student data, but under which conditions, for what purpose, for whose benefit, and in ways in which students may be actively involved. The authors explore issues including the constructs of general data and student data, and the scope for student responsibility in the collection, analysis and use of their data. An example of student engagement in practice reviews the policy created by the Open University in 2014. The chapter concludes with an exploration of general principles for a new deal on student data in learning analytics.


2020 ◽  
Vol 36 (6) ◽  
pp. 1-6
Author(s):  
Linda Corrin ◽  
Maren Scheffel ◽  
Dragan Gašević

The field of learning analytics has evolved over the past decade to provide new ways to view, understand and enhance learning activities and environments in higher education. It brings together research and practice traditions from multiple disciplines to provide an evidence base to inform student support and effective design for learning. This has resulted in a plethora of ideas and research exploring how data can be analysed and utilised to not only inform educators, but also to drive online learning systems that offer personalised learning experiences and/or feedback for students. However, a core challenge that the learning analytics community continues to face is how the impact of these innovations can be demonstrated. Where impact is positive, there is a case for continuing or increasing the use of learning analytics, however, there is also the potential for negative impact which is something that needs to be identified quickly and managed. As more institutions implement strategies to take advantage of learning analytics as part of core business, it is important that impact can be evaluated and addressed to ensure effectiveness and sustainability. In this editorial of the AJET special issue dedicated to the impact of learning analytics in higher education, we consider what impact can mean in the context of learning analytics and what the field needs to do to ensure that there are clear pathways to impact that result in the development of systems, analyses, and interventions that improve the educational environment.


NASPA Journal ◽  
2005 ◽  
Vol 42 (2) ◽  
Author(s):  
Martha F. Cleveland-Innes ◽  
Claudia Emes

The nature of interaction in higher education environments impacts not only end outcomes, but also the approach to learning itself. Using a quasi-experimental research design, this empirical study tests the impact of social and academic interaction on student approaches to learning. Findings demonstrate significant correlations between contextual variables and approaches to learning. Most importantly, Peer Interaction and Faculty Interaction have an effect on change in approach to learning over time. This demonstrates the potential of interaction in the learning context to affect not only learning outcomes, but also the way learning itself takes place.


10.29007/vrbv ◽  
2018 ◽  
Author(s):  
Peter Saunders ◽  
Ehsan Gharaie ◽  
Andrea Chester ◽  
Cathy Leahy

Learning analytics is an emerging field that has been gaining momentum in higher education. Learning analytics is the analysis and reporting of learner related data. Research has examined the benefits of learning analytics in higher education but there has been limited research conducted about the impact of showing students their own learning data. The aim of this study was to provide students with their own learner data, obtain feedback about the usefulness of this information and investigate if providing learning data leads to an increase in self-efficacy and self-reflection. The sample consisted of 78 students studying construction management, project management, and property and valuation. Students were provided with weekly learner reports that included data about their behaviour in a learning management system, their level of interaction in lectures, and their performance on assessments. A suggested target was provided toward an individualised behaviour goal, as well as comparison with both the contemporary class average and previous class averages. Students completed measures of self-efficacy and self-reflection pre and post intervention and feedback about the reports was obtained through surveys and a focus group. Results showed no significant change in self-efficacy and self-reflection, however, students reported finding the learning analytics reports helpful, believed it helped them reflect on their own learning and wanted to see more analytics in other subjects. Results support the use of learning analytics in the classroom and suggest that they may enhance the student experience.


Sign in / Sign up

Export Citation Format

Share Document