Personalising the Pathway: Learning Analytics for Leveraging Pedagogically Purposeful Indicators of Academic Performance

Personalising the Pathway: Learning Analytics for Leveraging Pedagogically Purposeful Indicators of Academic Performance

DOI: 10.4018/978-1-4666-9983-0.ch009
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The Eynesbury Institute of Business and Technology (EIBT) is one of a growing number of private providers partnering with universities to establish pre-university pathway programs worldwide. As a second chance for prospective students who do not meet initial Australian Higher Education (HE) entrance requirements, pathway providers attract students early in their tertiary lifecycle to secure their destination. EIBT has an abundance of empirically-rich and ‘big' data that may be used to find pedagogically useful indicators, predictors and recommendations for teaching and learning advancement by careful evaluation of the findings. The work of ‘researchers' often resides in isolation from that of ‘educators', whereby the ‘gap' may reflect a poor-cycle of communication and interaction between empirical studies and praxis. This chapter is limited to a select and somewhat brief discussion of specific uses of Learning Analytics (LA) in the context of EIBT and related sense-making and predicting of student performance.
Chapter Preview
Top

Introduction

In the Higher Education (HE) sector, learning quality assurance data are typically derived from Student Experience Surveys (Lockyer, Heathcote, & Dawson, 2013) alongside measures of attrition, progression and assessment scores (Monson, Bunney, & Lawrence, 2013). The adoption of education technologies—such as Learning Management Systems (LMS)—has resulted in a vast set of accessible data. These data, however, are commonly used ‘retrospectively’ to improve future iterations of program delivery, to determine impact on learning outcomes, and to provide a benchmark on overall institutional performance. Digital footprints can, however, be collected and analysed to establish indicators of teaching quality and provide a more proactive assessment of student learning and engagement.

This chapter employs the definition of Learning Analytics (LA) set out in the first International Conference on Learning Analytics and Knowledge (LAK 2011) and adopted by the Society for Learning Analytics Research (SoLAR) that says, ‘LA is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs’ (Siemens, 2011). Indeed, the educational ‘context’ plays a key role as ‘an in-depth understanding of the institution and how it functions will include knowledge of its operational and strategic objectives and direction’ (Prinsloo, Archer, Barnes, Chetty, & Van Zyl, 2015). As it stands, this definition is usually coupled with two underlying assumptions: (1) that LA make use of pre-existing and machine-readable data; and (2) that LA techniques are used to handle ‘big data’ i.e., large data sets that would not be practicable to deal with manually.

The Eynesbury Institute of Business and Technology (EIBT) has an abundance of empirically-rich and ‘big’ data that may be used to find pedagogically useful indicators, predictors and recommendations for teaching and learning advancement by careful evaluation of the findings. In agreement with Slade and Prinsloo (2013, p. 5), ‘[h]igher education institutions have always analyzed data to some extent, but the sheer volume of data continues to rise along with institutions’ computational capacity...’ EIBT is conceptually bridging technical and educational domains (Lockyer, et al., 2013, p. 1446). EIBT’s LMS is embedded within a ‘wider network of platforms and systems’ involved in supporting its institutional teaching and learning mission, and is increasingly viewed as a core component of the EIBT’s educational infrastructure (L. Macfadyen & Dawson, 2012, p. 150). That is, EIBT is increasing its focus on data-driven decision-making and the integration of the technical with social/pedagogical dimensions of HE (Romero & Ventura, 2013, p. 13). Added to this, however, is the need to utilise such analytical tools to reduce the timeframe between (a) analysis and (b) action. Ideally, effective tools can promptly explore real-time user feedback, as well as enable manipulation/visualisation based on the interests of researchers, practitioners and stakeholders.

EIBT is successfully using its data for many and varied purposes, among which includes linking available datasets with its fellow pathway colleges and partner universities in order to implement more learner-oriented services and thus improve student performance. With analytical tools growing more powerful and their reach increasing, the aim is to assist EIBT educators to interpret instructor- and learner-centric data for informing future pedagogical decisions. Objectives related to EIBT’s motivation to pursue LA include, for example:

  • To develop a deep(er) understanding of student learning at an individual-level to support the personalisation of their educative experiences;

  • To embed emergent feedback on student learning into practices through enabling adaptations to EIBT’s teaching and learning in a timely manner; and

  • To identify Students-At-Risk (STAR) of poor learning experiences/outcomes, in real-time and with insight to allow for meaningful intervention.

Complete Chapter List

Search this Book:
Reset