Unpacking the ‘Learning' in Student-Facing Analytics: Metacognition and the Zone of Proximal Development

Unpacking the ‘Learning' in Student-Facing Analytics: Metacognition and the Zone of Proximal Development

Joy Galaige, Geraldine Torrisi-Steele
Copyright: © 2019 |Pages: 12
DOI: 10.4018/IJAVET.2019010101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Founded on the need to help university students develop a greater academic metacognitive capacity, student-facing learning analytics are considered useful tools for making students overtly aware of their own learning processes, helping students to develop control over their learning, and subsequently supporting more effective learning. However, early research on the effectiveness of student-facing analytics is giving mixed results and is casting some doubt over the usefulness of student-facing learning analytics. One factor contributing to doubt over the value of student-facing learning analytics is that their design and implementation remains firmly rooted in the technical domain, with virtually no grounding in the knowledge base of learning and teaching. If the growing investment of resources into the development of student-facing learning analytics systems is to be fruitful, then there is an obvious, urgent need to re-position student-facing learning analytics within learning and teaching frameworks. With this in mind, we use Schraw & Dennison's model of metacognition and Vygotsky's zone of proximal development to unpack the ‘learning' in student-facing analytics and work towards an understanding of student-facing analytics that is more conducive to supporting metacognition and effective learning.
Article Preview
Top

Introduction

Data related to learning activities of students is produced as a matter of course whenever students enter learning management systems and interact in some way with content. Hence, learning analytics are essentially a by-product of student interaction with online learning systems. With the proliferation of online learning in higher education, data to sustain learning analytics systems is in abundant supply. Volume and diversity of data is not an issue. What is an issue is the lack of a pedagogical knowledge on which to base the effective design and implementation of learning analytics. To better understand how this state of affairs has arisen, it is useful to briefly discuss the context of learning analytics.

Analytics, or the systematic analyses of data, are not new. In the domain of business and marketing, analytics have long been used, mostly successfully, to acquire business intelligence and provide a knowledge base for business decision-making. In higher education too, initial interest in analytics was sparked by business needs since, in recent times, higher education has experienced a shift towards universities running as ‘competitive businesses.’ Social and economic forces are bearing down on universities, putting the institutions under pressure to become competitive. Hence, akin to other business organisations, the use of analytics in higher education has roots in the need for understanding their customers, improving customer experience, and meeting accreditation policies. The motivation for application of analytics to the need to improve student learning can, in some respects, be considered secondary to policy and business needs, but in reality, this approach obscures teaching and learning as a core business of universities. It must be recognized by institutions that effective learning is integral to the student experience, and thus must be considered central to the needs of the higher education organization.

As discussion turns towards effective student learning, we are compelled to drill down beneath the institutional perspective to consider educator perspectives on learning analytics. A core doing of educators is to try and better understand their learners. The data generated by student interaction with online learning systems (learning analytics) thus naturally attracts interest as an avenue for creating more effective learning environments built on understanding learners. Thus, the first implementations of learning analytics systems targeted the needs of educators (Teasley, 2017). Educators could use analytics to gauge progress of students, identify troublesome concepts, and importantly identify struggling students. Informed by learning analytics, educators can make evidence-based changes to their teaching practices/strategies or put in place more support mechanisms to assist students (Siemens, 2013; Arnold & Pistilli, 2012; Essa & Ayad, 2012). It didn’t take long for educators to also conceive that students may also stand to benefit from being able to access data about their own learning.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing