Development, Sustainment, and Scaling of Self-Regulated Learning Analytics: Prediction Modeling and Digital Student Success Initiatives in University Contexts

Development, Sustainment, and Scaling of Self-Regulated Learning Analytics: Prediction Modeling and Digital Student Success Initiatives in University Contexts

Matthew L. Bernacki
DOI: 10.4018/978-1-6684-6500-4.ch012
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Undergraduates who engage in high structure, high enrollment, active learning courses tend to perform best when they self-regulate their learning. Student self-regulated learning is encouraged by active learning designs that provide resources used to enact strategies and engage in planning, monitoring, and evaluation of one's learning processes. Some learners may need to “learn to learn” before these resources can be leveraged effectively. This chapter documents a scalable co-design and learning analytics project led by learning scientists and science, technology, engineering, and math (STEM) instructors who enhanced digital resources of large-lecture courses' learning management systems (LMSs). Students' course engagement produced data about self-regulated learning processes, affording feature engineering, prediction modeling and provision of timely support to students predicted to struggle. The authors report initial results through the lens of self-regulated learning theory and elaborate cases demonstrating institutionalization and replicability to research and regional institutions.
Chapter Preview
Top

Introduction

In response to issues with student performance, retention, progression, and completion (Arroway et al., 2016), many universities and their educational software providers are embracing learning analytics (Lang et al., 2017, 2022) to understand the student learning experience through the learning event data that undergraduates produce while they learn in their courses (Bernacki, 2018). Those data can be used to improve instructional practices, support for student learning, and institutional resources that encourage students’ retention, progression, and completion. The rise of interest among university administrators and information technology leaders is well documented in trade publications such as the Campus Computing Project’s annual report (Green, 2019), and industry trend reports such as EDUCAUSE’s Horizon Report for Higher Education (Pelletier et al., 2022). However, fervor about the promise of learning analytics for revolutionizing higher education that was forecast in earlier editions (e.g., Adams Becker et al., 2017) has been tempered in more recent years. In 2022, Pelletier et al. (2022, p. 9) describe “learning analytics and big data” with more reservation:

The promise of harnessing big data to improve student outcomes and increase automation of services such as the LMS has been enticing to many institutional leaders over the past five years. However, the implementation of big-data systems has not resulted in much change for campuses. Large amounts of data are collected but rarely harnessed to effect meaningful change in outcomes or even systems. Mature, institution-wide data mining strategies are needed to see progress on learning analytics and big data. Some barriers to success include student privacy and equity concerns, lack of buy-in from faculty, and investment in staff and resources for data reporting. A path forward needs to rely heavily on collaboration with all the key stakeholders at the table, including faculty, and starts with a clear vision of what big questions need to be answered.

Key Terms in this Chapter

Log (or Logfile): Any record of timestamped events recorded by a software that can be used to observe traces of learner actions, which can be interpreted through theory to reflect learning events.

Learning Resource: Elaborated in recommended readings ( Bernacki, 2018 ), a learning resource is a digital or physical piece of educational media provided to a learner within a learning environment. Educators can consider the resource and how it aligns to the learning objectives and assessments and consider learning theories to determine how access and use of a learning resource might produce traces that reflect a learning event.

Learning Event: Elaborated in recommended readings ( Bernacki, 2018 ), a learning event is a timestamped event captured as a trace from process data in a learning environment that can reliably be interpreted to reflect a theoretically-grounded learning process. This requires validation of the event using corroborating data from the learner, and a theory that describes the learning process the data reflect.

TRACE: A recorded observation of an event that occurs during a session. The observation may be captured from the logfile produced by a learning technology or may be captured from audio or video recording. This isolated trace then needs to be considered through the lens of a learning theory to determine whether it reflects a specific learning event.

Feature Engineering: Any actions taken to create a new trace from raw event data captured in a logfile. Learning analytics practitioners will engage in feature engineering as they label and organize learning events of interest to educators who aim to make decisions or inferences about learners in a learning environment.

Log(file) Analysis: Any variety of descriptive, inferential or machine learning analysis of data derived. From a logfile and inferred to trace learning events.

Learning Environment: Addressed in this chapter and further described by Matcha and colleagues in recommended additional readings, a learning environment includes the digital technology platform on which a learner engages in a task and the instructional design of the task. The learning goal set out by the designer, instructor, or learner and the assessment practices used to evaluate performance will determine the types of learning resources provided to the learner, and these will vary across the course sites on a learning management system. Tracing learning events is thus context-dependent, and definitions will vary from one environment to the next.

Complete Chapter List

Search this Book:
Reset