Predicting Student Satisfaction and Outcomes in Online Courses Using Learning Activity Indicators

Predicting Student Satisfaction and Outcomes in Online Courses Using Learning Activity Indicators

Kenneth David Strang
DOI: 10.4018/IJWLTT.2017010103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The premise for this study was that learner interaction in an online web-based course could be assessed in relation to academic performance, or in other words, e-learning. Although some studies reveal that learner interaction with online content is related to student academic performance, it remains unproven whether this is casual, or even if there may be a significant correlation. Thus, this study seeks to measure if there is a directional and then a casual relationship between student online academic performance, engagement analytics and other online activity factors. A unique aspect of this study is that data is collected from Moodle engagement analytics as well as from the activity logs. Student academic performance is measured based on the grade achieved from an assessment designed to map to the course learning objectives.
Article Preview
Top

Introduction

Learning analytics have been applied to study and visualize the relationship between student activity and performance in online-based university-level courses during the last decade (Dyckhoff, Zielke, Bültmann, Chatti & Schroeder, 2012; Ferguson, 2012; Gunn, 2014; Nieto Acevedo, Yuri Vanessa, Montenegro Maran & Enrique, 2015; Retalis, Papasalouros, Psaromiligkos, Siscos & Kargidis, 2006; Scanlon, McAndrew & O'Shea, 2015). The authors of 11 relevant studies published in peer-reviewed scholarly journals all found some benefits but they also cited many problems when trying to assess student learning through combinations of learning analytics, learning management system (LMS) activity data logs, and graded performance results (Agudo-Peregrina, Iglesias-Pradas, Conde-Gonzalez & Hernandez-Garcia, 2014; Fidalgo-Blanco, Sein-Echaluce, Garcia-Peealvo & Conde, 2015; Gomez-Aguilar, Hernandez-Garcia, Garcia-Pealvo & Theren, 2015; Iglesias-Pradas, Ruiz-de-Azcarate & Agudo-Peregrina, 2015; Nieto Acevedo et al., 2015; Reyes, 2015; Ruiparez-Valiente, Mua-Merino, Leony & Delgado Kloos, 2015; Scheffel, Drachsler, Stoyanov & Specht, 2014; Xing, Guo, Petakovic & Goggins, 2015; Yahya, Messoussi & Touahni, 2015).

Each of those scholarly manuscripts cited above made unique contributions to the literature beyond visualizing inferential deductions that previous GPA predicts future GPA. In addition to explaining how learning analytics may be used to measure and visualize student learning activity in online courses, all of those researchers raised the concern that deep learning may not be reliability predicted by learning analytics. In fact, only two groups of researchers (Agudo-Peregrina et al., 2014; Gomez-Aguilar et al., 2015) found any statistically significant correlation relationships between student online activity reported in learning analytics and academic performance. More than one researcher confirmed there was no statistically significant relationship between learning analytics, LMS activity data and student learning outcomes (Iglesias-Pradas et al., 2015). Therefore, more studies are needed to test if learning analytics data could relate to or predict student performance.

It is difficult to generalize either the positive, negative or null findings of the above studies towards any business school population due to the differences in the research design, unit of analysis, LMS context and subject matter disciplines from which the samples were drawn. For example, only a few studies tested for and used objective measures of student learning performance for the dependent variable. All of those researchers called for more studies to further explore how learning analytics and LMS data could be utilized to assess student performance.

There are numerous empirical studies published in scholarly peer-reviewed journals beyond the scope of learning analytics where researchers have found statistically significant links between student performance in online courses and activity-related factors (Blumenthal et al., 2014; Chang, Wu, Kuo & You3, 2012; Farrington, 2014; Gibson & Dunning, 2012; Hu, Lo & Shih, 2014; Kaufman & Schunn, 2011; Lu & Law, 2012; Lu & Zhang, 2012; Mirriahi & Alonzo, 2015; Pombo & Talaia, 2012; Russell, 2015; Shih, 2011; Strang, 2010, 2011, 2013a; Thomas, Reyes & Blumling, 2014; Wichadee, 2014; Zacharis, 2015). Thus, there may be relevant concepts in other domains that show how student activity in online courses may be related to estimating performance as well as how that knowledge could be applied to improve teaching.

Complete Article List

Search this Journal:
Reset
Volume 19: 1 Issue (2024)
Volume 18: 2 Issues (2023)
Volume 17: 8 Issues (2022)
Volume 16: 6 Issues (2021)
Volume 15: 4 Issues (2020)
Volume 14: 4 Issues (2019)
Volume 13: 4 Issues (2018)
Volume 12: 4 Issues (2017)
Volume 11: 4 Issues (2016)
Volume 10: 4 Issues (2015)
Volume 9: 4 Issues (2014)
Volume 8: 4 Issues (2013)
Volume 7: 4 Issues (2012)
Volume 6: 4 Issues (2011)
Volume 5: 4 Issues (2010)
Volume 4: 4 Issues (2009)
Volume 3: 4 Issues (2008)
Volume 2: 4 Issues (2007)
Volume 1: 4 Issues (2006)
View Complete Journal Contents Listing