Identifying Latent Classes and Differential Item Functioning in a Cohort of E-Learning Students

Identifying Latent Classes and Differential Item Functioning in a Cohort of E-Learning Students

Andrew Sanford (Monash University, Australia), Paul Lajbcygier (Monash University, Australia) and Christine Spratt (Royal Australian and New Zealand College of Psychiatrists, Australia)
Copyright: © 2009 |Pages: 23
DOI: 10.4018/978-1-60566-410-1.ch011
OnDemand PDF Download:


A differential item functioning analysis is performed on a cohort of E-Learning students undertaking a unit in computational finance. The motivation for this analysis is to identify differential item functioning based on attributes of the student cohort that are unobserved. The authors find evidence that a model containing two distinct latent classes of students is preferred, and identify those examination items that display the greatest level of differential item functioning. On reviewing the attributes of the students in each of the latent classes, and the items and categories that mostly distinguish those classes, the authors conclude that the bias associated with the differential item functioning is related to the a priori background knowledge that students bring to the unit. Based on this analysis, they recommend changes in unit instruction and examination design so as to remove this bias.
Chapter Preview


The aim of this chapter is to discuss the identification of latent classes or groups within a cohort of E-Learning students. These latent classes are determined by the existence of differential item functioning, or item bias, experienced by students within these different classes.1 In our illustrative case study, we are able to identify and interpret a number of these latent classes. Our thesis is that the differential item functioning (DIF), and the latent class structures identified, are a consequence of the students’ diverse educational backgrounds. The case study looks at a unit in computational finance where the students are taking either a single major in commerce or information technology, or double majors in both. We argue that given the multi-disciplinary nature of the unit, those taking a double major are advantaged in terms of background or a priori knowledge over those taking single degrees, resulting in the identified DIF.

DIF analysis seeks to determine the existence of systematic differences in item responses among groups of student, the cause of which is some factor, or factors, other than the innate ability or proficiency of the students. The meaning of ‘innate’ ability is the student trait which the test items have been designed to measure. DIF analysis seeks to identify test items that discriminate amongst students based on factors other than their ability. Item Response Theory (IRT) modeling has a long association with educational and psychometric research, and has proven to be a popular method for detecting DIF.2 Usually, in investigating DIF with IRT models, students are placed into groups based on the presence or absence of these observable, non-ability factors. Two IRT models are then estimated for each of the groups and their parameters are checked to see whether they are significantly different. If they are, then DIF is considered to exist.

The DIF analysis discussed in this chapter uses examination items and student responses from a unit in computational finance, which has been taught by one of the authors for many years.3 Materials in this unit have been designed to suit online E-Learning, and all assessment has been prepared in a multiple choice format, appropriate for automated delivery and scoring. The unit attracts students from a diverse range of educational and cultural backgrounds, and thus provides a ready number of observed factors (e.g. gender, major area of study, years of academic attainment, international status, ethnic background, etc.) which can be used to put students into groups.

Although using observed factors is common, it might not be valid in all circumstances. For example, DIF may be due to factors, such as a student’s level of motivation, learning intentions, language deficiencies, anxiety or problem solving strategies, which are not readily observable. An alternative to specifying the student class membership prior to carrying out the DIF analysis is to infer the student membership as an output of the DIF analysis. In our case study, a predefined number of latent classes are specified within the IRT model, and students are allocated to those classes based on their test item responses.

Within the case study, DIF analysis is carried out using a polytomous IRT model previously developed by Bolt, Cohen and Wollack (2001).4 A valuable output common to most IRT models, and one which provides a useful visual display of DIF, are the Item Response Functions (IRFs) or Item Category Characteristic Curves (ICCCs). These functions display the probabilities associated with the selection of each item and/or category as a function of a student’s ability (or proficiency) level.5 The correct category response usually records the highest probability. We reproduce a number of ICCC functions to illustrate the differences in response probabilities for the different latent classes of student. (See Figure 2 and Figure 3.)

In the following sections, we review the current IRT and assessment literature; discuss features of the computational finance unit, and the student response data. We then discuss in greater detail the polytomous IRT model and the methodology used to estimate and compare models. Finally, the results of the DIF analysis are presented and discussed, and recommendations made toward E-Learning assessment.

Complete Chapter List

Search this Book:
Editorial Advisory Board
Table of Contents
Gary Poole
Christine Spratt, Paul Lajbcygier
Chapter 1
Selby Markham, John Hurt
Reliability and validity have a well-established place in the development and implementation of educational assessment devices. With the advent of... Sample PDF
Re-Assessing Validity and Reliability in the E-Learning Environment
Chapter 2
Päivi Hakkarainen, Tarja Saarelainen, Heli Ruokamo
In this chapter the authors report on the assessment framework and practices that they applied to the e-learning version of the Network Management... Sample PDF
Assessing Teaching and Students' Meaningful Learning Processes in an E-Learning Course
Chapter 3
Charlotte Brack
Within the notion of Web 2.0, social software has characteristics that make it particularly relevant to ELearning, aligning well with a social... Sample PDF
Collaborative E-Learning Using Wikis: A Case Report
Chapter 4
Mike Hobbs, Elaine Brown, Marie Gordon
This chapter provides an introduction to learning and teaching in the virtual world Second Life (SL). It focuses on the nature of the environment... Sample PDF
Learning and Assessment with Virtual Worlds
Chapter 5
Paul White, Greg Duncan
This chapter describes innovative approaches to E-Learning and related assessment, driven by a Faculty Teaching and Learning Technologies Committee... Sample PDF
A Faculty Approach to Implementing Advanced, E-Learning Dependent, Formative and Summative Assessment Practices
Chapter 6
Christine Armatas, Bernard Colbert
Two challenges with online assessment are making sure data collected is secure and authenticating the data source. The first challenge relates to... Sample PDF
Ensuring Security and Integrity of Data for Online Assessment
Chapter 7
Robyn Benson
This chapter addresses some issues relating to the use of e-learning tools and environments for implementing peer assessment. It aims to weigh up... Sample PDF
Issues in Peer Assessment and E-Learning
Chapter 8
Paul Lajbcygier, Christine Spratt
This chapter presents recent research on group assessment in an e-learning environment as an avenue to debate contemporary issues in the design of... Sample PDF
The Validity of Group Marks as a Proxy for Individual Learning in E-Learning Settings
Chapter 9
Robert S. Friedman, Fadi P. Deek, Norbert Elliot
In order to offer a unified framework for the empirical assessment of e-learning (EL), this chapter presents findings from three studies conducted... Sample PDF
Validation of E-Learning Courses in Computer Science and Humanities: A Matter of Context
Chapter 10
Richard Tucker, Jan Fermelis, Stuart Palmer
There is considerable evidence of student scepticism regarding the purpose of team assignments and high levels of concern for the fairness of... Sample PDF
Designing, Implementing and Evaluating a Self-and-Peer Assessment Tool for E-Learning Environments
Chapter 11
Andrew Sanford, Paul Lajbcygier, Christine Spratt
A differential item functioning analysis is performed on a cohort of E-Learning students undertaking a unit in computational finance. The motivation... Sample PDF
Identifying Latent Classes and Differential Item Functioning in a Cohort of E-Learning Students
Chapter 12
Christine Armatas, Anthony Saliba
A concern with E-Learning environments is whether students achieve superior or equivalent learning outcomes to those obtained through traditional... Sample PDF
Is Learning as Effective When Studying Using a Mobile Device Compared to Other Methods?
Chapter 13
Thomas C. Reeves, John G. Hedberg
Evaluation falls into the category of those often neglected human practices such as exercise and eating right. All of us involved in education or... Sample PDF
Evaluation Strategies for Open and Distributed Learning Environments
Chapter 14
Madhumita Bhattacharya
This chapter presents a description and analysis of salient issues related to the development of an integrated e-portfolio application implemented... Sample PDF
Introducing Integrated E-Portfolio Across Courses in a Postgraduate Program in Distance and Online Education
Chapter 15
John LeBaron, Carol Bennett
Teachers and designers of computer-networked settings increasingly acknowledge that active learner engagement poses unique challenges, especially... Sample PDF
Practical Strategies for Assessing the Quality of Collaborative Learner Engagement
Chapter 16
Som Naidu
Many teachers commonly use assessment as the starting point of their teaching activities because they believe that assessment drives learning and... Sample PDF
Afterword: Learning-Centred Focus to Assessment Practices
About the Contributors