An Argument-Based Approach to Test Fairness: The Case of Multiple-Form Equating in the College English Test

An Argument-Based Approach to Test Fairness: The Case of Multiple-Form Equating in the College English Test

Yan Jin (School of Foreign Languages, Shanghai Jiaotong University, Shanghai, China) and Eric Wu (University of California, Los Angeles (UCLA), Temple City, CA, USA)
DOI: 10.4018/IJCALLT.2017070104
OnDemand PDF Download:
No Current Special Offers


This article aims to demonstrate how innovative testing practices can effectively prevent high-tech mass cheating and improve fairness in language assessment. The article first introduces Xi's (2010) view of validity and fairness and her proposal of an argument-based approach to empirically examining test fairness. The article then describes the threat to fair testing posed by high-tech cheating on the College English Test (CET). A study of multiple-form equating was conducted and reported in the article, which was aimed at achieving alternate form reliability when multiple versions and multiple forms were used in the CET. The article then concludes with a discussion on the usefulness of an argument-based approach to empirically examining test fairness.
Article Preview

1. Introduction

In the field of language testing and assessment, technology is generally seen as a blessing: the latest developments in information and communications technology (ICT), for example, have improved not only the efficiency of assessment practices through delivering tests on computers or over the internet, but the validity of language tests through introducing new language constructs and bringing about changes in language teaching and learning (Chalhoub-Deville & Deville, 1999; Chapelle, 2008; Chapelle & Douglas, 2006; Chapelle & Voss, 2016). However, the blessing of technology may well become a threat if used improperly by ill-intentioned people. Technology-assisted cheating on the College English Test (CET), a national language testing system in China (see Jin, 2010, 2016; Yang, 2003; Zheng & Cheng, 2008), is a case in point. In recent years, with the rise in the stakes of the CET, high-tech cheating has become a major concern among the test developer and its stakeholders. By “high-tech cheating on tests”, we mean the use of communications devices such as mobile phone, needle camera, invisible watch, mini-earpiece, and multifunctional Bluetooth transmitter for sending and/or receiving messages in a testing situation. The purpose of this article is to discuss the misuse of technology for cheating and introduce an innovative, preventive measure to combat high-tech cheating.

There is no denying that cheating on tests constitutes a serious threat to test fairness (AERA, APA, & NCME, 1999, 2014). Cheating also raises ethical concerns that require ongoing consideration by test developers and administrators (Cizek, 1999). To prevent high-tech cheating on the CET, the National Education Examinations Authority (NEEA), a government institute in charge of the operation of the CET, designed a measure called multiple versions and multiple forms (MVMF), which has been implemented by the National College English Testing Committee (NCETC), the developer of the CET, since 2013. One of the technical challenges facing the implementation of MVMF is score equating, that is, the differences in difficulty levels across alternate forms need to be adjusted so as to produce comparable score scales. In this article, we adopted the argument-based approach advocated in Xi (2010) to examine, from the CET developer’s perspective, the issue of high-tech cheating that threatens the fairness of the test.

Complete Article List

Search this Journal:
Open Access Articles
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing