A System to Manage Grammatical Level Tests in the Context of Language Schools

A System to Manage Grammatical Level Tests in the Context of Language Schools

Antonio Sarasa Cabezuelo (Department of Computer Systems and Programming, Complutense University of Madrid, Madrid, Spain)
Copyright: © 2016 |Pages: 17
DOI: 10.4018/JCIT.2016100104


A common problem in schools teaching languages is the preliminary assessment of students' knowledge to assign a skill level in the language that they would like to learn. Normally, level tests consist of an oral test, a written test and a listening test. All these tests can be performed through from a remote location, though the written test is less complicated to perform through correspondence since it generally does not require the presence of an examiner. This article describes how has been automated the performance and management of level tests of the written kind in the specific case of teaching Spanish to foreigners at the University of Zaragoza.
Article Preview

1. Introduction

Most universities have language training services that offer students and staff the opportunity to learn a language. In order to access these studies, it requires the performing of a set of tests (Kumaravadivelu, 1994) that allow the level of knowledge of the language that the student would like to study to be determined. Generally, 3 types of tests are performed (Finocchiaro, 1983). An oral test (McKay, 2002) that measures the ability to understand and be understood by a native person, a comprehension test that measures the ability to understand conversations between native people (Rivers, 1981), and a written test (Hadley, 1993) that measures the mastery of the vocabulary and grammatical constructions in the language. Normally the oral test (Lado, 1961) is performed with a teacher from the school, and a multimedia resource such as a video or a recorded conversation is used in the hearing test (Pinto-Llorente, 2016). The written test (Nuttall,1996) usually consists of performing exercises such as writing an essay on a proposed theme, rearranging the components of a sentence in a way that the sentence is correct or completing parts of a phrase that are empty with the most appropriate words (Pinto-Llorente, 2014).

There are several ways to automate level testing (Yang, 2009) that can be classified into specific assessment tools for language or general assessment tools (Squires, 1997). The first case includes tools (Voorheis, 2004) that allow only a particular language to be evaluated. Usually these are not free tools and it is necessary to pay for some kind of license. Its main advantage (Gottliebson, 2010) is the specificity with regard to language, and its disadvantages are the inability to reuse with other languages (Pinto-Llorente, 2015) and its minimum capacities of adaptation and configuration (usually, it is software that cannot be modified; its interfaces offer few options for adaptation and do not usually offer user management services). The second case (Bachmann, 2005) is the use of generic assessment tools that provide the ability to create test with different types of questions (Shohamy, 1988) such as multiple choice, free text, jumbled words. (Abello, 2008). Their main features are: a) They are flexible and adaptable to different contexts (Elbeck, 2014) b) In many cases, they allow the evaluation process to be set (conditional navigation, free navigation.), c) They allow the creation of a repository of questions, and d) It is possible to set the way that the questions are shown (e.g. random presentation of questions and answers). For this type of tools, there are both paid tools and free tools. However, the number of free tools (both online and desktop) is bigger than payment tools. They include, for example forms from Google Drive (Ji, 2015), or QuestionPro tools, thatquiz, testmoz.

Complete Article List

Search this Journal:
Open Access Articles
Volume 22: 4 Issues (2020): 2 Released, 2 Forthcoming
Volume 21: 4 Issues (2019)
Volume 20: 4 Issues (2018)
Volume 19: 4 Issues (2017)
Volume 18: 4 Issues (2016)
Volume 17: 4 Issues (2015)
Volume 16: 4 Issues (2014)
Volume 15: 4 Issues (2013)
Volume 14: 4 Issues (2012)
Volume 13: 4 Issues (2011)
Volume 12: 4 Issues (2010)
Volume 11: 4 Issues (2009)
Volume 10: 4 Issues (2008)
Volume 9: 4 Issues (2007)
Volume 8: 4 Issues (2006)
Volume 7: 4 Issues (2005)
Volume 6: 1 Issue (2004)
Volume 5: 1 Issue (2003)
Volume 4: 1 Issue (2002)
Volume 3: 1 Issue (2001)
Volume 2: 1 Issue (2000)
Volume 1: 1 Issue (1999)
View Complete Journal Contents Listing