Assessing the Composition Program on Our Own Terms

Assessing the Composition Program on Our Own Terms

Sonya Borton (Philadelphia University, USA), Alanna Frost (University of Alabama at Huntsville, USA) and Kate Warrington (Lindsey Wilson College, USA)
DOI: 10.4018/978-1-60566-667-9.ch010
OnDemand PDF Download:


As Jacqueline Jones Royster articulated at the 2006 Conference on College Composition and Communication, English departments are already assessing themselves and should resist suggestions by the Spellings Commission on the Future of Higher Education that a standardized method of assessing students and programs in higher education is needed. In the fall of 2006, the University of Louisville was due to be reviewed by the Southern Association of Colleges and Schools (SACS). The First-Year Composition program chose to conduct an internal assessment in the fall of 2004. This chapter details the Composition program assessment conducted at the University of Louisville and includes a comprehensive analysis of its rationale, theoretical foundations, methodologies, and results. This chapter also articulates the difficulties of such a large-scale assessment as well as the uniquely local challenges faced during the process.
Chapter Preview


“Treat program development, including formal assessment, as an adventurous space, open to explore” (Haswell, 2001, p. 188).

The Spellings Commission report on higher education, A Test of Leadership: Charting the Future of U. S. Higher Education (2006) has caused much debate and concern among postsecondary educators. One of the primary concerns educators have about this report is its call for a widespread standardized assessment of institutions of higher education in order to encourage “accountability.” Specifically the report recommends the development of a database that houses information comparing the performance, generally based upon standardized testing, of diverse groups of students across institutions of higher learning. According to the report, this collection of data will allow “meaningful interstate comparison of student learning” so that “state policymakers can [. . .] identify shortcomings as well as best practices” (p. 23). Brian Huot (2007), in his critique of the Spellings Commission report, responds to this recommendation and its goals, pointing out, “There appears to be an assumption that all students can learn equally well at all institutions, when in fact it has become increasingly apparent that educational success or failure is about whether or not students can establish relevant and productive learning relationships within a specific educational environment” (p. 519). According to Huot, as well as numerous other scholars (McLeod, Horn, and Haswell, 2005; Whithaus, 2005; Contreras-McGavin and Kezar, 2007), these kinds of standardized assessments provide little useful information about situated student learning. Rather, assessments that take into consideration the local context and culture of the institution yield significantly more information that can be used to reform higher education in a meaningful way while addressing specific student needs.

Standardized methods of assessment cannot possibly be suitable to measure the abilities of the diverse student populations at all institutions of higher education. However, the Spellings Commission report insists on using a standardized instrument, the National Assessment of Adult Literacy (NAAL) to claim that “the percentage of college graduates proficient in prose literacy has actually declined from 40 to 31 percent in the past decade” (p. 3). As Huot notes, these results may indicate that “there is a different population of students entering our doors that we must become more able to teach” (p. 518). However, as he also explains, a more appropriate response might be that we “need to find better ways of testing what people can really do, rather than creating tests that ensure their poor performance and the condemnation of the institutions charged with educating them” (p. 518). Focusing on student abilities is a more useful way of establishing benchmarks within a specific academic program, institution, or higher education system. The alternative is to attempt to assess student learning with a narrow measure of skills valued by an outside testing authority with little familiarity with the institution to be assessed and possibly, in the case of the Spellings Commission Report, little familiarity with higher education instruction in general.

Key Terms in this Chapter

Triangulation: The use of a variety of methodological instruments to gain separate perspectives of an issue.

Inter-Rater Reliability: The degree to which different readers assess the same document with the same score.

Norming: A process encouraging inter-rater reliability where readers score documents that have been pre-scored in order to move toward achieving a shared understanding of the scoring criteria and how this criteria is represented in the documents.

Rubric: A guide outlining the scoring criteria for a specific set of documents.

E-Portfolios: A collection of student work compiled in an electronic environment that documents students’ progress toward meeting specific standards.

Holistic Scoring: A scoring system where readers score documents based upon their impressions of the whole document rather than upon its separate elements.

Portfolio: A collection of student work that documents students’ progress toward meeting specific standards.

Complete Chapter List

Search this Book:
Editorial Advisory Board
Table of Contents
Christopher S. Schreiner
Christopher S. Schreiner
Chapter 1
Melissa A. Dyehouse, John Y. Baek, Richard A. Lesh
This chapter describes a model for evaluating complex organizations or systems. The design assessment model the authors propose is a response to... Sample PDF
Multi-Tier Design Assessment in the Development of Complex Organizational Systems
Chapter 2
Hedva Lewittes
In this chapter critical thinking is assessed using two critical thinking learning outcomes that were required for the State University of New... Sample PDF
A Critical Thinking Rubric as the Basis of Assessment and Curriculum
Chapter 3
Suzanne Pieper, Erika Edwards, Brandon Haist, Walter Nolan
The purpose of this chapter is to review literature over the past ten years regarding technology tools that are being used in higher education to... Sample PDF
A Survey of Effective Technologies to Assess Student Learning
Chapter 4
John Baer, Sharon S. McKool
The Consensual Assessment Technique is a powerful tool used by creativity researchers in which panels of expert judges are asked to rate the... Sample PDF
Assessing Creativity Using the Consensual Assessment Technique
Chapter 5
Christine Charyton, Zorana Ivcevic, Jonathan A. Plucker, James C. Kaufman
This chapter discusses creativity assessment as a means for evaluating skills required in higher education. Creativity is assessed in the context of... Sample PDF
Creativity Assessment in Higher Education
Chapter 6
Asao B. Inoue
This chapter articulates writing assessment as a technology, theorized with three aspects (power, parts, and purpose), accounting for the ways in... Sample PDF
The Technology of Writing Assessment and Racial Validity
Chapter 7
Sheila S. Thompson, Annemarie Vaccaro
The purpose of this chapter is to address epistemological and methodological approaches to assessing assessment. The authors’ intent is to show how... Sample PDF
Qualitative and Quantitative Methods as Complementary Assessment Tools
Chapter 8
Teresa Flateby
The development of the Cognitive Level and Quality of Writing Assessment online system is described in this chapter. Beginning with needs identified... Sample PDF
Effects of Assessment Results on a Writing and Thinking Rubric
Chapter 9
Barbara D’Angelo, Barry Maid
Outcomes-based assessment provides data for programs to demonstrate student learning as a result of their enrollment in the program and to assess... Sample PDF
Assessing Outcomes in a Technical Communication Capstone
Chapter 10
Sonya Borton, Alanna Frost, Kate Warrington
As Jacqueline Jones Royster articulated at the 2006 Conference on College Composition and Communication, English departments are already assessing... Sample PDF
Assessing the Composition Program on Our Own Terms
Chapter 11
Joan Aitken
This chapter uses a case study to exemplify one approach to assessment of three instructional delivery formats: (a) online, (b) distance, satellite... Sample PDF
A Case Study of Instructional Delivery Formats
Chapter 12
Victor W. Brunsden
The author present a case-study of a classroom technique that allows assessment and some remediation of several shortcomings of college student... Sample PDF
Inverting the Remedial Mathematics Classroom with Alternative Assessment
Chapter 13
David A. Eubanks
This chapter describes Coker College’s subjective performance assessment program to rate student thinking and communication skills. It uses a... Sample PDF
A Case Study of Authentic Assessment
Chapter 14
P. Tokyo Kang, David Gugin
This chapter reports an outcomes assessment study conducted at the University of Guam. The assessment project was conducted during the 2006-07 and... Sample PDF
Outcomes Assessment in Japanese Language Instruction
Chapter 15
Barika Barboza, Frances Singh
This chapter describes an outcomes assessment study completed in a basic composition course at a small urban open admissions community college. The... Sample PDF
Assessing the Effectiveness of a Basic Writing Course
Chapter 16
Lorraine Gilpin, Yasar Bodur, Kathleen Crawford
Peer assessment holds tremendous potential to positively impact the development of preservice teachers. The purpose of this chapter is to describe... Sample PDF
Peer Assessment for Development of Preservice Teachers
Chapter 17
Charlotte Brammer, Rhonda Parker
In 2002, Samford University began working on a long-term learning assessment plan designed to evaluate its undergraduates’ competencies in written... Sample PDF
Workshops and E-Portfolios as Transformational Assessment
Chapter 18
Daniel F. Chambliss
This chapter describes how the trend favoring assessment initiatives of a system-wide scope such as program review and collegiate learning... Sample PDF
A Neglected Necessity in Liberal Arts Assessment: The Student as the Unit of Analysis
Chapter 19
Deirdre Pettipiece, Timothy Ray, Justin Everett
Perhaps due to its applicability as a performance of skill sets in virtually all disciplines, writing as a mechanism for measuring student success... Sample PDF
Redefining Writing Reality Multi-Modal Writing and Assessment
Chapter 20
Sean A. McKitrick
This chapter introduces methods that can be used to engage faculty in the assessment process, working within a shared governance structure in... Sample PDF
Engaging Faculty as a Strategic Choice in Assessment
Chapter 21
Steven M. Culver, Ray VanDyke
There is much in the assessment literature about the necessity of developing a culture of assessment and mandates from accrediting bodies include... Sample PDF
Developing a Receptive and Faculty-Focused Environment for Assessment
Chapter 22
John Wittman
This chapter argues that as primary stakeholders in writing program assessment, students and instructors need to be included proactively in... Sample PDF
New Collaborations for Writing Program Assessment
Chapter 23
Mya Poe
The study of racial-ethnic group differences on educational tests has yielded a substantial body of research internationally in the last decade. In... Sample PDF
Reporting Race and Ethnicity in International Assessment
Chapter 24
Joan Hawthorne, Tatyana Dumova, April Bradley, Daphne Pederson
In this chapter the authors describe a method developed to assess the outcome of a “cultural familiarity” general education goal. Challenges in... Sample PDF
Method Development for Assessing a Diversity Goal
About the Contributors