Framework for Evaluating Written Explanations of Numerical Reasoning

Framework for Evaluating Written Explanations of Numerical Reasoning

Copyright: © 2023 |Pages: 28
DOI: 10.4018/978-1-6684-8262-9.ch005
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Reasoning in mathematics requires communication as often measured through 1:1 think-alouds. Reasoning can also be measured via disciplinary writing in mathematics, which uses a variety of representations to convey meaning, including mathematical words, abstract symbols, and visuals. This study measured numerical reasoning as conveyed via the written explanations of 417 fourth-grade students. Whole group administration procedures were followed to collect data on reasoning about fraction magnitude. Explanations were scored using a validated framework for evaluating numerical reasoning. Results include descriptive statistics and qualitative analyses of examples across the framework's five categories. Written explanations coded as “conceptual” reflected the richest examples of disciplinary writing in mathematics. Conclusions include the usability of this framework to adequately measure numerical reasoning at the elementary level. Implications include the importance of using formative assessment tools in the classroom.
Chapter Preview
Top

Background

Prompting students to share their reasoning provides an opportunity to see into the why of mathematical knowledge (i.e., conceptual understanding), and not just the how of mathematical knowledge (i.e., procedural understanding). Applying algorithms that result in correct answers does not demonstrate understanding. Relatedly, teachers may not realize that identical answers reflect different levels of understanding (Geller et al., 2017), and these levels of understanding require different instructional approaches. Thus, student explanations contribute to the design and delivery of effective mathematics instruction (Burns, 2011).

Mathematical reasoning has been defined as the ability to communicate understanding through justification (English, 2004; NCTM, 2009; Stein et al., 1996). Or, as described by Boesen et al. (2010), reasoning entails one’s ability to make and test assertions. Engagement in the act of reasoning is critical for developing students’ competence in mathematics (Battista, 2017) and is often captured through think-alouds. Think-aloud protocols have been studied extensively by Schoenfeld and his colleagues (1992) and student explanations of their mathematical thinking have long been valued by mathematics educators (NCTM, 2000). Indeed, two of the five NCTM mathematical Process Standards are (a) Communication and (b) Reasoning and Proof, both of which include a strong focus on student explanations. Unfortunately, think-aloud procedures, although effective for measuring students’ mathematical reasoning, require an inordinate amount of time to administer. As an example, Liu and Xin (2016) have categorized three types of mathematical reasoning, but their method relies on lengthy student explanations subjected to extensive conversational analysis. A more succinct measure is Listening to Learn (Burns & Zolli, 2023). During this assessment, students are prompted to answer multiple questions across various mathematical domains and then explain the thought processes behind the answers. The interviewer records both the accuracy of answers and the strategies employed to justify those answers. A positive feature of this inventory is the coding of student responses in real time. Less fortunate is the MRI’s use of multiple coding options across different items (e.g., 36 codes exist for the Number and Operations subtest), and its one-to-one administration protocol.

Understanding student reasoning is valued by researchers and educators alike but its measurement is thwarted by a historical reliance on think-aloud methodologies that require one-to-one administration, time intensive procedures, and tedious coding. Hence, the empirically validated framework presented in this chapter was used to provide a simple method for coding student reasoning as conveyed in their written explanations to fraction magnitude problems.

Key Terms in this Chapter

Disciplinary Writing in Mathematics: Writing that is precise and often involves language as well as symbols and visual representations.

Algorithmic: Procedure or method applied to solve a mathematics problem that lacks additional justification or reasoning.

Conceptual: Fully developed understanding that reflects reasoning founded on the intrinsic mathematical properties of the components of the task with or without describing a specific procedure.

Framework: A scoring rubric consisting of multiple categories which are not continuous.

Numerical Reasoning: Communicating understanding of the meaning behind number and operations.

Disciplinary Literacy: Reading, writing, listening, and speaking specific to unique disciplines of study, such as writing in mathematics.

Fraction Sense: A fully developed conceptual understanding of fractions as numbers.

Complete Chapter List

Search this Book:
Reset