Teaching math for understanding

Page 1

American Educational Research Journal http://aerj.aera.net

Teaching Mathematics for Understanding: An Analysis of Lessons Submitted by Teachers Seeking NBPTS Certification Edward A. Silver, Vilma M. Mesa, Katherine A. Morris, Jon R. Star and Babette M. Benken Am Educ Res J 2009; 46; 501 originally published online Jan 5, 2009; DOI: 10.3102/0002831208326559 The online version of this article can be found at: http://aer.sagepub.com/cgi/content/abstract/46/2/501

Published on behalf of

http://www.aera.net

By http://www.sagepublications.com

Additional services and information for American Educational Research Journal can be found at: Email Alerts: http://aerj.aera.net/cgi/alerts Subscriptions: http://aerj.aera.net/subscriptions Reprints: http://www.aera.net/reprints Permissions: http://www.aera.net/permissions

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


American Educational Research Journal June 2009, Vol. 46, No. 2, pp. 501-531 DOI: 10.3102/0002831208326559 Š 2009 AERA. http://aerj.aera.net

Teaching Mathematics for Understanding: An Analysis of Lessons Submitted by Teachers Seeking NBPTS Certification Edward A. Silver Vilma M. Mesa University of Michigan Katherine A. Morris Sonoma State University Jon R. Star Harvard University Babette M. Benken California State University, Long Beach The authors present an analysis of portfolio entries submitted by candidates seeking certification by the National Board for Professional Teaching Standards in the area of Early Adolescence/Mathematics. Analyses of mathematical features revealed that the tasks used in instruction included a range of mathematics topics but were not consistently intellectually challenging. Analyses of key pedagogical features of the lesson materials showed that tasks involved hands-on activities or real-world contexts and technology but rarely required students to provide explanations or demonstrate mathematical reasoning. The findings suggest that, even in lessons that teachers selected for display as best practice examples of teaching for understanding, innovative pedagogical approaches were not systematically used in ways that supported students’ engagement with cognitively demanding mathematical tasks. KEYWORDS: mathematics teaching, mathematical understanding, middle school teachers, NBPTS, teacher assessment

I

n recent years there has been unprecedented public demand for and professional interest in improving the quality of mathematics teaching and learning in American schools, with much attention to annual student test results and data regarding teacher qualifications as key sources of guidance for decisions regarding education policy and practice. As informative as they may be, however, data regarding student achievement and teacher quality are unlikely to provide adequate guidance for the design of interventions Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Silver et al. intended to improve mediating processes, such as classroom instruction, that have a major influence on the outcomes (Cohen, Raudenbush, & Ball, 2003). Improving the quality of mathematics teaching and learning hinges also on access to sound evidence regarding what happens in classrooms, that is, what teachers actually do with students to promote the development of students’ mathematical proficiency. What we know about mathematics teaching in the United States is largely derived from two kinds of data and associated research analyses. One set of studies over several decades has involved direct observation of classroom teaching (e.g., Hiebert et al., 2005; Stake & Easley, 1978; Stigler, Gonzalez, Kawanaka, Knoll, & Serrano, 1999; Stodolsky, 1988), and another has used teacher self-report data from surveys (e.g., Grouws, Smith, & Sztajn, 2004; Weiss, Banilower, McMahon, & Smith, 2001). Looking across these studies of teachers’ instructional practices in mathematics classrooms, whether in the past or more recently, a remarkably consistent characterization of mathematics teaching in the United States emerges. With amazingly little variation across teachers and over time, research has found that mathematics instruction and instructional tasks tend to emphasize low-level rather than high-level cognitive processes (i.e., memorizing and recalling facts and procedures rather than reasoning about and connecting ideas or solving complex problems), require students to work alone and in silence (with little opportunity for discussion and collaboration), focus attention on a

EDWARD A. SILVER is William A. Brownell Collegiate Professor of Education in the School of Education, University of Michigan, 610 East University, Ann Arbor, MI 48109-1259; e-mail: easilver@umich.edu. His scholarly interests include the study of mathematics teaching and learning, assessment of student proficiency, and teacher professional development. VILMA M. MESA is assistant professor of mathematics education in the School of Education, University of Michigan, 610 East University, Ann Arbor, MI 48109-1259; e-mail: vmesa@umich.edu. She investigates the development of teaching expertise in collegiate mathematics. KATHERINE A. MORRIS is associate professor of elementary mathematics education in the School of Education, Sonoma State University, 1801 East Cotati Avenue, Rohnert Park, CA 94928-3609; e-mail: Kathy.morris@sonoma.edu. Her current research is on mathematical discourse in the classroom and teacher education settings as well as teaching and learning using mathematical class routines. JON R. STAR is assistant professor in the Harvard Graduate School of Education, Harvard University, Longfellow Hall, 13 Appian Way, Cambridge, MA 02138; e-mail: jon_star@gse.harvard.edu. He investigates the learning and teaching of mathematics. BABETTE M. BENKEN is assistant professor in the College of Natural Sciences and Mathematics, California State University, 1250 Bellflower Boulevard, Long Beach, CA 90840-4501; e-mail: bbenken@csulb.edu. Her research explores models of mathematics teacher education and how to best facilitate teacher learning, and investigates the role that teachers’ knowledge, beliefs, and context play in shaping practice.

502

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding narrow band of mathematics content (i.e., arithmetic in the elementary and middle grades), and do little to help students develop a deep understanding of mathematical ideas (rarely asking for explanations, using physical models, or calling for connections to real-world situations). Although a direct link has not been established between the generally low levels of student achievement and this portrait of canonical school mathematics teaching practices in the American schools, it is difficult to deny the plausibility of a connection. Thus, it is not surprising that there has been in the past few decades an array of reform initiatives aimed at changing what and how mathematics is taught and learned in American schools. Although reformers have disagreed on many issues, there is a widely shared concern for enhancing opportunities for students to learn mathematics with understanding and thus a strong interest in promoting teaching mathematics for understanding.

Research Perspectives on Teaching Mathematics for Understanding Over at least the past 60 years, a solid body of research evidence has amassed that points to the benefits of teaching for understanding (sometimes called by various names, including authentic instruction, ambitious instruction, higher order instruction, problem-solving instruction, and sense-making instruction) in mathematics (e.g., Brownell & Moser, 1949; Brownell & Sims, 1946; Carpenter, Fennema, & Franke, 1996; Carpenter, Fennema, Peterson, Chiang, & Loef, 1989; Cohen, McLaughlin, & Talbert, 1993; Fuson & Briars, 1990; Hiebert & Wearne, 1993; Hiebert et al., 1996; Newmann & Associates, 1996). Although there are many unanswered questions about precisely how teaching practices are linked to students’ learning with understanding (see Hiebert & Grouws, 2007), there has been increasing emphasis in the mathematics education community on teaching practices that deviate from the canonical version of classroom mathematics instruction noted above and that appear to be more oriented toward the development of students’ conceptual understanding. Among the hallmarks of this conceptually oriented version of instruction are (a) mathematical features, or tasks that are drawn from a broad array of school mathematics content domains and are cognitively demanding, and (b) pedagogical features, or teaching practices that are suitable to support multiperson collaboration and mathematical discourse among students, as well as their engagement with mathematical reasoning and explanation, consideration of real-world applications, and use of technology or physical models (e.g., Fennema & Romberg, 1999; Hiebert & Carpenter, 1992). Mathematical Features The mathematics curriculum in the United States, especially in the elementary and middle grades, has long been characterized as incoherent, cursory, and repetitive (e.g., Balfanz, Mac Iver, & Byrnes, 2006). In particular,

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

503


Silver et al. many have argued that excessive attention to number and operations (and corresponding inattention to other content topics) has restricted students’ opportunities to learn other interesting and important mathematics content. Reflecting this concern about teachers’ preoccupation with number, the National Council of Teachers of Mathematics (NCTM) Standards (1989, 2000) noted the importance of including topics in algebra, geometry, measurement, and data analysis in the middle grades. Those who advocate treating a broad range of topics suggest that this breadth would not only enrich students’ exposure to more mathematical topics but also make salient the connections that exist among different content domains in mathematics—connections that are viewed by psychologists as hallmarks of student understanding (Bransford, Brown, & Cocking, 1999). Beyond the issue of content topics in the curriculum, efforts to improve mathematics instruction have attended to the central role that mathematics instructional tasks play in daily lessons in providing opportunities for students to learn mathematics. For example, the Professional Standards for Teaching Mathematics (NCTM, 1991) claimed that student learning of worthwhile mathematics depended to a great extent on the teacher using “mathematical tasks that engage students’ interests and intellect” (p. 1). Such tasks, when implemented well in the classroom, can help develop students’ understanding, maintain their curiosity, and invite them to communicate with others about mathematical ideas. As noted above, however, research on instructional practice in mathematics classrooms has found that daily mathematics instruction in elementary and middle grades usually involves teachers and students engaging in cognitively undemanding activities, such as recalling facts and applying well-rehearsed procedures to answer simple questions (Porter, 1989; Stake & Easley, 1978; Stigler & Hiebert, 1999; Stodolsky, 1988). Although research has shown that it is not easy for teachers to use cognitively demanding tasks well in mathematics classrooms (Henningsen & Stein, 1997; Stein, Grover, & Henningsen, 1996), the regular use of cognitively demanding tasks in ways that maintain high levels of cognitive demand can lead to increased student understanding and the development of problem solving and reasoning (Stein & Lane, 1996) and greater overall student achievement (Hiebert et al., 2005). Pedagogical Features In addition to attention to content topics and the cognitive complexity of tasks, reformers have also advocated a broader array of pedagogical strategies that might be used to increase students’ understanding of mathematics. As noted earlier, research studies have characterized mathematics classrooms for students in the upper elementary school and middle grades as places where students often work alone and in silence, with little or no access to classmates or to suitable computational or visualization tools, on low-level tasks that make little or no connection to the world outside of school, in

504

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding order to produce answers quickly and efficiently without much attention to explanation, justification, or the development of meaning (e.g., Stigler & Hiebert, 1999; Stodolsky, 1988). Such pedagogy is at odds with current conceptualizations of how people learn best when the goal is developing understanding (Bransford et al., 1999). Advocates for conceptually oriented teaching in school mathematics (e.g., NCTM, 1989, 2000) have suggested the potential value of fostering communication and interaction among students in mathematics classrooms through the use of complex tasks that are suitable for cooperative group work and that provide settings in which students need to explain and justify their solutions. Moreover, to increase students’ engagement with mathematical tasks and their understanding of concepts, instructional reform efforts have also encouraged the use of hands-on learning activities and technological tools, as well as connecting work done in the mathematics classroom to other subjects and to the world outside school. Beyond exhortations, there is also some research evidence to support these hypotheses about pedagogy that might support students’ development of mathematical understanding (e.g., Boaler, 1998; Fawcett, 1938; Fuson & Briars, 1990; Good, Grouws, & Ebmeier, 1983; Hiebert & Wearne, 1993; Stein & Lane, 1996).

Focus of This Study In this study, we examined samples of instructional practice—including both lesson artifacts and teachers’ commentaries on lessons—that were intended to be displays of teaching that was oriented toward the development of students’ mathematical understanding. The instructional samples were systematically probed for evidence of the mathematical and pedagogical features noted above as associated with conceptually oriented mathematics instruction. Through our analysis of the extent to which the mathematical and pedagogical features identified above were present in the lessons, we characterize what mathematical learning opportunities were provided to students and how these opportunities were provided by the teachers. The teaching samples were available in the form of portfolio submissions by applicants seeking certification of highly accomplished teaching by the National Board for Professional Teaching Standards (NBPTS). These records constituted a strategic site for an investigation into teaching for understanding because such teaching has been rarely observed in other studies of mathematics instruction in American classrooms, because the NBPTS portfolio process required teachers to submit two entries based on classroom lessons oriented toward mathematical understanding, and because teachers selected the entries to display their best practice. Thus, these instructional samples afford a rare opportunity to examine what American teachers actually do when they attempt to teach mathematics for understanding. More details about our research methods, including the context, sample, data, and data analysis methods are presented next.

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

505


Silver et al.

Method Study Context: NBPTS Certification in 1998–1999 The NBPTS was established in 1987 as a nonprofit, nonpartisan organization to promote the recognition of highly accomplished teaching practice. A voluntary national system was established under the auspices of NBPTS to certify accomplished practice in a number of fields. Except for generalist certifications, each field is defined by content area (e.g., Mathematics) and student development level (e.g., Early Adolescence, ages 11–15). The NBPTS certification system began with the specification of standards for professional practice and then the creation of a process to assess the extent to which an applicant meets the standards. Technical analyses of the reliability, and some aspects of the validity, of the assessment have been conducted (e.g., Bond, Smith, Baker, & Hattie, 2000), and there have been a number of studies investigating the relationship between NBPTS certification and measures of teaching practice and teacher effectiveness, especially in regard to student achievement (Hakel, Koenig, & Elliott, 2008). Teachers receive NBPTS recognition by demonstrating knowledge and professional practice of many kinds via a complex assessment system. The details of the assessment process have varied across certification areas and over time, so we describe here the process as it existed in 1998–1999, which was the year in which the data analyzed in this article were submitted, and for the area of Early Adolescence/Mathematics (EA/M), which was the certification domain studied herein. The EA/M assessment consisted of two parts: in one, teachers completed an on-demand, test center–administered set of exercises to evaluate certain aspects of their content and pedagogical content knowledge; in the other, candidates submitted a portfolio that included contextualized samples of their teaching practices and reflections on their work. The portfolio component of the EA/M assessment consisted of six entries. Two portfolio entries pertained to teachers’ participation in professional activity and their outreach to families and local communities. The other four were classroom-based entries. Two portfolio entries (Engaging a Whole Class in Mathematical Discussions and Engaging Small Groups in Mathematical Interactions) relied on videotapes and accompanying reflective commentary, and two portfolio entries (Developing Mathematical Understanding and Assessing Mathematical Understanding) captured teaching practice via classroom artifacts, samples of student work, and teachers’ reflective narratives. The instructions for submission required that the four classroom-based portfolio entries be drawn from different time points and, whenever possible, different class groups. Thus, the video entries depicted different lessons than those in the artifact-based entries—lessons not specifically aimed at mathematical understanding—and they were not analyzed in this study. The portfolio entries that teachers submit when applying for NBPTS certification offer a glimpse at instructional practice that teachers consider worthy of evaluation in a process designed to identify highly accomplished teaching. 506

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding Table 1 Demographic Characteristics and National Board for Professional Teaching Standards Certification Status of All Early Adolescence/Mathematics Candidates (1998–1999) and of Candidates in (or not in) Random Sample for Study All Candidates (N = 250) Gender Male Female Ethnicity Asian Black Hispanic Indian White Urbanicity Rural Suburban Urban Certification Pass Fail

In Random Sample (n = 32)

Not in Sample (n = 218)

17.6 82.4

12.5 87.5

18.3 81.7

2 10 1.2 1.2 85.2

3.1 12.5 0 3.1 81.3

1.8 9.6 1.4 0.9 85.8

35.6 30.0 34.0

37.2 34.4 40.6

25.0 29.4 33.0

41.6 58.4

40.6 59.4

41.7 58.3

The entries examined in this study illuminated instructional practices associated with teaching for understanding and assessing understanding. Sample When they seek NBPTS certification, applicants agree that their submissions can be examined for research purposes. With the cooperation of the National Board, we obtained test center and portfolio exercise score data for all candidates (N = 250) who applied for NBPTS EA/M certification in 1998–1999. From this set of 250 applicants, a random sample of 32 candidates (nearly 13% of the total) was selected. Our random sample was demographically similar to the entire population of EA/M applicants in 1998–1999 (see Table 1) and contained a comparable ratio of successful applicants and unsuccessful applicants to that of the full applicant pool; our sample included 13 individuals who obtained NBPTS certification and 19 who did not. We decided to include in our sample both successful and unsuccessful applicants for NBPTS certification because we felt that the composite sample would represent the instructional practice of a broader range of teachers of mathematics in the middle grades. Moreover, because the awarding of NBPTS certification in 1998–1999 was based on 10 performance indicators, each with an independent contribution to an applicant’s overall score, there was little reason to think that those applicants with the highest overall performance would necessarily be highest also with respect to these two specific Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

507


Silver et al. portfolio entries. We did contrast the portfolio entries submitted by applicants in our sample who were awarded NBPTS certification with those not awarded certification, but we do not report those analyses in this article. Data For each of the 32 individuals in our sample, we obtained copies of the two artifact-based portfolio entries—Developing Mathematical Understanding (DU) and Assessing Mathematical Understanding (AU). These artifact-based entries contained extensive textual portrayals of instructional practice related to developing and assessing student understanding of mathematical ideas, along with supporting artifacts (e.g., students’ work, tests, photographs). In this study, we examined more than 1,000 pages of material submitted by our sample of 32 applicants. The two portfolio entries examined in this study shared a number of common characteristics. For both entries, candidates were instructed to provide examples of their instructional practices and to present clear, consistent, and convincing evidence that they were able to use these lessons and activities in their classrooms to build students’ conceptual understanding of mathematics. Candidates were asked to follow a similar format in documenting examples in both portfolio entries. In particular, they were instructed to provide all of the following information: a written description of the instructional context (e.g., grade, subject, class characteristics); a written description of teacher planning (e.g., substantive math idea, goals for instructional sequence, challenges inherent in teaching these activities); analysis of student responses (actual student work samples for these specific students were appended to the entry); and candidate’s reflections on the outcomes of each lesson. Finally, for both entries, candidates were instructed to select activities in which students were engaged in thinking and reasoning mathematically (e.g. interactive demonstrations, long-term projects, journal assignments, problem solving); they were instructed not to select activities that focused on rote learning (e.g., students memorizing procedures). There was one important difference between the two entries that is germane to the analysis reported here. The DU entry required two instructional activities, both focused on the same mathematical idea, which could come from consecutive lessons or from nonconsecutive lessons. In contrast, the AU entry required only one activity, and it was required to be different from the idea that was the focus of the DU entry. The data available in the NBPTS portfolio submissions are in many ways quite similar to the data being used by several other researchers to study classroom practice using alternatives to direct observation and survey methods, such as “scoop” sampling of instructional artifacts (e.g., lesson plans, student work) to characterize instructional activity (Borko, Stecher, & Kuffner, 2007) and using classroom assignments to judge instructional quality (Clare & Aschbacher, 2001; Matsumura, Garnier, Pascal, & Valdés, 2002). The NBPTS portfolio submissions include lesson artifacts and samples of student work, as 508

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding well as teacher narratives to provide context and explanation for the lesson artifacts. Although the NBPTS data might appear limited in a study of instructional practice because the records do not include direct observation of actual teaching, the findings of other researchers regarding the validity of inferences about actual classroom practice drawn from organized collections of classroom artifacts (see, e.g., Borko, Stecher, Alonzo, Moncure, & McClam, 2005; Clare & Aschbacher, 2001) support the use of data like those in the NBPTS portfolio entries to examine classroom teaching practice. Data Analysis Our examination of the NBPTS data consisted of quantitative and qualitative analyses of two portfolio entries submitted by our random sample of 32 applicants as samples of instructional activity related to teaching and assessing understanding. We probed the entries for evidence of selected mathematical and pedagogical features. Mathematical Features of NBPTS Portfolio Submissions Our analysis of mathematical features attended to two main issues: the frequency and distribution of mathematical content topics treated in the activities and the extent to which the activities were judged to be cognitively demanding for students. In this section, we provide details about how this analysis was conducted. Mathematical topics. We used the five mathematics strands of the National Assessment of Educational Progress (1988) to classify each activity: Number and Operations; Algebra and Functions; Measurement; Geometry; and Data Analysis, Statistics, and Probability. Two raters independently assigned each activity (two in each DU entry and one in each AU entry for each applicant) to one or more topic categories. There was near unanimity in this coding, with disagreements arising only in a few cases of tasks with multiple topic designations; all disagreements were resolved via consensus. Mathematical cognitive demands. To determine the cognitive demand requirements of the mathematical tasks in the portfolio entries, we developed coding criteria for high-demand and low-demand activities. To develop our coding criteria, we examined frameworks that have been used to distinguish different levels of demands in mathematical tasks. These were drawn from the assessment literature (e.g., Beaton et al., 1996; Silver & Kenney, 2000), from cognitive science (Anderson et al., 2001; Schoenfeld, 1985, 1992), and from analyses of instruction (Stein, Smith, Henningsen, & Silver, 2000) or student proficiency (Kilpatrick, Swafford, & Findell, 2001). Although these frameworks differ in many ways, they are quite consistent in distinguishing the extremes of cognitive demand. Low-demand tasks exclusively involve recalling, remembering, implementing, or applying facts and procedures,

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

509


Silver et al. Table 2 Criteria for Coding Activities as High Demand and Low Demand High Cognitive Demand Tasks require students to explain, describe, justify, compare, or assess. Tasks require students to make decisions and choices, to plan, or to formulate questions or problems. Tasks require students to be creative in some way (e.g., to apply a known procedure in a novel way). Tasks require students to work with more than one form of representation in a meaningful way (e.g., to translate from one representation to another, interpreting meaning across two or more representations). Low Cognitive Demand Tasks require students to make exclusively routine applications of known procedures. Tasks that are potentially demanding are made routine because of a highly guided or constrained task structure (e.g., a complex task is subdivided into nondemanding subtasks; a potentially challenging task is made routine because a particular solution method is imposed by the teacher). Task complexity or demand is targeted at nonchallenging or nonmathematical issues (e.g., explaining, assessing, and describing work is targeted at procedures rather than at justification; required explanations are about nonmathematical aspects of a plan or solution).

which is in contrast to high-demand tasks, which require students to analyze, create, or evaluate facts, procedures, and concepts or to engage in metacognitive activity. Two raters independently examined the 32 AU activities and classified each as high demand or low demand. To make this determination, we examined the descriptions of the activities provided in the teacher planning section and the associated student work samples provided in the analysis of students’ work section. If the purpose of an activity was not clear, we also read the instructional context section of the portfolio and the candidate’s reflection. Agreement on the rating was achieved for about 70% of the AU activities. For each disagreement, the two raters met with a third member of the research team to review the basis for the different ratings. After each discrepancy was discussed, a consensus rating was derived. Based on this experience, the two raters developed a list of specific criteria and characteristics that were useful to distinguish the AU activities designated as high demand from those designated as low demand. This scheme was discussed and refined as needed to reach consensus. Each rater then used this list to independently classify each of the 64 DU activities as a high-demand or low-demand task. Each rater also provided a rationale for each classification. The two raters were able to use the preliminary list of criteria and characteristics for the DU entries, and they achieved agreement on 80% of the activities. As with the AU ratings, all instances of disagreement were discussed, and a consensus rating was derived. 510

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding The framework used to code the cognitive demand of instructional activities is provided in Table 2. The application of these criteria required nuanced judgment on the part of the raters, and the classifications were sometimes difficult to make. Activities typically contained many parts, and it was difficult to decide how to weigh the more demanding and less demanding aspects of an activity to derive an overall classification. The final decision was that raters would apply the criteria liberally; that is if a portion of an activity exhibited high-demand characteristics, it would be classified as a high-demand task, even if some other parts of the activity did not exhibit high-demand characteristics. Raters also considered the typical grade-level placement of topics treated in activities. For example, a task requiring application of basic knowledge about place value was less likely to be classified as high demand in Grade 6 than was a task involving concepts encountered in trigonometry. Finally, in classifying tasks with respect to cognitive demand, the raters ignored features of an activity that might increase performance complexity but that were related to nonmathematical or extracurricular features (such as collecting data or writing a letter about a mathematical task). Thus, tasks requiring complex performances but involving only routine mathematical knowledge and skills tended to be classified as low-demand activities. To illustrate how these criteria were applied to the AU and DU tasks, two examples of activities classified as high demand and two examples classified as low demand are provided in Table 3. Our interpretation of the research usage agreements governing the NBPTS materials does not allow us to provide verbatim reproductions, but the narrative summaries provide most of the essential aspects of the tasks that pertain to decisions regarding cognitive demand, and Table 3 also contains a brief rationale for the classification. Pedagogical Features of NBPTS Portfolio Submissions Our analysis of pedagogical features attended to aspects of the organization and enactment of mathematics instruction (teaching and assessment). We focused on five pedagogical features identified above as having potential to generate opportunities for students to develop or display mathematical understanding: tasks that involved multiperson collaboration and mathematical discourse among students, required mathematical reasoning and explanation, considered applications in contexts other than mathematics itself, employed technology, or used physical (hands-on) materials. Given that the examined materials did not include explicit records of enactment, careful judgment was required to determine if an activity displayed one of these pedagogical features. We considered an activity to have evidence of feature use when the teacher’s description of the instructional context explicitly indicated its use in connection with the submitted activity. Because a teacher’s explanation of instructional context was not generally activity specific for the two activities in the DU entry, we decided to treat the

511 Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Silver et al. Table 3 Examples of Activities Coded as High Demand and Low Demand Coding Rationale

Activity Summary High Cognitive Demand Miniature golf course. Students have to design a miniature golf course using at least four solids; they have to produce nets for each shape—showing dimensions—and an isometric drawing of the station. Students have to pass teacher- and peer-inspections that look for description of the station, nets, isometric drawing, and overall appearance of the course. Comments are expected to be addressed after the inspection. (Developing Mathematical Understanding) Assessment is based on textbook companion materials. There are three questions: Question 1 has 7 items, asking about conditions under which systems of equations have one, none, or multiple solutions (tell how you know that a system of two equations has no solutions). Students have to provide examples; for the case of one solution, students have to solve the system in at least two different ways. One item asks the students to write a word problem that can be solved using a system of two equations. Question 2 has three items to be solved using a graphing calculator. Question 3 has three items, all related to a diagram of a shaded region between two lines in the same plane. Students write a system of inequalities that correspond to the diagram and give a point that is a solution and a point that is not a solution. (Assessing Mathematical Understanding) Low Cognitive Demand Find the sale price. Worksheet illustrating how to calculate the price of an item on sale. (Developing Mathematical Understanding) Two-part assessment activity: “geometry walk” and “who am I.” In the “geometry walk,” students are given a list of 12 shapes and they have to sketch an object found in the real world that has the shape; then they pick three objects and explain why the example has that shape. In the “who am I” part, students are given 14 statements (e.g., “My angle degree is 63°, who am I?”). (Assessing Mathematical Understanding)

512

Students have the liberty to choose the solids and have to come up with a sensible course; they have many constraints to consider, and the net production involves considering reasonable measures for each of the shapes considered. There are also many extracurricular activities involved that make the task even more complex. This is not a straightforward activity. The questions are interesting in that they are “flipped.” They are not asking for a solution but for the conditions to get one or another solution. The demands are higher than when the standard problem/solution is asked for. Students have to create problems that will satisfy a given solution.

Students have to repeat step-by-step procedures modeled in the example provided. The assessment confuses two-dimensional shapes and three-dimensional objects. Although the task is nonstandard, the performance demanded from students is largely based on recalling memorized information; scoring is tilted toward reproduction rather than creativity.

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding entire DU entry, rather than each activity, as the unit of analysis for the coding of pedagogical features. Thus, 64 items (rather than 96) were coded in this analysis—32 AU entries and 32 DU entries. The first phase of rating consisted of three raters independently examining 8 of the 32 activities in the AU entries. For each entry, the raters judged whether the pedagogical feature was used and then generated a written rationale—typically, verbatim excerpts from the portfolio entries—as evidence that the selected features were represented as coded. Raters were nearly unanimous in their classifications, although they sometimes chose different aspects of the portfolio entry to support a common classification. All disagreements were resolved through discussion to reach consensus. Given the high level of agreement, the remaining AU entries and all DU entries were each coded by only one rater, but a reliability check was conducted for 8 randomly selected entries (2 in AU, and 6 in DU). For these cases, the second rater agreed with the first rater’s classifications in all instances. Table 4 displays the judgment criteria used in the coding and an example of an excerpt from a portfolio entry that was taken as evidence of use of that feature. Relating Mathematical and Pedagogical Features of the Portfolio Entries Recognizing that innovative pedagogy need not be used in ways that support the development of students’ understanding, we sought evidence in the portfolio entries. Because the portfolio entry requirements were developed to support a different purpose (evaluation of teaching samples to ascertain alignment with NBPTS criteria), they did not direct teachers to make explicit their thinking about pedagogical choices in relation to students’ understanding. Some teachers did address this issue in their narratives, but it was often quite difficult to interpret their descriptive prose on this point. Although it was not possible to probe fully the association between pedagogy and understanding, it was possible to examine teachers’ use of the pedagogical features in relation to cognitive demand. Thus, we undertook an analysis of the extent to which teachers in our sample used the pedagogical strategies to support students’ engagement with high-demand mathematics tasks. In particular, we were interested in which pedagogical features, if any, were strongly associated with teachers’ use of high-demand tasks. For the 32 AU and 32 DU portfolio entries, we created 2 × 2 contingency tables, crossing cognitive demand (high or low) with pedagogical feature (present or absent). For each pedagogical feature, each contingency table displayed the number of teachers in our sample who submitted entries that were coded with the corresponding pair of characteristics. For the DU entries, we collapsed the cognitive demand coding for the two submitted activities, and we considered an entry to be high demand if it contained at least one activity coded as high demand. We analyzed the data in these tables using chi-square tests.

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

513


Silver et al. Table 4 Descriptions of Criteria and Sample Excerpts Used in Coding Pedagogical Features Pedagogical Feature Context outside mathematics

Hands-on materials

Multiperson collaboration

Technology

Student-generated explanation

Description Tasks that involve real-world contexts encountered outside of school, including those related to students’ neighborhoods, interests, and cultures. Tasks that involve materials used to create some object (e.g., a poster, a physical model) or to make or serve as concrete models of abstract notions (e.g., colored chips to illustrate operations with negative numbers). Tasks that require that work be done with a partner or in a larger group of students.

Tasks in which technological tools—such as calculators, computers, software (e.g., electronic sheets or word processors), and the Internet— are used. Tasks that require students to make explicit their reasoning and thinking processes or to provide a rationale or justification for their solutions or approaches to a problem.

Sample Excerpt “The assessment is based on a single situation: choosing a car to rent.”

“I gave each pair of students a ball, a cylindrical tube, a ruler, and a recording sheet. Students built ramps.”

“They were heterogeneously arranged in carefully selected learning groups of four to five students within that homogeneously grouped class.” “Nineteen students used computer-generated graphs to illustrate their data, while five used pencil and paper.” “Their next step was to justify if a combination was really possible with a drawing or written explanation if it were impossible.”

Results Mathematical Features of NBPTS Portfolio Submissions Mathematical topics. In general, the submitted activities tended to focus on a single mathematics topic area rather than on multiple topics. In particular, 84% of the activities (81 activities: 54 in DU entries and 27 in AU entries) involved a single mathematics topic area. There were, however, differences in topical focus between DU and AU entries (see Figure 1). Algebra was the most frequent topic in single-topic AU activities (37%), whereas Geometry, Data Analysis, Number and Operations, and Algebra and Functions were approximately equally represented in the single-topic DU activities (28%, 24%, 22%, and 22%, respectively). Measurement was the least encountered 514

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding 50% 45% 40% 35% 30% 25% 20% 15% 10% 5% 0% Number & Operations

Measurement

Geometry

Algebra & Functions

Data Analysis, Statistics, & Probability

DU Entries AU Entries

Figure 1. Percentage of activities treating each mathematics topic area in Developing Mathematical Understanding (DU) and Assessing Mathematical Understanding (AU) entries.

focus of single-topic entries, but it was the most frequently encountered topic in multiple-topic entries. Among the 15 activities (10 in DU entries and 5 in AU entries) that spanned multiple topics, 12 involved Measurement in combination with other topics. Cognitive demand. We examined the cognitive demand of the 96 submitted activities—1 in each of the 32 AU portfolio entries and 2 in each of the 32 DU entries. Overall, about 1 in 3 was classified as a high-demand task using the criteria given in Table 2: 38% of the AU activities and 30% of the DU activities were coded as high-demand tasks. Considering activities simultaneously with respect to their content topic and cognitive demand classifications, we examined the distribution of highdemand activities across mathematics content topics. Table 5 presents the distribution of high-demand and low-demand tasks across the five topic areas. For activities that were judged to have more than one topic focus, we allocated proportionally to topic areas (i.e., if two topic areas were identified, then each received 0.5 in the tally). Several observations can be made about the findings presented in Table 5. In particular, the frequency of cognitively demanding tasks varied across topic areas, but no mathematics topic area was associated with more highdemand activities than low-demand activities. Nevertheless, some areas had a definite tendency to be more associated with low-demand activities. For example, activities that treated topics in Number and Operations were 6½ times as likely to be low demand than high demand. In contrast, activities that treated topics in Measurement or Data Analysis were only about 1½ Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

515


Silver et al. Table 5 Frequency of High-Demand and Low-Demand Tasks in Assessing Mathematical Understanding (AU) and Developing Mathematical Understanding (DU) Entries by Mathematics Topic AU Entries

DU Entries

Mathematics Topic

Low

High

Total

Low

High

Total

Number and Operations Measurement Geometry Algebra and Functions Data Analysis, Probability, and Statistics Total

5.0 1.5 4.0 6.5 3.0

1.2 1.5 1.5 4.5 3.2

6.2 3.0 5.5 11.0 6.2

11.8 3.7 11.3 8.7 9.5

1.5 1.7 6.0 4.7 5.2

13.3 5.3 17.3 13.3 14.7

20

12

32

45

19

64

Note. Activities treating more than one mathematics topic were apportioned across all topics involved.

times as likely to be low demand than high demand. Activities that treated topics in Geometry or Algebra were about twice as likely to be low demand as high demand. It is interesting to note that the ratios of low-demand to high-demand activities for topic areas generally tended to be smaller for AU entries than for DU entries, with Geometry being the lone exception. That is, teachers in our sample were more likely to use cognitively demanding tasks in assessment to evaluate student understanding (38%, or 12 of 32 activities) than they were to use such tasks in instruction to develop student understanding (30%, or 19 of 64 activities). In addition to examining variation in cognitive demand across topics, we also examined variation across teachers. Not all teachers included high-demand tasks in their portfolio entries. In fact, the high-demand tasks were submitted by about one half of the teachers in our sample: 53% of the teachers in our sample submitted at least one high-demand task, and 47% submitted only lowdemand tasks. Some of the teachers submitted only one cognitively demanding task, but others submitted more than one. About one third of the total sample submitted two or three cognitively demanding tasks (see Figure 2). Pedagogical Features Pedagogical features were observed in the portfolio entries with varying frequency. As can be seen in Figure 3, the pedagogical features found most frequently in the portfolio entries were applications to contexts outside of mathematics itself and the use of hands-on materials, and the feature found least often was the requirement for student-produced explanations or justifications. Nonmathematical contexts were nearly ubiquitous in both the DU and AU entries whereas only about 1 in 4 AU entries and 1 in 5 DU entries was judged to require student explanation of process or solution. As Figure 3 also indicates, the frequency of occurrence of some features varied between AU 516

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% None

Some

Three Two One

Figure 2. Percentage of teachers submitting activities with no or some cognitively demanding tasks.

and DU entries. For example, the use of hands-on materials was evident in only about half of the AU entries, in contrast to nearly 8 of every 10 DU entries. Except for the requirement that students provide an explanation or justification, all other pedagogical features were more frequently found in the activities provided in DU portfolio entries than in the activities in AU entries. The data summary provided in Figure 3 depicts the frequency of use across portfolio entries, but one might also wish to examine the extent to which each teacher in our sample employed each pedagogical feature in his or her submitted entry. For each pedagogical feature, Figure 4 shows the percentage of teachers who submitted at least one portfolio entry in which the feature was evident. Figure 4 shows that, with the exception of student-generated explanations, these pedagogical features were widely present in our sample. Almost all of the teachers in our sample submitted at least one entry that used application contexts and hands-on activities, and about two thirds used cooperative learning and technology. Fewer than half of the teachers in our sample submitted any entry with a task that required students to provide an explanation. Relating Mathematical and Pedagogical Features of the Portfolio Entries Table 6 provides a summary of data used in our analyses of the relation between pedagogical feature and the cognitive demand level of corresponding Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

517


Silver et al. 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Outside Math

Hands on

Group Work

Technology

Explanations

DU Entries AU Entries

Figure 3. Percentage of activities in Developing Mathematical Understanding (DU) and Assessing Mathematical Understanding (AU) entries exhibiting selected pedagogical features.

tasks. None of the 10 chi-square analyses indicated a significant positive relationship between any pedagogical feature and the use of cognitively demanding tasks in either the AU or DU entries. Although the teachers in our sample used many innovative pedagogical strategies, the results suggest that they were not using them in any systematic way to support students’ engagement with cognitively demanding mathematics tasks. A partial explanation for the apparent disconnection between pedagogy and mathematical demand may be found in the diverse array of instructional goals and motivations that were mentioned by teachers in the description sections (e.g., instructional context, planning) of their AU and DU portfolio entries. Some of the goals expressed by teachers related to mathematical content (e.g., understanding ratios, solving equations, organizing and interpreting data). Others related more to mathematical processes as outlined in the Principles and Standards for School Mathematics (National Council of Teachers of Mathematics 2000; e.g., developing problem-solving strategies, providing opportunities for communication, connecting mathematics to the real world). Yet, perhaps most interesting was the substantial number of goals related to nonmathematical issues confronting middle-grade math teachers (e.g., building students’ self-confidence in mathematics, addressing students’ learning styles, preparing students for standardized tests and future courses). In general, we found that teachers stated many goals for their teaching, some expressing as many as 50 in an entry. In addition to the number of 518

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Outside Math

Figure 4. feature.

Hands on

Group Work

Technology

Explanations

Percentage of teachers submitting activities with each pedagogical

goals, the varied and sometimes possibly competing content of those goals may play a role in teachers’ difficulty to implement innovative pedagogical strategies in the service of cognitively demanding mathematical tasks.

Discussion In this section, we discuss findings with respect to the mathematics learning opportunities for students that were provided in the lessons and assessments in the NBPTS portfolio entries. We discuss first findings from our analysis of mathematical features, and then we discuss findings from our analysis of pedagogical features. What Mathematics Learning Opportunities Were Provided in These Teachers’ Classrooms? We examined the tasks included in portfolio entries with regard to topical treatment and cognitive demand. The submitted tasks were reasonably well distributed across a range of mathematics topic areas—with activities frequently treating topics in geometry and data analysis as well as in the traditionally more heavily emphasized area of number and the recently heavily emphasized area of algebra. The frequent inclusion of activities related to data analysis and geometry is somewhat surprising given the findings of prior research on mathematics instruction in the upper elementary and middle grades that has reported a dominance of number and algebra (e.g., Porter,

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

519


Silver et al. Table 6 Number of Teachers Submitting Activities With Pedagogical Feature by Type of Portfolio Entry and Level of Cognitive Demand Assessing Mathematical Understanding Entries

Pedagogical Feature Use outside mathematical contexts Use hands-on activities Use cooperative grouping Use technology Solicit explanations

Developing Mathematical Understanding Entries

High Demand

Low Demand

High Demand (1 or 2)

Low Demand

9 6 4 5 5

20 12 8 4 4

15 14 11 10 4

14 11 9 9 2

1989; Stodolsky, 1988). We can think of three plausible explanations for this observed difference: (a) It may reflect a shift in modal instructional practice in the middle grades, (b) it may reflect a tendency of teachers applying for NBPTS to choose lessons related to topics they viewed as more difficult or less common, or (b) it may suggest that teachers are more comfortable with or more inclined toward engaging in teaching for understanding in topic areas that have been less emphasized in the past and less associated with the canonical instructional practices noted in prior research. This finding merits further research to understand more fully its implications. Our analysis of the cognitive demand of the tasks submitted in the portfolio entries indicated that about one half of the teachers submitted at least one task that we judged to be cognitively demanding. This finding may be interpreted as quite positive because this represents a much higher incidence of cognitively demanding tasks than would be predicted from prior studies of mathematics teaching practice in the middle grades (e.g., Jacobs et al., 2006; Stigler & Hiebert, 1999). As noted above, prior research on instructional practice in mathematics classrooms has found that daily instruction in elementary and middle-grade mathematics classrooms almost always involves teachers and students engaging in cognitively undemanding activities, such as recalling facts and applying well-rehearsed procedures to answer simple questions. On the other hand, the fact that about half of the teachers in our sample failed to include in their portfolio entries even a single task that was judged to be cognitively demanding can also be viewed as disappointing because these teachers were showcasing their best practice. Nothing in the directions provided by the NBPTS for the assembly of portfolios for EA/M certification nor in the accompanying materials (such as the teaching standards themselves) appears to discourage the inclusion of cognitively demanding tasks. In fact, the NBPTS process encourages submission of lessons that showcase students’ thoughtful engagement with mathematics. Thus, our finding suggests either that these teachers did not use such tasks 520

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding in their instruction (and hence they were unavailable for selection as portfolio entries), that they did not consider mathematical demand to be a characteristic of highly accomplished mathematics teaching (and hence they chose not to display it in their instructional samples), or that their definition of demanding tasks was related to pedagogical rather than cognitive features (and hence demand was interpreted to be about increasing pedagogical complexity and innovation). Moreover, we used a rather generous criterion to determine cognitive demand—if some part of an activity exhibited highdemand characteristics, it was classified as highly demanding, even if other parts of the activity did not. If we had applied a more stringent criterion—such as requiring that more than one half of an activity be judged to be cognitively demanding—the number of portfolio entries containing high-demand tasks would have been considerably smaller. Nevertheless, even if we applied this more stringent criterion, some of the activities were quite demanding and would likely have been so judged. Our analysis of cognitive demands in the mathematics tasks submitted by teachers in our sample indicated that the assessment (AU) entries were more balanced with respect to cognitive demands than were the teaching (DU) entries. That is, in all topic areas except geometry and number and operations, the ratio of low-demand to high-demand tasks in AU entries was closer to one than in the DU entries (see Table 5). We can propose at least two plausible explanations for this finding: One relates to the nature of classroom teaching and testing, and the other to the specific conditions under which the samples of teaching and assessment in this study were collected. First, it might be possible that instruction and assessment are operating under different didactical contracts (Brousseau, 1997). During instruction, for example, teachers and students alike may feel that the teacher’s job is to explain mathematical ideas and “make things clear.” Operating in accord with such expectations, teachers and students may together undermine the cognitive complexity of mathematics tasks. During assessment, however, there is no such expectation for a teacher to explain—students are supposed to know the material. Thus, these expectations provide fewer opportunities for teachers to undermine cognitively demanding tasks when they include them in a student assessment. A second plausible explanation is that teachers viewed cognitively demanding tasks as potentially too risky to use when portraying samples of their teaching practices in the portfolio entries. Given that the goal of these portfolio entries was to showcase exemplary teaching and given that complex tasks can be difficult to enact well in the classroom, teachers might have opted to use a less demanding task for the teaching entry, even if they included a more demanding task in the assessment entry. There might be, of course, a third possibility that we consider less plausible, namely, that teachers do not understand well the connection between teaching and assessing with respect to cognitive demand. That is, they might not view it as problematic if students are given more cognitively complex tasks in assessments than those from which they have had opportunities to learn

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

521


Silver et al. in classroom instruction. Although the portfolio data are not sufficient to determine which explanation best accounts for our findings, this result does suggest that the field would benefit from additional attention to teachers’ assessment practices in relation to their instructional practices and with respect to the issue of cognitive demand. How Were the Learning Opportunities Provided to the Students? To interpret our findings regarding pedagogical features evident in the examined portfolio entries, it may be useful to compare our results to findings reported in the 2000 National Survey of Science and Mathematics Teachers (Weiss, Banilower, McMahon, & Smith, 2001). The Weiss et al. report is based on responses from a nationally representative sample of about 6,000 teachers of mathematics in Grade 1 through Grade 12 in about 1,200 schools. Weiss et al. report their findings for teachers in three grade-level clusters, one of which is Grades 5–8, and it is this cluster that corresponds to age and grade specialization of applicants to the NBPTS for EA/M certification. In general, teachers in the NBPTS sample exhibited somewhat different use of the pedagogical features than was found in the national survey data. Although the direction of the difference was not the same for all features, the teachers in the NBPTS sample tended to be somewhat more adventurous pedagogically in comparison to the picture of normative practice painted by the Weiss et al. (2001) report. In particular, our analysis of portfolio entries found a higher-thanexpected incidence of three features (using technology, using tasks set in application contexts, and using hands-on materials). Regarding the use of technology, 49% of Grade 5–8 teachers in the Weiss et al. (2001) survey reported using calculators or computers for developing concepts and skills at least once each week, whereas 59% of the teachers in our sample actually used technology in either the AU or DU portfolio entries, or in both (see Figure 4). For the use of tasks set in application contexts and the use of concrete materials, Weiss et al. indicated that 57% of Grade 5–8 teachers reported using concrete materials in their lessons and 71% reported having students use mathematical concepts to interpret and solve applied problems at least once each week. In contrast, more than 80% of the teachers in the NBPTS sample included at least one activity involving hands-on activities (84%) or contexts outside of mathematics (91%). We acknowledge that, in these latter cases, the observed difference may be at least partially attributable to somewhat different definitions of the pedagogical features being considered. For example, the term concrete materials used in the Weiss et al. survey may invoke a narrower interpretation than the term hands-on activities used in our analysis. For one pedagogical feature (cooperative groups) the frequency detected in our sample was about the same as in the national survey. The percentage of teachers in our NBPTS sample who exhibited use of cooperative groups in at least one portfolio entry (66%) was lower than the percentage of teachers in Grades 5–8 in the national sample who reported 522

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding employing this pedagogical feature at least once each week (83%). But if we include the additional 16% of the teachers who wrote in their portfolio narratives about using cooperative groups in their teaching, then the totals in the two groups are very similar. It is possible that the artifacts available in the NBPTS portfolios tend to underestimate active pedagogical features like the use of cooperative groups. Requiring student-generated explanations was less evident in our sample of teachers than is reported from the national survey. In the Weiss et al. (2001) survey, 56% of the Grade 5–8 teachers indicated that they required explanations every day or almost every day in their classrooms. The percentage of teachers reporting daily use of tasks requiring explanations thus exceeds the 44% of our NBPTS sample who submitted tasks that required students to produce explanations (see Figure 4). Even if we include the additional 3% of our NBPTS sample whose portfolio narratives mentioned requiring student explanations in their teaching, the totals in the two groups remain quite different. Moreover, Weiss et al. reported that 71% of the Grade 5–8 mathematics teachers indicated that they assessed students at least monthly using tests that require them to provide descriptions or explanations for open-ended questions. This is more than double the 28% of our NBPTS sample who submitted an AU portfolio activity that required students to provide an explanation. The low frequency of this feature is surprising, especially given the emphasis on explanation and mathematical justification promoted by the NCTM Standards (1989, 2000). We suspect that this variation is due in large part to a difference in the meaning that explanation holds for teachers responding to a survey question and the way that we examined it in our study. In particular, it is not at all uncommon for teachers to require that students show their work by listing the sequence of steps used to solve a problem. Typically, there is no requirement that each step be justified mathematically, nor is there an expectation that an explanation would be given for the reasonableness of either the approach or the solution. We suspect that the survey respondents consider this to be an example of teaching that corresponds to description or explanation of solutions for open-ended problems, but our coding of this feature in this study required that mathematical justification be included, and so the frequency of incidence was lower. Nevertheless, given the centrality of explanation and justification to teaching for and learning mathematics with understanding, the relatively low frequency of appearance of this feature in the portfolio entries suggests that teachers’ thinking about this aspect of instructional practice and their proficiency in employing it both merit further attention and careful study. In our analysis of the submitted lesson materials, we noted what we considered to be many missed opportunities to develop mathematical ideas more deeply, and the intentional use of explanation could have played a role in enhancing the mathematical value of the tasks. Our findings suggest that teachers will need additional support to learn to solicit mathematical explanations as a tool in developing and assessing students’ mathematical understanding.

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

523


Silver et al. Although teachers used several innovative pedagogical strategies in their classrooms more frequently than might be expected from national survey data, they did not do so in a way that was closely linked to supporting students’ engagement with challenging tasks. Even in this highly select sample of teachers seeking NBPTS certification, we detected little evidence that innovative pedagogy was being used effectively to support students’ work with cognitively demanding tasks in the mathematics classroom. In this respect, our findings are consistent with those of some other studies (e.g., Cohen, 1990; Ferrini-Mundy & Schram, 1997) and with many anecdotes suggesting that teachers may implement reform pedagogy in a superficial manner that does not realize its potential to advance students’ learning. Other research evidence indicates both that teachers in the middle grades find it difficult to enact cognitively challenging tasks in the mathematics classroom (Stein et al., 1996) and that the consistent, effective use of cognitively demanding tasks in the mathematics classroom increases student achievement (Stein & Lane, 1996). Thus, our findings, based on self-selected samples of practice chosen by a self-selected group of teachers seeking special recognition, suggest that the larger population of mathematics teachers in Grades 5–8 is likely to need considerable assistance in learning to use innovative pedagogical features effectively to support students’ engagement with cognitively challenging tasks in the mathematics classroom. The findings of this investigation add a third element to the portrait of mathematics teaching in middle-grade classrooms in the United States at the transition point from the 20th century to the 21st century, with the first two elements provided by the survey data of Weiss et al. (2001) and the observational data from the Third International Mathematics and Science Study (TIMSS) 1999 Video Study (Hiebert et al., 2005; Jacobs et al., 2006). This portrait offers not only an important indicator of the impact of the NCTMinspired standards-based reforms of the 1990s but also a baseline for examining the effects on mathematics teaching of the No Child Left Behind reforms of the first decade of 2000.

Conclusion Our analysis of the portfolio entries submitted by candidates seeking certification as highly accomplished teachers by the NBPTS in the area of EA/M has offered a rare glimpse at the instructional practice of American teachers as they attempt to teach mathematics for understanding. Our analysis has revealed a form of instructional practice that, at least in some ways, deviates from the canonical portrayal of mathematics teaching derived from several decades of observational and survey research. In particular, the lessons that teachers submitted in their portfolio entries contained activities that treated a broad range of content topics (rather than being narrowly focused on number and algebra) that very often involved tasks situated in contexts outside mathematics itself and that frequently called for multiperson

524

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding collaboration as well as the use of technology and hands-on materials. These findings suggest greater diversity in content treatment and pedagogical approaches than has been evident in most other research on mathematics teaching in the middle grades, and they may reflect both the penetration of “reform” ideas (e.g., NCTM, 1989, 1991) into the instructional practice of those teachers who wish to display highly accomplished teaching and the feasibility of those pedagogical approaches in American classrooms. On the other hand, our findings also suggest that there are several aspects of teaching mathematics for understanding that will require more systematic attention if they are to become a regular feature of instruction in U.S. classrooms. The lower frequency of high-demand tasks, when compared to the higher incidence of innovative pedagogical features (contexts outside mathematics, collaboration, technology, hands-on materials), may suggest a need to explicate clearly the role and value of cognitively demanding tasks in the mathematics classroom. The critical link between student learning and the cognitive demand of tasks used in instruction has become more apparent in recent years as a result of the publication of research reports from QUASAR (Stein & Lane, 1996) and the 1999 TIMSS Video Study (Hiebert et al., 2005), and professional development materials have been developed to promote attention to the centrality of cognitive demand and the challenges that teachers encounter in using high-demand tasks in the mathematics classroom (e.g., Stein et al., 2000). It is possible that these ideas might be taking hold more firmly than was evident in our sample drawn from samples of teaching in 1998–1999, but our finding here of an apparent disconnect between innovative pedagogy and cognitive demand signals that teachers are likely to need explicit guidance about how pedagogical innovations could play a key role in increasing the cognitive demands of mathematical tasks and thereby enhancing students’ opportunities to learn mathematics. An approach similar to what the French call didactical engineering (Artigue, 1994) and what the Japanese call lesson study (Fernandez & Yoshida, 2004) might be particularly appropriate. In such work, teachers analyze lessons that are designed to develop or assess student understanding and that include different pedagogical features. Through such analytic activity they could become better able to identify the strengths and limitations of pedagogical approaches under varying conditions of use and to become more skilled in the deliberate and effective utilization of pedagogical practices to support students’ attainment of important mathematical goals. Analytic, reflective opportunities of this type could enhance teachers’ awareness of a connection between mathematical and pedagogical intentionality that was not generally evident in the sample of instructional artifacts we examined in this study. One particular aspect of the apparent disconnection between pedagogical innovation and cognitively demanding mathematical tasks that might merit further careful exploration is suggested by the array of goals and intentions expressed by the teachers in our sample as they described their planning and the instructional context of their portfolio entries. In general, we

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

525


Silver et al. found that teachers stated many goals for their teaching. Although we did not report herein a careful, complete analysis of these goals, our initial analyses suggest that the mathematical goals expressed by teachers were not very specific (e.g., promote problem solving/posing), nor were they related to high cognitive demands (e.g., use ratio, proportion, and percent and their conversions) nor often tied to pedagogical strategies (e.g., involve community). On the other hand, it was our impression that the diverse set of goals related to nonmathematical issues confronting middle-grade math teachers (e.g., building students’ self-confidence in mathematics, addressing students’ learning styles, preparing students for standardized tests and future courses), although diffuse and often also stated quite generally, were more likely to be tied to pedagogical strategies. A careful analysis of the varied, and possibly competing, goals that teachers hold for their instructional activities—by researchers and also by teachers themselves—might shed more light on how to assist teachers to employ innovative pedagogical strategies more effectively in the service of cognitively demanding mathematical tasks in the classroom. Our examination of portfolio entries in this study suggests that teachers would likely benefit from more opportunities to explain what they do and why they do it. Many of the teachers in our sample did not clearly articulate their goals for and approaches to mathematics teaching. The narrative portions of the portfolio entries were sometimes quite difficult to understand and interpret, as many teachers used educational jargon quite liberally and some presented ideas in a seemingly disconnected way. Thus, it seems that teachers might need and could benefit from professional development experiences that assist them in becoming more adept at examining and reflecting on their instructional practices and then expressing in writing the results of their examinations and reflections. The use of narrative and video cases and written reflections seem especially promising in this regard (e.g., Schifter, 1996; Stein et al., 2000), as these strategies can help teachers learn to use frameworks systematically to analyze and express key aspects of their teaching and assessment practices. We close by commenting on three special aspects of the data analyzed in this study that we think merit attention from researchers seeking to understand classroom instruction. First, the lesson materials and artifacts analyzed in this study were specifically solicited in the NBPTS certification process as samples of complex teaching practice—teaching mathematics for understanding—heretofore rarely found and studied in American classrooms. Moreover, the lesson materials were selected by teachers and submitted for evaluation in an assessment intended to identify highly accomplished teaching, so it is reasonable to assume that the samples represented lessons that the teachers considered to be their best practice. In large-scale observational studies of teaching and in surveys, it is common to request samples of or information about typical teaching practice. Some scholars (e.g., Silver, 2003) have suggested the potential value of also examining instruction that is atypical in some way to detect, for example, what teachers might be capable of 526

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding doing or inclined to do when they try to exhibit their very best work. The NBPTS portfolio entries offer one example of what such atypical data might look like, and our analysis of these data offers one example of what might be learned. Second, the data examined were of a hybrid form that combines some features of the data collected via direct observation and data collected via survey responses. Like direct observation, the portfolio entries displayed important details of classroom lessons; similar to survey data, the portfolio entries permitted access to the teacher’s perspective. We join several other researchers (e.g., Borko et al., 2005; Clare & Aschbacher, 2001) in advocating the careful consideration of research methods to study instructional practice that provide alternatives to direct observation methods (which are invasive, labor intensive, expensive, and impractical on a large scale) and survey methods (which involve questions susceptible to multiple interpretations, have questionable validity, and provide little information about the details of instructional lessons). Third, the data analyzed in this study included instructional artifacts associated not only with classroom teaching but also with classroom assessment. In general, large-scale research studies of teachers’ instructional practices, such as the TIMSS video analysis (Stigler et al., 1999), tend to focus exclusively on the teaching of classroom lessons rather on than the assessment activities used by teachers. But assessment is a key aspect of instructional practice, and our findings indicated that teachers’ assessment practices and instructional practices were not identical with respect to the features examined in this study. Thus, our findings suggest that focusing exclusively on teaching rather than on assessing may underestimate the extent to which cognitively demanding tasks are used in American classrooms. In recent years, there has been a growing interest in examining teachers’ classroom assessment practices (e.g., Black & Wiliam, 2004). Beyond the arguments found in that literature to support such a focus, our findings suggest that more attention to assessment practices could provide valuable information about key mathematical and pedagogical features of instruction that would complement the kind of information already available from surveys and observational studies of teaching. Note This study was supported in part by Grant No. ESI-0083276 from the National Science Foundation (NSF) to the Educational Testing Service (ETS), under the direction of Gail P. Baxter and Edward A. Silver. We are grateful to the National Board for Professional Teaching Standards (NBPTS) for granting access to the portfolio data and to members of the ETS staff, especially Rick Tannenbaum, for facilitating access to the data used in this investigation. Any opinions expressed herein, however, are ours and do not necessarily reflect the views of the NSF, NBPTS, or ETS. We are grateful to Angus Mairs, Douglas Corey, and Hala Ghousseini for their assistance with aspects of the data coding and to Drew Gitomer for his encouragement and for his helpful comments on an earlier report of our results. We also thank several anonymous reviewers and the editor for valuable comments on the manuscript.

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

527


Silver et al. References Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., et al. (Eds.). (2001). A taxonomy for learning, teaching, and assessing. New York: Longman. Artigue, M. (1994). Didactical engineering as a framework for the conception of teaching products. In R. Biehler, R. W. Scholz, R. Strässer & B. Winkelman (Eds.), Didactics of mathematics as a scientific discipline (pp. 27–39). Dordrect, Netherlands: Kluwer. Balfanz, R., Mac Iver, D., & Byrnes, V. (2006). The implementation and impact of evidence-based mathematics reforms in high-poverty middle schools: A multi-site, multi-year study. Journal for Research in Mathematics Education, 37, 33–64. Beaton, A. E., Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., Kelly, D. L., & Smith, T. A. (1996). Mathematics achievement in the middle school years: IEA’s Third International Mathematics and Science Study (TIMSS). Chestnut Hill, MA: Boston College. Black, P., & Wiliam, D. (2004). The formative purpose: Assessment must first promote learning. In M. Wilson (Ed.), Towards coherence between classroom assessment and accountability (One hundred third yearbook of the National Society for the Study of Education, Part II; pp. 20–50). Chicago: University of Chicago Press. Boaler, J. (1998). Open and closed mathematics: Student experiences and understandings. Journal for Research in Mathematics Education, 29, 41–62. Bond, L., Smith, T. W., Baker, W., & Hattie, J. (2000). The certification system of the National Board for Professional Teaching Standards: A construct and consequential validity study. Greensboro, NC: Center for Educational Research and Evaluation. Borko, H., Stecher, B., Alonzo, A., Moncure, S., & McClam, S. (2005). Artifact packages for measuring instructional practice: A pilot study. Educational Assessment, 10, 73–104. Borko, H., Stecher, B., & Kuffner, K. (2007). Using artifacts to characterize reformoriented instruction: The scoop notebook and rating guide (CSE Tech. Rep. No. 707). Los Angeles: Center for the Study of Evaluation, National Center for Research on Evaluation, Standards, and Student Testing (CRESST). Retrieved January 23, 2008, from http://www.cse.ucla.edu/products/reports/R707.pdf Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press. Brousseau, G. (1997). Theory of didactic situations in mathematics (N. Balacheff, Trans.). Dordrecht, Netherlands: Kluwer. Brownell, W. A., & Moser, H. E. (1949). Meaningful vs. mechanical learning: A study in Grade III subtraction (Duke University Research Studies in Education, No. 8). Durham, NC: Duke University Press. Brownell, W. A., & Sims, V. M. (1946). The nature of understanding. In N. B. Henry (Ed.), The measurement of understanding (Forty-fifth yearbook of the National Society for the Study of Education, Part I; pp. 27–43). Chicago: University of Chicago Press. Carpenter, T. P., Fennema, E., & Franke, M. (1996). Cognitively guided instruction: A knowledge base for reform in primary mathematics instruction. Elementary School Journal, 97(1), 3–20. Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C. P., & Loef, M. (1989). Using knowledge of children’s mathematical thinking in classroom teaching: An experimental study. American Educational Research Journal, 26, 499–531. Clare, L., & Aschbacher, P. R. (2001). Exploring the technical quality of using assignments and student work as indicators of classroom practice. Educational Assessment, 7, 39–59.

528

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding Cohen, D. K. (1990). A revolution in one classroom: The case of Mrs. Oublier. Educational Evaluation and Policy Analysis, 12, 327–345. Cohen, D. K., McLaughlin, M., & Talbert, J. (Eds.). (1993). Teaching for understanding: Challenges for policy and practice. San Francisco: Jossey-Bass. Cohen, D. K., Raudenbush, S. W., & Ball, D. L. (2003). Resources, instruction, and research. Educational Evaluation and Policy Analysis, 25, 119–142. Fawcett, H. P. (1938). The nature of proof: A description and evaluation of certain procedures used in a senior high school to develop an understanding of the nature of proof. New York: Teachers College, Columbia University. Fennema, E., & Romberg, T. A. (Eds.). (1999). Mathematics classrooms that promote understanding. Mahwah, NJ: Lawrence Erlbaum. Fernandez, C., & Yoshida, M. (2004). Lesson study: A Japanese approach to improving mathematics teaching and learning. Mahwah, NJ: Lawrence Erlbaum. Ferrini-Mundy, J., & Schram, T. (Eds.). (1997). The Recognizing and Recording Reform in Mathematics Education project: Insights, issues, and implications (JRME Monograph No. 8). Reston, VA: National Council of Teachers of Mathematics. Fuson, K. C., & Briars, D. J. (1990). Using a base-ten blocks learning/teaching approach for first- and second-grade place-value and multidigit addition and subtraction. Journal for Research in Mathematics Education, 21, 180–206. Good, T. L., Grouws, D. A., & Ebmeier, H. (1983). Active mathematics teaching. New York: Longman. Grouws, D. A., Smith, M. S., & Sztajn, P. (2004). The preparation and teaching practices of United States mathematics teachers: Grades 4 and 8. In P. Kloosterman & F. K. Lester (Eds.), Results and interpretations of the 1990–2000 mathematics assessments of the National Assessment of Educational Progress (pp. 221–267). Reston, VA: National Council of Teachers of Mathematics. Hakel, M. D., Koenig, J. A., & Elliott, S. W. (2008). Assessing accomplished teaching: Advanced-level certification programs. Washington, DC: National Academy Press. Henningsen, M., & Stein, M. K. (1997). Mathematical tasks and student cognition: Classroom-based factors that support and inhibit high-level mathematical thinking and reasoning. Journal for Research in Mathematics Education, 29, 514–549. Hiebert, J., & Carpenter, T. P. (1992). Learning and teaching with understanding. In D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 65–97). New York: Macmillan. Hiebert, J., Carpenter, T. P., Fennema, E., Fuson, K., Human, P, Murray, H., et al. (1996). Problem solving as a basis for reform in curriculum and instruction: The case of mathematics. Educational Researcher, 25(4), 12–21. Hiebert, J., & Grouws, D. A. (2007). The effects of classroom mathematics teaching on students’ learning. In F. K. Lester (Ed.), Second handbook of research on mathematics teaching and learning (pp. 371–404). Charlotte, NC: Information Age. Hiebert, J., Stigler, J., Jacobs, J., Givvin, K., Garnier, H., Smith, M., et al. (2005). Mathematics teaching in the United States today (and tomorrow): Results from the TIMSS 1999 Video Study. Educational Evaluation and Policy Analysis, 27, 111–132. Hiebert, J., & Wearne, D. (1993). Instructional tasks, classroom discourse, and students’ learning in second-grade arithmetic. American Educational Research Journal, 30, 393–425. Jacobs, J. K., Hiebert, J., Givvin, K. B., Hollingsworth, H., Garnier, H., & Wearne, D. (2006). Does eighth-grade mathematics teaching in the United States align with the NCTM standards? Results from the TIMSS 1995 and 1999 video studies. Journal for Research in Mathematics Education, 37, 5–32. Kilpatrick, J., Swafford, J., & Findell, B. (2001). Adding it up: Helping children learn mathematics. Washington, DC: National Academy Press. Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

529


Silver et al. Matsumura, L. C., Garnier, H., Pascal, J., & Valdés, R. (2002). Measuring instructional quality in accountability systems: Classroom assignments and student achievement. Educational Assessment, 8, 207–229. National Assessment of Educational Progress. (1988). Mathematics objectives: 1990 Assessment (Rep. No. 21-M-10). Princeton, NJ: Educational Testing Service and Author. National Council of Teachers of Mathematics. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: Author. National Council of Teachers of Mathematics. (1991). Professional standards for teaching mathematics. Reston, VA: Author. National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA: Author. Newmann, F. M., & Associates. (1996). Authentic achievement: Restructuring schools for intellectual quality. San Francisco: Jossey-Bass. Porter, A. (1989). A curriculum out of balance: The case of elementary school mathematics. Educational Researcher, 18(5), 9–15. Schifter, D. (Ed.). (1996). What’s happening in math class? (Vol. 1). New York: Teachers College Press. Schoenfeld, A. (1985). Metacognitive and epistemological issues in mathematical problem solving. In E. A. Silver (Ed.), Teaching and learning mathematical problem solving: Multiple research perspectives (pp. 361–379). Hillsdale, NJ: Lawrence Erlbaum. Schoenfeld, A. (1992). Learning to think mathematically: Problem solving, metacognition, and sense making in mathematics. In D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 334–370). New York: Macmillan. Silver, E. A. (2003). Lessons learned from examining mathematics teaching around the world. Education Statistics Quarterly, 5(1). Retrieved January 23, 2008, from http://nces.ed.gov/programs/quarterly/Vol_5/5_1/q2_3.asp Silver, E. A., & Kenney, P. A. (Eds.). (2000). Results from the seventh mathematics assessment of the National Assessment Educational Progress. Reston, VA: National Council of Teachers of Mathematics. Stake, R. E., & Easley, J. (1978). Case studies in science education. Urbana: University of Illinois. Stein, M. K., Grover, B. W., & Henningsen, M. (1996). Building capacity for mathematical thinking and reasoning: An analysis of mathematical tasks used in reform classrooms. American Educational Research Journal, 33, 455–488. Stein, M. K., & Lane, S. (1996). Instructional tasks and the development of student capacity to think and reason: An analysis of the relationship between teaching and learning in a reform mathematics project. Educational Research and Evaluation, 2(1), 50–80. Stein, M. K., Smith, M. S., Henningsen, M., & Silver, E. A. (2000). Implementing standards-based mathematics instruction. New York: Teachers College Press. Stigler, J. W., Gonzalez, P., Kawanaka, T., Knoll, S., & Serrano, A. (1999). The TIMSS videotape classroom study: Methods and findings from an exploratory research project on eighth-grade mathematics instruction in Germany, Japan, and the United States (NCES 1999-074). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Stigler, J. W., & Hiebert, J. (1999). The teaching gap. New York: Free Press. Stodolsky, S. S. (1988). The subject matters: Classroom activities in math and social sciences. Chicago: University of Chicago.

530

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009


Teaching Mathematics for Understanding Weiss, I. R., Banilower, E. R., McMahon, K. C., & Smith, P. S. (2001). Report of the 2000 National Survey of Science and Mathematics Education. Chapel Hill, NC: Horizon Research.

Manuscript received October 27, 2007 Revision received June 2, 2008 Accepted September 6, 2008

Downloaded from http://aerj.aera.net by Armando Loera on October 31, 2009

531


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.