Journal of Teacher Education http://jte.sagepub.com
Designing Video-Based Professional Development for Mathematics Teachers in Low-Performing Schools Rossella Santagata Journal of Teacher Education 2009; 60; 38 DOI: 10.1177/0022487108328485 The online version of this article can be found at: http://jte.sagepub.com/cgi/content/abstract/60/1/38
Published by: http://www.sagepublications.com
On behalf of:
American Association of Colleges for Teacher Education (AACTE)
Additional services and information for Journal of Teacher Education can be found at: Email Alerts: http://jte.sagepub.com/cgi/alerts Subscriptions: http://jte.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav Citations http://jte.sagepub.com/cgi/content/refs/60/1/38
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Designing Video-Based Professional Development for Mathematics Teachers in Low-Performing Schools
Journal of Teacher Education Volume 60 Number 1 January/February 2009 38-51 © 2009 Sage Publications 10.1177/0022487108328485 http://jte.sagepub.com hosted at http://online.sagepub.com
Rossella Santagata University of California, Irvine This article describes the theoretical framework, research base, structure, and content of a video-based professional development program implemented during 2 consecutive years with sixth-grade mathematics teachers from five low-performing schools. First, difficulties teachers encountered in responding to video-based prompts during the 1st year are summarized. Problematic questions deal with teachers’ (a) basic understanding of target mathematics topics, (b) knowledge of their students’ understanding, and (c) ability to analyze students’ work and reasoning beyond classification into right and wrong answers. Changes that were made to the program to address teachers’ needs in the 2nd year are then described. These are structured around three principles for designing video-based professional development: (a) attending to content-specific understanding, (b) scaffolding analysis of student thinking, and (c) modeling a discourse of inquiry and reflection on the teaching and learning process. Keywords: teacher professional development; teacher learning; videotape recordings; mathematics education; lowperforming schools
T
eacher educators have been using videos as learning tools for teachers since the late 1960s and early 1970s. At that time, publications primarily reported results from microteaching studies. These articles summarized teachers’ learning from watching brief clips of classroom instruction that featured specific instructional techniques to be modeled (among others, Acheson & Zigler, 1971; Allen & Clark, 1967; Limbacher, 1971; Ward, 1970). More recently, with the advent of digital technologies, video has often been embedded in complex multimedia databases and accompanied by a variety of instructional materials (e.g., transcripts, handouts the videotaped teacher gave to her students, samples of students’ work from the videotaped lesson). In addition, the learning objective for teachers has shifted from learning specific instructional techniques to deepening pedagogical content knowledge and developing reflective knowledge of teaching and learning (Santagata, Gallimore, & Stigler, 2005). As a result, more recent publications focus on reporting teacher progress in identifying important instructional moments, analyzing student thinking, and reflecting on content (Davies & Walker, 2005; Jacob, Lamb, Philipp, Schappelle, & Burke, 2007; Lampert & Ball, 1998; Santagata, Zannoni, & Stigler, 2007).
38
Although limitations still exist in the methodologies used to gather evidence of teacher learning from analyzing videotaped instruction—in most cases restricted to qualitative studies of small groups of teachers—evidence of the positive effects on teachers’ overall understanding of the teaching-learning process, knowledge of Author’s Note: This research was supported by the Institute of Education Sciences, Teacher Quality Program, under Grant R305M030154. Any opinions, findings, and conclusions expressed in this article are those of the author and do not necessarily reflect the views of the funding agency. This article is in part based on a poster presented in June 2007 at the Institute of Education Sciences annual conference in Washington, DC, and on a paper presented in April 2008 at the American Education Research Association annual meeting in New York City. I would like to thank Karen Givvin and Nicole Kersting for comments on a previous draft and Laurie Hansen for proofreading the manuscript. This project would have not been possible without the teachers who participated in the professional development program and who graciously allowed our research team to collect data on their learning process. I also thank the professional development program facilitators and all the members of the research team for the time, energy, and passion they devoted to this project. Correspondence concerning this article should be addressed to Rossella Santagata, Department of Education, University of California, Irvine, CA 92697; e-mail: r.santagata@uci.edu.
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 39
subject-matter specific instructional strategies, and understanding of student thinking is rapidly increasing (among others, Borko, Jacobs, Eiteljorg, Pittman, 2008; Jacob et al., 2007; Santagata et al., 2007; Sherin & Han, 2004; Sherin & van Es, 2005; van Es & Sherin, 2008; van Es & Sherin, 2002). In particular, authors of recent studies praise the use of video for allowing in-depth analyses of students’ learning in action—analyses that teachers would not be able to do while teaching a lesson (Clarke & Hollingsworth, 2000; LeFevre, 2004; Sherin, 2004; Sherin & van Es, 2005; van Es & Sherin, 2002). Less frequent in the literature are detailed descriptions of questions and tasks professional development providers use to guide teachers’ analyses of videos. If we are to understand the learning processes in which teachers engage when analyzing video and what specifically helps them to acquire knowledge that is useful in teaching, we need to make public the detailed descriptions of questions and tasks that accompany our video-based professional development programs. In a recent American Educational Research Association symposium on the use of video to study teaching, Hilda Borko (2007) called for collaboration among researchers to share the specifics of materials and tasks used with teachers. The Video Cases for Mathematics Professional Development, developed by Seago, Mumme, and Branca (2004), provide a good example of video-based material accompanied by questions that structure teachers’ analyses. The CD includes videos of real classroom teaching, and the accompanying guide assists teachers in exploring the topic of linear functions as well pedagogical strategies to foster student conceptual understanding, such as choosing and using various representations and interpreting and responding to students’ methods and errors. The Supporting the Transition from Arithmetic to Algebraic Reasoning Project is another example. This project describes in detail a professional development model, the “Problem Solving Cycle,” that involves teachers in sharing their practices with their colleagues through video and other records of practice (Borko et al., 2008; Koellner et al., 2007). A third example is Sherin and van Es’s (2008) research on teacher learning in the context of video clubs, in which the authors describe in detail teacher and facilitator’s roles in discussions around video clips of classroom interactions. This article contributes to this literature by describing in detail a video-based professional development program implemented for 2 consecutive years with sixth-grade teachers from five middle schools in a low-performing district. The study can be categorized as what many would call “design research,” in that we attempted to engineer an innovative educational environment and simultaneously
conduct an experimental study (Brown, 1992). It also shares with design research some of the tensions that come with wanting to refine an intervention to improve practice, while attempting to develop a deeper understanding of the general principles—the theory—at the basis of the intervention (Bielaczyc & Collins, 2007). What distinguishes this article from previously published studies is its focus on the difficulties teachers may encounter in using video to deepen their content and pedagogical content knowledge as opposed to the benefits of such forms of professional development. Specifically, this article reports on challenges that particular groups of teachers, such as those working in low-performing schools, may face. It describes the process through which our team of researchers and facilitators modified tasks and questions used in the 1st year version of the program to develop a new version for the 2nd year that would address teachers’ needs. Before I introduce the theoretical framework and research at the basis of the professional development program under investigation, I provide a context for the study by briefly summarizing research on teachers working in low-performing schools. Although teachers everywhere are faced with a similar challenge—that of teaching to students fairly complex content in a relatively short period of time—there are a set of attitudes, beliefs, and characteristics that, according to research, tend to distinguish teachers who work in low-performing schools. These beliefs and characteristics may affect ways teachers participate in professional development programs and can be obstacles to program success. Although some of the difficulties teachers in this study encountered with the professional development material may be similar to the ones teachers working in higher performing schools would face, teachers in low-performing schools are at a particular disadvantage for the following reasons: a. They do not think they are able to affect students’ learning. They tend to attribute students’ poor performance solely to factors outside the school, such as neighborhood violence, lack of parental monitoring, and economic conditions. As a consequence, they underestimate their role in the students’ learning progress (Oakes, Joseph, & Muir, 2003). b. They tend to have low expectations for their students. Teachers working in low-performing schools tend to underestimate students’ potential. In particular, they tend to believe that their students are not capable of conceptual understanding and the only route to the improvement of their performance is highly procedural teaching (Anyon, 1981; Oakes et al., 2003; Spencer, 2006)
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
40 Journal of Teacher Education
c. They are more likely to hold an emergency credential or being asked to teach outside their subject matter area than teachers working in higher achieving schools and less difficult working conditions (Lankford, Loeb, Wyckoff, 2002; Loeb, DarlingHammond, & Luczak, 2005). As a consequence, these teachers’ content and pedagogical content knowledge tend to be poor (Hill, 2007). This often results in teachers not being able to valorize what students do know and not being able to leverage on that to build more sophisticated mathematical understanding. d. Finally, in the current No Child Left Behind climate, they are particularly pressured to improve students’ performance on standardized tests. Many school administrators do not see teaching for understanding as the most direct way—or see it as having any potential at all—to improve students’ scores. Lacking leader support, teachers are often left alone in implementing innovative practices (Apple, 2004; Gutstein, 2003). The article is structured into four sections, each accomplishing a specific goal: (a) provide a theoretical framework and a brief of the research at the basis of the various components of the video-based professional development program under study, (b) describe in detail the program as it was developed for the 1st year of implementation, (c) summarize what we learned about teachers’ difficulties, and (d) illustrate how we modified the program for the 2nd year of implementation to respond to these difficulties.
Theoretical Framework and Research Base The main objective of the professional development program described here was that of assisting teachers to choose mathematically rich problems and to maintain that richness as they guide students through problem solutions. Findings from the video portion of the Third International Mathematics and Science Study (Hiebert et al., 2003) provide both the basis for the main objective of the professional development program and for the specific teacher knowledge and skills that we chose to target. In summary, Third International Mathematics and Science Study video research found that compared to the United States, highachieving countries engage students more frequently in rigorous mathematical reasoning. In particular, students in these countries are presented with complex mathematics problems that require them to make connections between
mathematical ideas. In addition, and contrary to U.S. lessons in which solution methods are reduced mostly to procedures to be followed, the complexity of the mathematics in high-achieving countries is maintained throughout the solution process. Two hypotheses may explain these results: (a) U.S. teachers do not possess a deep understanding of the mathematics they are asked to teach and (b) teaching mathematics with attention to conceptual underpinnings is not consistent with the tradition of school mathematics in the United States. Thus, although conceptual understanding has become a shared objective for student learning within the education mathematics community, U.S. teachers seldom have the opportunity to observe examples of teaching in which the complexity of the mathematics involved is maintained. With these two hypotheses in mind, we designed the professional development program to include opportunities for teachers to deepen their own understanding of key concepts of the curriculum they teach, improve their knowledge of ways students understand the content, and learn about instructional strategies that can be used to maintain the mathematical richness of the problems they pose. Our approach to teacher learning about instructional strategies centers on the analysis of videotaped lessons. We view teaching as a cyclical process that goes beyond what happens in the classroom to include planning and reflecting. Improvements in planning and reflection have great potential for improving teaching (Ball & Cohen, 1999; Hiebert, Gallimore, & Stigler, 2002). In particular, through the analysis of videotaped lessons, cultural routines can be brought to awareness, evaluated, and changed (Lewis & Tsuchida, 1997; Santagata et al., 2007; Stigler & Hiebert, 1999). This is also in agreement with research on teacher learning that finds that effective professional development must provide multiple opportunities for teachers to observe, analyze, and discuss classroom practice (DarlingHammond & Sykes, 1999; Elmore, 2002; Garet, Porter, Desimone, Birman, & Yoon, 2001; Kennedy, 1999; Whitehurst, 2002).
Professional Development Structure and Content The program consisted of three modules, each targeting a key content area of the California sixth-grade curriculum (i.e., fractions, ratios and proportions, and expressions and equations) and the respective core concepts. Teachers met face to face at the district office in groups of 8 to 10 led by one or both of two facilitators, each having a strong background in mathematics and several years of teaching experience. Each teacher was
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 41
provided with a laptop computer connected to the Internet. Video-based analyses were supported by Visibility, a multimedia platform developed by LessonLab. Each module was structured into three main folders: (a) Content Exploration, (b) Lesson Analysis, and (c) Link to Practice. Within each folder, numbered pages guided teachers through a series of video-based analysis tasks, each containing a few questions for them to answer. Teachers were asked to watch preselected video segments and type their responses to the analysis questions into a textbox. Some questions also required teachers to point the reader’s attention to specific moments of the video. A feature of the software allowed them to click on a button that inserted in their text a time stamp corresponding to a moment of the video they had chosen to cite. Written responses were saved on a server accessible by both the facilitator during the professional development sessions and the researchers, who later were able to analyze them. Independent work at the computer was interspersed with whole-group discussions led by the facilitator, who sometimes projected participants’ written responses onto a big screen. Teachers spent roughly half of the time working independently at their computers and half of time working and discussing as a whole group. After an initial introductory meeting, teachers met six times throughout the school year and spent 2 full days working on each module. Content Exploration and Lesson Analysis were each addressed for a full day, usually a week apart. A teaching window followed, after which teachers met at their school sites for 1 hour to share their teaching experiences as part of the Linking to Practice phase. Once a module was completed, they moved onto the next phase. The modules were distributed across the school year so that teachers would participate in the professional development sessions on a particular topic area immediately before they were to teach it to their students. Finally, an end-of-year meeting concluded the professional development program. This was mainly involved at the collection of research data. The following is a description of the content and structure of each folder:
Content Exploration Content Exploration was aimed at deepening teachers’ understanding of core mathematics concepts and was accomplished through a combination of written documents and video. Instead of watching a videotaped classroom lesson, here teachers were exposed through video to a mathematics-focused discussion among other teachers led by a mathematics educator. This had the dual purpose of (a) providing a dynamic setting for teachers to learn mathematics concepts that
would make the task more engaging than simply reading mathematics content documents and (b) creating an atmosphere in which teachers feel comfortable sharing doubts they themselves may have on mathematics concepts. In the videotaped discussion, in fact, the mathematics educator often highlighted inconsistencies and misconceptions in teachers’ mathematical ideas, making it easier for the participating teachers to share their own difficulties. Teachers participating in the professional development program watched selected segments of the videotaped discussion and posted online answers to questions aimed at fostering their conceptual understanding. Individual teachers’ responses were then shared in a group discussion led by a facilitator. Occasionally, concrete materials were provided to teachers for them to engage in the same activities in which the videotaped teachers engaged. A list of concepts that were targeted in the ratio and proportion module and of questions (organized in what we called “tasks”) that were posed to teachers online follows in Table 1. For brevity, I list there only the title of each question. For a few tasks, questions are presented in their entirety later in the summary of findings. Complete questions for each module can be requested from the author. The fraction and expression and equation modules followed the same structure.
Lesson Analysis For each module, the second day of professional development was dedicated to Lesson Analysis. On this day, teachers began by solving a rich problem that made use of one or more of the core concepts studied the prior day. They then studied a lesson plan that incorporated the rich problem and watched the video of the lesson in which the rich problem was taught. This videotaped lesson provided teachers with a model for engaging students in conceptual thinking. Lessons were filmed in the teachers’ district; thus, the students portrayed in the video were from the same population of the participating teachers’ students. During this phase, teachers answered a series of questions aimed at the analysis of students’ learning and understanding as evidenced in the video and in samples of students’ work. At the end of the second day, teachers discussed ways the lesson plan could be improved and proposed modifications before going back to their classrooms to teach the resulting lesson to their students. Following in Table 2 are the problems taught in the lesson videotaped for the ratio and proportion module and the titles of the questions that guided teachers’ analysis. Similar questions were asked in the other two modules. As
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
42 Journal of Teacher Education
Table 1 Content Exploration–Ratio and Proportion Module Target ratio and proportion core concepts: Ratio as comparison by division; constant ratios/constant multiples; proportional relationships. Task 1: Exploring ratio and proportion Brainstorm about ratio and proportion Comment on what teachers in videotaped professional development session say about the definition of ratio and proportion Task 2: Ratios as comparisons Give examples of real-life situations involving ratios Analyze example provided by teachers in the videotaped professional development session Task 3: Examining equal ratios Solve a problem about finding a missing value Analyze solution methods provided by teachers in the videotaped professional development session Explain a common student mistake with this kind of problem Task 4: Solving proportions 1. Discuss the cross-multiplying procedure for solving proportions Task 5: Issues in teaching 1. Reflect on problems on proportional reasoning without numbers 2. What do students gain from activities such as these? Task 6: Ratio tables 1. Describe understanding students need to set up and complete a ratio table 2. Describe how you might help students use the properties of cross-products in proportions while continuing to develop proportional reasoning
Table 2 Lesson Analysis–Ratio and Proportion Module Focus problem in videotaped lesson: Your class has been asked to organize a project called “Holding Hands Across L.A.” You will need to have people line up and hold hands from Downtown to Long Beach (a distance of 87,000 ft.). Your job is to determine how many people you will need to form the line. Task 1: Analyze the main problem Solve the problem Predict students’ strategies Describe prerequisite knowledge and skills Describe opportunities for student growth or misunderstanding Task 2: Lesson plan and lesson goals Make hypotheses on how the lesson will help students achieve the goals Predict students’ responses Collect evidence of students’ learning Task 3: Verify hypotheses and predictions Analysis of classroom video Analysis of student work Evidence of student understanding Analyze students’ opportunities (and missed opportunities) to deepen understanding of core concepts Task 4: Lesson improvement Discuss ways lesson can be improved and adapted to the specific needs of each teacher’s classroom
for the content exploration tasks, full text of sample questions is presented later and the full version of the questions can be requested from the author.
Link to Practice During this phase, teachers taught the lesson they had analyzed and participated in a facilitator-led 1-hr. meeting at their school sites. This phase was aimed at facilitating the
application of what was learned during the professional development sessions to teachers’ daily practices. At the meeting, teachers were asked to share with their colleagues samples of student work from the lesson they had taught. They were told to select work that represented a range of student performance. Each person took a turn and summarized his or her experience teaching the lesson and shared student work to discuss both aspects of the lesson that went as planned and aspects that did not.
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 43
Participating Schools Sixth-grade teachers from five low-performing Title 1 (i.e., having a poverty rate of 50% or higher), inner-city middle schools participated in the professional development program. Two of the schools followed a regular September to June calendar. The remaining schools followed a year-round calendar, with three or four groups of students rotating throughout the calendar year. All five schools were very large in size, accommodating on average 2,270 students in grades 6 through 8. Each school included approximately 20 sixth-grade classes, and most sixth-grade teachers taught mathematics and another subject matter (usually science) to three groups of students. The student population was predominantly Hispanic (from 62% to 95%) and Black (from 18% to 38%). Between 30% and 47% of the students at each school were English-language learners. Student mathematics achievement at these schools, as measured by standardized tests, was among the lowest in the state, with only 6.2% of students on average reaching a proficient level. The program was implemented for 2 consecutive years and was made mandatory by the district. During the 1st year, a randomly selected half of all sixth-grade teachers at these five schools attended the professional development. The remaining teachers were included in a no-treatment control group. During the second year, all teachers participated in the program. Further details about the experimental design are included in a separate article (Santagata et al., 2008). The data presented in this article are based on the experience of 33 teachers who participated during the 1st year. Now that I have delineated the context in which the professional development program was implemented, I describe the methodology I used to analyze the difficulties teachers encountered during the 1st year of implementation.
Method Data Sources The main source of data for the findings presented here are teachers’ written responses to all video-based online tasks included in the three professional development modules. These amounted to responses to 72 questions. Responses were downloaded onto Word documents to facilitate analyses. Other data sources were also available. As mentioned, the effectiveness of the professional development program was studied in the context of an experimental study. As part of that study quantitative measures of
teacher and student learning were collected along with measures of teaching practices and program implementation (Santagata et al., 2008). In this section, I describe in detail only the measures that are relevant to the analyses presented in this article. Teacher-specialized content and pedagogical content knowledge was measured through a presurvey and postsurvey administered prior to and at the completion of each year of the professional development program. The survey consisted of 38 multiple-choice item stems, which resulted in a total of 52 item scores, organized into three subscales, each targeting one of the three focus content areas. The majority of items were drawn from an item bank developed by Deborah Ball, Heather Hill, and colleagues at the University of Michigan as part of the Learning Mathematics for Teaching project (Hill, Schilling, & Ball, 2004). Other items were taken from assessments developed for the Mathematics Professional Development Institutes, a program designed to improve mathematics instruction in California under the direction of the University of California Office of the President (Hill & Ball, 2004). Finally, a small subset of items was developed directly for this project. In addition the following data sources were available: (a) field notes from professional development sessions and classroom visits collected by the professional development facilitators; (b) field notes from meetings with school and district staff collected by project team members; and (c) memos addressed to district staff and school administrators summarizing project progress. Field notes from the professional development sessions were structured into four sections: (a) issues that arose during the meetings, such as activities that did not go as planned; (b) initial suggestions for solutions to difficulties encountered; (c) overall impressions and reflections, including aspects of the professional development sessions that were effective and went as planned; and (d) reflections on individual participants, including both behaviors that showed participation and understanding and behaviors that showed difficulties or disinterest in the content of the professional development. This structure allowed the project team to use the information collected to both supplement quantitative measures of teacher learning and study the program implementation with a focus on improvements for the 2nd year of implementation. Fieldwork from classroom visits captured the content of the lesson being taught and the extent to which teachers maintained the complexity of the mathematics they presented to students and incorporated teaching strategies discussed during the professional development sessions. At meetings with school and district staff, project team members shared aspects of the professional development program that had been implemented
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
44 Journal of Teacher Education
with success and difficulties they had encountered. Field notes from these meetings summarized both professional development program successes and difficulties and reported on evidence (or lack thereof) of support provided by each school and the district to participating teachers.
Data Analyses As described, each professional development module was designed following the same structure and included similar tasks and questions. This structure guided the analyses of teachers’ written responses to the video-based questions. Teachers’ responses to Content Exploration task questions across the three modules were reviewed first; then responses to Lesson Analysis task questions across the three modules were reviewed. A three-step process was used to identify the most common difficulties teachers encountered. First, teachers’ responses that were not appropriate (i.e., did not answer the question) or revealed difficulties (examples of these are provided later) were marked. Second, when at least two thirds of teachers had difficulties with a particular question, the question was included in a list of problematic questions. Third, the list of problematic questions was reviewed and questions were grouped into categories based on the kind of analyses teachers were required to complete and the knowledge and skills necessary to do so in effective ways. Memos and field notes from professional development sessions, classroom lessons, and meetings were finally reviewed in search of confirming and disconfirming evidence of teachers’ difficulties that emerged from the review of teachers’ responses. As mentioned earlier, field notes always included both comments on aspects of the professional development that had been successfully implemented and aspects that had not. These were used as disconfirming and confirming evidence. Findings presented here prompted the changes that were made to professional development tasks in the 2nd year of implementation.
Findings Overall, there were many aspects of the professional development meetings that worked well: teachers were engaged in discussions; they completed planned activities; and their attendance rate was high. Analyses of teachers’ written responses to professional development questions, however, revealed three sets of difficulties that teachers encountered. These were confirmed by analyses of memos and field notes. The three difficulties were common across the three modules and can be grouped into the following
three categories based on the types of knowledge and analysis skills required to answer the questions: (a) difficulties with questions that relied on teachers’ basic conceptual understanding of the target mathematical concepts, (b) difficulties with questions that built on teacher knowledge of their students’ understanding, and (c) difficulties with questions that requested analyses of student work and reasoning that goes beyond classifications into right and wrong answers. In the following sections, I describe each difficulty by providing samples of teachers’ answers to video-based questions and quotes from field notes. In choosing examples of teachers’ answers, I included a variety of responses to illustrate ways different answers underlined common difficulties. To keep the presentation of the findings within the space limit of a journal article, I also chose examples based on the brevity of the responses. Difficulties With Questions That Relied on Teachers’ Basic Conceptual Understanding of the Target Mathematical Concepts Although we expected teachers to come to the professional development not fully mastering the conceptual basis of the topics we targeted, most of the contentfocused questions we designed assumed a basic understanding of those topics. We were surprised to observe that many of the participating teachers found those questions difficult. One of the facilitators commented, “A lack of content knowledge is evident in teachers’ responses. Some are very candid about this, mentioning they just didn’t know, or had never thought about it.” She continued by providing a few examples of concepts teachers had difficulty with. The first example she provided was “Rational vs. irrational number discussion: 1/3 is rational, but when it is changed to a repeating decimal, it becomes irrational.” Another example read, “Participants could not come up with a variety of ways to compare fractions, even if one fraction was more than half, and one was less than half. Everyone seemed to be content with using common denominators exclusively.” Similar observations were included in researchers’ field notes and the lack of mathematics conceptual understanding was summarized in memos written for meetings with school and district staff. This finding was supported by the multiple-choice survey used to measure teacher content and pedagogical content knowledge prior to and at completion of each year of the program. As mentioned in the Method section, this survey largely consisted of questions drawn from an item bank developed in the context of the
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 45
Learning Mathematics for Teaching project. For three of these items, data from a national sample of middle school teachers were available from a study conducted by Heather Hill (2007). Teachers in our sample scored lower on the survey administered prior to their participation in the professional development program than those in the larger sample: Twenty-nine percent of them answered the three items correctly compared to 47% of teachers in the national sample.
T: My method was, uh, guess and check. I started with 4 first and then I used 5. Clip 2 M: T: M: T:
Difficulties With Questions That Built on Teacher Knowledge of Their Students’ Understanding During the content exploration days, teachers also found particularly challenging questions that required an analysis of their students’ understanding and consideration of ways in which that could be facilitated. Following is one of these types of questions included in the expression and equations module (including the transcript of the video clips teachers were asked to watch) and a few examples of teachers’ responses: The following segments show Dr. B. posing a question, and participants explaining their solution methods. The participants used three methods: guess and check, write expressions, and write an equation. Which of these methods are your students likely to use on their own and which would you need to be prepared to present? If your class were to discuss this problem, what aspects of individual methods or the relation between methods would you want to highlight?
Following is the transcript of the video clips teachers were asked to watch (M indicates the math educator who led the videotaped professional development session, T a teacher in the videotaped session). Clip 1 M: Can you tell me the number I’m thinking of when I multiply it by 2 and then add 3, I get the same answer as when I multiply it by 3 and then subtract 2? So I want everybody to try to figure out—do I have to read it—say it again? Everyone have it? All right, so first work it, see if you can come up with what number I’m thinking of and then once you’ve done it, talk to the people at your table and see, did they get the same one and how did they come up with it . . . How did we do it? Wait—see if you—see if you did it the same way. Okay?
T: (inaudible) M: Well, we’re talking about that, so (inaudible). . . . Does everybody have it? T: Yeah. M: All right, we’ll take a minute. Okay. All right. So who’d like to share the method that was done at your table—yeah, what’d you do?
M: T: M: T: M: T: M: T:
M: T: M:
M: T: M:
T: M:
You said, 2x plus 3 equals 3x minus 2. Mm-hm. Then what’d you do? Okay, so, then I began to move around, um the— as I tell my students—the letters so for instance— 2x—I’m gonna subtract 2x on both sides to get rid of—not get rid of—but to choose—to move the 2x so that it’s on one side of the equation. So I subtract 2x from both sides and I receive— You’re gonna take 2x plus 3and subtract 2x from that side. Is equal to 3x. 3x. Minus 2. Minus 2. Subtract 2x from that side. All right. And so now I have 3 is equal to 3x— Simplify this you get 3— Right. Three is equal to 3x—I’m sorry, um—1x, uh, minus 2. Now, I add 2 on both sides, because I now want to move that negative 2 onto the other side of the equation sign. So I add 2, so I have 3 plus 2 is equal to x minus 2, plus 2. And then result is 5 is equal to x. And what does that tell you? That tells me that the answer set is—for x is 5. Okay. Now, if you recognize that up here, say well, let’s see, three equal sums and take away 2, it must be 5. Do you have to do this step? What do you think? Depends on the teacher. So, so, because—let’s face it—part of school—a big part of school—is pleasing the teacher or doing it the teacher’s way. Or the test. Or the test way. Exactly right. I mean—but the reality is, if you’re trying to find out what’s the solution set for this, the idea is when—all you’re doing is making equivalent equations. You want to get to a point where you say, “Oh, I know what will make this true. 5. It’s obvious.” Well, someone may say, “Well, I can see 5 is gonna work here.” All right. I mean, that’s okay. I mean, I would think, too, that’s okay, as long as you realize 5 because you know 5 minus 2 is 3. Alright. If, on the other hand, I’m teaching, I want you to show me ways to make something equivalent to get down to this that looks like a letter along side of the number—then that’s a different skill that
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
46 Journal of Teacher Education
I’m teaching. Alright? Alright. So you know there’s only one solution to this equation, which means there’s only one solution to that equation. Implicit in the question (i.e., “Which of these methods are your students likely to use on their own, and which would you need to be prepared to present? If your class were to discuss this problem, what aspects of individual methods or the relation between methods would you want to highlight?”) is the fact that the least sophisticated strategy—that of guessing and checking—is a strategy that can be valued. Teachers should make explicit connections to more sophisticated strategies so students can move forward in the learning process by building on their initial understanding. Teachers’ answers missed this implicit suggestion. Some teachers admitted that they did not know what strategy their students would use. Most teachers agreed that students would try the guess-and-check strategy but could not think of instructional strategies that would assist students in learning more sophisticated solution methods. The following are a few examples of teachers’ answers: Teacher A: I have no idea what my students would be able to do with this information. Considerable prodding and coaching would be needed in order to complete these steps. Teacher B: I would expect my students to use a guessand-check system. I don’t understand the rest of the question. Not sure as to how to answer. Teacher C: I think that my students are more likely to guess and check. I would have to present them with the algebraic solution to the problem. Teacher D: Students will most likely use guess and check on their own. I have a few that might opt for the other two, but most of my class will only write expressions and solve and algebra equation when instructed and taught to do so. Teacher E: My students are more likely to guess and check or ask to work with partners. There is then a stampede to partner with the smartest students who are then pressed to give up their answers. Few of my students have the tenacity to sit and try to work this out for themselves. The use of expressions such as “most of my class will only write expressions and solve an algebra equation when instructed and taught to do so” and “present them with the algebraic solution” reflects teachers’ approaches to students’ reasoning. There is no mention of the possibility to build on what students already know to assist them in understanding more sophisticated ways of solving the problem.
Difficulty in Analyzing Student’ Work and Reasoning Beyond Classifications Into Right and Wrong Answers As described earlier, after engaging with the mathematical content, teachers spent a day working on the analysis of a videotaped lesson for each of the target topics. The portion of this analysis that teachers found most challenging was focused on student learning. The following sample task was part of the ratio and proportion module. Teachers were asked to watch selected segments of the classroom lesson and to review samples of students’ work. This task included three questions. The first question focused on the analysis of classroom video and asked teachers to assess the effectiveness of the lesson as it relates to students’ progress toward the learning goals. The same task structure was kept constant across the three modules: Analysis of classroom video: View the following video segments and verify whether the learning goals were achieved: Did the students understand that ratios are multiplicative comparisons? Did the students understand that proportions are equivalent ratios? Did students understand how to set up a proportion with a variable? Have students learned to use various strategies to solve problems involving proportions? Please cite evidence from the video (by marking specific moments of the video) to answer these questions.
Teachers’ responses to these questions were limited to superficial and global analysis of student learning. These analyses seldom referred to specific instances of the video. The following are a few examples of teachers’ answers: Teacher A: I think most of the students understood the multiplicative comparisons between ratios; however, I think that many still held misconceptions about adding. Teacher B: Based on the students that were participating and their responses, it seems that the students did learn all of the goals that were stated in the lesson plan. The only thing that I noticed is that there seemed to not be a wide variety of student involvement, which the teacher should have directed it more to other students not really participating. The students that went up to the board seemed to have various ways of solving the problem. A limited group seemed to understand the ideas that were trying to be taught. Teacher C: [It] looks like students understood the lesson through a variety of methods, but the most effective seemed to be the use of ratio tables.
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 47
Teacher D: I believe they were responding and were responding multiplicative relationships. The students were finding relationships that were proportions that were equivalent ratios. Teacher E: The students were setting up equations. The second set of questions within the same task focused on the analysis of student work. Teachers were asked to review the work that four students did on the questions that the teacher assigned at the end of the lesson and to answer the same four questions related to the lesson learning goals that were asked in the classroom video analysis. Teachers were also asked again to provide evidence—this time from the student work—for why they thought students reached or did not reach the learning goals. Teacher analyses of student work included similar problematic features. They tended to remain at the surface level and/or did not analyze each learning goal separately. Efforts to try to make sense of student solutions and infer their level of understanding were absent or minimal in most cases. The following are a few examples of teacher answers: Teacher A: Most of the students were able to answer Question 1 but had difficulty with question 2. They did not set it up as 5/20 = 2700/x Teacher B: Yes, they were able to figure out that they had to divide and then multiply. Teacher C: Yes, they seem to figure out equivalent fractions by using the “horizontal chart.” Teacher D: Yes, they were able to set up a proportion with a variable using a question mark. Teacher E: Yes, they seem to be able to use various ways to solve this problem. Teacher F: The work that I reviewed demonstrated that the students not only had a good understanding of the problem, but they also have many math skills that my students are lacking. They were able to demonstrate the concept of using their prior knowledge of how many students per feet and apply it correctly to get how many students per 5,200 feet. Constant. Teacher G: It seems, by the examples in the hard copies of the students work, that they did understand how to solve and set up problems and to use various strategies to solve them—one student used a proportion table, one did equivalent fractions, and another solved for x or the variable. The third question provided another opportunity for teachers to cite evidence of student understanding from
the classroom video and the student work. This question asked them to compare the two sources of evidence and discuss in what ways they pointed to the same or different degrees of student understanding. This question also went a step forward and asked teachers to suggest what the videotaped teacher should do next. The following are a few sample answers: Teacher A: The students understood that there was a relationship between ratios. From the video, it seemed like the kids understood the concepts. From the students’ work, however, students had some difficulty. Teacher B: Teaching is done in questioning mode. Students give appropriate responses time after time to pointed questions by teacher on proportions, ratios, and equivalent ratios. Teacher C: Students are able to display the skills to explain how they arrived at their answers. They are able to interpret their situation very well. Just like when they are in trouble, they are recalling step by step what happened. Teacher D: I see it in the notes that various students took that they are capable of understanding the concept of ratios and proportions and also in the video when various students solve the problems using their methods and how they all came up with the same answers but used different methods. As noted earlier, teachers’ comparison of the two sources of evidence remained rather superficial. Although teachers were explicitly asked to suggest next steps, often their answers did not address that part of the question. Overall, the difficulties that teachers found in analyzing students’ solutions and in understanding students’ trajectories in the learning process are problematic. We would expect teachers to build on knowledge of their students’ thinking to plan instruction. In addition, an analysis of students’ thinking that is superficial and limited to the categorization into right or wrong answers contributes to reinforce the assumption that students in low performing schools are not capable of reasoning mathematically. Teachers who can place student performance on a positive trajectory will be more likely to value their students’ answers because they will see them not as failures but rather as indicative of understanding not yet fully developed. Teachers’ low expectations for students often emerged during the discussion of the videotaped lessons. Teachers commented, “These kids aren’t ready for this lesson yet” and “My students can’t multiply.” One teacher told the facilitator that she did not understand urban kids and that
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
48 Journal of Teacher Education
with them you need to go in “baby steps.” Most teachers felt that they needed to provide more preteaching before completing one of our lessons (although the lesson provided was deliberately created to introduce a concept) and many did so in the “Link to Practice” phase of the professional development program.
Year 2 Improvements To address the difficulties teachers encountered, we introduced four changes in the modules for the second year of implementation: (a) increased specificity of content-related questions, (b) focus on common students’ misconceptions, (c) refinement of facilitators’ planning and variation in professional development discourse structure, and (d) increased guidance in the analysis of student thinking. I now describe each change in detail. Increased Specificity of Content-Related Questions Teacher difficulties with questions that relied on basic conceptual understanding of target mathematics topics were addressed by increasing the specificity of the content-focused questions asked in the online tasks. During the 1st year of implementation, we learned a lot about teachers’ difficulties with particular mathematical ideas. This knowledge we developed about our own learners informed the design of the questions in Year 2. These became more targeted and often made explicit connections to other related ideas and concepts to foster teachers’ conceptual understanding. An example from the ratio and proportion module illustrates these changes. At the beginning of the module, teachers were asked to revisit the concepts of ratio and proportion. In Year 1, two broad questions elicited teachers’ knowledge of these topics: 1.
2.
Dr. B. [the instructor in the videotaped professional development session] asks participants to write down what they think of when they hear the terms ratio and proportion. Watch this video clip to see him pose this initial question to the participants. In the space below, write what you think of regarding (a) ratio and (b) proportion. You may use definitions, examples, word associations, etc. In the professional development session, Dr. B. asks the participants to list what comes to mind when they hear the term ratio and what comes to mind when they hear the term proportion. Click the following link to watch the portion of the session in which participants share their responses. As you watch the video, click the “quick mark” button to mark each point you find especially interesting. This will create a time code link. Follow these steps [a series of specific instructions follow] to post a
response explaining what you found interesting about at least two of the points you marked.
This introductory task was modified in Year 2 to include questions that expanded the definition of ratio to include the related concept of fraction and explicitly asked to identify ratio and proportion-related topics in the curriculum. Although the first question remained similar to Year 1, the subsequent questions were modified as follows: 2.
3.
A participant mentions fraction as something he thinks of when he thinks of the word ratio. Reference is made to fractions several times throughout the professional development session. Is there a difference between fractions and ratios? Use examples to clarify your response. A goal of mathematics instruction is to present mathematics as a coherent and connected set of ideas. With this goal in mind, answer the following questions: In what other areas of the mathematics curriculum do ratios and proportions appear? How are these curriculum areas related to ratio and proportion?
Focus on Common Students’ Misconceptions To assist teachers with the analysis of students’ understanding of mathematical ideas, we added questions specifically targeted at the analysis of common students’ misconceptions. In some cases, we provided samples of students’ thinking or solutions to problems and asked teachers to analyze them and to discuss how specific instructional strategies may help students overcome their misconceptions. In the expression and equation module for example, the following question was included: Below is an explanation given by a student of a step used in solving an equation using addition and subtraction: “if you had a positive number on one side of the equal sign, you switch to the other side, it’s gonna become negative, but it’s the same thing.” What does this student understand or not understand about equality and solving equations? What would a teacher say to the student or ask the student to clarify her thinking?
In other instances, we asked teachers to predict difficulties that students commonly have with specific math problems: Consider the difficulties students may have in solving “7 + 4 = x + 5. What value of x makes the equation true?” What do students need to know about equality to arrive at the correct answer? What are two incorrect answers students might give, and what misconceptions lead to those mistakes?
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 49
Refinement of Facilitators’ Planning and Variation of Professional Development Discourse Structure To address difficulties with both questions focused on content and on student thinking, we also introduced changes in the facilitators’ planning of each professional development session. The researchers and facilitators met during the summer prior to the 2nd year of implementation to identify the main ideas and concepts we wanted teachers to understand at the end of each discussion that followed the individual completion of an online task. This detailed outline of learning outcomes for the participating teachers assisted the facilitators in leading the discussion and in funneling teachers’ comments toward better understanding of the target mathematical concepts and of ways students can be assisted in developing more sophisticated understandings. Finally, an intermediate step was built in for some of the most difficult questions. Teachers were asked to discuss in pairs their individual answers before they shared them with the larger group. This was intended to facilitate understanding by providing opportunities to bounce back ideas with one another before the intervention of the facilitator in the larger group. Increased Guidance in the Analysis of Student Learning Teachers’ difficulties with the analysis of student thinking and learning were addressed by providing increased guidance and more targeted questions that focused on specific classroom instances. In the Year 1 modules, teachers were asked to browse through the lesson video and then assess the effectiveness of the lesson within a single task. The task required them to watch segments that portrayed students working on the main problem and to analyze samples of students’ written work. In Year 2, the lesson was broken down in smaller segments and several tasks guided teachers through the analysis of the entire lesson with more specific questions on students’ reasoning and teacher’s instructional decisions. Year 2 questions assisted teachers in focusing on particular aspects of students’ thinking and in analyzing teachers’ actions in terms of their effects on student learning. The following examples illustrate the main differences between Year 1 and Year 2 questions. In the Year 1 ratio and proportion module, the video-based question read as follows: View the following video segments and verify whether the stated learning goals were achieved. Did the students understand that ratios are multiplicative comparisons? Did the students understand that proportions are equivalent ratios? Did students understand how to set up and solve a
proportion with a variable? Have students learned to use various strategies to solve problems involving proportions? Please cite evidence from the video (by marking specific moments of the video) to answer these questions.
In Year 2, teachers were asked to comment on various phases of the videotaped lesson through a series of specific questions. The following are the first few questions that were included in the revised version of the module: Comment on the beginning of the lesson: What information about student understanding did the teacher get by asking . . . Describe the solutions generated by students. What does the teacher do while students are discussing their solutions? What role does the teacher take in the presentation of ideas? How does the classroom discussion promote the development of proportional reasoning? The teacher has prepared counterexamples to test conjectures she had anticipated that students would make. How important do you think this portion of the class discussion is to student understanding? If you would have handled this differently, explain what you would have done and why.
In addition, the analysis of samples of student work was modified to include more examples of student work and an activity that asked teachers to classify student work in terms of level of sophistication of mathematical reasoning as evidenced by the solution strategies used by the students. This activity gave teachers the opportunity to learn to go beyond the surface when assessing students’ performance and understanding.
Summary and Conclusion In this article, I have described the theoretical framework, research base, structure, and content of a videobased professional development program that we implemented during 2 consecutive years with sixthgrade teachers from five low-performing middle schools. I have then presented findings from qualitative analyses of teachers’ responses to video-based tasks and of field notes and memos we collected during the 1st year of implementation. These analyses revealed three categories of difficulties teachers encountered with the questions we had designed: (a) difficulty with responding to questions that relied on their basic understanding of target mathematics topics, (b) difficulty with responding to questions that built on knowledge of their students’ understanding, and (c) difficulty with the analysis of students’ work and reasoning beyond classification into right and wrong answers. These difficulties largely depended on our lack of understanding of the nature of the participating teachers’
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
50 Journal of Teacher Education
mathematical knowledge and of their abilities to analyze student mathematical thinking. As professional development designers and facilitators, we overestimated our teachers’ abilities to analyze the teaching and learning process through a content lens. Although the field of mathematics education, drawing from Shulman’s seminal work on pedagogical content knowledge (Shulman, 1986), has recently developed a sophisticated way to conceptualize content knowledge that is effective for teaching (see among others, Ball & Cohen, 1999; Hill, 2007), this knowledge looks quite different from the knowledge about teaching that many teachers in low-performing schools had opportunities to acquire. In this article, I have documented the gap between a research-driven notion of teacher knowledge and the reality of the knowledge teachers in our sample possessed. A question remains open on whether this gap, perhaps to a lesser extent, also exists in higher achieving contexts. In the second part of the article, I have shared our efforts in revising the professional development program to better assist teachers in the acquisition of that knowledge. This is only a small step in the direction of investigating strategies for making content-focused, videobased professional development effective for this subset of the teacher population. To address teachers’ difficulties, we made four main modifications to the video-based modules and their implementation during the second year: (a) contentfocused questions became more specific, (b) questions focused on common students’ misconceptions were introduced, (c) the facilitators’ planning notes became more structured and the organization of teachers’ interactions was modified to include some work in pairs, and (d) guidance in the analysis of student learning from the videotaped lesson was increased. Through concrete examples of questions posed in the 1st year and ways those questions were modified for the 2nd year, I hope to have provided useful information to other researchers and professional development providers who believe video to be a useful tool for teacher learning. The changes we made can be summarized in three principles for designing video-based professional development tasks for teachers: (a) attending to content-specific understanding, (b) scaffolding analysis of student thinking, and (c) modeling a discourse of inquiry and reflection on the teaching and learning process. These general principles can perhaps be used to frame the design of video-based professional development programs for all teachers. Researchers and professional development facilitators working with teachers in low-performing schools should expect their teachers to start at the lower end of the continuum and to need more guidance and more time before they can independently engage in the kind of inquiry and reflection on teaching these video-based forms of professional development promote.
Assisting these teachers in the improvement of their content and pedagogical content knowledge is not easy. Research summarized in the introduction provides some reasons why that might be the case. Teachers’ attitudes, beliefs, and characteristics may affect their learning from professional development. In addition, the settings in which they work often do not provide the needed support. Despite these challenges, helping these teachers to improve their content and pedagogical content knowledge is important for its impact not only on the quality of their mathematics instruction but also on their beliefs about their students’ potential. Observing our facilitators teaching a lesson to their students was sometimes an eye-opening experience for our teachers who had never seen their students engage in conceptually rich discussions concerning mathematics. Content and pedagogical content knowledge allow teachers to recognize in their students’ errors the beginning of mathematical reasoning and great potential for conceptual learning.
References Acheson, K. A., & Zigler, C. J. (1971). A comparison of two teacher training programs in higher cognitive questioning. San Francisco: Far West Laboratory for Educational Research and Development. Allen, D. W., & Clark, R. J. (1967). Microteaching: Its rationale. The High School Journal, 51, 75-79. Anyon, J. (1981). Social class and school knowledge. Curriculum Inquiry, 11(1), 3-42. Apple, M. W. (2004). Ideology and curriculum. New York: Routledge. Ball, D. L., & Cohen, D. K. (1999). Developing practice, developing practitioners: Toward a practice-based theory of professional education. In L. Darling-Hammond & G. Sykes (Eds.), Teaching as the learning profession (pp. 3-31). San Francisco: JosseyBass. Bielaczyc, K., & Collins, A. (2007). Design research: Foundational perspectives, critical tensions, and arenas for action. In J. Campione, K. Metz, & A. M. Palincsar (Eds.), Children’s learning in laboratory and classroom contexts: Essays in honor of Ann Brown (pp. 89-112). New York: Routledge Borko, H. (2007, April). Video as a research tool for studying instructional practice. Symposium conducted at the annual meeting of the American Education Research Association, Chicago. Borko, H., Jacobs, J., Eiteljorg, E., & Pittman, M. E. (2008). Videos as a tool for fostering productive discussions in mathematics professional development. Teaching and Teacher Education, 24(2), 417-436. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2, 141-178. Clarke, D., & Hollingsworth, H. (2000). Seeing is understanding. Journal of Staff Development, 21(4), 40-43. Darling-Hammond, L., & Sykes, G. (1999). Teaching as the learning profession: Handbook of policy and practice. San Francisco: Jossey-Bass. Davies, N., & Walker, K. (2005, July). Learning to notice: One aspect of teachers’ content knowledge in the numeracy classroom. Paper
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009
Santagata / Designing Video-Based Professional Development 51
presented at the 28th conference of the Mathematics Education Research Group of Melbourne, Australia. Elmore, R. F. (2002). Building a new structure for school leadership. Washington, DC: Alberta Shanker Institute. Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915-945. Gutstein, E. (2003). Teaching and learning mathematics for social justice in an urban Latino schools. Journal of Research in Mathematics Education, 34(1), 37-73. Hiebert, J., Gallimore, R., Garnier, H., Givvin, K. B., Hollingsworth, H., Jacobs, J., et al. (2003). Teaching mathematics in seven countries: Results from the TIMSS 1999 Video Study (National Center for Education Statistics No. 2003-013). Washington, DC: U.S. Department of Education. Hiebert, J., Gallimore R., & Stigler, J. W. (2002). A knowledge base for the teaching profession: What would it look like and how can we get one? Educational Researcher, 31(5), 3-15. Hill, H.C. & Ball, D.L. (2004). Learning mathematics for teaching: Results from California’s Mathematics Professional Development Institutes. Journal for Research in Mathematics Education, 35, (5), 330-351. Hill, H.C., Schilling, S.G. & Ball, D.L. (2004). Developing measures of teachers’ mathematics knowledge for teaching. The Elementary School Journal, 105,(1), 11-30. Hill, H. C. (2007). Mathematical knowledge of middle school teachers: Implications for the No Child Left Behind policy initiative. Educational Evaluation and Policy Analysis, 29(2), 95-114. Jacob, V., Clement Lamb, L., Philipp, R., Schappelle, B., & Burke A. (2007, April). Professional noticing by elementary school teachers of mathematics. Paper presented at the American Education Research Association annual meeting, Chicago. Kennedy, M. (1999). Form and substance in mathematics and science professional development. National Institute for Science Education Brief, 3(2). Madison: University of Wisconsin. Koellner, K., Jacobs, J., Borko, H., Schneider, C., Pittman, M. E., Eiteljorg, E., et al. (2007). The problem solving cycle: A model to support the development of teachers’ professional knowledge. Mathematical Thinking and Learning, 9(3), 273-303. Lampert, M., & Ball, D. L. (1998). Teaching, multimedia, and mathematics: Investigations of real practice. Journal of Mathematics Teacher Education, 2(3), 311-319. Lankford, H., Loeb, S., & Wyckoff, J. (2002). Teacher sorting and the plight of urban schools: A descriptive analysis. Education Evaluation and Policy Analysis, 24, 37-62. Loeb, S., Darling-Hammond L., & Luczak, J. (2005). How teaching conditions predict teacher turnover in California schools. Peabody Journal of Education, 80(3), 44-70. LeFevre, D. M. (2004). Designing for teacher learning: Video-based curriculum design. In J. Brophy (Ed.), Advances in research on teaching: Using video in teacher education (Vol. 10, pp. 235-258). Oxford, UK: Elsevier. Lewis, C., & Tsuchida, I. (1997). Planned educational change in Japan: The shift to student-centered elementary science. Journal of Education Policy, 12(5), 313-331. Limbacher, P. C. (1971, February). A study of the effects of microteaching experiences upon the classroom behavior of social studies student teachers. Paper presented at the American Education Research Association, New York. Oakes, J., Joseph, R., & Muir, K. (2003). Access and achievement in mathematics and science: Inequalities that endure and change. In
J. A. Banks & C. A. M. Banks (Eds.), Handbook of research on multicultural education (pp. 69-90). San Francisco: Jossey-Bass. Santagata, R., Gallimore, R., & Stigler, J. W. (2005). The use of videos for teacher education and professional development: Past experiences and future directions. In C. Vrasidas & G. V. Glass (Eds.), Current perspectives on applied information technologies: Preparing teachers to teach with technology (Vol. 2, pp. 151-167). Greenwich, CT: Information Age Publishing. Santagata, R., Kersting, N., Givvin, K., Stigler, J.W. Rich problems as a lever for change: An experimental study of the effects of a professional development program on students’ mathematics learning. Manuscript in preparation. Santagata, R., & Zannoni, C., & Stigler, J. (2007). The role of lesson analysis in pre-service teacher education: An empirical investigation of teacher learning from a virtual video-based field experience. Journal of Mathematics Teacher Education, 10(2), 123-140. Seago, N., Mumme, J., & Branca, N. (2004). Learning and teaching linear functions: Video cases for mathematics professional development. Portsmouth, NH: Heinemann. Sherin, M. G. (2004). New perspectives on the role of video in teacher education. In J. Brophy (Ed.), Advances in research on teaching: Using video in teacher education (Vol. 10, pp. 1-27). Oxford, UK: Elsevier. Sherin, M. G., & Han, S. Y. (2004). Teacher learning in the context of a video club. Teaching and Teacher Education, 20, 163-183. Sherin, M. G., & van Es, E. A. (2005). Using video to support teachers’ ability to notice classroom interactions. Journal of Technology and Teacher Education, 13(3), 475-491. Sherin Shulman, L.S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14. Spencer, J. (2006). Balancing the equation: African American students’ opportunities to learn mathematics with understanding in two central city middle schools. University of California, Los Angeles. (Unpublished doctoral dissertation). Stigler, J. W., & Hiebert, J. (1999). The teaching gap: Best ideas from the world’s teachers for improving education in the classroom. New York: Free Press. van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571-596. van Es, E. A., & Sherin, M.G. (2008). Mathematics teachers' “learning to notice” in the context of a video club. Teaching and Teacher Education, 24, (2), 244-276. Ward, B. E. (1970). A survey of microteaching in NCATE-accredited secondary education programs (Research and Development Memorandum). Stanford, CA: Stanford Center for Research and Development in Teaching. Whitehurst, G. J. (2002, October). Evidence-based education. Paper presented at the Student Achievement and School Accountability Conference, Washington, DC.
Rossella Santagata, PhD, is an assistant professor in the Department of Education at the University of California at Irvine. Her research focuses on the design and study of professional development experiences for mathematics teachers at both pre-service and in-service levels. Specifically, she is interested in investigating ways in which digital video and multimedia technologies can be used to engage teachers in learning about effective practices.
Downloaded from http://jte.sagepub.com by Armando Loera on October 31, 2009