AllThingsPLC Magazine

Page 1

all things

PLC M A G A Z I N E Winter 2018

SEEING THE WHOLE TREE


all things

PLC M A G A Z I N E

Winter 2018

Features Collective Commitments Jacqueline Heller

The backbone in the anatomy of a lesson.

Learning Champion Matt Devan asks, “What’s your championship?”

Data: More Than a Number Chris Jakicic

Finding a deeper meaning in the jumble.

Heritage Middle School Scott Carr

A look into a community of dedicated learners.

10 18 24 32


To o l s & R e s o u rc e s fo r I n s p i ra t i o n a n d E xce l l e n ce

First thing

4

Finding consensus inside the noise

ICYMI

7

Short bits you might have missed

FAQs about PLCs

9

Combating the anxiety of sharing data

Data Quest

16

Turning data into information

Words Matter

23

Essential standards vs. learning targets

PLC Clinic

39

Collaborative teams working through the struggle

Skill Shop

40

What is it we expect our students to learn?

The Recommender

43

Leadership resources to PLC by

Classic R&D

44

Teacher and student behavior in classrooms

Contemporary R&D

46

Improving classroom instruction

Why I Love PLCs Interdependency is crucial

48


all things

PLC M A G A Z I N E

2

8

SOLUTION TREE: CEO Jeffrey C. Jones

12

PRESIDENT Edmund M. Ackerman SOLUTION TREE PRESS:

18

PRESIDENT & PUBLISHER Douglas M. Rife ART DIRECTOR Rian Anderson PAGE DESIGNERS Abigail Bowen, Laura Cox, Rian Anderson

AllThingsPLC (ISSN 2476-2571 [print], 2476-258X [Online]) is published four times a year by Solution Tree Press. 555 North Morton Street Bloomington, IN 47404 800.733.6786 (toll free) / 812.336.7700 FAX: 812.336.7790 email: info@SolutionTree.com SolutionTree.com POSTMASTER Send address changes to Solution Tree, 555 North Morton Street, Bloomington, IN, 47404 Copyright © 2017 by Solution Tree Press

First Thing Finding Consensus

Inside the Noise

I

n the fall 2017 issue of the All Things PLC Magazine, Becky DuFour used this column to challenge us, as professionals, “to answer why before what and how.” Becky’s comments reminded me of a conversation I had with a Texas school district assistant superintendent as the 2017–18 school season was about to begin. She asked me to focus my comments to her staff on reasons why to pursue the PLC life. I knew one of my reasons why was an intense desire to reduce the noisiness of our professional work. Not surprisingly, the education of students is considered one of several “noisy” professions, according to Princeton University’s Daniel Kahneman and his colleagues. They state, “The problem is that humans are unreliable decision makers; their judgments are strongly influenced by irrelevant factors, such as their current mood, the time since their last meal, and the weather. The unavoidable conclusion is that professionals often make decisions that deviate significantly from those of their peers, from their own prior decisions, and from rules they themselves claim to follow.” They further suggest we conduct a noise audit for our teams by identifying differences and inconsistencies in our daily practices, diagnosing why we have so much noise (variant judgment) on issues for student learning, and then acting to harmonize the noise. Are there any areas of wide variance your collaborative team needs to work on this week or in this unit? Any places of potential student inequity you might be causing by not discussing, sharing, and reaching consensus on certain routines with all members of your team? Think about your routines for homework and makeup work, for assessments and the scoring of student work, for student engagement and lesson design, for understanding standards and the cognitive demand or rigor of student work, and for the nature of


your interventions—just to name a few. The decision-making list is endless in our profession. What are those areas of your professional work decisions and judgments that need to be revealed to your team members, and how might you approach your team to reduce the inequities caused by your team noise as you focus on a greater transparency with one another and move toward a more magnified impact on student learning? From a practical point of view, reducing your noise starts by making sure your team members understand an operational definition of consensus: everyone’s voice on the team is heard, and the will of the group prevails. I would add to this definition, however, that the voice of each individual needs to be an informed voice and not just an opinion. As I told my faculty teams many times over the years, your voice needs to support practices that have evidence bigger than your opinion and support teacher team actions known to increase student learning.

Educators are about Timothy Kanold’s

NEWEST BOOK “This book is a tremendous book study and tool for the PLC toolbox.” —Scott Carr, middle-level reconfiguration coordinator, Liberty Public Schools, Missouri

“HEART! is elegantly courageous. HEART! transitions the reader from book, to journal, to professional diary. Clever.” —Bill Barnes, director of curricular programs, Howard County Public School System, Maryland

LEARN MORE SolutionTree.com/HEART

Our choice to reduce the noisy nature of our work should always be an informed judgment and decision. As my wise colleague Aaron Hansen once told me, “There is a sense of power and freedom that comes when members of your team achieve student learning goals they couldn’t have reached on their own.” Reduce your team noise today. Experience your increased power on student learning tomorrow.

—Timothy Kanold Reference Kahneman, D., Rosenfield, A. M., Gandhi, L., & Blaser, T. (2016). “Noise: How to overcome the high, hidden cost of inconsistent decision making,” Harvard Business Review, 94(10), 38–46. Retrieved from https://hbr.org/2016/10/noise

“HEART! is Tim’s best work ever.” —Matthew R. Larson, K–12 mathematics curriculum specialist, Lincoln Public Schools, Nebraska; president, National Council of Teachers of Mathematics

“It will reshape a toxic culture and help teachers and leaders to look inward at their impact on that culture.” —Sarah Schuhl, consultant specializing in PLCs, assessment, school improvement, and mathematics

“Reading the book felt as though I was engaging the author in a casual but engrossing classroom discussion.” —Suzi Mast, director of K–12 mathematics standards, Arizona Department of Education

“Tim’s rawness and authenticity tugs at the emotions of the reader.” —Nathan Lang, presenter, speaker, and writer


O N L I N E COURSES

Professional learning at your fingertips Get online PD from authors you know and trust With coursework available 24/7, our self-paced online courses are a great way to learn from leading authorities in the field of education, stay on top of changing standards, and develop in-depth understanding of techniques that make a lasting difference in the classroom.

Through the PLC at Work™ Process Today course, you’ll explore how educators are applying research-based strategies and protocols to transform their schools into high-performing PLCs.

Earn graduate credit or CEUs while gaining instant access to dynamic PD

ENROLL TODAY

SolutionTree.com/Courses

Solution Tree


Students in Handcuffs A video showing an eight-year-old boy handcuffed above the elbows went viral following its release by the American Civil Liberties Union. A school resource officer in Kentucky restrained two children (ages eight and nine) in such a manner because the handcuffs would have slipped off their wrists. A federal judge ruled that such a method of restraint was “unreasonable and constituted excessive force as a matter of law.” Read more: “Federal Judge Rules Handcuffing Little Kids Above Their Elbows Is Unconstitutional,” by Ryan J. Reilly. Huffington Post. www.huffingtonpost.com /entry/handcuffs-little-kids-unconstitutional _us_59e127fce4b0a52aca1809ad?section=us_education

Comfort Dogs in the Classroom New York City’s Department of Education has decided to expand a program that sends therapy dogs into schools to help ease anxieties and agitations and provide support for a number of issues from bereavement to social and emotional learning. The program was piloted last year and considered a success with both students and staff. “New York City’s Comfort Dog program is based on

the Mutt-i-grees curriculum, which was developed by Yale researchers in partnership with North Shore Animal League America and aims to integrate rescue dogs into classroom lessons on empathy, resilience and conflict resolution.”

Read more: “New Documentary Explores the Digital Divide,” by Hechinger Report. U.S. News & World Report. www.usnews .com/news/education-news/articles/2017 -09-19/new-documentary-explores-the -digital-divide

To Kill a Mockingbird Removed From Curriculum

Read more: “New York City Students Need Comfort? In Trots Petey the Shih Tzu,” by Gabriella Borter. Reuters. www.reuters .com/article/us-new-york-education/new -york-city-students-need-comfort-in-trots -petey-the-shih-tzu-idUSKBN1AR0V4

Once again, Harper Lee’s To Kill a Mockingbird has been banned, this time from eighth-grade classrooms in Biloxi, Mississippi. The reason stated is that some of the language “makes people uncomfortable.” Despite the book’s usage of the N-word, many teachers are using the book in their classrooms specifically to spark critical thinking and lively discussions. Where do you stand on the issue?

A Digital Divide Documentary The digital divide continues to widen despite the goal of getting all schools online by 2020. “Nearly one in four school districts still does not have sufficient bandwidth to meet the digital learning needs of students. And even before bandwidth, plenty of schools don’t have the laptops or tablets that students need to get online.” Filmmaker Rory Kennedy explores this issue in the documentary Without a Net: The Digital Divide in America.

Read more: “Despite ‘Discomfort,’ Many Teachers Still Teach To Kill a Mockingbird. Here’s Why,” by Madeline Will. Education Week. http://blogs.edweek.org /teachers/teaching_now/2017/10/amid _debates_about_to_kill_a_mockingbirds _place_in_classrooms_teachers.html

Winter 2018/AllThingsPLC Magazine

7


Data

More Than a N

24


: a

Number I

can remember early in my teaching career getting back achievement test data for my students once each year and challenging my memory about how to use stanines, percentiles, and scaled scores to interpret my students’ scores. I wanted to be able to use the scores to improve my teaching, but I can honestly say that while I enjoyed the analysis activity, very little changed in my classroom as a result. I used this information to know whether individual students were learning at grade level, but not much else was revealed about their learning. Fortunately, now when I work with teachers regarding assessment, I find that they are able to use their data in a much different way. They can identify specifically what students have or have not yet learned, and they are able to detect misconceptions or misunderstandings about individual student learning. A lot has changed about the kind of data we collect for our students in order to make this happen.

What Changed? The first thing that changed was understanding what our mission or purpose is. Schools and districts that operate as PLCs understand that the focus must be on whether students have learned, not on whether the content was taught. We recognize that not every student learns the same way or at the same rate and accept that it’s our responsibility to do whatever it takes to ensure that all students learn at high levels. Collab-

Chris Jakicic oratively identifying essential standards and writing common formative assessments around these standards are the practices that teams use to successfully teach all students. The first of these practices—identifying essential standards— is linked to the first critical question answered by collaborative teams that identify as part of a PLC: What do we want students to know and do? (DuFour, DuFour, Eaker, Many, & Mattos, 2016). When teams agree on their essential standards, they are creating a guaranteed and viable curriculum. A guaranteed curriculum ensures that all students will learn a common set of standards. The term viable means that teams will be able to teach, assess, and provide response time for these standards. Teams typically do not have the time to accomplish all of these steps on all of their standards in one school year. Consider third-grade English language arts (ELA) teachers who teach the Common Core standards. They have 90 standards to teach in a school year with approximately 180 days—about two days per standard. Most of us recognize that this is an impossible task if we expect all students to learn these standards. While most teachers recognize they have too much to teach, some find it hard to make a purposeful decision to identify and treat a subset of standards differently. Teams make sure that all standards have a place in their pacing guides but allow more instructional time for the essential standards—enough so that they can teach, assess, and respond to the assessment. They still teach all of the standards. However, when given the following scenario,


frequent, occur during the instructional process, and are written to provide diagnostic data about student learning. Teams build in time during the instructional unit to respond to students who need additional time and support on essential learning and to provide enrichment to students who already know the essentials.

What Kind of Data Is Most Effective?

it becomes clearer how this works. Consider a kindergarten team that is asked to teach the following two concepts in their ELA program (NGA & CCSSO, 2010): 1. With prompting and support, name the author and illustrator of a story and define the role of each in telling a story. 2. Blend and segment onsets and rimes of single-syllable spoken words (being able to hear the sounds that letters make in words). When asked which of the two is essential, kindergarten teachers easily see the second standard as the most important one. They still teach the first; in fact, they likely teach it multiple times as they are reading stories aloud to students. The difference for collaborative teams in a PLC is that they have agreed to create a common formative assessment for and use intervention time to reteach the second standard (an essential standard) to make sure students are proficient with it. However, the team won’t assess or respond to the first standard. 26

AllThingsPLC Magazine/Winter 2018

Because the teachers agree collectively to this work, the students are similarly prepared as they move to the next grade or course, and teachers don’t feel they have to spend as much time on review the following year. When teachers have clearly identified what all students must know in their grade level or course, and when they know what all students will know as they enter their grade level or course, they find that they actually have the time they need for ensuring learning. The second thing that significantly changed the way we use data was understanding the purposes of different kinds of assessment data and making sure we use the right kind of assessment for the right purpose. In a game-changing article, Black and Wiliam (1998) published their research on the effect of using formative assessment in classrooms to respond during the teaching process when some students hadn’t yet learned. Their research showed that student achievement increased significantly (0.9 standard deviation) when teachers used formative assessment to guide their instruction. Formative assessments are short and

Early in my teaching career, all of the data I had about student learning was a number: a percentile, a stanine, a standard score, or even a percent on a test. A report card grade was provided based on the average of all of these numbers. Data came primarily from paper-and-pencil tests (no computer testing at that time!). Our focus was on summative assessment, which told us what students had learned at the end of the unit, quarter, semester, or year. It defined how students were achieving against what they were expected to achieve. The teams I work with today recognize that data is much more than a number. Some teams gather observational data as students are working (for example, the student follows the agreed-on rules for discussions). Other teams rely on student work and use a rubric to determine whether students are beyond proficient, proficient, partially proficient, or not proficient. Teams examine constructedresponse answers to learn more about what students are thinking when approaching questions or problems. Num-


bers are still used but not exclusively. The biggest change is that summative assessment is used for the purpose of measuring student learning and formative assessment is used to diagnose what students need next. In a balanced assessment system, teams have both summative and formative data to make decisions. They recognize that summative data provides information about the effectiveness of programs, pacing, and curriculum. It also allows teachers to measure the overall learning of students and examine how their learning changes over time. In the case of benchmark assessments, items may include some concepts that have not yet been taught to the students taking the test. Formative data is used to diagnose more precisely what individual students

need to help them learn more. It is often used daily in classrooms after an essential target has been taught and before the teacher moves on. Collaborative teams also use common formative assessments to guide their work. These short frequent assessments are written around essential standards, take place after these concepts have been taught, are given at the same time for all classes, and are used to identify students who need time and support to master essential concepts and students who can benefit from enrichment. In addition to providing information about student learning, common formative assessments provide information to help teams identify successful teaching strategies. Consider this as well: when teams design quality assessments, they are able to learn which students need help but

also what kind of help they need. Examine Wiliam’s (2011) definition of formative assessment: An assessment functions formatively to the extent that evidence about student achievement is elicited, interpreted, and used by teachers, learners, or their peers to make decisions about next steps in instruction that are likely to be better, or better founded, than the decisions they would have made in the absence of evidence. (p. 43, emphasis added)

27


In this definition, Wiliam gets to the heart of the purpose of formative assessment—that is, to use additional instruction that is different from the original instructional strategy based on what we know from the formative assessment. If all a teacher does is use formative assessment to identify students who need help without providing that additional help, or if the response is the same as the original instruction, the assessment is a waste of time. If the team can only guess at why a student didn’t learn, the formative assessment hasn’t really done what it’s intended to do. If we want to be diagnostic and know how to best respond to student learning, the data has to be as focused as possible. This is why we ask teams to unwrap or unpack their standards into learning targets before they teach and assess. Teams examine key words in their standards (typically verbs and noun phrases) to identify all of the specific skills and concepts that must be taught for the students to be proficient on a standard. While they are doing this work, they also spend some time discussing the anticipated outcome of the learning target, often using a taxonomy, such as Depth of Knowledge (DOK), to characterize the rigor of the target. Consider this second-grade standard: “Students identify the main topic of a multi-paragraph text as well as the focus of specific paragraphs within the text” (NGA & CCSSO, 2010). There are two different learning targets that must be taught: (1) identify the main topic of a multi-paragraph text, and (2) identify specific paragraphs within the text. The collaborative team discusses these targets and concludes that they are both at a DOK 2 level. They recognize that students must be able to read a multiparagraph piece of text and determine what topic is being discussed. They must also be able to identify the main idea for each of the paragraphs in the text. A student might be able

to demonstrate proficiency on both of those targets, one of those targets, or neither of those targets.

Designing Quality Common Formative Assessments If our goal is to get back information (data) from our assessments that tells us more than just what percent correct the student scored, a team must create an assessment plan before they start writing questions. They begin this process when they are unwrapping the standard and discussing what proficiency will look like for the learning targets and assigning a rigor level (for example, DOK 2) to each target. They discuss the type of item they will use and how many items they will write. Some teams like to use multiple-choice questions for their assessments because they are easy to score and help prepare students for future high-stakes tests. However, if we want to be able to know not just whether the student has learned a target but also what misunderstanding or misconception a student has about the target, a constructed-response question can provide much more information (data) about the target. Consider the first target from our example. The team will supply the students with a short multiparagraph text that is new to them—that is, one that hasn’t already been discussed in class. The first question might be: “What is the main topic of this piece of text? What information did you use to get this answer?” If the student is able to tell you the correct topic but not provide the support used for the answer, it’s possible the student just guessed the answer. The same might be true if the team used a multiple-choice question instead—the student could have guessed the correct answer. But if the students have to explain the clues from the text they used to know what the main topic is, we know that guessing


didn’t prompt the correct answer. I remember sitting with a second-grade team that was analyzing a common formative assessment about these targets. They noticed that the majority of students who answered the question incorrectly answered it by using the first sentence of the text, which was not the main topic. Once the team noticed this, they were confident they could easily help students overcome this misunderstanding by providing examples where the first sentence was the main topic and examples where the first sentence wasn’t the main topic. They also discussed how, when they were teaching students to

1 2

••

Review norms (focusing on data norms).

When analyzing the results of a common formative assessment, highperforming teams use a protocol to keep the discussion focused on the data (infor-

2

Review the Focus of the Assessment

2

Min ••

••

••

Identify the essential learning targets we assessed and which questions we designed to assess each of them. Review the expectations for proficiency (for example, two out of three correct on a multiple-choice assessment, or a level 3 on the rubric). Discuss any questions we had when we scored student work.

Determine Instructional Strategies

5

15

Min

4

Each team member must participate in this discussion. ••

For each target, identify how many students will need additional time and support.

mation) provided, as well as to make sure they are tackling all important issues, even those that might be uncomfortable for them to discuss. The Protocol for Using Common Formative Assessment Data is intended to help teams analyze the results of an assessment in the time a team typically has for common planning. As you review this protocol, you’ll notice that in step 2, the team reviews which questions are linked to each of the learning targets being assessed. Additionally, they discuss what proficiency looks like for the target. For example, let’s say that they decided to use four multiple-choice questions to determine

5

Discuss the Data

Min Establish the purpose of the meeting.

Using a Protocol for Data Analysis

3

Set The Stage

••

write multiparagraph essays, they would be careful not to imply that the topic sentence is always the first sentence. When a team plans the assessment before they start writing questions, they ensure that every question on the assessment is linked to one of the essential learning targets they are assessing.

Determine Student Misconceptions and Errors

10

Make sure that all team members have the same understanding of what this will look like. ••

Decide whether we will develop small groups for reteaching or if we will use a reengagement lesson with the whole class.

••

Each teacher should share his or her original instructional strategy so that we can see if one strategy worked better for certain students.

••

Min

••

Be careful to do this step one essential learning target at a time. ••

For each target, identify which students need help.

••

Once we’ve identified the students who need help, regroup them by specific need (for example, students who made a calculation error versus students who chose the wrong solution pathway).

6

Min

For each target and for each mistake or misconception, develop a plan to help students move ahead on their learning of that target. If necessary, go back to best practice information about how to teach the concept or about what strategies work best for struggling students. Consult instructional coaches or specialists if necessary.

Develop Monitoring System

10 Min

This reassessment may be done orally or may be a version of the original assessment. ••

Develop the items that we will use to monitor whether students met the learning target after this response. This will provide information about which students still need help on this essential target.

Winter 2018/AllThingsPLC Magazine

29


EVERY 26 SECONDS, A STUDENT DROPS OUT OF SCHOOL Creating a learning-centered, collaborative culture is more urgent than ever Ideal for team meetings and whole-school professional development

Timebomb addresses the urgency of reducing dropout rates and preparing students for a better future.

START NOW

SolutionTree.com/Timebomb


AllThingsPLC Magazine | Winter 2018

Discussion Questions

ll

a things

PLC N E17 Z I Fal l 20 G A M A

NG SEEI

THE

WHO

REE LE T

Use this convenient tear-out card to go over and reinforce the topics discussed in this issue with the members of your team.

Collective Commitments: The Backbone in the Anatomy of a Lesson (p. 10) 1.

What are your collective commitments? Are they truly commitments, or are they merely words in a binder on a shelf?

2. In what ways do (or should) those collective commitments drive your work? 3. How can you (as an individual and/or a team) inspire others to embrace the collective commitments?

4. In what ways is celebration a part of your PLC? Can celebration be tied to collective commitments?

Data: More Than a Number (p. 24) 1.

Discuss with a partner or your team members what “data is more than a number� means to you.

2. What protocols do you currently have in place to review and use student achievement data? In what way(s) could these protocols be improved upon?

3. How do you use formative assessment data in your classroom? Does your team create common formative assessments? Why, or why not?

4. In your next team meeting, use the Protocol for Using Common Formative Assessment Data. In what ways is this protocol beneficial?

Heritage Middle School: A Community of Learners (p. 32) 1.

What is the PLC origin story of your school? In what ways has the story evolved? In what direction would you like to see it go?

2. What are your thoughts about Heritage Middle School vision statement: A Community of Learners? Does this describe your school?

3. What obstacles have you faced as a PLC, and how did you overcome them?

SolutionTree.com/PLCmag


all things

PLC

Strategies & Stories to Fuel Your Journey Each issue includes inspiration, fixes, tools, and more. A must-have for emerging and veteran PLCs.

M A G A Z I N E Print and digital versions available

lgls a in

Subscribe Online

th

PLC

N E Z I Fall 2017 G A A M

SEE

THE ING

WH

OLE

TRE

E

SolutionTree.com/ATPLCMagazine

or Call 800.733.6786


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.