Designing Effective Assessments

Page 1

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

—Diane Hren, Head of Educational Programs at ACS International Schools, Cobham, England

“This book empowers classroom educators to enter the challenging world of assessment. . . . With compelling research and easy-to-use checklists, the authors provide a bridge from theory to practice. . . . Every teacher in every grade level and subject will benefit from the clear, thoughtful, and practical guidelines in this book.” —Douglas Reeves, Founding Partner, Creative Leadership Solutions, Boston, Massachusetts

Assessment is a critical component of effective teaching and learning. Designing Effective Assessments presents K–12 educators with ten key, research-based tools to create quality assessments that provide valuable data. Authors James H. Stronge, Leslie W. Grant, and Xianxuan Xu offer clear directions for incorporating these tools into daily classroom practice to make effective use of assessment data. With quality assessment processes in place, teachers at all grade levels can accurately measure student mastery and shape instruction to increase achievement. Readers will: • Study research that supports each of the assessment design tools described in this book

DESIGNING EFFECTIVE ASSESSMENTS

“Designing Effective Assessments provides practical information, advice, and activities for assessing the links between teaching strategies and learner outcomes—both affective and cognitive. . . . It is a valuable resource for self-study or collaborative peer learning.”

• Explore the benefits of involving students in the assessment process

• Examine how standards-based grading and reporting communicate student learning better than traditional assessment practices • Use reproducible handouts to create effective assessment and feedback practices

STRONGE • GRANT • XU

SolutionTree.com

Effective

Assessments James H.

STRONGE Leslie W.

• Learn how to align grading policies and practices to ensure they are valid and reliable

Visit go.SolutionTree.com/assessment to download the free reproducibles in this book.

DESIGNING

GRANT

Xianxuan

XU


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

Copyright © 2017 by Solution Tree Press

Materials appearing here are copyrighted. With one exception, all rights are reserved. Readers may reproduce only those pages marked “Reproducible.” Otherwise, no part of this book may be reproduced or transmitted in any form or by any means (electronic, photocopying, recording, or otherwise) without prior written permission of the publisher. 555 North Morton Street Bloomington, IN 47404 800.733.6786 (toll free) / 812.336.7700 FAX: 812.336.7790 email: info@SolutionTree.com SolutionTree.com Visit go.SolutionTree.com/assessment to download the free reproducibles in this book. Printed in the United States of America 21 20 19 18 17

1 2 3 4 5

Library of Congress Cataloging-in-Publication Data Names: Stronge, James H. | Grant, Leslie W., 1968- author. | Xu, Xianxuan, author. Title: Designing effective assessments / authors, James H. Stronge, Leslie W. Grant, and Xianxuan Xu. Description: Bloomington, IN : Solution Tree Press, 2017. | Includes bibliographical references and index. Identifiers: LCCN 2016048520 | ISBN 9781936763702 (perfect bound) Subjects: LCSH: Educational tests and measurements. | Students--Evaluation. Classification: LCC LB3051 .S8829 2017 | DDC 371.26--dc23 LC record available at https://lccn.loc.gov/2016048520                                                      Solution Tree Jeffrey C. Jones, CEO Edmund M. Ackerman, President Solution Tree Press President and Publisher: Douglas M. Rife Editorial Director: Sarah Payne-Mills Managing Production Editor: Caroline Weiss Senior Production Editor: Todd Brakke Senior Editor: Amy Rubenstein Copy Editor: Ashante K. Thomas Proofreader: Evie Madsen Editorial Assistants: Jessi Finn and Kendra Slayton


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

Table of Contents Reproducible pages are in italics.

About the Authors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 An Overview of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Summary: So Where Do We Go From Here? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Chapter 1 Enhancing the Validity and Reliability of Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 What Research Says About Validity and Reliability of Assessment . . . . . . . . . . . . . . . . . 6 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Reducing Errors in Measuring Student Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Examining Quality of Assessment Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 Comparing Item Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

Chapter 2 Measuring Student Attitudes, Dispositions, and Engagement Using Affective Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 What Research Says About Affective Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Assessing Student Academic Self-Efficacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Assessing Student Academic Motivation and Perseverance . . . . . . . . . . . . . . . . . . . . . . 32 Assessing Student Learning Styles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Template for Choosing or Designing an Affective Assessment Instrument . . . . . . . . . 36

Chapter 3 Assessing Student Criterion-Referenced Learning Using Performance-Based Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 What Research Says About Performance Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 Self-Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 Planning Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 Authenticity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

Chapter 4 Documenting Student Progress Through Portfolios . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 What Research Says About Student Portfolios . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Portfolio Planning for Teachers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Portfolio Planning for Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Teacher Self-Assessment of Student Portfolios . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

v


DESIGNING EFFECTIVE ASSESSMENTS

Chapter 5 Creating Rubrics for Student Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 What Research Says About Rubrics for Student Feedback . . . . . . . . . . . . . . . . . . . . . . . . 57 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 Teacher Self-Assessment Rubric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Template for Creating a Rubric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 Create Rubrics With Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

Chapter 6 Building Practical Grading Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 What Research Says About Grading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Frame Your Thinking About Grading Policies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Teacher Self-Assessment Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Standards-Based Grading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

Chapter 7 Building Valid and Reliable Grading Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 What Research Says About Validity and Reliability Issues in Grading Policies and Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 Recommendations From Measurement Experts to Improve Grading Policies and Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Self-Assessment of Valid Grading Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 Conversation Guide on Valid Grading Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 Aligning Assessments to Learning Goals for Valid Grading Practices . . . . . . . . . . . . . . 92

Chapter 8 Improving Communication Through Standards-Based Grading . . . . . . . . . . . . . . . . . . . . 93 What Research Says About Standards-Based Grading . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 Student Reflection Questions for Individual Assessments . . . . . . . . . . . . . . . . . . . . . . . 101 Student Reflection on Body of Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Selection of Assessments for Report Card Summary Judgment . . . . . . . . . . . . . . . . 103

Chapter 9 Understanding and Using Standardized Assessment Data . . . . . . . . . . . . . . . . . . . . . . . 105 What Research Says About Standardized Assessment Data . . . . . . . . . . . . . . . . . . . . . 105 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 Data Matters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 Data-Analysis Activity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 Teacher Self-Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

vi


Chapter 10 Teaching Test-Taking Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 What Research Says About Test-Taking Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 From Research to Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 Measures to Allay Test Anxiety: Teacher Self-Assessment . . . . . . . . . . . . . . . . . . . . . . . 124 Test-Preparation Practices: Critique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 Steps for Solving Common Test Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

References and Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

Table of C ontents

vii


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

About the Authors J

ames H. Stronge, PhD, is president of Stronge and Associates Educational Consulting, an educational consulting company that focuses on teacher and leader effectiveness with projects internationally and in many U.S. states. Additionally, he is the Heritage Professor of Education, a distinguished professorship in the Educational Policy, Planning, and Leadership program at the College of William and Mary in Williamsburg, Virginia. Dr. Stronge’s research interests include policy and practice related to teacher effectiveness, teacher and administrator evaluation, and teacher selection. He has worked with state departments of education, school districts, and U.S. and international education organizations to design and implement evaluation and hiring systems for teachers, administrators, and support personnel. He completed work on new teacher and principal evaluation systems for American international schools in conjunction with the Association of American Schools in South America and supported by the U.S. Department of State. Dr. Stronge has made more than 350 presentations at regional, national, and international conferences and has conducted workshops for education organizations extensively throughout the United States and internationally. Among his research projects are international comparative studies of national award-winning teachers in the United States and China and influences of economic and societal trends on student academic performance in countries globally. Dr. Stronge has authored, coauthored, or edited twenty-seven books and approximately two hundred articles, chapters, and technical reports. His 1994 book Educating Homeless Children and Adolescents: Evaluating Policy and Practice received the Outstanding Academic Book Award from the American Library Association.

Dr. Stronge is a founding member of the board of directors for the Consortium for Research on Educational Accountability and Teacher Evaluation (CREATE). In 2011, he was honored with the Frank E. Flora Lamp of Knowledge Award, presented by the Virginia Association of Secondary School Principals for “bringing honor to the profession” and his “record of outstanding contributions.” He was selected as the 2012 national recipient of the Millman Award from CREATE in recognition of his work in the field of teacher and administrator evaluation.

L

eslie W. Grant, PhD, is associate professor of educational leadership in the Educational Policy, Planning, and Leadership program at the College of William and Mary in Williamsburg, Virginia.

Dr. Grant’s research interests include assessment literacy for educators and international comparisons of teacher beliefs and practices. In addition, she consults regularly with school districts, state education agencies, and international schools in areas related to assessment design and development, strategic planning, and evaluating programs for effectiveness. She is the corecipient of Division H of the American Educational Research Association’s Outstanding Publications Award for her work on improving assessment practices with in-service teachers and educational leaders. She has coauthored seven books and numerous articles and technical reports on these topics and presents at state, national, and international conferences. Dr. Grant is the past president of the Consortium for Research on Educational Assessment and Teaching Effectiveness, having served on the board of directors. In 2015, she was elected to the board of directors for Association for Supervision and Curriculum

ix


DESIGNING EFFECTIVE ASSESSMENTS

Development. Dr. Grant began her career in education as a middle school social studies teacher, serving as instructional leader in the social studies. Additionally, she worked for the California Testing Bureau developing state-customized assessments across the United States.

X

ianxuan Xu, PhD, is as a senior research associate at Stronge and Associates Educational Consulting. Dr. Xu received her doctorate from the College of William and Mary’s Educational Policy, Planning, and Leadership program.

Her research interests are teacher effectiveness, professional development, and teacher and principal evaluation. She is also particularly interested in researching the relationship between culture and educational issues such as teaching, learning, and leadership. She has presented research findings at various U.S. conferences, including the American Educational Research Association, University Council for Educational Administration, and National Evaluation Institute. She is also a contributing author to Principal Evaluation: Standards, Rubrics, and Tools for Effective Performance and West Meets East: Best Practices From Expert Teachers in the U.S. and China. Visit www.strongeandassociates.com to learn more about Dr. Stronge, Dr. Grant, and Dr. Xu’s work. To book James H. Stronge, Leslie W. Grant, or Xianxuan Xu for professional development, please contact pd@SolutionTree.com.

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

x


T

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

Introduction he practice of assessing student learning is essential for effective teaching and learning. Indeed, when quality assessment is occurring, instruction and assessment become almost inseparable. Thus, assessment isn’t an add-on, nor is it something teachers do every Friday after a week’s lesson study. Rather, high-quality assessment provides teachers with valuable information regarding the extent to which students have attained the intended learning outcomes, and it informs teachers’ instructional decision making (what to teach and how to teach). Further, assessment can be conducted for summative purposes, for instance, evaluating student performance at the end of an instructional unit by comparing it against some standard, benchmark, or one another. Assessment can also involve formative practices, such as monitoring student progress, making instructional decisions based on the progress, and providing meaningful ongoing feedback.

witnessed the development and deployment of a new generation of assessments designed to measure students’ mastery of complex content and a wider range of skills necessary for college and career readiness in the 21st century. Read any list of 21st century standards and we can find references to the development of competencies and skills in cognitive domains, such as problem solving and critical thinking, as well as in noncognitive domains, including collaboration, perseverance, and task orientation. This shift of focus in what we prepare our children and youth for calls for a matching transformation in assessment, because in many cases assessment not only reflects instruction but also drives classroom instruction. In increasing numbers, schools are beginning to use portfolios, exhibitions, oral presentations, and in-depth investigations to gauge and propel students’ complex learning, which conventional standardized assessments cannot readily measure.

We see assessment in the 21st century classroom— in the areas of assessing knowledge, skills, and dispositions in domains of learning other than the cognitive domain—as more balanced than it was during the 1990s and early 2000s; using a wider variety of assessment tools, guides, and techniques; reconceptualizing the assessment cycle; and involving students in the assessment of their own learning and the learning of their peers. As David T. Conley and Linda DarlingHammond (2013) note, assessment can and should be seen as a system that provides rich experiences and ways for students to show what they know, can do, and value.

Although the ultimate goal in a quality curriculumassessment-instruction alignment is the use of assessment to guide instruction and learning, there must be a prerequisite step. First, we must answer the question, “How do we design proper assessments?” In this book, we primarily focus on this prerequisite step, addressing how to assess student learning, and secondarily on how to use assessment for student learning.

Ships constantly need real-time data about their location, working conditions, and weather and ocean conditions in order to navigate. Having accurate, available data is vital to crews. Likewise, effective assessment is a cornerstone of almost every endeavor in quality teaching and learning in our classrooms. Assessment data allow teachers to know the latitude and longitude of their students in terms of the curriculum goals, and to navigate better toward benchmarked concepts and skills. In fact, since the turn of the century we have

The question of how to assess student learning hinges on the importance of designing effective assessments. If we assess learning using poorly designed and poorly implemented assessments, we can all but guarantee that the ensuing result will be useless information. There is an old adage often applied to computer applications: garbage in, garbage out. Before we can truly make quality use of assessment information, we must first have quality assessments. Well-designed assessments can provide diagnostic information regarding students’ mental readiness for the new learning and also generate formative and summative information needed to monitor student progress and adjust instruction. Rigorous assessments can also keep students motivated and hold them accountable for their

1


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

2

DESIGNING EFFECTIVE ASSESSMENTS

own learning. In addition, quality assessments can help both teachers and students stay focused on what really matters in the subject and compel students to retain and transfer what they have learned.

• An introduction to the assessment tools

An Overview of the Book

• A chapter summary that lists and describes the assessment handouts we provide

Based on a review of research on design and application of assessment in the classroom, we’ve organized the chapters in Designing Effective Assessments around ten key assessment principles and supporting design handouts and guides. 1. Enhancing the validity and reliability of assessments 2. Measuring student attitudes, dispositions, and engagement using affective assessment 3. Assessing student criterion-referenced learning using performance-based assessment 4. Documenting student progress through portfolios 5. Creating rubrics for student feedback 6. Building practical grading practices 7. Building valid and reliable grading practices 8. Improving communication through standards-based grading 9. Understanding and using standardized assessment data 10. Teaching test-taking skills We designed this book to present each assessment of learning principle in its own chapter. It is important to note that, although we refer to each item as a tool, each assessment principle is as varied in its construction and focus as teachers’ and students’ needs when designing assessments. Some are tools in the literal sense of the word, while others act as activities and guides for you to use in constructing assessments. These tools are adaptable to multiple grade levels and classroom environments. Each chapter includes explicit strategies teachers can employ in the everyday life of an effective classroom. To make the featured assessment tools relevant and useful, each chapter includes the following sections.

• What research says about the assessment tools

• How to move from research to practice in using the assessment tools

To end each chapter, we include selected handouts to help teachers use these assessment tools immediately. (Visit go.SolutionTree.com/assessment to download free reproducible versions of these handouts.) Our intent is for teachers and school leaders to take the tools they find useful right off the page and put them into practice as seamlessly as possible. The tone and style of most handouts are addressed to teachers more directly. At the same time, we trust instructional leaders, coaches, and principals will find it easy to adapt them for their own use, either to guide instructional conversations with teachers or to help teachers reflect on more effective assessments as they adopt and design valid, reliable, worthy, fair, and useful assessments for their classrooms. It is our hope that teachers can use many of the featured tools to self-assess and reflect. Administrators can use the tools to evaluate teacher assessment practices from the formative perspective. As table I.1 summarizes, we support three specific groups of educators in the important work of delivering effective teaching through the effective use of assessment tools. Table I.1: Goals for Each Audience

Audience Teachers improving practice

Goals • Self-reflection • Guided study • Teacher-directed growth

Teachers teaching teachers

• Mentor tips • Instructional coaching tips • Peer networks

Leaders supporting teachers

• Directed growth • Supervisor support for teachers • Coordinated curriculum


Summary: So Where Do We Go From Here? In addition to quality and engaging instruction, the best teachers have the know-how to apply a range of assessment tools to understand student learning, to understand their own effectiveness, to communicate and interact with students around learning goals and outcomes, and to support student success. We know from research that quality assessment is fundamental to good teaching. And having the proper knowledge of how to select, design, and apply good assessment

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

Intr oduction

tools is paramount to being able to meaningfully and fruitfully use assessment data.

Our goal for this book is to reinforce and help make designing effective assessments a fundamental part of an effective teacher’s skill set. Assessment skill isn’t the endgame of teaching and learning, but it certainly is an essential component of every effective teacher’s regular routine. It is our hope that this book motivates you to broaden your assessment of learning skills and connects quality assessment with quality teaching and learning. We hope you find this guide on assessment of learning to be practical, solidly researched, and easy to use. Now, let’s put these assessment tools to use in your school or classroom.

3


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

Chapter 3

A

lthough there are many ways to categorize student assessments, the two most prominently recognized types are norm referenced and criterion referenced. The primary purpose of norm-referenced assessment, which we cover in more detail later in this chapter, is to compare each student with respect to the achievement of others in broad areas of knowledge and skills. Norm-referenced assessment often has test items varying in difficulty so that they discriminate high and low achievers. (We look at norm-referenced assessment again in chapter 9, page 105, in regard to standardized testing, which is the popular form of norm-referenced assessment.) The topic of this chapter—performance assessment—is more closely associated with the other type, criterionreferenced assessment. It is worthwhile to notice that criterion-referenced assessment can also include standardized tests. The Advanced Placement exam and the National Assessment of Educational Progress are both standardized and criterion-referenced forms of assessment. However, most of the time criterion-referenced assessment takes the form of low-stakes tests used in the classroom to measure the mastery of content and skills of individual students, identify areas in need of remediation, and inform instruction modification. As long as the primary purpose of an assessment is to determine if students have achieved specific learning

standards, rather than ranking students into a bell curve to create percentile scores, you can safely classify the assessment as criterion referenced. Because the primary purpose of criterionreferenced assessment is to determine whether each student has achieved specific skills or knowledge outlined in a designated curriculum, it is not about comparing one student against another, but rather about comparing each individual with a preidentified standard of expected achievement or performance. Educational psychologist Robert Glaser (1963) coined the term criterion referenced when he explained that “measures which assess student achievement in terms of a criterion standard provide information as to the degree of competence attained by a particular student which is independent of reference to the performance of others” (p. 519). Performance assessments, such as essays, reports, oral presentations, and other kinds of projects, serve as the foundational strategy for criterionreferenced assessment. Although it has meaning within the context of criteria (that is, criterionreferenced assessment), performance assessment is a general term describing assessments that require students to demonstrate skill and knowledge by producing a formal product or performance (Sivakumaran, Holland, Heyning, Wishart, & Flowers-Gibson, 2011).

© 2017 by Solution Tree Press. All rights reserved.

Assessing Student CriterionReferenced Learning Using Performance-Based Assessment

37


DESIGNING EFFECTIVE ASSESSMENTS

What Research Says About Performance Assessment

Practical Attributes of Performance Assessment Performance assessment focuses on the accomplishment or outcomes that students must demonstrate when they finish a certain amount of learning. Additionally, performance assessment has many advantages over more conventional assessment, such as multiple-choice tests (Popham, 2014; Tanner, 2001; Waugh & Gronlund, 2013). For instance, consider the following attributes of performance assessment. • Performance assessment offers clear specifications to students about what knowledge, skills, and attitudes they are expected to demonstrate. Since performance assessment tends to be

task specific, it is easier for teachers to communicate precisely about what the students must do in order to meet the standard. For example, teachers often share or build rubrics with students when conducting performance assessment to clarify criteria of quality.

• Performance assessment is not only an assessment of learning but also assessment for learning. In other words, the assessment, itself, is a learning experience and not merely a testing session. For instance, when students work on an essay, they can receive feedback on the outlines, drafts, and other preliminary products before you assign the final grade. Assessment is no longer a onetime event, but a process where students can benefit from feedback and have extra opportunities to work toward the goal. • Performance assessment involves assessment tasks that are authentic and meaningful beyond the classroom. It is conducted not just for data gathering purposes but also as a learning experience that helps prepare learners to succeed beyond the classroom. For instance, when teacher use a laboratory report to evaluate a student’s achievement in physics, the skills measured in that report—hypothesis making, data collection, data analysis, sound reasoning, and clear explanations— are applicable and transferable to other situations, or even to problems the student will encounter in his or her future career. • Teachers can more easily adapt performance assessment to accommodate student needs, especially when delivering differentiated instruction. Teachers have the latitude to modify learning activities— and assessments—from student to student according to the differences in students’ levels of prior knowledge, motivation, and interests. For instance, for research projects, teachers can provide struggling students with more materials, clues, prompts, or scaffolding so that they can succeed; while, for students who are more advanced, the teacher can delegate full autonomy.

© 2017 by Solution Tree Press. All rights reserved.

Conventional multiple-choice or other close-ended test items are useful, but not entirely sufficient to measure valuable student learning outcomes such as analytical abilities, problem-solving skills, and creativity. Performance assessment, such as essays, projects, portfolios, reports, debates, role playing, oral presentations, where students have to demonstrate their learning through products, is more effective in assessing higher-order skills. Assessing these complex skills often requires the assessors to make a certain degree of inference based on the collected student learning data. Indeed, performance assessment can draw out students’ underlying abilities, and when teachers appropriately design and execute the assessment, the inferences can be valid (Lai, 2011). Also, students engaged in performance assessment sustain their attention over longer periods of time and master content better because the assessment itself is an authentic and meaningful learning experience (Erwin, 2015). Judith T. M. Gulikers, Liesbeth Kester, Paul A. Kirschner, and Theo J. Bastiaens (2008) find that when students perceive assessment tasks to be authentic, they are more likely to use deeper learning approaches and achieve better learning outcomes. They further defined authentic assessment as requiring students to use the same competencies or combinations of knowledge, skills, and attitudes that they need for addressing real-life demands.

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

38


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

As s es s in g Stu d e n t C r ite r io n -Re fer enced Lear ning Using Per for m ance-B ased A ssessment

Performance Assessment and Constructivist Learning

• Define the outcomes that would demonstrate competency. • Perform a complete cognitive analysis of the assessment. • Align assessment methods with instructional context, techniques, and goals. • Ensure the assessment task is a direct measure of the outcome. • Make the task clear, precise, and measureable. • Design performance assessments that require application and other higher-order thinking skills. • Make tasks contextual or relevant to the real world. • State criteria for evaluation explicitly. • Use work examples to clarify criteria. • Consider learner characteristics, prior knowledge, and skills for the task. • Analyze targeted learner needs. • Offer learners choices or allow personalized solutions. • Provide learners with descriptive feedback after the performance. Using constructivist and practical approaches to more effectively gauge student learning is essential in designing effective performance assessments. Let’s take a look at how to implement these approaches in practice.

Performance assessment frequently examines how well students’ learning is in alignment with a particular criterion. For instance, the criteria for a laboratory report may include introduction, materials, methods and procedures, data analysis, and conclusions. When using performance assessment as a form of criterion-referenced assessment, educators are little concerned with how learners compare to one another. Rather than ranking students in their abilities to perform selected tasks (the normative approach), they care more about whether the students can create equations that describe relationships of variables, or cite textual evidence to support inferences drawn from the text. Usually, it is important for students to master criteria because they represent accomplishments that have independent value in the learning disciplines. As Tanner (2001) points out, the key, and also probably the most difficult part, to designing performance assessment is establishing which particular outcome reflects the criterion and determining how well the student must perform in order to be judged to have an adequate command of the criterion. The technique of using a table of specifications (see Validity, page 7) is useful to dissect the criterion and ensure alignment between teaching and student mastery. Assessment questions for performance tasks often have the following characteristics (Gareis & Grant, 2008). • Focusing on the application (rather than memorization) of knowledge • Using real-life situations • Grading with a rubric In performance assessments, students apply their knowledge to a real-life situation. Again, because students are often graded on the quality of their responses, teachers grade the assessment with a rubric. Examples of performance assessment include, but are not limited to the following. • Role playing, such as debates or skits, models or simulations • Student performances or products, such as writing, portfolios, posters, or oral presentations

© 2017 by Solution Tree Press. All rights reserved.

Performance assessment is heavily influenced by a constructivist approach to learning. Constructivism focuses on reflective and active construction of knowledge. As criticism of selected-response tests increases, the shift in assessment practices to constructivist learner-centered and performance-based assessment is timely (Henson, 2015). Using the ideas of constructivist learning as a perspective to examine assessment, Theresa A. Butori (2012) identifies a number of best practices for designing performance assessment.

From Research to Practice

39


DESIGNING EFFECTIVE ASSESSMENTS

• Student-designed and studentconducted labs Each of these examples is inclined toward assessing higher-order thinking skills and assessing students as individuals rather than relative to each other. Let us take a look at how you can design your own performance-based assessments and how you can apply them to higher-cognitive-level learning.

Many teachers tend to equate any constructedresponse assessment (for instance open-ended short answers and essays) with performance assessment. However, there are three characteristics essential to designing performance assessment (Newmann & Archbald, 1992; Popham, 2014): (1) multiple criteriabased standards, (2) prespecified indicators of quality, and (3) judgmental appraisal. In performance assessment, students’ performance is typically evaluated using more than one criterion. Criterion-based standards provide points of reference to evaluate the students’ performances. For instance, the inquiry rubric in figure 3.1 uses an array of five steps to calibrate students’ individual performance and an array of four steps for his or her group. In addition, behavior-anchored descriptions accompany each evaluative standard in order to explicate the different levels of performance one might expect. Conventional holistic scoring that creates a letter grade to sum up a student’s performance tends to be generic and glosses over the fine differences across students. Using multiple indicators of quality can help teachers dissect students’ discreet skills and abilities and have a more refined understanding regarding how several individual components contribute to each learner’s performance. The example we use here comes from an eighth-grade science class. In this rubric, the teacher, Mr. Nunes, engaged students in an inquiry project to research the relationship between sea turtles and the bay where the school is located. As you can see, this rubric provides direct assessment of the students’ actual performance skills. The teacher reflects:

I used this rubric with my eighth-grade students for the inquiry project. I like that it is written in student-friendly language, and I think that it spells out how to successfully do well on the assignment. If a student scores all 3s, which is considered proficient, the score will be 86 percent. However, there are also options for a student to go above and beyond. This rubric worked well. Students understood it and were able to make reasonable choices about how much to contribute and how to work together. I also really like the idea of giving both individual and group grades, which I think supports the ideas of accountability, collaboration, and cooperation. Some of the students did have questions about what constituted “thought-provoking,” and as this was the first time we did the project, I didn’t really have any examples to show them. I will be able to make refinements for next year, however, and I have pictures of projects from this year to show as examples. I think the students appreciate that I not only give them a grade but also respond with feedback. I was able to use this rubric to determine areas of strength and weakness for each class. For instance, I noticed that understanding graphic organizers seemed to be a weakness across the board, which tells me that I need to include more lessons on that particular topic. (L. Nunes, personal communication, June 8, 2014)

These are just some of the ways you can design a rubric to evaluate student performance in the classroom, but you can take this process further, including using your assessments to gauge highercognitive-level learning.

Applying Performance Assessment to HigherCognitive-Level Learning It is worth noting that performance tasks also work well for higher-order thinking standards, such as those written at the Analyze, Evaluate, or Create levels from Bloom’s taxonomy revised. To illustrate, look at the following rubric that an eighth-grade health teacher, Ms. Washburn, collected (see figure 3.2, page 43; J. Washburn, personal communication, May 28, 2014). Ms. Washburn developed a unit

© 2017 by Solution Tree Press. All rights reserved.

Designing Performance Assessments for Classroom Use

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

40


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

As s es s in g Stu d e n t C r ite r io n -Re fer enced Lear ning Using Per for m ance-B ased A ssessment

Student: Ariel The following steps help determine the student’s individual grade. Emerging (One point)

Developing (Two points)

Practicing (Three points)

Excelling (Three and a half points)

I contributed one fact that we already knew about a local ecosystem, but was missing either how I know or how to further validate claim.

I contributed one fact that we already know about a local ecosystem, how I know it, and a way to further validate the claim.

I contributed two facts that we already know about a local ecosystem, how I know it, and a way to further validate the claim.

I contributed three or more facts that we already know about a local ecosystem, how I know it, and a way to further validate the claim.

Step 2: Answer the research question: What do I need to know?

I contributed one thing we wanted or needed to know, but was missing a promising resource, applicable search terms (if appropriate), or all of these.

I contributed one thing we wanted or needed to know, a promising resource, and applicable search terms (if appropriate).

I contributed two things we wanted or needed to know, a promising resource, and applicable search terms (if appropriate).

I contributed three or more things we wanted or needed to know, a promising resource, and applicable search terms (if appropriate).

Step 3: Investigate the known.

I contributed one investigative report form with at least two things learned from the resource, but was missing crucial information (such as why the information was relevant or the citation).

I contributed one completed investigative report form with at least two things learned from the resource.

I contributed two completed investigative report forms with at least two things learned from each resource.

I contributed three or more completed investigative report forms with at least two things learned from each resource.

Step 4: Share findings in a product.

I contributed at least one element to the final product.

I contributed at least one element to the final product and helped with at least one other.

I contributed at least two elements to the final product and helped with at least one other.

I contributed at least two elements to the final product, helped with at least one other, and helped to refine the final product as a whole.

Step 5: Reflect on the work.

My reflection considers one of the following: major challenges, how the group overcame the major challenges, my personal greatest contribution, and advice for the future.

My reflection considers two or three of the following: major challenges, how the group overcame the major challenges, my personal greatest contribution, and advice for the future.

My reflection considers all of the following: major challenges, how the group overcame the major challenges, my personal greatest contribution, and advice for the future—but in a perfunctory manner.

My reflection considers all of the following: major challenges, how the group overcame the major challenges, my personal greatest contribution, and advice for the future—in an insightful, thoughtprovoking manner.

Figure 3.1: Inquiry project rubric.

continued 

© 2017 by Solution Tree Press. All rights reserved.

Step 1: Determine current knowledge status: What do I know?

41


DESIGNING EFFECTIVE ASSESSMENTS

The following steps help determine the student’s group grade. Emerging (One point)

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

42

Developing (Two points)

Practicing (Three points)

Excelling (Three and a half points)

We have a primary research question.

We have a primary research question that is researchable.

We have a primary research question that is thoughtprovoking and researchable.

We have a primary research question that is thoughtprovoking, researchable, and novel.

Step 2: Investigate the known.

We have at least one sub-question per person.

We have at least one researchable subquestion per person.

We have at least one researchable and relevant sub-question per person.

We have at least one relevant, thought-provoking, and researchable question per person.

Step 3: Organize findings.

We completed a graphic organizer.

Our graphic organizer reflects major findings or key message, but not both.

Step 4: Share findings in a product.

Our product displays our key message and findings.

Individual: 15.5/17.5

Our product displays our key message and major findings in a way that considers the key audience and contains no more than four to eight spelling or grammar errors.

Group: 11/14

Our graphic organizer reflects the key message and major findings.

Our graphic organizer reflects the key message and major findings, and the format of the organizer supports the key message.

Our product displays our key message and major findings in a way that considers the key audience, is visually appealing, and contains no more than three spelling or grammar errors.

Our product displays our key message and major findings in a way that considers the key audience, is visually appealing, contains no more than three spelling or grammar errors, and the format deliberately supports the key message in a thought-provoking way.

Total: 26.5/31.5

Grade: 84 percent (B)

Feedback: Your group completed a strong final product on the role of the sea turtle in our bay, the human impacts on the sea turtles, and potential ramifications of extinction. Your contributions definitely played a part in that. I enjoyed watching your group move through the process and felt like you each helped strengthen the work of the others. The way that everyone pulled together to review and refine the final project certainly helped to make it stronger. The graphic organizer needed to convey your key message, but you included that in your final product.

on evaluating the relationship between health-risk behaviors and the onset of health problems during the adolescent years. She specifically focused on the health risks associated with feelings of immortality in order to help students better understand potential long-range consequences of their current actions. The assessment for this unit was a project in which students interviewed someone from a different

generation, such as a parent, grandparent, or other adult to identify activities considered risky when that person was a teen. Students then worked together in groups of two to four to write, produce, and edit a three-minute podcast segment that compared and contrasted high-risk activities or behaviors that were prevalent twenty to fifty years ago to those of today. The teacher used an assessment rubric with students

Š 2017 by Solution Tree Press. All rights reserved.

Step 1: Answer the research question.


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

As s es s in g Stu d e n t C r ite r io n -Re fer enced Lear ning Using Per for m ance-B ased A ssessment

Group: Therese, Malaki, Ruth Element Organization

Three points

The introduction is inviting, states the goal or thesis, and provides an overview of the issue. Information is presented in a logical order and maintains the audience’s interest. The conclusion strongly states a personal opinion.

The introduction includes the goal or thesis and provides an overview of the issue. Information is presented in a logical order but does not always maintain the audience’s interest. A conclusion states a personal opinion.

There is one goal or thesis that strongly and clearly states a personal opinion and identifies the issue.

There is one goal or thesis that states a personal opinion and identifies the issue.

Two points The introduction includes the main goal or thesis. Most information is presented in a logical order. A conclusion is included, but it does not clearly state a personal opinion.

One point

There is no clear introduction, structure, or conclusion.

A personal opinion is not The personal clearly stated. There is little opinion is not reference to the issue. easily understood. There is little or no reference to the issue.

Reasons and support Three or more excellent reasons are stated with good support. It is evident that a lot of thought and research was put into this assignment.

Three or more reasons are stated, but the arguments are somewhat weak in places.

Two reasons are made but with weak arguments.

Attention to audience

Argument demonstrates a clear understanding of the potential audience.

Argument Argument demonstrates some understanding of the does not seem to target any potential audience. particular audience.

Argument demonstrates a clear understanding of the potential audience and anticipates counterarguments.

Arguments are weak or missing. Fewer than two reasons are made.

Visuals

Visuals are appealing, highly relevant, and add support to the argument.

Visuals are appealing and add support to the argument.

Visuals are related to the topic.

Delivery

Delivery is fluent, with an engaging flow of speech.

Delivery is fluent.

Delivery lacks some fluency. Delivery is not fluent.

Content

All information and facts presented in the project are correct, properly cited, and highly relevant to the point being made.

Grade: 90 percent (A–)

Comments: Great job, guys! Your persuasive commercial really presented the dangers of texting and driving in a way that I think middle schoolers will love. The visuals you chose were highly relevant, and you all sounded like professionals on your voice-overs! Make sure to properly cite all your facts, and try not to list too many in a row (the middle section was not quite as engaging as your beginning and ending). Overall, though, I think you might have a career in advertising or television production! We’ll definitely be playing this on the morning announcements.

All information and facts presented in the project are correct. Most are properly cited and relevant to the point being made.

Information and facts in the project show some signs of misconceptions or are not particularly relevant for the point being made. Citations may be incomplete or missing.

Source: International Research Association and National Council of Teachers of English, 2013. Figure 3.2: Persuasive presentation rubric for healthy, safe habits.

Visuals are not directly related to the topic.

The presentation contains few or no information or facts to back up argument.

© 2017 by Solution Tree Press. All rights reserved.

Goal

Four points

43


DESIGNING EFFECTIVE ASSESSMENTS

on their persuasive presentations. She reviewed the rubric with students before they started their presentations and let them use it while creating their presentations. This sample assessment illustrates the three important attributes of performance assessment we discussed in this chapter.

2. It has multiple scale points to reflect progression of performance. 3. It has performance descriptions that are varied and distinctive enough to facilitate reliable evaluation. Ultimately, performance assessment taps into the teacher’s knowledge about the students and his or her ability to judge students’ performance. It encourages the teacher to exercise the very subjectivity that standardized tests strive to avoid. In addition, to ensure the subjective judgment is consistent and reliable, especially when differentiated instruction and assessment are implemented, the criteria used should be concrete and descriptive enough so that the teacher would make the same inferences about the performance if the assessment were given again.

Summary Close-ended questions fall short in assessing most of the advocated deeper learning and 21st century skills that involve students processing transferable knowledge and solving new problems. Compared with conventional assessments, performance assessments have a higher capacity to assess higher levels of students’ cognitive learning, and they frequently can target students’ abilities to apply skills and knowledge to real-world problems. Therefore, performance assessment is projected to play an increasingly important role in the future of classroom assessment. We have developed three reproducible handouts on the following topics to help teachers put performance assessments into practice in the classroom.

1. The handout “Self-Assessment” captures a number of characteristics of performance assessment that make it different from conventional classroom assessment. Tanner (2001) offers a critical view of traditional assessment in that ill-designed classroom assessment provides poor content coverage—with the skills that need assessment the most being left out. In addition, the assessment is artificial and sterile because it is removed from the situations for which the students should be prepared. This form of conventional assessment tends to include, disproportionately, multiple-choice questions and emphasize lower cognitive skills. Quality performance assessment aims to add more diversity to assessment and to focus on higher-order learning. The “Self-Assessment” handout provides a selfassessment survey for teachers to assess their design of performance assessment. Instructional leaders can also adapt the tool for instructional supervision purposes. 2. The handout “Planning Module” (page 46) provides a planning module to guide teachers through the process of designing performance assessments. It facilitates teachers to answer what to assess and how to assess.

3. An essential attribute of performance assessment is authenticity. Authenticity means that the process and product of the assessment mirrors real-world tasks beyond school learning. We adapted the handout “Authenticity” (page 48) from the components or dimensions of authenticity various researchers proposed (Frey, Schmitt, & Allen, 2012; Gulikers, Bastiaens, & Kirschner, 2004). It is designed to help teachers understand what valid authentic assessments look like, and it can serve as a guideline for establishing connections among assessment, learning, and issues encountered in the real world.

© 2017 by Solution Tree Press. All rights reserved.

1. It is criterion-based and the criteria include organization, goal, reasons and support, attention to audience, visuals, delivery, and content.

oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th

44


oy ew. se j en vi ha se re urc . ea ep p on Pl fre to ersi s re lv i th he ful ck e i Cl th —Diane Hren, Head of Educational Programs at ACS International Schools, Cobham, England

“This book empowers classroom educators to enter the challenging world of assessment. . . . With compelling research and easy-to-use checklists, the authors provide a bridge from theory to practice. . . . Every teacher in every grade level and subject will benefit from the clear, thoughtful, and practical guidelines in this book.” —Douglas Reeves, Founding Partner, Creative Leadership Solutions, Boston, Massachusetts

Assessment is a critical component of effective teaching and learning. Designing Effective Assessments presents K–12 educators with ten key, research-based tools to create quality assessments that provide valuable data. Authors James H. Stronge, Leslie W. Grant, and Xianxuan Xu offer clear directions for incorporating these tools into daily classroom practice to make effective use of assessment data. With quality assessment processes in place, teachers at all grade levels can accurately measure student mastery and shape instruction to increase achievement. Readers will: • Study research that supports each of the assessment design tools described in this book

DESIGNING EFFECTIVE ASSESSMENTS

“Designing Effective Assessments provides practical information, advice, and activities for assessing the links between teaching strategies and learner outcomes—both affective and cognitive. . . . It is a valuable resource for self-study or collaborative peer learning.”

• Explore the benefits of involving students in the assessment process

• Examine how standards-based grading and reporting communicate student learning better than traditional assessment practices • Use reproducible handouts to create effective assessment and feedback practices

STRONGE • GRANT • XU

SolutionTree.com

Effective

Assessments James H.

STRONGE Leslie W.

• Learn how to align grading policies and practices to ensure they are valid and reliable

Visit go.SolutionTree.com/assessment to download the free reproducibles in this book.

DESIGNING

GRANT

Xianxuan

XU


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.