The Toolbox Collection | Vol. 3: Assessment of Student Learning

Page 1

TOOLBOX COLLECTION

VOLUME 3: ASSESSMENT OF STUDENT LEARNING

Editor Brad Garner


VOLUME THREE:

ASSESSMENT OF STUDENT LEARNING

CONTENTS Introduction ...................................................................2 Ensuring Thoughtful Assessment: Strategies That Make the Grade............................................. 4 Creating Meaningful Assignments ...................................................................8 Making the Grade: Get the Most from Course-Based Assessments .................................................................. 11 Assessment à la Carte .................................................................13 Deadlines and Due Dates .................................................................16 The Rubric: A Tool for Authentic Assessment .................................................................18

www.sc.edu/fye/toolbox

A Collection of Assessment Tools and Strategies .............................................................. 21 No-Fault Quiz........................ 21 Hand-In Dates........................ 21 Exit Cards.................................21 Problem-Based Learning ........................................................22 Alternative Ways to Display Knowledge..............23 Drabble......................................24 Mind Maps............................... 25 Oral Discourse......................27 References ..............................................................29

Published by: National Resource Center for The First-Year Experience and Students in Transition University of South Carolina 1728 College Street, Columbia, SC 29208 www.sc.edu/fye The First-Year Experience® is a service mark of the University of South Carolina. A license may be granted upon written request to use the term “The First-Year Experience.” This license is not transferable without written approval of the University of South Carolina. Production Staff for the National Resource Center Brad Garner, Founding Editor Todd Money, Editor Stephanie L. McFerrin, Graphic Artist Tracy L. Skipper, Assistant Director for Publications

www.sc.edu/fye/toolbox

| 1


The Toolbox Collection • Volume 3: Assessment of Student Learning

I

INTRODUCTION

n our 21st century culture, higher education has taken quite a beating. Many say it’s not worth the expense, and that students would be better off simply enrolling in the “School of Life” or its graduate equivalent, “The School of Hard Knocks.” These kinds of observations are not uniquely new. In 1934, Stanford professor Walter Crosby Eells observed: They say that our universities are aimless institutions that have prostituted themselves to every public whim, serving as everything from a reformatory to an amusement park; they are only service stations for the general public; they are a bargain-counter system presided over by quacks; they are places where pebbles are polished and diamonds dimmed … The ultimate values of college education are best summarized in the well-known fact that with a Harvard diploma and a dime one can get a cup of coffee anywhere. The colleges are shamelessly robbing men of priceless years; in a half-century the degeneration of the American college will be complete. (pp. 187, 189) Eells went on to take issue with these “picturesque exaggerations,” as he described them. It would appear that higher education has been, and will always be, under the microscope of public scrutiny. For higher education professionals, it would be quite natural to take offense at negative comments about the perceived quality of the field in general, and, more specifically, the quality of instruction at the course level. Obviously, none of us can make the changes necessary to fix all of the prevailing concerns about the quality and value of a college degree. Each of us can, however, assure that the courses we personally teach are offered at an optimal level that responds to the learning needs of our students. As an example, consider the following questions as they relate to your own actions as a faculty member:

Sometimes we can’t quite put our finger on something important because we’ve got all of our fingers wrapped around a bunch of other things that are not important. -Craig D. Lounsbrough, counselor/writer

»» What is important for students to learn and demonstrate in the courses that you teach? »» How do you know whether they are actually mastering the prescribed course content? 2

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018 »» How well does your teaching contribute to this process? »» Are there things you could do better (e.g., more of some things, less of others) to advance the quality and quantity of their learning? »» How, and on what basis, will you answer these questions? The place to begin seriously considering these queries is with the final one: “How, and on what basis, will you answer these questions?” Quite often, faculty members do so quite glibly and without any supporting evidence. This is not a criticism; rather, it is an observation about how decisions and generalizations are often shared as the truth in the absence of data in every aspect of culture (i.e., “truthiness,” in the words of Stephen Colbert). It is at this point, however, that we can begin to consider the assessment of student learning and its derived data as a guide to crafting verifiable statements of fact about our students, their learning, and the manner that teaching and learning practices contribute to or impede their attainment of course learning outcomes. Suskie (2009) provided some basic principles to assist faculty in the design and use of student-based assessments: »» The best assessments are those whose results are used to improve teaching and learning and inform planning and budgeting decisions. »» The greater the variety of assessment evidence, the more confidently you can make inferences about student learning. »» Students should have multiple opportunities to develop and achieve key learning goals. »» It is unfair to place full responsibility for a key program or institutional goal on one faculty member or one course. »» Assessment is a perpetual work in progress. (p. 36) These words of wisdom can guide our thinking as we proceed through this volume of The Toolbox Collection. The chosen content aims to help faculty reflect on assessment strategies available in their courses and determine the level that student performance should inform how faculty choose instructional strategies and learning experiences. Additionally, a list of sample assessment strategies is included to encourage alternate ways of thinking about how student learning can be verified and documented.

www.sc.edu/fye/toolbox

3


The Toolbox Collection • Volume 3: Assessment of Student Learning

ENSURING THOUGHTFUL ASSESSMENT: STRATEGIES THAT MAKE THE GRADE T

he assessment of student learning is pivotal to course design and delivery. Chosen assessment strategies must always be seen as more than a pathway to calculating final course grades. Good grades do not always equal good learning (Jennings, Lovett, Cuba, Swingle, & Lindkvist, 2013). In reality, although course-based assessments can lead to a grade calculation, they should always be viewed as primary evidence that students have accomplished identified learning outcomes. Faculty can add value and power to assessment by informing and engaging students through a transparent process in which expectations for learning are known and shared.

LINK ASSESSMENTS TO LEARNING OUTCOMES In every course, faculty should intentionally connect learning outcomes, assessment practices, and experiences (Wiggins & McTighe, 2005). Many institutions have used rigorous academic approval processes to thoroughly vet general learning outcomes. Within these established parameters, however, instructors generally have tremendous latitude in designing their assessment strategies and course-related learning experiences. Although the popularity of the backward-design approach Wiggins and McTighe (2005) proposed is growing (i.e., generate learning outcomes, link assessment to identified outcomes, create learning experiences), faculty often focus primarily on creating engaging classroom experiences, typically leaving assessment strategies as afterthoughts. To elevate student assessment in the planning process, analyzing current assessment strategies in relation to the course’s learning outcomes is a good starting point. These questions can provide guidance: »» Does a direct correlation exist between the critically important variables of learning outcomes and assessment strategies? »» What types of assessment strategies can be included in courses to strengthen the evidence of student learning?

DEFINE EXPECTATIONS Clearly described expectations for course performance (i.e., how performance will be measured, what defines the criteria for excellence) are one of the greatest gifts that instructors can give students.To ensure students understand assessment standards for course requirements, instructors should make them aware of the assessment tasks that will evidence their learning (e.g., descriptions, expectations, due dates) at the beginning of the semester; designate class time to help students make the connection between assessments and course outcomes; and give students a rubric that defines excellent performance for authentic assessment tasks (e.g., presentations, projects, group work, written products).

4

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

THINK ABOUT DIFFERENT MODES OF ASSESSMENT Assessment can come in many flavors and be customized to meet a variety of instructional needs. In the book Schools That Learn, contributor Bena Kallick (2000) provides the following illustration of the three types of knowledge that learners must acquire: Imagine that you have a teenage son who is old enough to get a driver’s license—and you are a little nervous about it.You drive him to the licensing agency to take the multiple-choice written test on state driving laws. When he returns with a big grin to tell you that he scored well, you are pleased and relieved. At least he knows the shape of a stop sign, the speed limit in a school zone, and the need to yield to pedestrians. He has proven his mastery of formal knowledge: He knows (or knows where to find) the academic, explicit, codified facts that any expert would need at his or her fingertips. But are you ready to turn him loose with an automobile? Probably not. … Eventually, after further hours of instruction behind the wheel, he passes the full-performance driving test. He proudly brings home his provisional driver’s license. He’s demonstrated applicable knowledge: the ability to transfer into action, even in situations that are less than routine. Under a variety of conditions, he has the proficiency he needs to produce results. You congratulate him, and he immediately asks for the keys to the car. What do you do now? The tests—both the written and the performance test—are inadequate in themselves. All they show is that he knows how to pass the tests. … Formal tests, even good ones, are not enough to assess learning authentically. Before your son can drive the car (or at least mine) alone, he must also show signs of longitudinal knowledge: the basic capability for acting effectively over time in a way that leads to ongoing improvement, effectiveness, and innovation. (pp. 186-187)

www.sc.edu/fye/toolbox

Description of a grade: An inadequate report of an inaccurate judgment by a biased and variable judge of the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite material. -Dressel, (1983, p.12)

5


The Toolbox Collection • Volume 3: Assessment of Student Learning Kallick goes on to categorize these forms of knowledge as demonstrated through teaching and learning: »» Formal knowledge: “Knowing (or knowing where to find) the academic, explicit, codified facts that any expert would need … ” (p. 186) »» Applicable knowledge: “The ability to transfer knowledge into action, even in situations that are less than routine … ” (p. 187) »» Longitudinal knowledge: “The basic capability for acting effectively over time, in a way that leads to ongoing improvement, effectiveness and innovation … ” (p. 187) Think for a minute about how and where these varied forms of knowledge (and the manner they are assessed) could be included in your courses.

VARY TECHNIQUES Providing students with a range of assessment options capitalizes on their learning styles and strengths. When considering learning outcomes and related assessments, instructors should think creatively about how students can demonstrate their learning, mastery of course content, and new skills: »» Do students have varied outlets (e.g., examinations, written work, creative endeavors) that offer different strategies for assessment to communicate their knowledge and expertise? »» Do assessment techniques tap into a hierarchical range of skills (e.g., the elements of Bloom’s taxonomy)?

PROVIDE FEEDBACK Excellent assessment techniques lose their power if students do not receive prompt, specific feedback. Just as students are accountable for submitting assignments on time, faculty should return graded work as quickly as possible, considering these questions: »» What is a reasonable amount of time in which to grade and return student work? »» Do students need specific feedback on submitted assignments (i.e., comments that will assist them in their learning or enhance future performance)?

LEARN FROM ASSESSMENTS Instructors can use course-based assessments to inform and improve teaching by reviewing student work at the end of every semester. This includes analyzing student assessment data to determine consistent areas of deficient performance (e.g., areas that may need additional emphasis or a different form of assessment); finding gaps in the assessment model that need strengthening (e.g., areas where assessments appear incomplete or inadequate); and examining assessment results to determine particular topics or areas of concentration that need greater or lesser emphasis. Assessment is more than assigning a grade. To make it vital and relevant, instructors should review current techniques, advise students of expectations, and analyze data that emerges from students’ work.

6

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

GOOD GRIEF, ANOTHER PAPER? THE FIVE STAGES OF GRADING Presented with a weekend of grading term papers, quizzes, or lab reports, instructors might experience something akin to Elisabeth Kübler-Ross’s five stages of grief. This takeoff on Kübler-Ross’s model (Kübler-Ross, n.d.) offers a humorous take on the often arduous task of grading: »» Denial—At this stage, the instructor is unwilling to acknowledge the size of the task ahead. »» Anger—Anger usually begins once the instructor starts grading. Finding repeated errors in material covered in class can lead to disillusionment. »» Bargaining—In this stage, the instructor makes an earnest attempt to buckle down and grade but negotiates a reward for completing a certain amount of work (e.g., taking a TV break after grading five papers; having a piece of candy for every page graded). »» Depression—At some point in a marking weekend, the instructor will come to realize that, in spite of good intentions, the papers won’t be marked in time for the next class. »» Acceptance/resignation—Finally, the instructor comes to terms with the reality that the papers must be graded and, having finished them, gets primed to begin the process all over again when the next major assignment comes in.

This article was originally published in September 2014.

www.sc.edu/fye/toolbox

7


The Toolbox Collection • Volume 3: Assessment of Student Learning

CREATING MEANINGFUL ASSIGNMENTS C

ourse assignments, in their various forms, can have a tremendous impact on student learning. Assignments perform a formative function by providing information on how well students are grasping course content and a summative function as evidence for achieving the identified learning outcomes. Robert E. Stake captured the difference between these two aspects of designing assignments as meaningful evidence of student learning: “When the cook tastes the soup, that’s formative; when the guests taste the soup, that’s summative” (Patton, 2008, p. 171). If we are serious about student learning and the ability to document the level it occurs, the creation of meaningful assignments becomes a key element in course design. Two of the five benchmarks for effective educational practice that the National Survey of Student Engagement (NSSE) identified as “some of the more powerful contributors to learning and personal development” (NSSE, n.d., p. 1) are linked directly to course assignments: »» Level of academic challenge—Challenging intellectual and creative work is central to student learning and collegiate quality. Colleges and universities promote high levels of student achievement by emphasizing the importance of academic effort and setting high expectations for student performance. »» Active and collaborative learning—Students learn more when they are intensively involved in their education and are asked to think about and apply what they are learning in different settings. Collaborating with others in solving problems or mastering difficult material prepares students to deal with the messy, unscripted problems they will encounter daily, both during and after college. (NSSE, n.d., p. 1) When thinking about coursework, there are several key considerations for creating meaningful assignments that promote and extend student learning. These suggestions apply to individual student assignments as well as group projects: »» Avoid “busy work”—Students often complain about assignments that seemingly have little connection to course learning outcomes, are almost mindless in the level of required effort, or are graded with a perfunctory checkmark indicating completion as opposed to constructive or evaluative feedback. Keep the three Rs—rigorous, relevant, and reflective—in mind when creating course assignments. These elements are the essence of assignments that embody a high level of academic challenge. »» Connect the dots—Intentionally share connections between required assignments and course learning outcomes. One way to strengthen the value of an assignment is to intentionally draw direct connections and parallels to these outcomes, which also represent the expectations for student performance. This can be done in the course syllabus and in classroom discussions (e.g., “This assignment is connected with the following outcomes that we wish to accomplish in this class … ”). Establishing this link should become routine when framing and presenting course assignments.

8

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018 »» Do as I say and as I have done—To acquire a better understanding of the challenges that students may face in their coursework, Kalchman (2011) recommended faculty actually complete an assignment before including it on the syllabus. This practice also offers added insight into the level of academic challenge, as well as the clarity (or lack thereof) of the directions given for the assignment, along with the associated grading rubric. »» Provide transparent assessment criteria— Students want and expect specific information on how their work will be graded. Rubrics are an effective way to define levels of excellence and rigor by presenting the grading criteria and scoring weights for the different variables in an assignment (e.g., organization, grammar and spelling, use of reference materials, critical thinking). (See “The Rubric: A Tool for Authentic Assessment,” p. 18). »» Assign a variety of tasks—Assignments come in many flavors (e.g., quizzes and examinations, writing projects,presentations,live performances,portfolios). Each of these varied formats has strengths and weaknesses in terms of what is measured and the level of effectiveness. Using the course objectives as a starting point, instructors need to plan how their students can provide evidence for mastering a learning outcome and the best way to measure that evidence. For instance, if basic knowledge about the course topic (e.g., vocabulary, key principles, people and places) is an essential learning outcome, then an objective quiz may be the most effective assignment and evaluation. If, however, critical thinking and analysis are desired outcomes, then a research paper, essay, or presentation might better assess students’ progress toward the goal. »» Use authentic audiences—Quite often, the only people who actually see a written assignment are the student and the instructor. Expanding this group to include authentic audiences (i.e., naturally occurring external individuals or groups with interest and expertise related to the developed product) can provide students with feedback from others in the discipline. Examples of assignments that can be submitted to an authentic audience, and which can have meaningful, real-world applications

www.sc.edu/fye/toolbox

Good design is making something intelligible and memorable. Great design is making something memorable and meaningful. -Dieter Rams, German industrial designer

9


The Toolbox Collection • Volume 3: Assessment of Student Learning for students in their community and professional lives beyond college, include developing a grant application; submitting a proposal for a conference presentation; writing a letter to the editor of a journal, magazine, or newspaper; and working with a faculty member to craft a journal article. »» Require assignments early and often in the course—Multiple assignments scheduled over the span of a semester provide a range of data points, increasing the validity of the assessment process and better demonstrating student learning and achievement of desired outcomes. If the assignments are to be effective, however, instructors need to consistently offer students constructive feedback on completed work. This way, students can monitor their progress and improve performance on subsequent assignments. Carefully and intentionally developed assignments, as well as the effort put into grading them and giving helpful feedback, offer a means to directly enhance the learning and personal development of students and provide evidence of that growth. Creating meaningful assignments should be part of the skill set in every instructor’s toolbox of effective educational practices. This article was originally published in July 2015.

10

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

MAKING THE GRADE:

GET MAXIMUM BENEFIT FROM COURSE-BASED ASSESSMENTS

T

he past several years have seen increasing interest in higher education related to the assessment of student learning. Tom Angelo (1995) defined assessment as an ongoing process aimed at understanding and improving student learning. It involves »» making our expectations explicit to the public; »» setting appropriate criteria and high standards for learning quality; »» systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and »» using the resulting information to document, explain, and improve performance. (p. 7)

These criteria traditionally have been used to define universitywide assessment practices that are often tied to accreditation. Such principles also have significant implications for faculty as they plan and deliver the assessment of student learning at the course level. Using Angelo’s definitions as a framework, the following sets of questions can help faculty think through this process and possibly help students become more successful learners.

MAKING EXPECTATIONS KNOWN The goal: Consciously and consistently remind students of the identified learning outcomes for your course and their connection to chosen assessments. »» At the beginning of each semester, as part of the course introduction, do you review course learning outcomes?

I think it’s very important to have a feedback loop, where you’re constantly thinking about what you’ve done and how you could be doing it better.

-Elon Musk, CEO, SpaceX

»» As your courses begin, are students fully aware of all assessments and their relative contributions to the overall course grade?

www.sc.edu/fye/toolbox

11


The Toolbox Collection • Volume 3: Assessment of Student Learning »» Does the course syllabus fully outline due dates and scheduled assessments? »» As the semester proceeds and you share new content or upcoming assessments, do you make a deliberate connection to course learning outcomes?

SETTING HIGH EXPECTATIONS FOR STUDENT LEARNING The goal: Communicate your belief that students can be successful in your classes; also, delineate what success looks like. Kuh (2003) observed: Students typically don’t exceed their own expectations, particularly with regard to academic work. But students will go beyond what they think they can do under certain conditions, one of which is that their teachers expect, challenge, and support them to do so. (p. 28) »» For authentic assessments (e.g., presentations, product creation), do you provide students with rubrics that define excellent performance? »» Do you share (with permission) the work of previous students that rises to the level of excellence? (Catapano, n.d.) »» Do you provide students with scaffolding information (e.g., study guides, sample question formats) to help them prepare for quizzes and examinations?

SYSTEMATICALLY GATHERING AND INTERPRETING STUDENT PERFORMANCE DATA The goal: Use student performance data (e.g., areas of observed difficulty, item analyses of quizzes and tests) as diagnostic tools to identify areas of instruction that could be realigned to increase understanding and performance. »» Do you analyze student performance on course-related assessments to identify areas where modified forms of instruction may be warranted (e.g., putting more emphasis on or changing the way you teach certain topics)? »» Do you consistently modify your course design and delivery as a pathway to assist students in their learning?

USING RESULTS TO DOCUMENT, EXPLAIN, AND IMPROVE PERFORMANCE The goal: Give prompt, meaningful feedback to students to help them improve performance. Kuh (2003) suggests: “Students read and write when we demand it. And in concert with other effective practices—prompt feedback, for example—they learn more” (p. 28). »» Are grades on objective assessments (e.g., quizzes, tests) quickly posted to your learning management system (e.g., within 72 hours)? »» Do authentic assessment results go beyond a letter or number grade to include constructive comments, encouraging improved performance? »» Do you debrief quizzes and examinations to improve student understanding of course content? (Winkelmes, 2013) Solid assessment practices are a critically important aspect of teaching and learning. Faculty play a key role in assuring that assessments are chosen, and results used, to assist students in their learning while also serving as a guide for course or program level modifications. This article was originally published in May 2015.

12

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

ASSESSMENT à la CARTE I

magine you are extremely hungry and have just been given the opportunity to partake of a meal at one of your favorite restaurants.The food at this eatery is extraordinary! You decide to throw caution to the wind and ignore everyday concerns about calories, fat content, and carbohydrates. As the waiter approaches, you struggle with what to order because you really enjoy several items on the menu. You hesitate, then describe your dilemma to the waiter. Much to your surprise, he invites you to simply pick and choose from a variety of meal options based upon your own personal preferences. Let the meal begin! Consider this illustration in relation to course design and student learning. As teachers, we all strive to provide instructional opportunities that will maximize the level that our students gain new information, understanding, skills, and concepts. Quite often, however, course syllabi reveal a one-size-fits-all mentality. Although we know each of our students learns differently and brings varied levels of competence and skill to the classroom, everyone is given identical assignments and tasks over the course of a semester. There is an alternative: teaching à la carte! In this approach to course design: »» Individual differences are acknowledged. »» The demonstration of learning can occur in a variety of ways. »» Students have the opportunity to select their own learning activities from a menu of choices. The implementation of teaching à la carte requires three easy steps.

STEP ONE: POINT OUT THE BASICS Identify those basic learning activities and course requirements that you believe all students should complete. Examples might include reading the assigned text, attending class, engaging in classroom discussions, or participating in tests, quizzes, and examinations.

STEP TWO: CREATE A MENU Make a “menu” of additional learning activities that students can choose from as a way of demonstrating their learning and applying the information they are gaining through their reading and participation in class. A sample listing of potential menu items is provided below.

STEP THREE: GIVE POINTS Assign point values to the various required and optional experiences that will comprise your assessment system. For example, based on a 1,000-point system, students might be given the following alternatives: »» Readings in textbook—100 points »» Class attendance—100 points »» Quizzes—100 points »» Midterm examination—100 points »» Final examination—100 points »» Performance contract (i.e., items totaling 500 points that students select from the menu)

www.sc.edu/fye/toolbox

13


The Toolbox Collection • Volume 3: Assessment of Student Learning Under this proposed arrangement, students can select several activities totaling 500 points (or more). At the end of the semester, the total number of points that students accrue (from required and selected items) will determine their final grade in the course.

SAMPLE MENU ITEMS The following are potential items that could be included in a learning menu, along with some ideas regarding point values (based on a 1,000-point scale). Each of these items has been field-tested. As an instructor, you will need to judge the relevance of these activities to your own discipline or the degree to which they should be modified. »» Field interviews (200 points)—Interview at least three professionals currently employed in a position related to your academic discipline. Prepare a summary of your interviews, synthesizing the data obtained and generating relevant conclusions and observations. »» Research paper (200 points)—Write a research paper on a course-related topic pre-approved by the instructor. Your paper should be five or more pages in length (word-processed, 12-point font, double-spaced, 1-inch margins on top, bottom, and sides). Include a reference page citing at least six references from the professional literature (with an emphasis on articles appearing in refereed journals). A rubric will be provided to specify guidelines and grading expectations. »» Video reviews (200 points possible)—Watch eight videos from an approved list related to the content of this course. Provide a written review of each video using the format provided.

Learning is a treasure that will follow its owner everywhere. -Chinese Proverb

»» Supplementary fiction (200 points possible)— Choose a book from the supplementary reading list. After reading, write a three-page essay containing the following components: (a) basic thesis of the book, (b) a section of the book that had the greatest impact on you as a person, (c) applications and connections to your life, and (d) implications of this book for your vocation/career. »» Job shadowing (200 points possible)—Shadow a professional in the career area or field you would like to enter. Write about your experiences and insights gained in a journal.

14

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018 »» Resource notebook (150 points possible)—Develop a resource notebook of materials that will be useful to you. The notebook should have at least 100 pages of content selected from a variety of sources. Organize these resources with topical dividers. »» PowerPoint (150 points)—Take some aspect of course content and develop a PowerPoint presentation that illustrates an important principle or concept. The presentation can include no more than 10 pictures and 25 words. Be creative. Refer to the rubric provided as a guide to slide and presentation development. »» Video (200 points)—Prepare a video of three minutes or less exploring one of the key principles discussed in the course. An à la carte approach to assessment can give your students a greater level of choice and control related to how they demonstrate their learning and mastery of course learning outcomes.

This article was originally published in November 2004.

www.sc.edu/fye/toolbox

15


The Toolbox Collection • Volume 3: Assessment of Student Learning

DEADLINES AND DUE DATES I

n this fast-paced world, there are always tasks to be completed and deadlines to be met. These realities are a constant part of daily life. If one of the purposes of college is to prepare students for the demands of the workplace, then to what extent are institutions exposing students to the rigor of meeting deadlines and working within a world of perpetual due dates? Murray (2008) has argued that colleges are not effectively helping students with this real-world transition—that rather than creating policies and practices that challenge students to demonstrate personal responsibility, colleges tend to coddle them. This chapter examines the ways that developing and enforcing course deadlines can contribute to student growth and increased levels of responsibility.

LIFELONG COMPANIONS: DEADLINES AND EXCUSES Given human nature, it is likely that for every deadline that has ever been set, at least one excuse (if not more) has been offered as to why it was unavoidably missed. Schwartz (1986) identified the top five categories of student excuses for the late submission of assignments: (a) death of a grandparent, (b) an accident involving a friend or relative, (c) automotive problems, (d) animal or pet trauma, and (e) the unfortunate results of crime victimization. The questions that arise for faculty when students use these excuses are: »» Is the student telling the truth? »» What is the published class policy on submission of late assignments? »» Should I make an exception? »» Am I willing to enforce the stated policy in spite of the presented excuse?

LIFELONG ENEMIES: DEADLINES AND PROCRASTINATION It is arguably a common human trait to stall the completion of assigned tasks in the shadow of a looming deadline. Zarick and Stonebreaker (2009) cite the example of the well-known annual deadline for completing federal income tax forms: April 15. Everyone knows the deadline will arrive on or close to this date each year. Yet thousands wait until the evening before to complete that onerous task and then race to the post office just in the nick of time. According to Zarick and Stonebreaker, possible reasons for this prevailing pattern of procrastination include (a) uncertainty (i.e., not being sure what is expected), (b) aversion (i.e., people are less likely to start something they really do not want to do), (c) fear (i.e., the perceived inability to successfully complete the task), and (d) poor planning (i.e., time runs out in the midst of other demands). Waiting to complete a task that is not understood or is disliked, feared, or unplanned often results in poor performance and negative consequences.

CREATING A POSITIVE LEARNING EXPERIENCE (WITH DEADLINES ATTACHED) To successfully incorporate firm assignment deadlines into a course, remember the following guidelines: »» Create a schedule for the semester that includes assignment deadlines. If deadlines are to become a part of course expectations, the syllabus must include a detailed schedule of the dates, times, and topics for all planned learning experiences, including assignments. »» Be specific about deadlines. A syllabus-based articulation of deadlines should include the date, time, and manner of submission (e.g., through the course management system, via email, at the beginning of a scheduled class). »» Make clear the consequences for late or missing assignments. Students should be made aware, in advance and in the syllabus, of the consequences for submitting assignments after the deadline 16

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018 (e.g., for each 24-hour period an assignment is late, 25% of the available points will be deducted). »» Refrain from adding assignments to the schedule. A faculty behavior that often draws the greatest number of complaints from students is unexpectedly adding new assignments not listed on the syllabus. All work for the class should be clearly stated in the syllabus prior to the beginning of the semester. »» Avoid changing the schedule over the course of the semester. Sometimes unexpected and disruptive events occur during a term (e.g., weather-related problems, faculty cancellations because of illness). To resolve any delays caused by these unforeseen occurrences, initiate a discussion with the class as to how things will proceed from that date forward to the end of the semester. Make sure to announce a clear and precise resolution of these matters as related to due dates. »» Provide reminders of due dates for upcoming assignments. To help keep students and the class as a whole on track, as a general practice, consider providing routine reminders for upcoming course-related events (e.g., “Next week, we will be discussing … ”, “Your blog entries should be entered by Friday at 5 p.m.”). »» Be consistent about deadline policies. Consistency is the key when administering a deadline policy. Make every effort to apply the rules of deadlines in a manner that is fair, equitable, and reasonable. On occasion, some situations may require an exception (i.e., a documented illness); however, as a general rule, the policy should be maintained and enforced as written.

If it weren’t for the last minute, nothing would get done. -Rita Mae Brown, American writer

»» Periodically review deadline policies and the consequences. Take time to routinely review the rules in place, whether they are working effectively, and the ways they can be communicated more clearly or administered more efficiently and fairly.

This article was originally published in September 2011.

www.sc.edu/fye/toolbox

17


The Toolbox Collection • Volume 3: Assessment of Student Learning

THE RUBRIC: A TOOL FOR AUTHENTIC ASSESSMENT T

here is a growing trend in higher education toward relying on authentic assessments (e.g., presentations, exhibits/displays, portfolios, blogs/wikis, written responses) as a means to assess student learning. As opposed to more traditional objective assessments (e.g., true/false, multiple choice, fill-in-the-blank), authentic assessments typically require students to move beyond simply remembering and understanding to higher levels of thinking: applying, analyzing, evaluating, and creating.Wergin (1988) summarized the important connection that exists between assessment and the ways that students learn: If we have learned anything from educational research over the last 50 years, it is that students learn according to how they are tested. If we test students for factual recall, then they will memorize a set of facts. If we test them for their ability to analyze relationships, then they will begin to learn to think critically. If we assess how well they can apply classroom material to concrete problems, they will learn to do that. But despite the general agreement that classroom assessment procedures have a powerful influence over learning, testing is the bane of most faculty members’ lives. (p. 5) One of the challenges faced by faculty moving in the direction of authentic assessments is designing a strategy to consistently evaluate student performance in relation to identified learning outcomes. An excellent tool for remedying this situation is the rubric. According to Montgomery (2002), the rubric provides a consistent mechanism that allows the faculty member to ask several important questions: »» What are the parameters of a quality process or product? »» Are the expectations for excellence clear to the students and the instructor? »» What have the students learned after completing a task? Rather than simply labeling a student’s work as “well done” or “lacking in overall quality,” for example, the rubric allows the instructor to efficiently and succinctly share those aspects of the assigned task that were completed with excellence and those that failed to meet the established criteria. Most rubrics are arranged as a matrix with a series of dimensions or variables listed on the vertical axis (e.g., organization, quality of references, grammar/mechanics, creativity) and a scale of values or descriptors listed on the horizontal axis (e.g., excellent, fair, good, poor). In the corresponding boxes within the matrix (i.e., where the two axes meet), descriptions or examples are included, both for the reference of the student completing the assignment and of the instructor evaluating it. The following scenario illustrates applying the rubric to the assessment process of a common classroom task.

THE SCENARIO: You have assigned your students the task of creating a digital presentation that summarizes their investigation of a topic related to course content. To communicate your expectations for this project, you provide the following rubric:

18

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

Figure 1. Sample rubric for a digital presentation. Created using the rubric generator at http://rubistar.4teachers.org

These criteria are reviewed with the students and discussed in class before they embark on the assigned project. A questionand-answer time provides needed clarification for students. Once the projects have been completed, the same rubric, marked with the results of the instructor’s evaluation, can be returned to the students for their reference and understanding. As a followup, the instructor can offer to meet with students to review the rationale for their performance rating.

GETTING STARTED Several excellent websites contain sample designs related to a wide variety of project assignment areas, including written reports, digital presentations, group participation, musical and dramatic performances, and artworks (e.g., http://www.teach-nology. com/web_tools/rubrics/ and http://rubistar.4teachers.org/). The examples can easily be adapted for specific classroom uses, and the sites provide online tools to help faculty identify and choose their own rubric variables and descriptive criteria. In addition, bringing students into the discussion and rubric development www.sc.edu/fye/toolbox

Students should be attempting to complete tasks that are visible and meaningful. ... A performance is in a sense autonomous; it is an accomplishment. ... Studying for a test is an activity, but not a performance. -Tagg (2003, p. 155)

19


The Toolbox Collection • Volume 3: Assessment of Student Learning process can engage them more fully in their learning at the beginning of a project. During a class session before an assignment, students can spend time identifying the most important variables to consider during the assessment process. This gives students a real sense of ownership for the process and the outcomes. According to Chun (2010), one of the primary purposes of assessment is to determine the level that chosen instructional strategies have a desired effect on student performance (i.e., if students are not mastering identified learning outcomes, then the chosen pedagogical techniques may need to be altered or abandoned). Thinking in this way requires cogent and meaningful assessment techniques that can directly link teaching and learning. Properly constructed and validated, rubrics can serve that purpose and act as a powerful tool for strengthening your awareness of student learning and the power of your teaching.

This article was originally published in September 2010.

20

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

A COLLECTION OF ASSESSMENT TOOLS AND STRATEGIES H

ere are some ideas to get you thinking about new and different ways to assess student learning in your courses.

NO-FAULT QUIZ As reported by Sporer (2001), students often ask the question, “Can I get extra credit?” One way to provide this opportunity in a meaningful way is through the No-Fault Quiz. Students are given a 5- to 15-question quiz (i.e., multiple choice, true/false, fill-in-the-blank) covering the previous week’s content. Points gained on the NoFault Quiz count as extra credit, and points that may be “missed” do not count against the student. Answers to the questions are provided immediately after the quiz. Student performance is recorded, but students are exposed to quiz questions that can later serve as a source for review.

HAND-IN DATES Typically, course syllabi specify the assignments that students are expected to complete during the semester and the dates that each of those assignments are due. An alternative is the Hand-In Date strategy: »» At the beginning of the semester, students receive a list of the assigned projects, papers, and presentations to be completed. »» Instead of providing specific due dates for each assignment, the instructor provides a series of “completion dates” (e.g., Completion Date No. 1, Completion Date No. 2). »» Students choose the order in which they complete the assigned tasks for the semester. They must submit one completed assignment on each of the designated completion dates. This strategy allows students to choose which assignments they can complete first and which ones will require the greatest amount of time to complete. Additionally, this strategy provides an opportunity to front-load assignments and prevents the common practice of requiring a massive amount of completed work during the final two weeks of the semester.

EXIT CARDS Davies and Wavering (1999) describe a procedure for encouraging students to engage in ongoing reflection about their learning. Exit cards provide a strategy for students to process what they are learning and apply that information to their chosen discipline. On a weekly basis, students are asked to complete a 5x8 card that contains three questions: »» What? »» So what? »» Now what? www.sc.edu/fye/toolbox

21


The Toolbox Collection • Volume 3: Assessment of Student Learning The “What?” question focuses on the content presented and learned during a particular class session.The “So what?” question is designed to elicit a summary of the main points discussed and reviewed during the week. The “Now what?” question requires students to relate that content to their lives, learning, and future roles and responsibilities.

PROBLEM-BASED LEARNING Problem-based learning (PBL) is a strategy receiving an increasing amount of attention in higher education. Its origins are often attributed to Thomas Corts of McMaster University in Canada, who developed this technique as a way of teaching medical students (Rheem, 1998). Duch, Groh, and Allen (2001) describe the PBL process: In the problem-based learning approach, complex, real-world problems are used to motivate students to identify and research the concepts and principles they need to know to work through these problems. Students work in small learning teams, bringing together collective skill at acquiring, communicating, and integrating information. (p. 6) PBL provides an excellent mechanism for helping first-year students gain and practice skills that will serve them well throughout their higher education experience and beyond. In particular, PBL helps students develop critical thinking, improve verbal and written communication techniques, learn to work as members of a team, seek solutions to realworld problems, and contextualize the acquisition of new knowledge and skills. Typical phases of the PBL process include: 1. presenting the problem or scenario, 2. clarifying the questions involved in the problem, 3. generating possible hypotheses, 4. gathering information related to the problem—typically done with team members investigating various components of the issue, 5. synthesizing the derived data, and 6. creating a final response and action plan. Beyond these basics, the construction and execution of PBL scenarios can go in a number of directions based on course-related learning outcomes.The presented scenarios can originate from actual facts (e.g., current news reports, data, historical documents, audio/video clips) or imagined scenarios (e.g., fictional charts, reports, research based on real events). Through group discussions and individually written responses, the scenarios should require students to grapple with conflicting data and points of view, their personal values and ethical dilemmas, and interdisciplinary perspectives on real-world issues and concerns. In addition to discussions and reflection papers, PBL activities often result in the creation of one or more final products (e.g., presentations, memoranda, reports, songs, poems). PBL can be implemented in a classroom setting or within an online learning structure; in large or small classes; and over a semester, a block of time within a course, or in a single class. Finally, PBL provides a great venue for creating interdisciplinary learning opportunities. A list of sample topics and disciplines relevant to those topics is included in Figure 2.

22

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

Figure 2. Sample PBL scenarios and suggested disciplines.

ALTERNATIVE WAYS TO DISPLAY KNOWLEDGE It is quite common for faculty to assign writing projects to students in the form of lengthy, referenced research papers. There is value to be gained from doing the necessary research, synthesizing the derived information, then summarizing conclusions and findings in a formatted paper. As an alternative, however, or even as a supplement, it is helpful to have students condense their main findings into a different format. The following examples, Pecha Kucha, Wikis, and Pinterest, force students to think differently (e.g., taking content they have summarized in writing over to a more visually oriented venue).

Conversation is a meeting of minds with different memories and habits. When minds meet, they don’t just exchange facts: They transform them. … Conversation doesn’t just reshuffle the cards: It creates new cards. -Theodore Zeldin, English philosopher (2000, p. 14)

PECHA KUCHA Pecha Kucha (pronounced pa-check-a-cha), which literally means “chit chat” in Japanese, is an electronic presentation format composed of exactly 20 PowerPoint slides with exactly 20 seconds of narrative per slide. These presentations can be performed live or recorded

www.sc.edu/fye/toolbox

23


The Toolbox Collection • Volume 3: Assessment of Student Learning using available computer software (e.g., iMovie). Given a topic to explore and present, students must consider the facts and concepts to include (or to omit) and then create a visual and auditory display within the time constraints of the Pecha Kucha format. Pecha Kuchas are fun to create and also require planning and a mastery of the topic.You can find samples at www.pechakucha.org.

WIKIS A wiki is a website that allows users to edit content actively and collaboratively. One option for creating wikis is the website Wikipedia (http://en.wikipedia.org/wiki/Creating_a_new_page). The sponsors of Wikipedia provide specific directions for creating a wiki on a new topic. These directions are easy to follow and allow instructors to limit access to only students enrolled in their classes. Instructors could ask students to create a wiki and collaboratively and asynchronously create a body of knowledge (i.e., words, pictures, quotations, external sources) on an assigned topic. At the end of the semester, students would be evaluated on the quality and organization of their Wikipedia article. As an additional component, teams could develop their own articles and provide constructive criticism for other groups. This interactive dialogue should increase student involvement and, ultimately, the quality of their work.

PINTEREST Pinterest is a website that provides students the opportunity to collect and organize pictures, video and audio clips, and articles around an assigned topic (www.pinterest.com). Omnicore (2018) reports that Pinterest currently attracts almost 175 million active users and is growing faster than Facebook and Twitter. Students could develop a Pinterest display around a course-related content topic and then share and pool the acquired information to review and process key concepts. Within the context of a course, students could be assigned a related topic with the purpose of gathering the information and resources necessary to teach their classmates the most salient content and principles related to their topic. The accumulated bank of Pinterests then becomes a databank for learning and review for the entire class.

DRABBLE Looking for a new way to challenge students in their writing, critical thinking, creativity, and connections with course content? Have them create a Drabble, a unique, succinct, and fun writing exercise, as a classroom activity or assignment. These very short novels help students choose their words wisely and think carefully about what they want to say. The Drabble comes from an unlikely source: the British comedy troupe Monty Python and its 1971 publication, Big Red Book (Chapman et al., 1971), which describes this word game as a competition in which the winner creates a novel using 100 or fewer words. In 1987, Steve Moss, then editor of the New Times, an independent newspaper in San Luis Obispo, California, took the art form to a new level, or, more accurately, a new low. Under Moss’s leadership, the newspaper began sponsoring annual Drabble contests but changed the original Monty Python format, reducing the allowable number of words to 55. Other rules include the following: »» The novel must include a setting. »» The Drabble should feature one or more characters. »» The story must include conflict. »» The conflict must be resolved. (Drabble, 2018) Here are two Drabbles, the first using the longer and the second the shorter format: »» Whodunnit? Only five of us made it to the escape pod. We peered through the viewports at the cataclysmic destruction of our spacecraft as we spun away into the Deep Black. It had been a deliberate, traitorous act—some mole undetected by the crew—and we knew that the saboteur had to be on board. We exchanged suspicious glances. I studied the others closely. The second technician

24

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018 was looking particularly nervous—fidgeting and sweat-soaked. The navigator didn’t meet anyone’s eyes for long. I sat back, thinking hard. Which one of them would be the first to guess it had been me? (“52 Drabbles,” n.d.) »» Love on the Net—Sarah sat at the table, awaiting her perfect match, pink carnation held at the ready. When Jason arrived, their eyes immediately met across the crowded café. As he sauntered over to her, his smile quickly faded. “You told me you looked like George Clooney,” she said. “You told me you were a woman!” he retorted. (“WriteWords,” n.d.) Effective Drabble writing requires students to consider carefully the plot of their stories and choose their words to fit within the constraints of the 55- or 100-word format—great skills for any writer to develop. Drabbles also can be valuable learning experiences across academic disciplines and courses. The following examples demonstrate the range of activities that Drabbles can inspire in the classroom: »» Students can create a Drabble on a specific discipline- or content-related concept in any class connected to specific course outcomes, themes, or subjects (e.g., mitochondria, Mark Twain, quadratic equations, existentialism). »» The shortened format can help students outline the main points they want to communicate in another writing assignment, allowing them to brainstorm ideas and making the Drabble a preview of that work. »» Students can post their work on the class’s learning management system and then critique their peers’ novels in a Drabble gallery. »» To add an extra step, students can illustrate their Drabbles with one photograph. »» To make the assignment more creative and incorporate the use of technology, students can transfer their Drabbles into a comic book format or a movie, using iMovie, for example.

MIND MAPS Learning is a fascinating process. Throughout our lives, and in many ways, we learn and master various combinations of facts, concepts, and principles. A great deal of learning occurs in a linear fashion (e.g., formulas, computations, sequential lists of facts, rules of grammar). A graphical depiction of this type of learning could be represented with outlines, flow charts, and tables, illustrating the one-dimensional relationship between pairs of informational variables. One of the aims of higher education, however, is to promote learning that requires deeper, more critical thought, including the ability to analyze relationships and make multiple connections between facts and concepts. A mind map (aka concept map, cognitive map) is a tool that organizes words, thoughts, ideas, tasks, activities, and more in the form of a diagram. … start[ing] with a key or main idea in the center with subtopics [arranged] radially around the main idea. The subtopics group and cluster similar ideas, and they branch out to lower-level topics, guiding you to wherever your thought processes lead you. (Arthur, 2012, p. 9) The mind map’s origination has been traced back to the Phoenician philosopher Porphyry of Tyros, who created graphic depictions of Aristotle’s Organon. The modern rendition of the mind map, however, is attributed to author and educational consultant Tony Buzan (1974), who compared the mind map with the map of a city: The centre of your Mind Map is like the centre of the city. It represents your most important idea. The main roads leading from the centre represent the main thoughts in your thinking process; the secondary roads represent your secondary thoughts, and so on. Special images or shapes can represent sites of interest or particularly interesting ideas. (Buzan, 2012, p. 6)

www.sc.edu/fye/toolbox

25


The Toolbox Collection • Volume 3: Assessment of Student Learning Tsinakos and Balafoutis (2009), summarizing current research on mind maps, suggested the following process for map creation: »» Begin with a blank (preferably large) sheet of paper. »» Place the main topic of the map in the center of the page. The topic should be depicted in a large, colorful manner and accompanied by a graphic design or picture. »» Draw radiating, curved lines (branches) from the main topic to represent key ideas. Branches should be drawn in bright colors, identified with one-word labels, and, if the content is sequential, numbered in a clockwise manner. »» Add sub-branches as necessary to clarify and define the topic. »» Use arrows, geometric figures, punctuation marks, symbols, and pictures to prioritize the importance of the content. You can integrate mind maps into the process of teaching and learning in first-year courses in several ways: »» For effective pedagogy, it is helpful to provide students with intentional strategies to summarize and synthesize content presented in textbooks, classroom lectures, or demonstrations. For example, within small groups, students could create mind maps for a key topic or concept under investigation. These could be shared in class or posted in the discussion forum housed in the campus learning management system (LMS). A collection of mind maps can also help students prepare for examinations. »» One of the skills that can best prepare students during their college journey, as well as after graduation, is the ability to communicate in writing. Good writing is preceded by good planning. Using a mind map helps students organize their thoughts and conceptualize the components of written assignments. After creating their initial map drafts, students can either participate in a peer review process or submit their maps to the instructor for suggestions. Experiencing the benefits of good planning can foster the habit of using mind mapping as part of the writing process. »» With the growing popularity of mind maps as a strategy for learning, free online tools have been created to assist in this process. One such website is Coggle (https://www.coggle.it/), where students work from a blank white screen using a palette of colors and design tools that can turn anyone into an expert mind mapper in a matter of minutes. As an example of the mind map process, Figure 3 depicts some of the key theories of teaching in higher education. Various aspects of the presented theories are color-coded to illustrate the level that they overlap and complement one another. Creation of this mind map might start with a student conversation about theories that will be included, followed by a discussion about how the components intersect (resulting in the color-coding process). Finally, the resulting mind map could be shared with other groups of students for review and suggestions.

26

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

Figure 3. Mind map of key theories of teaching and learning in higher education.

ORAL DISCOURSE Many techniques can be used to assess student learning in higher education. The exact menu of assessment tools chosen by instructors depends largely on the knowledge, skills, and dispositions found in a course’s learning outcomes. Singh (2011) suggested that, although assessment practices tend to focus on written products (e.g., examinations, research papers), the boundaries could be reasonably extended to include oral discourse. These structured, oneon-one conversations with instructors, when thoughtfully conceived and implemented, allow students to talk about what they have learned, demonstrate an understanding of their knowledge and class objectives, and apply course content in new ways. Joughlin (1998), in a review of the literature on assessment practices related to oral discourse, identified four areas to guide instructors in evaluating student performance: (a) knowledge and understanding (e.g., recall of basic concepts, facts, and principles), (b) applied problem-solving ability (e.g., applying the knowledge base in derived scenarios), (c) interpersonal competence (e.g., communication abilities), and (d) intrapersonal qualities (e.g., self-awareness, poise). Oral discourse can be a primary mode of summative assessment, focusing on the outcome of instructional goals, or it can be linked with written assessment strategies as a formative function, allowing the instructor to monitor how well students are meeting objectives as the course progresses. For example, faculty members might engage students in an oral discourse, then later ask them to elaborate further in a written assignment, including external references and resources that focus on one of the questions or topics of discussion that emerged during the conversation. Combining oral and written assessment reinforces students’ understanding of course content and accomplishment of learning outcomes. Adopting oral discourse as an assessment tool will require instructors to do some intentional planning, but the energy spent will pay dividends later in the quality of the experience for students and the ease with which their performance can be evaluated. For example, an instructor teaching a course in which Goldilocks and the Three Bears is a primary text could use an oral discourse as a strategy to assess course learning objectives. To begin, the instructor creates a list of questions that will guide the conversation with students; this step is critically important. Heritage and Heritage (2013) suggest that initial questions and answers stimulate further response and elaboration (i.e., initiation,

www.sc.edu/fye/toolbox

27


The Toolbox Collection • Volume 3: Assessment of Student Learning response, feedback, response, feedback). This interactive process facilitates an organic, self-sustaining conversation rather than a simple question-and-answer session. These questions (i.e., prompts) should be designed to assess student learning at varied levels of thinking (e.g., remembering, applying, analyzing, creating). For example, when focusing on the Goldilocks story, the instructor, using Joughlin’s (1998) model, might begin the conversation with one or more of the following prompts to assess students’ basic knowledge and understanding: »» What are the three main events in the plot of this story? »» Describe the personality and characteristics of Goldilocks. »» What kinds of observations can you make about Goldilocks and her character or personality? »» Create a scenario where this story occurs in a 21st-century setting. »» What are some of the lessons intended from this story? »» What elements of this story explain its longstanding popularity? When planning an oral discourse, instructors also should create a rubric (Figure 4) to guide the assessment of students’ performance during their one-on-one dialogue. Students would receive the rubric before the conversation to gain a sense of how the dialogue will progress and how they will be evaluated.

Figure 4. Rubric for assessing student–instructor dialogue. This article contains elements from past issues of The Toolbox.

Note to readers: Additional assessment tools that primarily rely on the use of technology will be included in Volume 4 of the Toolbox Collection: Digital Learning, scheduled for release in December 2018.

28

www.sc.edu/fye/toolbox


The Toolbox Collection • October 2018

VOLUME THREE:

ASSESSMENT OF STUDENT LEARNING

REFERENCES 52 Drabbles. (n.d.). Retrieved from http://52drabbles.blogspot.com/ Angelo, T. (1995). Reassessing (and defining) assessment. The AAHE Bulletin, 48(2), 7-9. Arthur, K. (2012). Mind maps: Improve memory, concentration, communication, organization, creativity, and time management [Kindle edition]. Retrieved from http://www.amazon.com/ Buzan, T. (1974). Use your head. London, England: BBC Books. Buzan, T. (2012). The ultimate book of mind maps. New York, NY: Harper Thorsons. Catapano, J. (n.d.). Teaching strategies that utilize student work. Retrieved September 15, 2017, from http://www. teachhub.com/teaching-strategies-utilize-student-work Chapman, C., Python, M., Palin, M., Idle, E., Jones, T., Cleese, J., & Gilliam, T. (1971). Big red book. London, England: Meuthen. Chun, M. (2010). Taking teaching to (performance) task: Linking pedagogical and assessment practices. Change, 42(2), 22-29. Davies, M., & Wavering, M. (1999, Fall). Alternative assessment: New directions in teaching and learning. Contemporary Education, 71(1), 39. Drabble. (2018, July 14). Retrieved from https://en.wikipedia.org/wiki/Drabble Dressel, P. (1983, December). Grades: One more tilt at the windmill. Bulletin, 12. Duch, B. J., Groh, S. E., & Allen, D. E. (2001). Why problem-based learning? A case study of institutional change in undergraduate education. In B. J. Duch, S. E. Groh, & D. E. Allen (Eds.), The power of problem-based learning (pp. 3-12). Sterling, VA: Stylus. Eells, W. C. (1934). Criticisms of higher education. Journal of Higher Education, 5(4), 187-189. Heritage, M., & Heritage, J. (2013). Teacher questioning: The epicenter of instruction and assessment. Applied Measurement in Education, 26(3), 176-190. Jennings, N., Lovett, S., Cuba, L., Swingle, J., & Lindkvist, H. (2013). “What would make this a successful year for you?” How students define success in college. Liberal Education, 99(2), 40-47. Joughlin, G. (1998). Dimensions of oral assessment. Assessment and Evaluation in Higher Education, 23(4), 367-378. Kalchman, M. (2011). Do as I say and as I’ve done: Assignment accountability for college educators. College Teaching, 59(1), 40-44. Kallick, B. (2000). Assessment as learning: Are we assessing what we need to know? In P. Senge, N. Cambronwww.sc.edu/fye/toolbox

29


The Toolbox Collection • Volume 3: Assessment of Student Learning McCabe, T. Lucas, B. Smith, J. Dutton, & A. Kleiner (Eds.), Schools that learn: A fifth discipline fieldbook for educators, parents, and everyone who cares about education (pp. 186-195). New York, NY: Doubleday. Kübler-Ross, E. (n.d.) The five stages of grading. The Chronicle of Higher Education. Retrieved from https://www. chronicle.com/forums/index.php?topic=72766.0;wap2 Kuh, G. (2003). What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices. Change, 35(2), 24-32. Montgomery, K. (2002). Authentic tasks and rubrics: Going beyond traditional assessments in college teaching. College Teaching, 50(1), 34-39. Murray, C. (2008, August). College daze. Forbes, 182(3), 32. National Survey of Student Engagement (NSSE). (n.d.). Benchmarks of effective educational practice. Retrieved from http://nsse.indiana.edu/pdf/nsse_benchmarks.pdf Omnicore (2018). Pinterest by the numbers: Stats, demographics & fun facts. Retrieved from https://www. omnicoreagency.com/pinterest-statistics/ Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Thousand Oaks, CA: SAGE. Rheem, J. (1998). Problem-based learning: An introduction. National Teaching and Learning Forum, 8(1), 1-4. Schwartz, M. (1986). An experimental investigation of bad karma and its relationship to the grades of college students: Schwartz’s F.A.K.E.R. Syndrome. Journal of Polymorphous Perversity, 3, 9-12. Singh, P. (2011). Oral assessment: Preparing learners for discourse in communities of practice. Systemic Practice and Action Research, 24, 247-259. Sporer, R. (2001, Spring). The no-fault quiz. College Teaching, 49(2), 61. Suskie, L. (2009). Assessing student learning: A common sense guide. Hoboken, NJ: Wiley. Tagg, J. (2003). The learning paradigm college. Bolton, MA: Anker. Tsinakos, A. A., & Balafoutis, T. (2009). A comparative study on mind mapping tools. Turkish Online Journal of Distance Education, 10(3), 55-72. Wergin, J. F. (1988). Basic issues and principles in classroom assessment. In J. H. McMillan (Ed.), Assessing students’ learning (pp. 5-17). San Francisco, CA: Jossey-Bass. Wiggins, G., & McTighe, J. (2005). Understanding by design. New York, NY: Pearson. Winkelmes, M. (2013, Spring). Transparency in teaching: Faculty share data and improve students’ learning. Liberal Education, 99(2). Retrieved July 5, 2018, from https://www.aacu.org/publications-research/periodicals/transparencyteaching-faculty-share-data-and-improve-students WriteWords. (n.d.) Retrieved from http://www.writewords.org.uk/forum/75_8979.asp Zarick, L. M., & Stonebreaker, R. (2009). I’ll do it tomorrow: The logic of procrastination. College Teaching, 57(4), 211215. Zeldin, T. (2000). Conversation: How talk can change our lives. Mahwah, NJ: HiddenSpring.

30

www.sc.edu/fye/toolbox


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.