Assessing student learning
Assessing Student Learning Student Learning Working Group 2
STUDENT AFFAIRS I NORTHWESTEN UNIVERSITY I FALL QUARTER 2014
1
Table of Contents INTRODUCTION Introduction ................................................................................................................................................... 5
Week
Learning Outcomes for Workshop Series ...................................................................................................... 5
1
OVERVIEW OF STUDENT LEARNING ASSESSMENT Learning Outcomes for Week 1 ..................................................................................................................... 6 Student Affairs Student Learning Outcomes ................................................................................................. 6 The Layered Approach................................................................................................................................... 7 The Assessment Cycle ................................................................................................................................... 7 Distinguishing Between Learning Outcomes, Program Goals/Evaluation, and Other Outcomes ................ 8 Assignment .................................................................................................................................................... 9
Week
Handouts ..................................................................................................................................................... 10
2
WRITING MEASUREABLE LEARNING OUTCOMES Learning Outcomes for Week 2 ................................................................................................................... 11 How to Start ................................................................................................................................................ 11 Format for Writing Learning Outcomes ...................................................................................................... 12 Characteristics of Effective Learning Outcomes.......................................................................................... 13 Selecting the Right Verb: Bloom’s Taxonomy ............................................................................................. 13 Examples of Student Learning Outcomes ................................................................................................... 14 Thinking About Your Teaching Strategy ...................................................................................................... 14 Assignment .................................................................................................................................................. 14 Handouts ..................................................................................................................................................... 15
2
Week
3
GETTING ACQUAINTED WITH ASSESSMENT STRATEGIES Learning Outcomes for Week 3 ................................................................................................................... 16 Formative and Summative Assessment ...................................................................................................... 16 Direct and Indirect Measures ...................................................................................................................... 16 Quantitative and Qualitative Methods ....................................................................................................... 18
Week
Assignment .................................................................................................................................................. 19
4
ANALYZING AND INTERPRETING QUANTITATIVE DATA Learning Outcomes for Week 4 ................................................................................................................... 20 Review ......................................................................................................................................................... 20 Quantitative Method: Traditional Tests ...................................................................................................... 20 Quantitative Method: Surveys .................................................................................................................... 21 Quantitative Method: Existing Scales.......................................................................................................... 24 Analyzing Quantitative Data ........................................................................................................................ 24
Week
Handouts ..................................................................................................................................................... 26
5
ANALYZING AND INTERPRETING QUALITATIVE DATA Learning Outcomes for Week 5 ................................................................................................................... 27 Review ......................................................................................................................................................... 27 Qualitative Method: Open-Ended Survey Questions ................................................................................. 27 Qualitative Method: Before-and-After Reflection ...................................................................................... 27 Qualitative Method: Journals ...................................................................................................................... 29 Qualitative Method: Interviews of Focus Groups ....................................................................................... 29
Week
Analyzing Qualitative Data .......................................................................................................................... 30
6
RUBRICS Learning Outcomes for Week 6 ................................................................................................................... 34 Definition of a Rubric................................................................................................................................... 34 3
When Rubrics Are Used ............................................................................................................................... 34 Types of Rubrics........................................................................................................................................... 34 Steps for Creating Rubrics ........................................................................................................................... 38 Resources .................................................................................................................................................... 38
Week
Handouts ..................................................................................................................................................... 39
7
PRESENTING AND USING DATA AND FINDINGS IN MEANINGFUL WAYS Learning Outcomes for Week 7 ................................................................................................................... 40 Presenting Data and Findings ...................................................................................................................... 40 Using Data and Findings to Improve the Teaching/Learning Experience ................................................... 44 Guidelines for Posters ................................................................................................................................. 44
REFERENCES References ................................................................................................................................................... 46
4
Introduction
INTRODUCTION This seven-week workshop series will guide student affairs staff step-by-step through the student learning assessment cycle/process. At the end of the workshop series, each participant will have developed and be ready to implement an assessment project at the program or activity level in his/her department in 2014 - 2015. The results of these projects will be shared with the Division at the second annual poster gallery session on Tuesday, June 23, 2015.
LEARNING OUTCOMES FOR WORKSHOP SERIES Student Affairs staff who engage in these workshops will [or will be able to] . . . 1. Craft and implement a sound student learning assessment project at the program/activity level that “flows� from their department learning outcomes. 2. Write measureable student learning outcomes for a program or activity in their department. 3. Describe and implement a meaningful assessment strategy that measures student learning in a program or activity in their department. 4. Choose creative, interactive activities that engage students in the learning process. 5. With plenty of assistance from Student Affairs Assessment, analyze and interpret the quantitative or qualitative data collected as a part of their learning assessment strategy. 6. Illustrate and explain their student learning assessment project and their findings to other members of the Division of Student Affairs. 7. Embrace the student learning assessment process.
5
Week 1: Overview of Student Learning Assessment
LEARNING OUTCOMES FOR WEEK 1 At the end of Week 1, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. 2. 3. 4. 5. 6.
Sketch an assessment cycle. List and briefly describe each of the four division learning outcomes. Distinguish between a division, department, program, and activity-level learning outcome. Distinguish between learning outcomes, program goals/evaluation, and other outcomes. Write a brief description of the program/activity they are going to assess in 2014 – 2015. Broadly describe what they want students to learn through the program/activity they are going to assess in 2014 – 2015.
STUDENT AFFAIRS STUDENT LEARNING OUTCOMES At Northwestern University, we believe that student learning happens throughout and across the college experience. The following four broad, division-wide student learning domains and related student learning outcome statements define/describe the co-curricular learning that takes place through the programs, activities, and services offered by Student Affairs. These learning outcomes also consider and reflect the missions and strategic plans of the University and Student Affairs.
Personal Development Student who engage in student Affairs programs, activities, and services will develop an integrated sense of personal identify, a positive sense of self, and a personal code of ethics.
Interpersonal Competence Students who engage in Student Affairs programs, activities, and services will develop healthy, respectful, and collaborative relationships with others.
Social Responsibility Students who engage in Student Affairs programs, activities, and services will demonstrate an understanding or and commitment to social justice and apply that knowledge to create safe, healthy, equitable, and thriving communities.
Cognitive and Practical Skills Students who engage in Student Affairs programs, activities, and services will acquire and use cognitive and practical skills that will enable them to live healthy, productive, and purposeful lives. See Student Learning Outcomes, Division of Student Affairs, Northwestern University.
6
Week 1: Overview of Student Learning Assessment
THE LAYERED APPROACH In 2010, Peggy Maki proposed a “layered” approach to assessing student learning at the college/university level. The SLWG1 considered this approach when thinking about how the Division learning outcomes relate to the department learning outcomes, and, in turn, how the program/activity learning outcomes related to the department outcomes. Student Learning Working Group 1
Maki (2010) Modules Educational Experiences Educational Opportunities Courses
Divisional Learning Outcomes
Departmental Learning Outcomes
Program- or Department-Level Outcomes (Including Programs and Services in the Co-Curriculum)
Program Learning Outcomes
Activity Learning Outcomes
General Education/ Institution-Wide Outcomes
Figure 3.1 Integration and Distribution of Campus-Wide Outcomes and Program- or Department Level Outcomes Into the Curriculum and Co-Curriculum (Maki, P. L. Assessing for Learning: Building a Sustainable Commitment Across the Institution (2Edition). Stylus Publishing, p. 114)
THE ASSESSMENT CYCLE In 2010, the Northwestern Assessment/Accreditation Council described a student learning assessment framework “intended to provide support for units that are at the beginning, intermediate and advanced stages of assessment efforts” (http://www.northwestern.edu/provost/initiatives/assessment/Framework.pdf). Included in this framework is an illustration of the assessment cycle.
Step 1:
Define/Redefine learning objectives Step 6:
Step 2:
Make decisions; implement change
Select/design criteria, measures, instruments
Step 5: Identify gaps between what was intended and what was achieved
Step 3: Gather evidence
Step 4: Analyze and evaluate evidence (learning outcomes)
7
Week 1: Overview of Student Learning Assessment In her 2009 book, Assessing Student Learning, Linda Suskie described another assessment cycle. This one takes into account the importance of crafting learning opportunities/activities that allow students to learn what we hope they will (p. 4).
1. Establish Learning Goals
3. Use the Results
2. Provide Learning Opportunities
4. Assess Student Learning
DISTINGUISHING BETWEEN STUDENT LEARNING OUTCOMES, PROGRAM GOALS/EVALUATION, AND OTHER OUTCOME MEASURES The assessment of student learning outcomes is different from conducting a program evaluation or collecting data related to other desirable outcomes, like graduation rates. Program evaluation—whether students were satisfied with the program, the time of the program, the location, and/or the presenters, and the like—and other outcome measures— number of attendees, attendance, characteristics of the participants, grades, GPA, graduation rates, etc.—are important, too, but they should not be confused with learning outcomes.
Learning outcomes describe what students should know, be able to do, and/or be able to demonstrate, as a result of engaging in a learning experience. Learning outcomes describe how students will be different because of a learning experience.
8
Week 1: Overview of Student Learning Assessment Consider the following statements. Which are learning outcomes and which are program goals/evaluation or other outcomes? Learning Outcome?
Program Goals/ Evaluation?
Other Outcomes?
Students will attend at least three evening programs sponsored by RAs Students will be able to Identify the most appropriate University policy or procedure that pertains to their concern Students will be able to use the STAR Method to describe relevant experiences in a way that reflects knowledge of the job/internship position description and employer New students will participate in campus-wide events that are designed to promote a sense of community Students will report higher levels of confidence in their ability to identify high risk drinking behaviors Students will be able to identify the characteristics of healthy and unhealthy relationships Through the enhancement of its website, the Center for Leadership and Service will increase student access to service opportunities, leadership retreats, and educational programming Eighty percent (80%) or more of the students will report being very satisfied with their experience in CAPS Students will be able to use the creative problem solving method when analyzing a case study related to leading in a student organization Students will report that living on campus enhanced their ability to meet new people Students will rate the guest speaker on sexual assault at least 4 on a 5-point scale Participants will be familiar with the structure of fraternity and sorority life at Northwestern and beyond
Richard Keeling suggests that we should collect these different kinds of data on different instruments, or at different times, or in different ways. He also recommends not collecting self-assessment of learning outcomes immediately after asking students to rate or evaluate programs. He writes, “Salience effects may materially undermine the quality of the results” (Keeling & Associates, LLC, 2010).
ASSIGNMENT Write a brief description of the program or activity you are focusing on in 2014 – 2015. Then broadly describe what you want students to learn through this program/activity. Just create a list. Don’t worry about the wording of these learning outcomes at this point. In general, what knowledge, skills, and attitudes do you want students to acquire as a result of participating in your program/activity?
9
Week 1: Overview of Student Learning Assessment
HANDOUTS
Student Affairs Strategic Plan Summary of Learning Outcomes from the Literature Abstracts – Poster Gallery Session 2014 – Assessing Student Learning Booklet: Student Learning Outcomes, Division of Student Affairs 2014 – 2015 Assessment Projects Workbook: Assessing Student Learning, Division of Student Affairs
10
Week 2: Writing Measureable Learning Outcomes
LEARNING OUTCOMES FOR WEEK 2 At the end of Week 2, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. 2. 3.
Write two or more measureable student leaning outcomes for the program/activity they are going to assess in 2014 - 2015. Illustrate how their program/activity level learning outcomes “flow” from the department and divisional learning outcomes. Brainstorm creative, interactive teaching/learning activities that will permit students to learn—and practice—what they hope students will.
HOW TO START When determining what you want students to learn through your program/activity, it might be helpful to use the “knowing-being-doing” model. Komives (2013) suggests that these three components are interrelated—the knowledge students possess can influence their ways of thinking, which can influence their actions. Other ways to view this holistic approach is by using the framework of knowledge, skills, and attitudes or head, heart, and practice. Palmer (1998) uses the phrase “head, heart, and practice” to describe the paradoxes in teaching and what happens when we keep the head (knowing and intellect) separated from the heart (being) and even further separated from practice (doing). Palmer argues that we need a synthesis of all three components in the teaching process. So, when thinking about what you want students to learn from your program/activity, consider these questions. 1. What do you want the student to know (knowledge) as a result of their participation in this program/activity? 2. What do you want the student to be able to do (skills) as a result of their participation in this program/activity? 3. How might students’ attitudes and/or values (be) be influenced by this 1 program/activity? 4. How will students be able to demonstrate what they learned? 5. How does this program/activity and how do these learning outcome fit within the Division's student learning outcomes and your department’s learning outcomes?
1
Note: It is relatively easy to figure out how to assess knowledge and skills. Assessing attitudes and values is, admittedly more difficult and less precise. In fact, some of these “being” learning outcomes are not just difficult, they are virtually impossible to assess—because they are not teachable. One way to figure out if a learning outcome is assessable is ask, (1) “how would I help students learn this?” and (2) “what evidence could I gather that they had learned this?”
11
Week 2: Writing Measureable Learning Outcomes
FORMAT FOR WRITING LEARNING OUTCOMES There are many methods described in the literature for how to write measureable learning outcomes. The A-B-C-D one is one of the most popular. A variation of the A-B-C-D method is the C-B-C. The A-B-C-D Method
A C B
Audience Condition Behavior
D
Degree
Example Students who participate in the leadership workshop will be able to list three of the five leadership criteria stated in Kouzes’ and Posner’s The Leadership Challenge.
The CBC Method
C B C
Condition Behavior Criterion
Example Given a problem with two unknowns, students will be able to describe how to solve the problem in step-by-step fashion.
And Keeling & Associates describe a similar, but slightly different approach (Keeling & Associates, LLC, 2010). The Keeling format requires that we think about the assessment method as we are writing our activity/program learning outcomes. Several examples follow on the next page. Keeling & Associates Method of Writing Learning Outcomes
AUDIENCE/POPULATION Persons First-Year Students Graduating Seniors Students
who
LEARNING EXPERIENCE engages in (to a a function/activity degree) activity engage with/in program participate in X # counseling complete sessions
INTENDED LEARNING action verb to a degree identify at least # list with X% accuracy describe more/fewer than explain all draw sometime in the write future
EVIDENCE as demonstrated by interview survey observed behavior blog portfolio performance
will [or will be able to] . .
Why as required by or for state systems accreditation body professional standards program effectiveness benchmarking
Example: Sophomores who participate in at least three personal counseling sessions will be able to describe at least two strategies for managing stress in college as demonstrated by reflective journal entries the student will share with a counselor.
12
Week 2: Writing Measureable Learning Outcomes
CHARACTERISTICS OF EFFECTIVE LEARNING OUTCOMES What are the characteristics of effective learning outcomes? Learning outcomes should be S.M.A.R.T. SMART Outcomes S
Specific
M Measureable A
Achievable
R
Relevant
T
Time-framed
Describes exactly what the learner will know or be able to do Specifies the trait, ability, behavior, or habit of mind to be assessed; Data can be collected to measure what you want the learner to know or be able to do Attainable for the participants within scheduled time and specified conditions Aligned with the Division and departmental learning outcomes; they measure something useful and meaningful Realistic to achieve within the time and resources available
SELECTING THE RIGHT VERB: BLOOM’S TAXONOMY The selection of an action verb for each outcome is crucial. The verb not only reflects the knowledge, skills, and attitudes we want students to learn, it also suggests the assessment method. Avoid “fuzzy” words like understand, appreciate, believe, know, learn. These verbs frequently denote knowledge or behaviors not easily observed or measured. For example, if students “understand” a concept, they should be able demonstrate that understanding by “describing” or “explaining” the concept, or “identifying” the key elements of the concept, etc. In 1956, Bloom developed a taxonomy of educational objectives which has stood the test of time. His taxonomy is a hierarchical sequence; and therefore, being able to “analyze,” for example (level 4), assumes that the learner can already perform at the lower levels (knowledge, comprehension, application) of the Taxonomy. Also, the level of learning you are measuring at the program/activity level should seldom exceed the level of learning described in your department learning outcomes. Below is an illustration of how to measure learning at each level of Bloom’s Taxonomy using some of the verbs he classified in each. This is not an exhaustive list. Bloom’s Taxonomy Knowledge Comprehension/ Understanding Application of New Knowledge/Skills Analyze
Evaluate Create
Verbs List. . . . . . . . Classify. . . . . Describe. . . . Identify. . . . . Apply. . . . . . Illustrate. . . Dramatize . . Categorize. . Plan. . . . . . . Evaluate. . . . Argue. . . . . . Write. . . . . .
Measurement Strategy Present a list Sort a random list into appropriate groups Write or orally describe a concept Choose an appropriate answer in a multiple choice test Use knowledge to accomplish a task Use drawing to explain, show a process Use role-playing to illustrate a concept Place items in an appropriate general groups based on similarities Write/describe a procedure to accomplish a goal before beginning it Describe the relative merits of something based on criteria Describe reasons and present evidence for a point of view Present something in writing, e.g., a senior thesis
13
Week 2: Writing Measureable Learning Outcomes
EXAMPLES OF STUDENT LEARNING OUTCOMES Below are several examples of student learning outcome statements.
Students who participate in the Red Watch Band Training Program will be able to correctly identify myths from facts about alcohol overdose. In the reinstatement interview, students who took a medical leave will be able to describe at least two help-seeking and/or positive coping skills that they used to manage a recent stressful situation. Students will be able to use the STAR Method to describe relevant experiences in a way that reflects knowledge of the job/internship position description and employer. Students who complete the Cultural Diversity workshop will demonstrate the ability to analyze and respond to arguments about racial discrimination. Residence hall staff will be able to assist roommates in resolving conflicts by helping them negotiate roommate agreements.
THINKING ABOUT YOUR TEACHING STRATEGY There is a dynamic equilibrium between teaching strategies and learning outcomes and assessment. Selecting teaching and learning activities that are likely to ensure that the learning outcomes are achieved is essential. Ask yourself, how and when will students have the opportunity to learn [outcome] in the program/activity I am focusing on in 2014 – 2015? See the “Teaching in Blooms” handout for more information.
ASSIGNMENT Review the broad outcomes statement you listed last week when you wrote the description of the program/activity you are focusing on in 2014 – 2015. Now turn them into learning outcome statements using the following template. 1.
Begin with the stem, “Students who engage in [or participate in or complete] . . .
2.
Insert the name of your program/activity.
3.
“will [or will be able to] . . .”
4.
Insert an action verb—one that suggests how students will demonstrate they have learned the knowledge, skills, and attitudes you desire. Use Bloom’s Taxonomy for ideas. Use only one verb per learning outcome. Note, once again, avoid “fuzzy” verbs like understand, appreciate, believe, know, be familiar with, and learn.
5.
Describe what you want students to know, be able to do, and attitudes you hope to influence through your program/activity—although keep in mind that attitudes are hard to measure. While doing this, start thinking how you would gather evidence that the students had learned this—your preliminary assessment strategy.
14
Week 2: Writing Measureable Learning Outcomes Next, illustrate how the learning outcomes for your program or activity “flow” from your department learning outcomes and how they, in turn, “flows” from the Divisional learning outcomes. An example of a “flow chart” from 2013 – 2014 SLWG1 can be found below.
Divisional LO Personal Development Students who engage in Student Affairs programs, activities and services will develop an integrated sense of personal identity, a positive sense of self, and a persona code of ethics.
Divisional LO Social Responsibility Students who engage in Student Affairs programs, activities and services will demonstrate an understanding of and commitment to social justice and apply that knowledge to create safe, healthy, equitable, and thriving communities.
Religious Life Department LO Students who engage in programs, activities and services provided by Religious Life will [or will be able to] articulate a personal belief system that sustains one in daily life.
Religious Life Department LO Students who engage in programs, activities and services provided by Religious Life will [or will be able to] reflect on experiences that challenged assumptions and transformed thinking about issues related to social justice
Spring Break Friendship Mission Trip Program/Activity Students who participate in the Spring Break Friendship Mission Trip, will (or will be able to) . 1. Identify three factors of increased globalization impacting the country of El Salvador. 2. Articulate how their core values have been challenged and/or transformed through this immersion experience. 3. Reflect on what they learned about themselves and their personal response when immersed in an unfamiliar environment especially in regards to developing a personal identity and code of ethics. 4. Report increase knowledge of the historical, political, social, economic, and religious aspects of the community in 22 de Abril, San Salvador and El Salvador. 5. Report an increased willingness to make changes in their daily life as a result of participating in this trip.
HANDOUTS
Action Words for Bloom’s/Anderson & Krathwohl’s Revised Taxonomy Teaching in Blooms Mapping Example: Residential College New Officer Conference Outcomes – Teaching Strategies – Assessment Strategy Chart
15
Week 3: Getting Acquainted with Assessment Strategies
LEARNING OUTCOMES FOR WEEK 3 At the end of Week 3, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. 2. 3. 4.
Define formative and summative assessment. Distinguish between direct and indirect assessment methods. List several quantitative and qualitative assessment strategies. Propose an assessment strategy for the program/activity in which they are assessing student learning in 2014 - 2015.
FORMATIVE AND SUMMATIVE ASSESSMENT Formative assessment techniques monitor student learning during the learning process. The feedback gathered is used to identify areas where students are struggling so that instructors can adjust their teaching and students can adjust their studying. These are low-stakes assessments that happen early and often in the program/activity. Black and William (1998) describe formative assessment like this: “All those activities undertaken by teachers and/or by their students, which provide information to be used as feedback to modify the teaching and learning activities in which they are engaged.” Summative assessment techniques evaluate student learning at the end of the program/activity. Summative assessment techniques measure the extent to which students have achieved the desired learning outcomes. Following are examples of Formative and Summative Assessment techniques developed by the University of Texas (http://ctl.utexas.edu/teaching/assessment/planning/methods) and Campus Labs. Formative Assessment Techniques Written reflections, e.g., “1-Minute Papers” or “Muddiest Points” o What was the most important thing you learned today? o What was the most confusing topic today? o What important questions remain unanswered?
Polls/Surveys o Classroom response systems (Clickers or Campus Labs’ Student Response System)
Checks for Understanding o Pausing every few minutes to see whether students are following along with the learning activity
In-Class Activities o Having students work in pairs or small groups while instructors roam the classroom as students work, helping those who get stuck and guiding those who are headed in the wrong direction
Pop Quizzes End of Class Reflection One Sentence Summary Concept Map Others?
Summative Assessment Techniques Final Exam Final Paper Portfolios Case Study or Analysis Exit Interview Model, Simulation, or Illustration Debate or Discussion Journal or Log Poster, Display, or Exhibit Presentation, Demonstration, or Slide Show Reflection on what and how one has learned Website Game Intervention Proposal for and justification of a solution to a problem Survey Video or Audio Recording Senior Recital Others?
16
Week 3: Getting Acquainted with Assessment Strategies
DIRECT AND INDIRECT MEASURES There are two types of outcome measures: direct measures and indirect measures. Each serves an important function in the assessment of student learning, and when used together, they provide a rich perspective on student learning by providing direct evidence and context to understand student performance. Direct assessment measures require students to demonstrate or display their knowledge, behavior, or thought processes. These are methods for assessing actual samples of student work to provide evidence of student performance relative to the learning outcomes. Indirect assessment measures ask students to reflect on their knowledge, behaviors, or thought processes rather than demonstrate it. Indirect measures allow you to collect secondary information on student learning. Consider the following survey questions. Which is a direct measure of student learning? Which is indirect? OPTION 1: INDIRECT Please rate your level of agreement with the following‌ I can name all of the sections that should be included when I create my resume. 0 Strongly agree 0 Moderately agree 0 Neither agree nor disagree 0 Moderately disagree 0 Strongly disagree OPTION 2: DIRECT List three sections that should be included on your resume. Following are examples of Direct and Indirect assessment methods. Direct Measures Capstone projects (with rubric) Performances (with rubric) Student publications or conference presentations (with rubric) Employer or supervisor rating of student performance Some types of open-ended questions on surveys Explicit self-reflections on what students have learned, including journals (with rubric) Case studies (with rubric) Portfolios (with rubric) Mock interview (with rubric) Tests and examinations of knowledge Pre-test/Post-test evaluation (score gains) Team/group projects and presentations (with rubric) Writing samples (with rubric) Poster presentation (with rubric) Certification exams, licensure exams
Indirect Measures Course grades Some types of open-ended questions on surveys Focus groups Locally-developed or national surveys of student perceptions or self-report of activities Surveys asking students’ perceptions of learning Job placement statistics Graduation and retention rates
17
Week 3: Getting Acquainted with Assessment Strategies When assessing student learning, direct methods are far more desirable than indirect methods. However, not all learning can be measured in a direct way. For example, a desired outcome of a program/activity may be to increase students’ confidence in their ability to recognize the signs of alcohol overdose which is difficult to assess directly. An indirect assessment is useful in that it can be used to measure certain implicit qualities of student learning, such as values, perceptions, and attitudes, from a variety of perspectives. However, in the absence of direct evidence, assumptions must be made about how well perceptions match the reality of actual achievement of student learning. It is important to remember that all assessment methods have their limitations and contain some bias. A strong learning assessment strategy at the program/activity level would utilize both direct and indirect assessments if at all possible. This use of multiple assessment methods provides converging evidence of student learning. Indirect methods provide a valuable supplement to direct methods.
QUANTITATIVE AND QUALITATIVE METHODS Quantitative methods assess learning by collecting and analyzing numeric data and using statistical techniques in the analysis process. Learning outcomes that describe knowledge and comprehension (Bloom’s Taxonomy) can most readily be assessed using quantitative data. Qualitative methods “rely on descriptions rather than numbers” (Palomba & Banta, 1999). Qualitative methods are most useful when surveys or other types of data collection methods may not paint the “whole picture.” Qualitative methods can be used to test knowledge, but they are most useful when measuring outcomes that describe learning at the levels of comprehension, apply, analyze, evaluate, and create (Blooms’ Taxonomy). Suppose you want to measure what students learned as a result of attending an event that focused on diversity and inclusion. OPTION 1: QUANTITATIVE METHOD/SURVEY QUESTION: Attending [event] enhanced my understanding of diversity and inclusion. 0 Strongly agree 0 Moderately agree 0 Neither agree nor disagree 0 Moderately disagree 0 Strongly disagree OPTION 2: QUALITATIVE METHOD/FOCUS GROUP QUESTION How was your understanding of diversity and inclusion challenged by attending [event]?
18
Week 3: Getting Acquainted with Assessment Strategies Following are examples of quantitative and qualitative methods. Quantitative Methods Test scores Surveys, including self-reports of learning and forced choice items measuring knowledge Pre/Post Tests, including “quasi” pre/post tests Rubrics (if assigning numbers to levels)
Qualitative Methods Focus groups Interviews Open-ended questions on surveys Portfolios Journals Reflective essays Photo Journaling Rubrics (if descriptive) Performances Participant observations Case studies
ASSIGNMENT In Week 1 you wrote a brief description of the program or activity you are focusing on in 2014 – 2015. In Week 2 you drafted the learning outcomes for that program/activity. At each stage, we recommended you keep in the back of your mind how you will provide opportunities for students to learn what you hope they will (teaching strategies) and how you will collect evidence that the learning occurred (assessment). Now you are ready to create your learning assessment strategy. Keep in mind that a strong learning assessment strategy at the program/activity level utilizes formative and summative techniques direct and indirect measures, and quantitative and qualitative methods, if at all possible. This use of multiple assessment methods provides converging evidence of student learning. They complement one another.
Learning Outcome
Proposed Assessment Strategy
19
Week 4: Analyzing and Interpreting Quantitative Data
LEARNING OUTCOMES FOR WEEK 4 At the end of Week 4, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. 2. 3. 4. 5.
Draft a simple survey to measure some aspect of student learning. Describe the difference between nominal (categorical), ordinal, and scale data. Access Baseline/Campus Labs. Use frequencies, percentages, and mean scores to describe a small set of data. Compare mean scores using simple tests of significance.
REVIEW Remember, using quantitative methods to assess learning involves collecting and analyzing numeric data and using simple—and sometimes, more sophisticated—statistical techniques in the analysis process. Learning outcomes that describe knowledge and comprehension (Bloom’s Taxonomy) can most readily be assessed using quantitative data.
QUANTITATIVE METHOD: TRADITIONAL TESTS Tests—multiple choice and other objective items—while less popular than they used to be, still have a place in the assessment toolbox. They are often used to assess fundamental knowledge and understanding. But be careful, writing good clear test items that do not suggest the “right” answer is not easy. Northwestern students are test-wise and there is a chance, even though they don’t know the answer, they will do well on the test through chance. The types of questions that work well on tests include: multiple choice, interpretive exercises, true-false items, matching items, completion or fill-in-the-blank. Suppose you were asked to take a test or quiz about what you have learned in the Assessing Student Learning Workshops so far. Here are two sample questions. Both can be analyzed using numbers and/or percentages. Examples of Traditional Test Questions Below is a list of ways to assess student learning. Please check which are direct and indirect measures. Direct
Indirect
Focus Groups Tests and examinations of knowledge Pre-test/post-test evaluation (score gains) Surveys asking students’ perceptions of learning Supervisor rating of student performance Simulation Poster presentations
Please complete the following sentences, describing when to use these assessment techniques. Formative assessment techniques assess learning ______________________________. Summative assessment techniques assess learning _____________________________.
20
Week 4: Analyzing and Interpreting Quantitative Data
QUANTITATIVE METHOD: SURVEYS Surveys are commonly used to assess student learning, directly and indirectly. Surveys can be a combination of structured questions, rating scales, open-ended questions that ask students to recall and/or explain factual information, or prompts for reflection. Structured Questions Structured questions are questions that offer students a closed set of responses from which to choose. Structured questions usually take the form of a test question on the survey (see p. 20). You can report the number or percentage of students that answered each question correctly or the number/percentage of students who answered 80% (or some predetermined percentage) correctly. Example of Structured Survey Question Which of the following are symptoms of alcohol overdose? (Check all that apply) Confusion Vomiting Violent behavior Passed out Cannot be roused or awakened Seizures
Argumentative Slow breathing Nausea Irregular breathing Cool and clammy skin Blue or pale skin, lips, or nail beds
Rating Questions/Scales Likert (pronounced Lick-ert) scales are the most familiar and they are often used to see how students rate their knowledge, skills, values, and attitudes related to learning on a scale from 1 to 4 or from 1 - 5, and sometimes from 1 to 7. Rating questions usually gather indirect evidence of learning and so, are not the best method. But sometimes they are the only method available, especially when there are a large number of students involved in the learning opportunity. Example of Rating Question/Scale To what extent has your experience at Northwestern contributed to your knowledge, skills, and personal development in the following areas?
Constructively resolving interpersonal conflicts Career- or work-related knowledge and skills Relating to people of different races, nations, and religions Understanding yourself: abilities, interests, limitations, personality Leadership skills
Very Much (4)
Quite a Bit (3)
Some (2)
Very Little (1)
28%
35%
28%
9%
30%
37%
25%
7%
30%
36%
26%
9%
51%
36%
11%
2%
40%
34%
21%
5%
21
Week 4: Analyzing and Interpreting Quantitative Data
Here is another example of the use of a rating scale that was designed to measure learning. It is, however, a very indirect measure. An Exit Survey for Student Affairs Staff Who Complete SLWG2 To what extent did SLWG2 enhance your ability to do the following: Completely (5)
Considerably (4)
Moderately (3)
Slightly (2)
Not at all (1)
Craft and implement a sound student assessment program/activity level Write a measureable learning outcome Implement a meaningful assessment strategy that measures student learning in a program/activity Analyze and interpret quantitative data Analyze and interpret qualitative data
Explain the findings of an assessment project to others Embrace the student learning assessment process
Below are a number of response scales suggested by Campus Labs. More can be found on the Campus Lab website (Documents and then Rating Scales). When using the scales, Campus Labs recommends presenting the scale from the most positive to the least positive (left to right). And the most positive response would be given the highest value. For example, on a five-point agreement scale “strongly agree� = 5, etc.
22
Week 4: Analyzing and Interpreting Quantitative Data
Possible Response Scales (Source: Campus Labs) Agreement Comparison Ease Extent (5 point) Extent (4 point) Frequency (no set time) Frequency (general) Frequency (extended) Helpfulness Importance Likelihood Numeric Scales Probability Proficiency
Strongly agree, Moderately agree, Neither agree nor disagree, Moderately disagree, Strongly disagree (another version removes the “moderately” qualifier and/or uses “neutral”) Much X, Slightly X, About the same, Slightly (opposite of X), Much (opposite of X) Very easy, Moderately easy, Neither easy nor difficult, Moderately difficult, Very difficult A great deal (Completely, if appropriate), Considerably, Moderately, Slightly, Not at all Significantly, Moderately, Slightly, Not at all Always, Often, Occasionally, Rarely, Never Daily, Weekly, Monthly, Once a semester, Once a year, Never More than once a week, Once a week, Once a month, Once a semester, Once a year, Less than once a year, Never Extremely helpful, Very helpful, Moderately helpful, Slightly helpful, Not at all helpful Extremely important, Very important, Moderately important, Slightly important, Not at all important Very likely, Moderately likely, Neither likely nor unlikely, Moderately unlikely, Very unlikely Less than #, About the same, More than # Definitely would, Probably would, Probably wouldn’t, Definitely wouldn’t Beginner, Developing, Competent, Advanced, Expert (typical for Rubrics)
Determining the appropriate rating scale is the easy part. Writing the “stem,” takes a lot more thought. The stem must carefully describe the knowledge or understanding you are seeking. Following are a few guidelines for writing good stems. 1. 2. 3. 4. 5. 6. 7.
Keep the stem simple and focused. Ask one question per question. Use language everyone can understand. Use precise qualifiers and time referents. Watch out for leading questions or stems. Avoid double negatives. Only ask a question that you will use.
Student responses on rating questions/scales are almost always reported in the aggregate. They can be reported in several ways: the percentage of students who responded “strongly agree” or some other point on the scale, the mean score (the average) of the scale, and so on. Open-Ended Questions Open-ended questions can also be used to assess student learning and they are more direct measures than rating scales. Instead of asking students the degree to which they learned something, open-ended questions allow you to ask students to demonstrate what they learned by explaining a concept or listing the steps in how to do something.
23
Week 4: Analyzing and Interpreting Quantitative Data
Consider the following. The rating question is easy to analyze, but the open-ended question allows students to demonstrate what they know and can still be measured quantitatively. Open-ended questions are more time-consuming and difficult to analyze, but provide far more depth. Rating Question/Scale I understand the steps involved in the Norris event planning process.
Open-Ended Question List the steps involved in the Norris event planning process.
Strongly agree [Code = 5] Moderately agree [Code = 4] Neither agree nor disagree [Code = 3] Moderately disagree [Code = 2] Strongly disagree [Code = 1]
Guidelines for Writing Good Survey Questions In summary, here are some guidelines for writing good survey questions. 1. 2. 3. 4. 5. 6.
Keep questions short and concise. Each question should be clearly stated so that there is no misunderstanding about what is being asked. The best way to ensure your questions are well worded is to have other people review and take your survey before you distribute it. Ask questions in a neutral way, i.e., that you are not leading respondents toward a particular answer. Northwestern students are smart! Ask only one question at a time. Begin your survey with a simple, but interesting question. Do not begin with a sensitive question.
QUANTITATIVE DATA: EXISTING SCALES Existing scales, developed by someone else, that have been found to be reliable and valid measures of one or more constructs are often used in pre/post assessment strategies. They are quantitative in nature. For example, the EQ-I or the Emotional Quotient Inventory purports to measure emotional intelligence and provides a total score and scores on 5 subscales. Another example is the Connor-Davidson Resilience Scale 10 (CDRS10). It was designed to measure resilience. And, the Socially Responsible Leadership Scale (SRLS) is an instrument based on the Social Change Model of leadership development. The SRLS is used for research, assessment, and education to measure and identify leadership capacities. Sometimes existing scales are only accessible through a third party vendor and there is a cost per respondent. Other times scales can be embedded in a locally developed survey. Regardless, seeking approval to use an existing scale from the author is required.
ANALYZING QUANTITATIVE DATA The most important consideration in the assessment process is to link the data you collect to the learning outcomes you have identified so you can determine how to improve student learning.
24
Week 4: Analyzing and Interpreting Quantitative Data
Reviewing the Data The first step in analyzing data is to review the data visually. Are there outliers and possible mistakes? Can you see a pattern or trend in the data? For example, depending on your test results or survey data, it may be clear that all students who completed a certain workshop had difficulty with a particular outcome. Levels of Measurement There are three types of data that need to be considered: nominal/categorical, ordinal, and interval/scaled. Nominal/Categorical Data. Nominal, sometimes called categorical, data basically refers to categorically discrete data such as class standing, school/college, gender, racial/ethnic background. Frequencies and percentages of observations falling into each category are appropriate summary statistics. Such statistics as means would not be appropriate. Ordinal Data. Ordinal data have a natural ordering. The ranking of favorite sports, the order of people's place in a line, the order of runners finishing a race or more often the choice on a rating scale from 1 to 5. With ordinal data you cannot state with certainty whether the intervals between each value are equal. For example, we often use rating scales (Likert questions). On a 7 point scale, the difference between a 6 and a 7 is not necessarily the same difference as the difference between a 4 and a 5. Ordinal data is most often reported as frequencies and percentages. Interval/Scaled Data. With interval data—or scaled variables—the distances between the data points are defined in fixed and equal terms. Examples of scaled variables include temperature, or GPAs. With this kind of data, mean scores are appropriate. Note, Likert scales are not interval or scaled variables, but they are often treated that way. Ways to Analyze Quantitative Data For our purposes today, we will discuss four basic ways to analyze quantitative data: frequencies, percentages, aggregates, and averages (mean scores). Of course, there are far more sophisticated analyses of data that can be done—and we can talk about doing those in some cases with your data—but for our purposes, let’s consider just these five ways. This section is from Suskie’s 2009 book, Assessing Student Learning (San Francisco: Jossey-Bass). When analyzing your data, Student Affairs Assessment will be right there to help you. Frequencies. Frequencies or tallies are straightforward counts of how many students selected or earned each rating or chose each option. If you are using a test question, you can tally how many students answered each test questions correctly. If the test is multiple choice, you can tally how many students chose each option of each test questions. Percentages. Percentages are easier to understand and more meaningful than raw numbers. Few people will understand the implications that 125 students passed a test (frequency or tally); more will care that only 23 percent did. Percentages make it easier to compare groups of different sizes or groups from different backgrounds, class standings, etc. You might want to compare the students in your workshop this year with the students in your workshop last year. Percentages help you easily view such differences.
25
Week 4: Analyzing and Interpreting Quantitative Data Aggregates. Sometimes it’s helpful to aggregate frequencies/tallies into overall scores or sub scores. Whether and how you do this depends on the purpose of the test, survey, or rubric and how its components align with your learning outcomes. An overall score, for example, on a rubric or test can give you a sense of students’ overall learning. Sometimes several items on a test, survey, or rubric address a common learning goal. In these cases, it can be helpful to aggregate those item results into an overall sub score. Averages/Mean Scores. Averages or mean scores summarize the central tendency of assessment results. This form of analysis of data is appropriate for scaled data. If your scores are ordered, a more appropriate statistic is the median: the midpoint of all results when they are listed from highest to lowest. Medians are also a good choice when a few very high or very low results distort the mean. If your results are categorical, the appropriate statistic is the mode: the most frequent result. Tests of Significance. Statistical tests are used to determine whether a relationship between two, or more, variables is statistically significant. Without this, we might make decisions based on observation that there is a difference when there really isn’t. Three statistical tests are the most common: (1) Chi-square test for nominal/categorical data, (2) t-tests for interval data (comparing two mean scores), and (3) ANOVA (comparing means scores in a scale variable across three or more categories of another variable. Tests of significance are generally used to compare results on pre and post-tests. If you are using a pre/post-test strategy, we will talk about how to statistically determine if the difference is significant. SPSS makes a distinction between “substantive” versus “statistical significance. “A critical distinction to make in survey analysis is whether a relationship is statistically significant versus bring substantively, or practically, significant . . . You should not let statistical significance bet he overriding determinant in deciding whether a relationship or pattern you have discovered is interesting or important” (Survey Analysis Using PASW Statistics, 2010, Chicago: SPSS Inc., p. 10-9).
HANDOUTS Tips for Creating Good Test Questions
26
Week 5: Analyzing and Interpreting Qualitative Data
LEARNING OUTCOMES FOR WEEK 5 At the end of Week 5, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. Write one or two open-ended survey questions that measure some aspect of student learning. 2. Write a prompt for a before-and-after reflection and/or journal entry. 3. Draft a simple focus group interview protocol. 4. Identify themes from a sample focus group transcript.
REVIEW Qualitative methods “rely on descriptions rather than numbers” (Palomba & Banta, 1999). Qualitative methods are most useful when surveys or other types of data collection methods may not paint the “whole picture.” Qualitative methods can be used to measure knowledge, but they are most useful when measuring outcomes that describe learning at the levels of comprehension, apply, analyze, evaluate, and create (Blooms’s Taxonomy).
QUALITATIVE METHOD: OPEN-ENDED SURVEY QUESTIONS Open-ended survey questions can gather direct evidence of what students know as a result of participating in various programs/activities. Consider the example from Week 4. Rating Question/Scale I understand the steps involved in the Norris event planning process.
Open-Ended Question List the steps involved in the Norris event planning process.
Strongly agree [Code = 5] Moderately agree [Code = 4] Neither agree nor disagree [Code = 3] Moderately disagree [Code = 2] Strongly disagree [Code = 1]
Asking students to list the steps involved in the Norris event planning process—a qualitative question—can be transformed into quantitative data in several ways: reporting the percentage of respondents that were able to list all the steps or the percentage that got three of the five correct, etc. Note, however, the caution expressed by Suskie (2009) “Short questions and prompts can be incorporated into surveys as well as posed on their own. If they are a part of a survey, keep in mind that they are usually not very popular with survey recipients because they lengthen the time required to complete the survey and require more mental energy to respond. Often survey respondents leave them blank. These kinds of open-ended items are therefore best used sparingly in surveys (p. 190).” And, coding open-ended comments on surveys can be more time-consuming as well.
QUALITATIVE METHOD: BEFORE-AND-AFTER REFLECTION Asking students to reflect at both the beginning and end of a program/activity and comparing their responses can give a sense of their learning. The prompts you provide
27
Week 5: Analyzing and Interpreting Qualitative Data students, however, become very important. The wording of the prompts should reflect the learning outcomes you desire. Below is an example provided by Suskie (2009) that illustrates student definitions of leadership before and after participating in a leadership development program. Unfortunately, she does not provide the prompts given to students. Student Definitions of Leadership Before and After Participating in a Leadership Development Program Initial Definition of Leadership The ability to give and take orders and being able to take charge of a large group of people.
Later Definition of Leadership I have learned that leadership is not a oneman/woman show. To be a good leader, you must have the respect from your committee, and you must be able to communicate.
The presence of a strong, task-oriented, and social-oriented yet compromising force or person
Leadership isn’t as easy as it looks@ Leadership takes a lot of hard work, and there are ups and downs to the position of a leader. My definition has changed to include the importance of diverse people.
Leadership is a responsibility or a skill/trait one possesses that makes a person stand out above everyone else.
Leadership is a collective process. You need leaders and followers to make an event or organization successful.
Leadership involves taking control in an organizational way. A leader must know how to dictate responsibility as well as work with others to achieve a goal.
Leadership has a lot to do with confidence. Most of the confidence lies within yourself, but you also have to have confidence in the people you’re working with. My definition of leadership has changed in the sense that I feel like it is more delegating and following up with your delegations than actually taking a lot of work upon yourself.
Leadership is an important element of life that can only be fulfilled by individuals possessing the motivation, insight, and communication skills to fulfill the mission, goals, and objectives of an organization
Leadership is ever changing. Now I think leadership is defined by the people being led.
Source: Adapted with permission from responses to prompts by Tess Shier, coordinator for campus programs, Office of Student Activities, Towson University.
Creating a good prompt begins with your learning outcomes: what do you want students to know or be able to do as a result of participating in the program/activity you are assessing? What questions can you ask the students that will permit them to describe where they were before engaging in the program/activity and how they have changed or how they are applying what they have learned? Often pre/post reflections are analyzed using a rubric. We will discuss rubrics in Week 6.
28
Week 5: Analyzing and Interpreting Qualitative Data
QUALITATIVE METHOD: JOURNALS Journals are documents into which students make repeated entries during a program/activity. They can be used for reflection, to record behaviors, and to describe what and how they are using what they are learning. Effective journals require clear learning outcomes and clear instructions or prompts. Students should understand exactly what they should write in the journals and how often. Journals can be time-consuming to read and evaluate. Once again, the evaluation process often begins by developing a rubric can describes where students might be at the beginning of a program/activity and their growth/learning trajectory.
QUALITATIVE METHOD: INDIVIDUAL INTERVIEWS OR FOCUS GROUPS Interviews usually consist of open-ended questions asked by an interviewer. Focus groups are in-person interviews of small groups of students. Both can be used to ask participants to reflect on themselves and their experiences and also to collect information on their behaviors. Because interview and focus group responses can be wide ranging and generally do not yield numerical results, qualitative summaries are typically used to look for response patterns and themes. General Principles. Several general principles should be observed when developing the interview/focus group questions. 1.
Make sure you thoroughly understand the learning you are measuring.
2.
Questions should be ordered from the more general to the more specific.
3.
Questions should be ordered by their relative importance to the learning outcomes. Thus, the questions of greatest importance should be placed early, near the top of the protocol, while those of lesser importance should be placed near the end.
4.
Most interview or focus group protocols consist of fewer than a dozen questions.
5.
Although it is wise to begin with an explanation of the program/activity in which you are measuring learning, avoid getting involved in a long explanation.
6.
The beginning of the interview sets the tone and the agenda. Start by asking an introductory question that is easy to answer. For example, “Tell me a little bit about why you decided to sign up for this program?”
7.
Good interview/focus group questions take time to develop. It would not be unusual to go through three or four drafts before you arrive at the final set. If at all possible, pilot test your questions. Ask your colleagues to look at your questions.
8.
Make sure you ask questions related to your learning outcomes. Don’t ask a lot of extra questions.
Characteristics of Good Interview/Focus Group Questions. Consider the following characteristics of good interview/focus group questions.
29
Week 5: Analyzing and Interpreting Qualitative Data
1.
Avoid closed-ended questions that are dichotomous (e.g., questions that can be answered simply “yes” or “no”). They provide little information and stifle discussion.
2.
Open-ended questions allow the respondent to determine the nature of the answer.
3.
Ask “what” instead of “why” unless there is a specific reason for asking why.
4.
Ask direct questions about an event rather than asking respondents to “remember.” If you do ask a question about something in the past, develop your question in such a way as to bring the respondent back in the original environment. For example, you might ask, “For the next few minutes, think back to your first leadership experience. How would you have defined ‘leadership’ then?” And then follow-up with a question about “now.” “In the last six weeks, we’ve explored a lot of aspects of leadership, how would you define ‘leadership’ now?” or “In the last six weeks, we’ve explored a lot of aspects of leadership, how has your definition of leadership changed, if at all?”
5.
Questions should be phrased simply and in language that the respondent understands. Long, complex, multipart questions are not only difficult to understand; response is also difficult.
6.
Questions should be designed to determine how the respondent structures the world, not how the respondents respond to the researcher’s view of the world.
7.
Ask “balanced” questions. For example, if you ask what class a student likes best, at the same time, ask which class he/she liked least. Or “What are the pros and cons” of XXXX?”
8.
The way in which questions are worded may place respondents in embarrassing or defensive situations. For example, instead of asking respondents, “Why aren’t you involved in any volunteer opportunities on campus?” the same questions might be asked as “What prevents you from getting involved in volunteer opportunities on campus?”
9.
Ask for concrete details, examples, stories.
ANALYZING QUALITATIVE DATA Analyzing qualitative data generally follows these steps: 1. 2. 3. 4. 5. 6.
Collecting the raw data Organizing and preparing the data for analysis Reading through all the data (Creswell describes it as “dwelling with the data”) Coding the data (e.g., themes, descriptions, or “short phrases, ideas, concepts of hunches that occur to you” (Creswell, 2002) Inter-relating the themes/descriptions Interpreting the meaning of the themes/descriptions
Here is a visual model of the coding process in qualitative research developed by Creswell (2002).
30
Week 5: Analyzing and Interpreting Qualitative Data
A Visual Model of the Coding Process in Qualitative Research (Source: Creswell, 2002) Initially read through text data
Divide the text into segments of information
Many pages of text
Many segments of text
Label the segments of information with codes
30 – 40 codes
Reduce overlap and redundancy of codes
Codes reduced to 20
Collapse codes into themes
Codes reduced to 5 – 7 themes
Coding strategies may include any or all of the following: 1. 2. 3. 4. 5.
Use comments in Word (highlight text and “add comment”) using reviewer tools. Print and do it by hand with colored highlighters or pens. Use software programs (e.g., NVivo). Cut and paste text onto index cards or sticky notes, sort and group cards/notes by hand. Identify quotes or phrases that you want to be sure to capture in your final report.
Below is an example using text from an interview done in 2003 that explored the development of intercultural sensitivity (Bennett, 1993) among undergraduates at Northwestern University. This is just a portion of the entire transcript but you can see how the coding began.
31
Week 5: Analyzing and Interpreting Qualitative Data
Interviewer: Tell me about your life before coming to Northwestern: your family, your neighborhood, and your high school. Student: Okay. I live right outside of Boulder, Colorado, in a community of about70,000 people. It’s not a very diverse community. For example, in my high school we had maybe 20 African American kids in a school of like, 1,600. But we were more diverse in Latino. It’s maybe like 10% Latino. As far as religion, it’s not very diverse. Pretty much everyone is Protestant or non-practicing. I didn’t know any Jewish or Buddhist or Hindu people in my high school. My parents are married and I have one younger sister. She is two years younger. Our neighborhood--I think that we’re pretty much middle class, but no one where I’m from is wealthy. There’s no upper, upper class there, so for the area, my family had more money than some of my friends. But compared to a lot of kids here at Northwestern, I’m definitely middle class. That’s kind of my background.
Interviewer: What exposure to or experiences did you have with people who were different from you as you were growing up? Student: I definitely had very little experience with people who are different from me. All through growing up in all my classes, there was one African American student who went through all of elementary school with me. And he was everyone’s best friend and we all really liked him. It was a positive experience for all of us. But it was only one person. So I was friends with him. I didn’t really have any friends that were of a different religion than me. Most of my friends were about the same socioeconomically. There were a few Asian people. I wasn’t particularly good friends with any other ethnic group. Most of my friends were just like me. So I didn’t have very much experience with diversity of any kind really.
Interviewer: What was your first year living experience like? Where did you live? Was your roommate or were your hallmates from similar or different backgrounds from you? What did you learn about yourself and others your first year? And are you still in contact with your freshman roommate and hallmates? Student: I lived in Bobb McCulloch. My roommate was very similar to me. She was--she even looked a lot like me. We had similar religious backgrounds, similar families—just one younger sibling, our parents still married. One of the things I mostly remember is having a lot of interaction with Jewish people that I had just never had before. There’s just no one where I’m from--I can’t way no one, but none of my good friends were Jewish. I just remember kind of realizing I didn’t even know--I had never even realized that there were--I don’t know exactly know how to put it exactly--that there were a lot of Jewish people. And then another thing that surprised me about being in the dorm was I had a friend who was an international student from Hong Kong. We become pretty good friends my freshman year. Just like hearing her talk about Singapore and this happened and then we were there and then we went there. Those were just experiences that I never had before. I remember having a lot of positive interactions with people who were different from me. I learned a lot, too, from them--from those who were similar to me and from those who were different. I just remember a wide variety. I liked it--learning about people and other cultures. And it inspired me to take some classes here that I think have been the most positive influence on me.
Impact of residence hall
Classes
32
Week 5: Analyzing and Interpreting Qualitative Data
HANDOUTS Tips on Conducting Focus Groups Campus Climate Focus Group Questions Project Wildcat Journal Prompts
33
Week 6: Rubrics
LEARNING OUTCOMES FOR WEEK 6 At the end of Week 6, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. 2.
Draft a simple rubric to measure some aspect of student learning. Use a rubric to evaluate SLWG1 posters.
DEFINITION OF A RUBRIC Suskie (2009) suggests that a rubric is scoring guide or “a list or chart that articulates the criteria and standards of achievement to be used to evaluate work.” Similarly, Levy (2012) describes a rubric as “a set of criteria specifying the characteristics of an outcome and the levels of achievement for each characteristic.” How does a learning outcome differ from a rubric? A learning outcome describes what students will know or be able to do as a result of participation in a program/activity. A rubric describes the expected properties of that demonstration (criteria) and the possible levels of achievement/performance (standard). Rubrics enable the collection of qualitative and quantitative data.
WHEN RUBRICS ARE USED Rubrics are a useful tool for assessing student learning, especially when the assessment strategy includes an opportunity for students to demonstrate or “perform” what they have learned. This includes presentations, role plays, assignments that require teamwork, or performances. Rubrics are also useful when assessing learning through reflection papers, portfolios, or journals.
TYPES OF RUBRICS Suskie (2009) described five kinds of rubrics: checklist rubrics, rating scale rubrics, descriptive rubrics, holistic scoring guides, and structured observation guides. Checklists A checklist rubric is a simple list indicating the presence of the things you’re looking for in, for example, a risk management plan or a student learning assessment poster. A checklist rubric will not provide any detail about the quality with which criteria are met, only that the criteria exists or not. Example of a Checklist Rubric for a Website (Suskie, 2009, p. 139) Check if Present (Standards)
Criteria for a well-designed website The purpose of the site is obvious The site’s structure is clear and intuitive Titles are meaningful Each page loads quickly Graphics and multimedia help convey the site’s main points The design is clean, uncluttered, and engaging Spelling, punctuation, and grammar are correct Contact information for the author or sponsor is given The date each page was last updated is provided
34
Week 6: Rubrics Imagine the kinds of results you get from this if aggregated across a group with whom you were working. If a large percentage of the group failed to have a structure that is clear and intuitive (ex. 70% of the “standards” are not present), then you, the trainer, may want to revise how you are teaching about websites, focusing on what students are not “getting.” On the other hand, if all criteria are met by the majority of students then that is, indeed, a great finding. Note, with a checklist rubric, you can confirm that certain criteria are met, but it does not provide a level of sophistication or quality. Rating Scale Rubric “A rating scale rubric is a checklist with a rating scale added to show the degree to which the things you are looking for are present (Suskie, 2009, p. 138). The main drawback with rating scales is that the meaning of the numeric ratings can be vague. Without descriptors for the ratings, the raters must make a judgment based on their perception of the meanings of the terms. For the same presentation, one rater might think a student rated “good” and another rater might feel the same student was "marginal."
Example of a Rating Scale Rubric for Theater Presentation on Risk Management (Suskie, 2009, p. 140) Strongly Agree (1)
Agree (2)
Disagree (3)
Strongly Disagree (4)
Clearly described the program/activity Described the process used to determine risk sources and categories Accurately identified the risk management issues that could arise Demonstrated evidence that the group had carefully considered the risk management issues Presented a plausible plan for dealing with risk management issues Demonstrate that there are adequate resources for dealing with potential risk management issues
35
Week 6: Rubrics Descriptive Rubric “Descriptive rubrics replace the checkboxes of rating scale rubrics with brief descriptions of the performances that merit each possible rating” (Suskie, 2009, p. 142). Descriptive rubrics are increasingly popular because they explicitly describe the dimensions of what you are measuring and the degrees to which students may be learning. But coming up with the descriptions for each level is not easy. Campus Labs recommends when building a descriptive rubric that you start from the outside (e.g., “beginner” and “advanced” and work your way in.
Example of a Descriptive Rubric for Student Conduct Administrative Hearing
Scale 1 - Beginning
2 - Developing
3 - Accomplished
4 - Advanced
Articulates some Understanding Unable to articulate Articulates a vague how their decisions understanding of how understanding of how Impact of contributed to the their decisions their decisions Behavior
Articulates a clear and detailed understanding of violation or citation, contributed to the contributed to the how their decisions or deny that it did violation or citation, violation or citation but contributed to the but makes excuses for lacks detail violation or citation behavior Articulates an understanding of the conflict between the incident and their personal values, but lacks depth
Integrates what they learned from the incident to affirm or develop their personal values
values at all or unable to articulate the conflict between the incident and their personal values
understanding of the conflict between the incident and their personal values, but the connection is vague
Decision Making
Does not explain how they could have prevented the situation through use of good decision-making skills or does not think they could have prevented it
Agrees that they could have prevented the situation through use of good decisionmaking skills, but cannot explain
Explains how they could have prevented the situation through use of good decisionmaking skills, but lacking some detail or important elements.
Clearly explains, in detail, how they could have prevented the situation with better decisionmaking skills.
Effect on Community
Cannot articulate if or how their behavior affected the community or does not think it did
Agrees that their behavior affected the community but cannot explain
Is able to state that their behavior affected the community, but lacks a clear understanding of how
Is able to state that their behavior did affect the community and has a clear understanding of how
References changing future behavior, but cannot explain what that means to them
Articulates some behavior changes, but lacks detail in explanation
Clearly articulates specific changes in future behaviors
Plan for Future Cannot articulate plans for how they Behavior will change their future behavior
Descriptions
Dimensions
Connection to Unable to articulate Can articulate values and articulates some Personal Values their personal
TOTAL:
36
Week 6: Rubrics Holistic Scoring Guide Holistic Rating Scales use a short narrative of characteristics to award a single scored based on an overall impression of a student's performance on a task. A drawback to using holistic rating scales is that they do not provide specific areas of strengths and weaknesses and therefore are less useful to help you focus your improvement efforts. Example of Holistic Rubric for Assessing Student Essay (Source: Allen, 2004)
Inadequate The essay has at least one serious weakness. It may be unfocused, underdeveloped or rambling. Problems with the use of language seriously interfere with the reader’s ability to discern what is being communicated.
Developing Competence The essay maybe somewhat unfocused, underdeveloped or rambling, but it does have some coherence. Problems with the use of language occasionally interfere with the reader’s ability to discern what is being communicated.
Acceptable
Sophisticated
The essay is generally focused and contains some development of ideas, but the discussion may be simplistic or repetitive. The language lacks syntactic complexity and may contain occasional grammatical errors, but the reader is able to understand what is being communicated.
The essay is focused and clearly organized; it shows depth of development. The language is precise and shows syntactic variety. Ideas are communicated clearly to the reader.
Structured Observation Guides A structured observation guide is a rubric without a rating scale. They are subjective, qualitative—but nonetheless, direct and valid—assessments of student learning. Here are the instructions Suskie (2009) gives for creating a structured observation guide. 1.
2.
Take informal notes the next time you evaluate students’ learning in your program/activity. How did you know they had learned what you had hope they would? What evidence was there? Did you notice any common patterns or themes when students learned what you hoped they would? Those factors, patterns, or themes represent your implicit goals and expectation for your students.
37
Week 6: Rubrics
A Structured Observation Guide for a One-Act Play Notes Pace and rhythm
Characterizations
Stage presence and business
Stagecraft: Costume, lighting, set, And sound designs
Creative vision and risk taking
“Sparkle” and audience engagement
Total integrated production effect
STEPS FOR CREATING RUBRICS 1. 2. 3. 4. 5. 6.
Identify learning outcomes - what do you students to know or be able to do? Look for models. Rubrics are becoming increasingly popular. See resources below. List the dimensions or the things you are looking for - what do you want students to accomplish? Focus on most significant skills. These will become your dimensions. Develop the rating scale. Label each level with names, not just numbers. If you are developing a descriptive rubric, fill in the boxes. Some recommend using an even number of scales to avoid tendency to score in the middle. Test it out. Make sure the standards and descriptions are appropriate.
RESOURCES Here are a few resources that provide examples of rubrics.
AAC&U VALUE Rubrics - 16 rubrics including Civic Engagement, Ethical Reasoning, Global Learning, Teamwork, and others (http://www.aacu.org/value) NILOA’s Rubric Resource Page - examples and additional readings (http://www.learningoutcomeassessment.org/Rubrics.htm#Description) Student Learner Outcomes and Rubrics - Texas A&M (http://sllo.tamu.edu/) Rubric Bank – University of Hawai’I, Manoa (http://manoa.hawaii.edu/assessment/resources/rubricbank.htm)
38
Week 6: Rubrics
HANDOUTS ACPA Rubrics for Professional Development AAC&U Civic Engagement Rubric
39
Week 7: Presenting and Using Data and Findings in Meaningful Ways
LEARNING OUTCOMES FOR WEEK 7 At the end of Week 7, Student Affairs staff who participate in the Student Learning Assessment Workshops will [or will be able to] . . . 1. 2. 3.
4.
Describe at least three ways to display quantitative data. Describe at least two ways to display/report qualitative data. Describe any gaps between what was intended and what students learning in the program/activity and how they will use the data to improve the learning experience. Create a poster describing their student learning assessment results for the 2015 Student Affairs Poster Gallery session in June.
PRESENTING DATA AND FINDINGS Data from your learning assessment projects can be presented in multiple ways. SLWG2 will each present the data and findings from their learning assessment projects during a poster gallery session next summer. But there are other ways to present data and findings: a written report, a verbal report, a presentation to your staff, a staff development presentation, or a conference program. Displaying Quantitative Data Quantitative data is generally presented using charts, graphs, and/or tables. These can easily be created in Word, Excel, or PowerPoint. When considering how to present your quantitative findings, keep in mind the audience. A beautiful chart or graph that is difficult to understand may not adequately tell the story about your learning assessment project. Consider the following example from the Sustained Dialogue assessment project in 2013. At a glance, we can see that there is a difference between the pretest and posttest percentages. The Percentage of Respondents who "Strongly Agreed" with Six of the Statements that Make Up the Controversy with Civility Value Socially Responsible Leadership Scale Pretest and Posttest
100% 74%
80% 60% 40%
53% 30%
0%
I am open to others' ideas
63%
49% 35%
Creativity can come from conflict
28%
26%
16%
20%
65%
30%
9% I value differences in Hearing differences in Greater harmony can others opinion enriches my come out of thinking disagreements
Pretest
I respect opinions other than my own
Posttest
Mean scores can also be displayed using charts and graphs. Again, this is easily done in Word, Excel, or PowerPoint.
40
Week 7: Presenting and Using Data and Findings in Meaningful Ways
Sometimes a table is the best way to illustrate your findings effectively. In the example below, the story you might want to tell is that half (51%) of the students reported that Northwestern contributed “very much� to their understanding of themselves. This percentage is higher than any of the other areas. To what extent has your experience at Northwestern contributed to your knowledge, skills, and personal development in the following areas? Constructively resolving interpersonal conflicts
Very Much (4) 28%
Quite a Bit (3) 35%
Some (2) 28%
Very Little (1) 9%
Career- or work-related knowledge and skills Relating to people of different races, nations, and religions Understanding yourself: abilities, interests, limitations, personality
30%
37%
25%
7%
30%
36%
26%
9%
51%
36%
11%
2%
Leadership skills
40%
34%
21%
5%
Another way to display quantitative data is to draw a model. Here is how staff from University Career Services illustrated their findings related to their Mock Interview learning assessment project.
Mock Interview Learning Assessment Project 87% of participants reported reading the materials provided prior to the mock interviews. Model 1: Performance Did readers perform better in the mock interview? As measured by a simple average of the rubric scores on each component, controlling for prior knowledge. p =.878 Prior Knowledge
Read Materials
Mock Interview Performance (Rubric Scores)
Model 2: Confidence Did readers see a larger confidence increase between pre and post-mock interview? As measured by a simple average of the confidence scores on both the pre-test and posttest, controlling for prior knowledge and performance.
p =.003
Prior Knowledge Read Materials
Confidence
Mock Interview Performance Score
41
Week 7: Presenting and Using Data and Findings in Meaningful Ways
Displaying Qualitative Data Qualitative data may be displayed by:
Listing the themes that emerged from the data Selecting key quotes or exemplars Building tables or matrices Using diagrams to visually display theories or models that emerged from a qualitative study
Displaying direct quotes, short stories or excerpts from interviews, focus groups, openended comments, self-reflection, and journals, are a widely used method for describing themes. Using direct quotes is important because it allows the audience to examine the data collected and analyzed, to understand the findings of the analysis, and to evaluate the plausibility, credibility or face validity of the claims and learning outcomes. Knowing how and when to including quoted text in a report is difficult and takes practice. A few guidelines may be useful:
Choose excerpts that clearly support the claim you make in your findings. The best excerpts should be selected to support your interpretation and findings
Include enough text in the quoted segment so that your reader will understand what the participant was saying and meaning, but not so much text that there is extraneous or non-relevant material in a quoted passage
If relevant to your learning assessment project, Include information that identifies the person quoted. For example, if you collected data from freshmen and seniors, it might be helpful to know which quotes were from freshmen and which were from seniors.
When integrating quotes into a report, it is important that the authors guide the way readers attend to the quote. This is done by including written material before and after quotes that present the researchers' understanding or interpretation of the quote.
Keep in mind that the use of quotations and the articulation of findings interweave or go hand-in-hand
(Adapted from Qualitative Research Guidelines Project, Robert Wood Johnson Foundation, http://www.qualres.org/index.html) Here is an example from the Religious Life learning assessment project in 2013 – 2014.
42
Week 7: Presenting and Using Data and Findings in Meaningful Ways
WHAT did they tell us? “This trip reinforced my [values] but also encouraged me to push them further in my own life.” Increase of Knowledge
“Looking back, almost every day here we have partaken in some sort of education regarding the history and politics of modern El Salvador.” “I don’t think I’ve ever actually participated in so much history/politics “immersion” as we’ve gone through. Even our presentation at UCM...could not inform me about the true historical and political realities here...” “Being in El Salvador now even is the closest I’ve been to any war…This forces me to consider my own ignorance about similar situations around the world”
Personal Development & Social Responsibility
“Can we truly call ourselves Christian and claim to love and care for others if we ignore the problems of El Salvador?” “What are the ways I am called to show God’s face to others in the community? And what constitutes my community-Evanston? Chicago? The US? The world?”
Another way to display qualitative data is to create a word cloud. Wordle is one way to do this (http://www.wordle.net/). The clouds give greater prominence to words that appear more frequently in the source text, thus illustrating themes. Below is an example of a work cloud created from all the open-ended comments of staff who reported having high morale on the 2013 Student Affairs Staff Survey.
43
Week 7: Presenting and Using Data and Findings in Meaningful Ways
USING DATA AND FINDINGS TO IMPROVE THE TEACHING/LEARNING EXPERIENCE The last two steps in the University’s assessment cycle (see p. 7) are about using the data to improve the teaching/learning process.
Step 5: Identify gaps between what was intended
Step 6: Make decision; implement change
and what was achieved Improving the learning experience might include “tweaking” how you teach what you want students to learning in your program/activity. Maybe the learning outcomes you developed for your program/activity do not accurately describe the learning that is occurring. Or maybe, if students are not learning what you intended, the program/activity should be revised or abandoned. Finally, Bresciani (http://www.bchs.edu/files/u41/ic_and_Co-Curricular_Program_ReviewJohnston.pdf) reminds us to two additional things to keep in mind when thinking about the results of your learning assessment project. 1. 2.
Don’t personalize the results, e.g., view them as failures. Results should always be used to inform program discussions, decision, and recommendations (e.g., course improvements, available to the public, strategic planning, budget allocations and reallocations, etc.).
GUIDELINES FOR POSTERS The 2015 Student Affairs Poster Gallery Session is Tuesday, June 23, 2015. Below guidelines for creating posters and deadlines to get PowerPoint slides to Student Affairs Assessment. 1.
Posters should be created in PowerPoint. The standard size should be 40” wide x 36” high. To adjust the size of the PowerPoint slide, following the following instructions. a. Open PowerPoint. b. Click the “Design” tab. c. Click “Page Setup.” d. Change the width setting to 40 and the height setting to 36. e. Leave the orientation on landscape. f. Click “OK” and go back to the PowerPoint slide.
2.
You can use whatever font you want, but do not use a font size smaller than 28. Anything less than 28 will be difficult for the viewer to read. Pictures might make your poster more interesting.
3.
All posters must have the following common elements:
The title of learning assessment project
44
Week 7: Presenting and Using Data and Findings in Meaningful Ways
4.
The Division learning outcomes to which this program/activity “maps”
The learning outcomes for program/activity (Students who participate in/engage in [program/activity] will or will be able to . . .)
A description of your assessment strategy
A description of your teaching strategy/strategies or curriculum
A description of your major findings (quantitative and qualitative) – display your data
A description of how you are going to use the findings to improve your program/activity
Posters must be submitted in PowerPoint or a pdf no later than June 16
45
References
REFERENCES Allen, M.J. (2004). Assessing academic programs in higher education. Bolton, MA: Anker Publishing Company, Inc. Bennett, M.J. (1993). Towards ethnorelativism: A developmental model of intercultural sensitivity. In R.M. Paige (Ed), Education for the intercultural experience (2nd ed., pp. 1-51). Yarmouth, ME: Intercultural Press. Bresciani, M.J. (2006). Outcomes based co-curricular program review. (http://www.bchs.edu/files/u41/ic_and_Co-Curricular_Program_Review-Johnston.pdf). Creswell, J.W. (2002). Research design: Qualitative, quantitative, and mixed methods nd approaches (2 edition). Thousand Oaks, CA: Sage Publications, Inc. Keeling & Associates, LLC. (2010). Format for writing student learning outcomes. Handout. Levy, J.D. (2012). Using rubrics in student affairs: A direct assessment of learning. Campus Lab webinar. Maki, P.L. (2010). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus Publishing, LLC. Palomba, C.A., & Banta, T.W. (1999). Assessment essentials. San Francisco: Jossey-Bass. Survey Analysis Using PASW Statistics (2010). Chicago: SPSS Inc. Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco: JosseyBass.
46
47