Assessment in A Publication of the Rutgers University Division of Student Affairs
Action
The Napkin Board : Rutgers Dining Services’ Low-Tech Assessment Tool
Winter 2013 Vol. 1 Issue 2
Editorial Staff
Scott Beesley* Chris Brittain Andrew Campbell Charles Kuski* Patrick Love Patricia Rivas* Cheyenne Seonia
Design
Scott Beesley
Photography
Scott Beesley Abbas Moosavi
Special Contributors Lisa Endersby
Student Experience Advisor University of Ontario Institute of Technology
Gavin Henning
Welcome to
Assessment in Action Happy New Year and welcome to the second issue of Assessment in Action. This publication is but one piece of an emergent divisional program of assessment. In addition to AiA, we are working on a modular assessment training program that can be accessed by anyone or any unit in the division. However, the most significant addition to our assessment efforts is happening this month when Marylee Demeter joins us as a full-time Coordinator of Student Affairs Assessment and Research. This marks the first time we will have a full-time expert to assist with assessment activities throughout the division.
Higher Ed. Assessment Expert and Professor at New England College and Marylee has an M.E. in Educational Psychology and an Salem State University. M.Ed. in Measurement, Statistics, and Program Evaluation
Michelle Yurecko
Assistant Dean of Academic Assessment, The College of St. Elizabeth
Questions?
Please Contact: Student Affairs Marketing, Media, & Communications 115 College Ave New Brunswick, NJ 08901 Phone: (732) 932-7949 Email: aia@rci.rutgers.edu * Students in the Rutgers College Student Affairs Master’s Program page
02
from the Graduate School of Education here at Rutgers. She has worked for the Center for Applied Psychology, Middlesex County College, The Graduate School of Education, and the Educational Testing Service. We look forward to her starting her work with us and you will be hearing from her directly as we move forward with coordinating and supporting divisional assessment efforts. Marylee’s email address is Demeter@oldqueens.rutgers. edu. We welcome all feedback about this publication and also invite anyone interested in writing for it to contact us. Best wishes for a successful start to the spring semester. Patrick Love Associate Vice President for Student Affairs
Assessment in Action
in This Issue MAAP under the Microscope By Cheyenne Seonia Net Promoter Scale By Patrick Love
page
04
page
06
Why I Hate Mission Statements By Michele Yurecko
page
Synergistic Assessment Methods By Gavin Henning
page
Interview with John Schuh By Charles Kuski Telling Stories A Data Sharing Pilot By Lisa Endersby
08
10
page
15
page
18
Cover Story The Napkin Board: Rutgers Dining Services’ Low-Tech Assessment Tool Joe Charette (pictured above) believes that assessment doesn’t have to be messy. In this article, Joe discusses how suggestions pinned to a cork board help Rutgers Dining Services assess what they are doing.
By Patty Rivas
Winter 2013
page
10
page
03
MAAP
Under the Microscope
By Cheyenne Seonia
W
hen I first met Ji Lee, Director of the Asian American Cultural Center (AACC), it was amidst preparations for their annual open house. The center buzzed as students decorated the walls with warmth and completed last minute tasks to help make the new academic year bright and prosperous. Despite the sense of urgency throughout the center, Ji was able to find a few moments to sit down with me and talk “assessment.” The AACC just completed an assessment using the new Mission, Alignment, Assessment and Planning (MAAP) program, a visual assessment matrix developed at Rutgers University. It is designed to help departments throughout Rutgers to clarify departmental contributions,
MAAP is a tool, like a microscope, it allows you to zoom in and focus on the small stuff, and zoom out to the big picture - Ji Lee enhance communication and collaboration, all the while aligning department goals with university goals. Essentially, it enables individual units of Rutgers University to work together and become unified. Assessment continues to grow as a significant part of the culture on campus, but the term is often met with a groan and a sigh of exhaustion by professionals. But for Ji, assessment does not carry the same anxiety as a midterm or final. Assessment in her case signifies the ability to take a snapshot of her department’s functions and locate places where growth is possible. Ji explained that when the AACC was first developed, the department was exceptionally small. With just a few staff members, assessment did not seem pertinent to their mission. However, as the department grew she recognized the need to start thinking about assessment and worked to get it on everyone’s radar. “Assessment is absolutely critical,” said Ji as she page
04
Assessment in Action
as the human body. The human body has multiple turned to MAAP to help her department flourish and function more efficiently. She explained that MAAP had functions and systems functioning as parts of a whole. This is analogous to the many departments and units allowed the AACC to adjust their “assessment lens” and realign their values, while strengthening their sense within the University. “It’s important for all systems to be in place in order to have a smooth operating body,” of community at Rutgers. explained Ji. Assessment, and specifically MAAP, Ji noted MAAP advocated self-reflection for the AACC. The ability to look closely inside the department allows for the evaluation of these systems and their functions in relation to a larger mission. In this sense, as well as how the department functions on a larger it is important for each part of the body to work on an university scale, gave Ji and fellow AACC staff individual level as well as on a collective level. “When members a greater understanding of their work and the body needs to heal just slapping achievements. “It allowed a band aid on the wound won’t our department to not only cut it. There needs to be a more step into a micro level but holistic approach in order for some also to step back on the institutional change to really take macro level and see that we place. On the exterior, the wound are doing some good.” may appear to be healed, but in On the micro level, Lee’s reality it is because all systems department looked closely on the interior level are working at programs and events together properly. MAAP allows us to understand how they to look at the work we do in a more were affecting students. holistic way,” she continued. Just “What resonated strongly like the focusing the microscope, with the MAAP matrix was Ji and her fellow department that it gave us a chance to members were able to place the go back and allocate our work they do within a larger resources a bit differently; frame, achieving greater successes maybe more of this type of and making more meaningful program versus that type strides. “Folks can have a greater of program,” explained Ji. appreciation for the work that they The ability to reflect on their do. MAAP definitely increases department enabled clearer morale and essentially gets us to do communication and more our work better.” meaningful involvement with As I walked back out onto the students. main floor of the AACC after our On the macro level, MAAP Ji Lee, Director of the Asian interview, I noticed everyone’s was a way for AACC to take passion and drive as they prepared pride in all the work they American Cultural Center to welcome new students. The were doing. Said Ji: “There’s energy flooded the space and I could tell the AACC’s a lot of pride in what we do here at the cultural center, experience with MAAP truly gave them a deeper sense but then to find out the work that you’re doing is being of pride and community. Ji’s scientific metaphors for justified because it fits in with this larger mission, that’s understanding assessment ultimately transformed the a big win.” idea of assessment into a crucial practice that is fun and The AACC began using the MAAP matrix like a motivating rather than dreary and exhausting. It’s the microscope, zooming inside their department exploring reward of positive growth that makes the necessity for individual goals, and zooming out, broadening their assessment clear. Just walk into the AACC and see for view and ultimately connecting with the university as yourself. a whole. It is this macro scope that creates a stronger sense of community within the AACC as well as within Rutgers University. Cheyenne Seonia is an undergraduate intern with Keeping with the science theme, Ji further explained the Rutgers Student Affairs Marketing, Media and how MAAP provided clear identification of external Communications department. She will be graduating in the and internal assessment needs by thinking of Rutgers Spring of 2013 with a degree in English.
Winter 2013
page
05
Net Promoter Scale
By Patrick Love
I
t is always my intention to provide a program or conference feedback on my experience. But then I get the survey – 4 pages, 5 pages, 6 pages, or more, and with questions about many topics I just don’t care about. Sometimes I don’t finish those surveys. There’s got to be a better way, and I beleive I’ve found it. It is a form of assessment that has been adopted by Apple, Amazon, Allianz, P&G, Costco, Vonage, Intuit, Google, American Express, Verizon, General Electric, Phillips, Overstock.com, T-Mobile, eBay, Jet Blue Airways, Symantec, TD Bank, and the Vanguard Group. And it is only TWO questions. All of these companies have adopted the Net Promoter Scale (NPS) methodology to assess customer satisfaction and loyalty and to drive organizational improvement. The NPS was developed by Frederick F. Reichheld and Satmetrix Systems, Inc. The
0
1
2
Promoter Lickert item, and a second, open-ended question: 1) How likely are you to recommend [company/product/ experience] to a colleague or friend? (0 - 10) AND 2) Why did you give us this score? Respondents who respond with a 9 or 10 to the first item are classified as Promoters, and those who respond with 7 or 8 are classified as Neutrals. Respondents who answer 6 or lower are classified as Detractors. The Net Promoter Score is calculated by subtracting the percentage of Detractors from the
3
4
5
6
7
Detractors
over time and gauge the impact of policies and programs. However, one-time administrations are also done frequently. Responses to the open-ended question are what are most important to the individual answering the question, not the company administering the survey. Also, the qualitative data can be analyzed and linked to the score pattern of Promoters, Detractors, and Neutrals, in order to create a deeper understanding of participant satisfaction (and dissatisfaction). These analyses can generate actionable items and concrete programmatic changes grounded in
8
Neutral
9
10
Promoters
The Net Promoter Scale uses two question. 1) How likely are you to recommend [company/product/experience] to a colleague or friend? (0 - 10) 2) Why did you give us this score? NPS measures customer loyalty and satisfaction, and can be directly linked to organizational growth and profitability. The appeal of the NPS is its simplicity. Derived from a single Lickert item, it is easy to understand, administer, evaluate, and explain. NPS assessment programs administer a two-item survey consisting of the Net page
06
percentage of Promoters. The absolute range is from -100 to +100. What constitutes a good score varies depending on the industry, for example, good hotels expect to be in the 70 range, while 15-20 is considered good for airlines. Typically, corporations calculate the NPS using repeated, random samples, in order to monitor consumer satisfaction
Assessment in Action
robust participant response data. The survey is designed to be neither anonymous nor confidential, because organizations can follow up with both Promoters (e.g., to recruit further participation) and Detractors (e.g., to see if problems are correctable). To date, there are no published sources documenting use of the NPS in higher education. However, while at Pace University,
Net Promoter Scale
I successfully implemented a Net Promoter program in order to monitor the effectiveness of the Office for Student Success. The results helped to shape our services
% Promoters
% Detractors.
to students. I also used it to evaluate future conventions. the 2011 ACPA Convention. Imagine Try it and let us know your evaluating an entire convention in experience! two questions. The process gave ACPA actionable data to improve
Patrick Love is the Associate Vice President for Student Affairs at Rutgers University. His areas of scholarship include assessment, organizational culture, leadership and management issues in student affairs, applying theory to practice, spiritual development, and GLBT issues.
• Online webinars on assessment • Online RA training and assessment • Online professional development workshops • Hundreds of catalogued websites and listservs • The internet resource for student affairs professionals
Summer Winter 2013 2012
page
06 07
Why I HATE
Mission Statements By: Michele Yurecko I have a confession to make. I hate Mission Statements. There. I said it, and it’s too late to take it back. Given that I am an assessment professional, my aversion to Mission Statements may surprise you, particularly those members of the administration and evaluation crowd. In the workplace, most of us
to define our institutional identity and determine success, but are they really up to the job? A Mission Statement is a guiding statement of purpose. It is the big idea that characterizes the essential nature, values and work of an organization. Although the Mission Statement serves the
come with limitations - limitations that, unfortunately, are rarely discussed. The Mission Statement is viewed as the heart and soul of an organization. We praise “mission driven companies,” and “mission focused leadership.” However, the Mission Statement is merely one component in the complex process
A mission statement is defined as “a long awkward sentence that demonstrates management’s inability to think clearly.” All good companies have one. - Scott Adams, creator of Dilbert accept assessment models that place the Mission Statement at the core of institutional or program evaluation. Mission Statements are supposed page
08
important function of declaring an organization’s purpose and core values, it is important to recognize that Mission Statements
Assessment in Action
of assessment, and too much focus on that one component can narrow our vision and draw our attention from other salient factors that have
enormous impact on organizational success. The first problem I have with Missions Statements is that they are too darn aspirational. They describe what an organization intends to be, without necessarily taking into account what the organization actually is. If you want to know what an organization wants to be when it grows up, read the Mission Statement. If you want to know what an organization actually is, you’re going to have to dig a bit deeper than that. The “boots on the ground” realities which deeply impact the day-to-day conduct of an organization, can have tremendous influence on the identity of an organization and the success of its efforts. However, the lofty Mission Statement frequently glosses over, or even ignores, these powerful factors. In their aspirational smugness,
Mission Statements are also overwhelmingly affirmational. They are like tiny little cheerleaders, rootroot- rooting your organization to success with positive statements. Mission statements describe what an organization intends to achieve, what legacy it will leave, how it is going to shine. Mission Statements rarely discuss challenges, limitations, or anything that might cast the organization in a bad light. In effect, Mission Statements eliminate an organization’s challenges and limitations from the declaration of its identity. Yet, in reality, those limitations and challenges are key features that reveal hard won successes and heroic efforts. With its positive and perky language, the Mission Statement can be reduced from a means of institutional renewal, to a tool for self-promotion. Rather than foster
A Mission Statement is a dense slab of words that a large organization produces when it needs to establish that its workers are not just sitting around downloading Internet porn. - Dave Barry, Humorist Mission Statements are always looking forward; they never look backward. This belies an interesting contradiction with evidence-based practice. We frequently see Mission Statements touted and revered by the same individuals who cry for evidence-based decision-making and evaluation. Yet whereas the Mission Statement continually looks ahead, in an evidencebased culture, we frequently look back, using lessons from the past to shape practice in the future. Mission Statements articulate the endpoint, without capturing the dynamic, developmental nature of organizations. In addition to being aspirational,
genuine reflection, evaluation, and change, the Mission Statement can become a means to promote and market an organization with handpicked evidence and assessment results. While this practice may enhance public perception, it denies an opportunity for real renewal. As pointed out by Robert Stake and his colleagues, “It is difficult to fix weaknesses in an atmosphere of self-promotion.” Ethical and effective assessment practice must occupy the real estate between what an institution intends to be, and what it actually is; between what the Mission states, and the story told by the “boots on the ground.” Assessment cannot
be limited by the language of the Mission Statement, but must look beyond the aspiration and affirmation, to construct a fuller picture of organizational identity. Effective assessment must take into account an organization’s mundane challenges and limitations, as well as its lofty Mission. In constructing this more comprehensive approach, the assessment professional can better serve the honorable goal of organizational improvement and renewal, and avoid the temptation of self-promotion. Would we all be better off ditching these aspirational, affirmational, self-promoting Mission Statements? Should we just abandon the window dressing, and get back to work? Maybe. In spite of the fact that Mission Statements drive me crazy, I’m still not ready to vote them off the island entirely, but they must be placed in “timeout.” When attempting to evaluate organizational effectiveness, it’s always good to practice skepticism and deconstructionism, particularly with regard to the Mission Statement. I’ll let you in on a little secret. When I started my most recent position as Assistant Dean of Academic Assessment, I didn’t write a mission statement for my office. This wasn’t an oversight or even an act of rebellion, but more of an experiment. I shelved the Mission Statement, and instead composed a statement of Assessment Philosophy, one that I hope is powerful and flexible enough to support changing goals and an expanding role on campus. Do I feel a little rudderless without a Mission? No. I feel free and excited about where the “boots on the ground” will take me. Michele Yurecko is the Assistant Dean of Academic Assessment, The College of St. Elizabeth.
Winter 2013
page
09
n i k p a N The Board:
g n i n i D s r e g t Ru h c e T w o L ’ s e Servic l o o T t n e m s Asses By Patty Rivas
t/ n e m · s s e s · s a : t n e m s s e ns i Ass k p a n w e i o rev t / t n ə m s e ə’s y b d r a o b a o t ’ pinned ont n s e o d t a h it, t a W . s t n e ctly d a x stu e s ’ t i , y l l ctua A . t h g i r d has soun ” d r a o b n i apk n “ e h T . t ing h n i rig D s r e g t u yR b d e z i l i t u been ars. e y 0 2 r e v o Services for page
10
Assessment in Action
I had the pleasure of interviewing Joe Charette, Executive Director of Dining Services. We discussed various assessment tools Dining Services utilizes, but the napkin board is what stood out to me. The napkin board began when students started writing feedback on napkins and taping them on the wall at dining halls. They would address each dining hall with it’s own moniker. For example, when writing a note in Brower Commons, they would begin with, “Dear Captain Commons.” Little did these students know, this napkin board would become a major catalyst for a lot of change within Dining Services. Students are able to leave notes at each dining hall. When they check back, there is an index card with an answer, waiting on the student’s napkin. Each dining hall’s unit manager answers the napkins. Unit managers see questions such as “Why do you have Captain Crunch but not Captain Crunch with berries?” or “You have Skippy peanut butter, but I’m more of a Jiff guy.” Joe says they are able to accommodate requests such as these if “it doesn’t cost us any extra.” However, major changes have been
implemented as well. One of those is continuous service. Dining halls used to close in between meals in order to set up for the next shift. Many students wrote that if they had class at odd hours, they were never able to eat in the dining hall. Now, each dining hall stays open throughout the day. Other major changes have included take-out services, the sushi line at Brower Commons, and making Greek yogurt available. “It is the best feedback system that we have,” states Joe. Dining Services has decided to take the napkin board concept and go digital. They now have a Facebook page for each dining hall, a Twitter account, an iPhone app, and a computer kiosk at each dining hall. All encourage students to give their instant feedback. They plan on expanding to other social media outlets, such as Foursquare, in order to have more areas where students can give their opinions. “We are able to get information out there. Students aren’t reading flyers, ads in the Targum, or sitting down at desktop computers anymore. They get their information from their phones. We want to relate to students in the way that they want
During Hurricane Sandy, many students left thank you notes to dining hall staff members who were serving food during the storm. The one from Sasha reads, “On behalf of the familes in Marvin, Russell, and Johnon apartments. Thank you. You have no idea.”
to get their information and also hear what they say to each other,” says Joe. Every year, thousands of new students come to campus, and every year, there is something different about them. Lately, Joe has noticed that students want more local and sustainable foods, hormone-free foods, and organic food, options. Dining Services has tried to accommodate this by being sensitive to allergies, and providing various foods for special populations. Because every class may have different opinions, “the challenge is finding out what’s the most popular thing today,” says Joe. Dining Services is constantly assessing the student population, whether it’s through their napkin board or online services. Of course, Dining Services also utilizes traditional assessment methods as well. They use an outside company, Food Insights, to conduct bi-annual qualitative and quantitative surveys. These surveys are given to faculty and staff, alumni, and students. Various changes have been, or will be, made due to these surveys, such as the addition of a diner and a Starbucks on Livingston campus (so you can thank Dining Services for those!). They also use a system called “Satis Track” to assign surveys to specific groups of students. The students are sent two surveys throughout the semester, with the incentive of a $25 RU Express credit. Both of these assessment tools, as well as social media, will be used to garner feedback once several new dining areas open on Livingston campus, such as those mentioned above, and a “green grocer.” Surveys will ask students things like “are the hours meeting your needs?” or “is there something you need to leave campus to buy that you can’t find in the green grocer?” Each new unit will also have it’s
Winter 2013
page
11
own Facebook page and napkin board. When asked if he thinks the use of social media will mean less student interaction for him and his dining crew, Joe responds, “Not at all. Social media gives people who don’t feel comfortable or don’t have the time to have a face-to-face interaction to be heard. People who want to speak with you face-to-face will seek you out.” Dining Services is using traditional assessment methods, but also has various innovative concepts that create a well-rounded strategy. Joe believes other units are capable of doing the same. He advises units
where students are invited to make dishes out of any of the 5,000 items in the dining hall. The dishes are brought to a panel of judges who analyze for plate presentation, taste, and creativity. This event prevents students from getting bored of the food, and shows them the many options they have at each dining hall. Also, other students eating at the dining halls see them making these dishes, and then start getting information and feedback from their fellow peers. “This is a more fun way of getting this information out there than a newsletter. It also gives us feedback as to what students would create if they could make the
We make students understand that they are the purpose of our work, not an interruption to our work. - Joe Charette to make it known that they are seeking feedback from students. “Students think ‘this place is huge, who’s going to listen to me?,” says Joe, “make it known that what they feel is something you want to hear. You have to encourage that.” His tip is to start with pulling together a random sampling of people you know and ask them to recommend students who are interested and willing to speak with you. “Make students understand that they are the purpose of our work, not an interruption to our work,” he says. Lastly, Joe emphasizes the importance of having fun. Dining Services holds an Iron Chef competition page
12
Assessment in Action
menu,” says Joe. It’s certainly not traditional assessment, but those fun and creative initiatives are what have helped Dining Services make it known that they care about what students want.
Patty Rivas is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. She is currently an intern at Rutgers Office for Violence Prevention and Victim Assistance.
Summer 2012
page
02
Synergistic Assessment Methods By Gavin Henning
W
hen is assessment not just assessment? When assessment is also educational.
The notion that assessment can also be education is confusing to many folks especially if they view assessment as an activity that is completed at the end of an interaction such as a program, service, or meeting. Synergistic assessment is assessment that provides data to help understand if goals were achieved, where improvements can be made, and helps foster learning. Thinking in this synergistic way requires a shift in our mindset. Student affairs educators need to be actively learner-centered. If educators are learnercentered then assessments will be focused on what students are receiving from an experience, not what they doing as educators, programmers, or administrators. The result is that intentions will be focused on fostering learning in multiple ways. Some methods are more adept at implementing this type of synergistic assessment. In 1993, Thomas Angelo and Patricia Cross published an informative book entitled Classroom Assessment Techniques. This book was meant to be an educational assessment guide for college teachers, but there are benefits for student affairs educators. Most of the 50 techniques in this book can be used in student affairs, and virtually
all of these assessment methods identified in the book foster learning. Let me highlight just a few of these instructive methods so you can see the synergy of partnering assessment with learning. A useful classroom assessment technique is the 1-minute paper. There are many ways to adapt this technique for a variety of learning experiences. The 1-minute paper really is the Swiss Army knife of assessment methods. After an interaction, which could be program, meeting with a student, student organization meeting, or even a conduct hearing, the educator takes a few moments to ask the student to respond to a question using an index card. The question is usually some form of “what was your major take-away from this activity?” In addition to allowing the educator to learn if the student took away from the activity what she was hoping, this method also creates what can be called a “reflection trap.” The 1-minute paper creates a space for the student to reflect on what he/she learned. The reflection trap is the contemplative oasis in a desert of overstimulation. It forces the student to reflect on their learning when there are few opportunities to do that. Another benefit of the synergistic approach of the 1-minute paper is its ability to assess learning at all levels of Bloom’s revised cognitive taxonomy (below).
What are five ways identified in the readings that can help you reduce stress? (Remembering) Based on the floor meeting, what do you believe are at least three reasons we don’t allow alcohol in the residence halls? (Understanding) How can you use what you learned in these activities in your student organization? (Application) After participating in the ropes course, describe the keys to success for the group? (Analyzing) As you reflect on this past year as I have served as your organization’s advisor, in what areas have I been most effective and in what areas can I improve (Evaluating) Based on our conversation regarding potential careers, what would be four steps in your action plan? (Creating)
Winter 2013
page
13
The “muddiest point” is a classroom assessment technique that is the opposite side of the 1-minute paper coin. Where the 1-minute paper helps both student and educator understand what the student learned, the muddiest point helps both understand what the student doesn’t know or is struggling to learn. The muddiest point can be used in conjunction with the 1-minute paper exercise having students use the backside of the index card. This method also serves as a reflection trap providing students an opportunity to reflect on what they are confused about from the learning activity or what they would like to know more about. Questions are usually a variation of what you are still confused about or what you would like to know more about. This helps the educator know what he/she needs to do a better job explaining next time. If there is an opportunity to follow-up with the student or the group, the educator can also resolve the confusion. Directed paraphrasing is another classroom assessment technique that fosters student learning. This method challenges learners to paraphrase a technical statement or passage into language that is more colloquial. This is a fabulous technique for RA. One of the most challenging tasks for RAs is
describing the alcohol policy to students on their floor during the first meeting in fall. This is an often-scary experience. Directed paraphrasing can be a great teaching tool. During a 1-on-1meeting between hall director and resident assistant before the first floor meeting, the hall director can ask the RA to paraphrase how he/she would describe the alcohol policy. This method allows the hall director to know if the RA can accurately summarize the policy in student language. It also provides the RA the opportunity to consider how she would like to paraphrase the policy and to practice this statement in a supportive environment. This recitation can help ease the discomfort of the RA, allowing for effective communication between RA and student. There are many more assessment techniques that foster learning in addition to helping one understand if a learning activity or interaction achieved its espoused goals or identified ways to improve implementation of the activity ahead of time. Synergistic assessment begins with a learnercentered individual that sees assessment more than just an activity and is continuously seeking new and innovative ways to help students learn. I challenge you to be that synergistic assessor. In doing so, you will become a better educator.
Gavin Henning has nearly 20 years of experience in student affairs and over 10 years experience in student affairs assessment and planning. He teaches student affairs assessment and research in the master’s programs at New England College and Salem State University. Gavin’s scholarship has been published in professional journals and he has been an invited speaker at regional and national conferences.
Write for AiA! We’re looking for contributors From RU and Beyond
It’s not about publish or perish. It’s about making the culture of assessment visible on campus. It’s about making assessment part of every program. It’s about breaking the stigma of assessment as busy work. Would you like to learn more about assessment by writing about it? If you would like to find out more about the opportunity to write for AiA, please send us an email: aia@rci.rutgers.edu page
14
Assessment in Action
Interview with Assessment Expert
John Schuh By Charles Kuski
Assessment ought to be rewarded but I think in our contemporary environment, this ought to be thought of as a routine aspect of student affairs work. Programs that aren’t engaged in routine assessment are simply incomplete. - John Schuh
D
r. John Schuh was gracious enough to sit down and talk with me about how the subject of assessment has changed and evolved since he started writing the books that are now used as standards in the field. While Dr. Schuh recently “retired” as a Distinguished Professor of Educational Leadership and Policy Studies at Iowa State University, he still serves as an Editorial Consultant identifying and advising authors in the higher education and student affairs areas of Jossey-Bass. He admits that he looks at his work schedule and questions whether or not he’s really retired. Dr. Schuh comes from an impressive Residence Life background and served as the Associate Vice President for Student Affairs at Wichita State University. In addition to this, he held administrative positions at the University of Kansas, Indiana State University, Indiana University Bloomington, and Arizona State University. Even though Dr. Schuh has authored and coauthored multiple books on assessment in student affairs, he still describes his introduction to assessment as “falling into it.” Having never taken a formal assessment
course while getting his Master’s in Counseling at Arizona State University, Dr. Schuh’s first lesson in assessment came in the form of a 1976 Western Interstate Commission for Higher Education (WICHE) seminar by Dr. Ursula Delworth while working for the Residence Life Office at ASU. WICHE offered a simple promise; they would teach a team of “student personnel workers” (this was 1976, after all) an ecosystem assessment model in exchange for their agreement to actively use the model in their institution. Dr. Schuh contrasts this story with the emphasis on effective assessment initiatives that we see in student affairs today. When Dr. Schuh attended that workshop, he stated that assessment with the sophistication that we know today just wasn’t conducted. In addition, there was no professional development for assessment, no convention or conference workshops on the topic, and certainly no CAS standards. Even in 1996 when Dr. Schuh and Dr. Lee Upcraft published Assessment in Student Affairs, assessment was broadly defined as “keeping track of services,
Winter 2013
page page
15
programs, and facilities, and whether or not such offerings have the desired impact.” Dr. Schuh contests that the now-sixteenyear-old definition really only shows an accountability dimension. In practice today, that accountability dimension certainly still exists, but the focus now is on assessment and evaluation of what students have learned, how they have learned, and how they have changed and developed.
satisfaction. Looking beyond the field, Dr. Schuh highlights a change in environment, especially with regard to governmental interests in higher education. There is more pressure on practitioners today to be able to demonstrate effectiveness to stakeholders of a college. Not engaging in routine assessment can leave an institution vulnerable to scrutiny from that external environment. Dr. Schuh asks, “How do you
There’s a mindset created when you take on a project that you need to completely close the loop and have a means for measuring the extent to which a specific initiative, project, or pursuit has been successful. And that’s not bad—not by a longshot.- John Schuh Dr. Schuh believes this could be due to a paradigm shift in student affairs as well as a changing external environment. Today, student affairs professionals measure departmental and divisional effectiveness more in terms of student learning and development rather than simply student
page
16
Assessment in Action
know if you’re doing a good job or not? It’s got to be more than, ‘we know what we’re doing, take our word for it.’“ With regards to the changing environment, Dr. Schuh recounts simply, “that isn’t the way it’s always been.” As a broad statement, the luxury of labeling student affairs as a “faith-based profession” may be well behind us. Dr. Schuh sees both pros and cons to the shift to an assessment focus. While assessment adds a dimension of pressure and stress to creating and improving programs and initiatives, it also suggests that there exists a mindset at the birth of a
project that we should completely close the loop. That is, we should have a means for measuring the extent to which a specific project has been successful. Dr. Schuh very clearly asserts, “it ought to be thought of as a routine aspect of student affairs work. Programs that aren’t engaged in routine assessment are simply incomplete.” For a practitioner, engaging in routine assessment seems more easily said than done. When Dr. Schuh started publishing books on the subject, much of the resistance to assessment was from those who believed assessment was overwhelmingly about statistics. Someone who had no background
excitement that universities around the country are engaging in assessment to create positive, sweeping changes to their students’ experience. As Dr. Schuh gently reminded me in the beginning of the interview, by the time I was born in 1989, he had already been working full-time in the field for 19 years. Of course he didn’t say that to be rude, but rather to underscore the vast change over time that has occurred in our field. During that time, he’s seen the field of student affairs go from one ecosystem model to complex assessment and evaluation models spanning years and touching upon everything from a student’s
in statistics couldn’t possibly assess effectively. Dr. Schuh advocates keeping assessment as simple as possible. As student affairs moves from solely a focus on assessment for accountability to a focus on the assessment of student development and learning, the not-so-statistically-literate become more fluent. As a result of successful assessment initiatives, Dr. Schuh explained that universities are able to assert their effectiveness and influence more boldly than thirty years ago. For example, at the University of South Carolina, there is an initiative to see if students in a certain first year transition program are more likely to be retained and graduate on time. At the Ohio State University, there is a push to conceptualize and improve the sophomore year experience and the redevelopment of a sophomore residence area. Dr. Schuh described these and other initiatives with
minute details to the entirety of a student’s developmental process in college. It remains clear that assessing and evaluating is simply about how to best serve our students. I’m thankful that Dr. Schuh accidentally fell into assessment and brought it to the forefront of our profession. Be sure to look for Dr. Schuh’s monograph on this topic in the New Directions Monograph, published by JosseyBass, due to arrive in the spring of 2013.
Assessment used to be just keeping track of how many students participated in a particular program or how many patient visits there were to the student health service. I think we’re past that and getting much more into what students have learned and how experiences are making a difference in their lives. - John Schuh
Charles Kuski is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. He is currently a Hall Director at Rutgers University.
Winter 2013
page page
17
ASSESSMENT IN ACTION
18
Telling Stories:
A Data Sharing Pilot
By Lisa Endersby
T
poor.” ime and time again, I heard the phrase “Data rich, but information the and office, affairs t studen any almost in ent Say the word assessm s, conversation immediately turns to the data – piles of paper survey is data The res. brochu ing market for nials percentage scores and student testimo is report the after it to s happen what but look, we everywhere and anywhere written? is To add to this poverty of information, much of our data collection e prepar might animal an like much data gather and done in silos. We hunt on set sights our es, resourc data our up shore for winter hibernation. We these the ‘one day’ it will be useful. Particularly in larger departments, could that data onto g holdin areas, nt differe n stores get divided betwee work. our inform and celebrate In an attempt to share our rich data resources across the department, ng that I recently piloted a data sharing initiative at our staff retreat. Knowi titled I office, our in some for words tough still assessment and data are the project ‘Storytelling at the Student Experience Center Retreat’. With a charge to tell their piece of our students’ story, my colleagues of the data worked to create posters that would serve as a visual representation of a busy out come just had we timely; was ve they had collected. This retreat and initiati nch of re-lalu or launch the with along ms, progra summer of orientation and transition ent assessm of flurry a came activity this of all new and improved programming. With level higher rs, Directo with shared were tations and report writing. Reports and presen other. committees and some students, but never, surprisingly, with each per, almost Develo edia Multim patient and d Thanks in no small part to a talente From nials. testimo and rs numbe share to every colleague and unit created a poster I kept story. a told poster each , photos event and s inforgraphics to pie charts, to Wordle , puzzle the of piece a have each we – stories ts’ reinforcing the need to tell our studen pages or pieces those put to able been t haven’ a chapter of the larger novel, but we together. their creators so After a brief introduction, I handed over the finished products to
page
18
Assessment in Action
TELLING STORIES - A DATA SHARING PILOT
19
they could hang them around the room. Over lunch, I couldn’t help but swell with pride watching my colleagues examine these works of art. Using a one minute paper, I asked colleagues to reflect on one new thing that they learned after viewing the posters. Many reflections speak of surprise and awe over our facts and figures, noting, “how highly regarded our programs [are] by survey respondents.” They discovered “how amazing our assessment results and feedback [have] been” and “how a poster can tell a story.” Colleagues who had no or very little involvement with our orientation programs saw “how these events really connect students to campus.” Perhaps most importantly, my colleagues saw “that [our] students are engaged and hopeful about the future.” This is only the beginning of their story. In creating these posters, we see now that the students are truly the authors of their own experience – we are the agents of change, the publishers and book binders who will help navigate the plot twists and help bring their stories to life. Lisa Endersby - Taking on the title of ‘Advocate for Awesome,’ her work in higher education spans defining and chasing student success in leadership development, career services, community engagement and, her most recent love, assessment. Lisa has presented and facilitated at numerous local and national conferences. Lisa is an avid user of social media. Follow her on Twitter (@lmendersby) to keep the conversation going. Rutgers University does not discriminate on the basis of race, color, national origin, sex, sexual orientation, gender identity or expression, disability, age or any other category covered by law in its programs, activities, or employment matters. The following people have been designated to handle inquiries regarding the non-discrimination policies: Judy Ryan, Title IX Coordinator for Students & ADA/Section 504 Compliance Officer Office of the Vice President for Student Affairs 83 Somerset Street, Suite 101, CAC p. 848-932-8576 ryan@oldqueens.rutgers.edu
Jayne M. Grandes, Director Office of Employment Equity, University Human Resources 57 US Highway 1, ASB II, Cook Campus p. 848-932-3980 grandes@rutgers.edu
For further information on notice of non-discrimination, visit http://wdcrobcolp01.ed.gov/CFAPPS/OCR/contactus.cfm for the address and phone number of the Office for Civil Rights that serves your area, or call 1-800-421-3481.
Winter 2013
page page
19
2013
Save The Date!
Rutgers Student Affairs Awards for Excellence
May 22 3-5 p.m. nd