Assessment in A Publication of the Rutgers University Division of Student Affairs
Action
Rutgers Recreation Using Social Media Assessment Tools Spring 2013 Vol. 1 Issue 3
Editorial Staff
Scott Beesley* Andrew Campbell Charles Kuski* Patrick Love Tammy Mulroony* Dustin Ramsdell* Patricia Rivas* Cheyenne Seonia David Speiser*
Design
Scott Beesley Dustin Ramsdell
Photography
Scott Beesley Abbas Moosavi Rutgers Recreation
Special Contributors David Eubanks
David holds a doctorate in mathematics from Southern Illinois University, and has worked in academia for more then 20 years as a faculty member. He is currently at Eckerd College as Associate Dean of Faculty for Institutional Research & Assessment. David also blogs on the topic of assessment at highered.blogspot.com
Questions?
Please Contact: Student Affairs Marketing, Media, & Communications 115 College Ave New Brunswick, NJ 08901 Phone: (732) 932-7949 Email: aia@rci.rutgers.edu Visit us online! www.issuu.com/ assessment_in_action * Students in the College Student Affairs Master’s Program at Rutgers University page
02
Welcome to
Assessment in Action
Dear Colleagues, We are coming to the end of another academic year. It is a good time for reflecting on the experiences of the past year to aid in our future growth. It is a time when such reflection can be an exercise in assessment. As David Speiser, in his article, No-Sweat Assessment, tells us, “All assessment really requires is intent: even when you don’t know exactly what you’re looking for, if you look hard enough, you’ll find something.” So, I encourage you to reflect on your work of the past year and assess it through that reflection. What follows are some questions to guide that assessment reflection. You do not have to use them all. Pick a few that you find meaningful or suspect will lead to a rich reflection experience. Write down the answers to the questions you choose and use them in the summer ahead to guide your planning. What were you aiming to achieve when the year started? Did your aims change during the year? If so, how? To what degree do you believe you achieved your aims? How do you know? With what parts of your job did you struggle this year? How might you address those as you head into a new year? What did you learn this year? How did you apply that knowledge to your job? What did you discover that you need to know or know more about? In what ways did you contribute to the growth and development of students? What competencies did you learn or improve this year? There can be many more questions to aid in reflective assessment, but these are a good starting point. I invite you to share your reflections with me. I am curious to see some of the growth and development that has occurred in the division this past year. Good luck with closing and best wishes for a relaxing and/or productive summer! Sincerely, Patrick Love Love@oldqueens.rutgers.edu Follow me on Twitter: @pglove33
Assessment in Action
in This Issue No Sweat Assessment By David Speiser
page
04
Best of the Blogs: Networking 2.0 for Assessment Professionals page 06 By David Eubanks Knowing Your Students By Charles Kuski
page
The Annoying Survey By Scott Beesley
page
Assessment Reflection By Marylee Demeter
page
When Used Properly Data’s Pretty Cool By Tammy Mulrooney
09
13
15
page
17
Cover Story By Patty Rivas and Kevin O’Connell Are you Tweeting? Using Facebook? Engaging students with social media platforms? Patty Rivas and Kevin O’Connell discuss why it is important to include social media in your department and how you can use it for assessment
Spring 2013
page
10
page
03
No-Sweat Assessment By David Speiser
T
here we sat, John and I, wrapping up our one-on-one in half our scheduled time. I figured I could no longer run from the experiment. I had to talk about it: “Post-Program Evaluation: Trends in What We’ve Learned.”
Originally, this “Special Topic” in my miniature lecture series during individual meetings was meant to provide teachable moments with my large staff. I thought it would be my chance to teach them lessons on programming or related subjects. For my first round of one-on-ones, my presentation was a refresher on the model of program development that I created at the beginning of the year. It was a short spiel that I
such as how many people attended, how much the program cost, what they learned, what they thought residents learned, and what they would do differently. Admittedly, I had not done much with the forms in the past—I mostly wanted the numbers of cost and attendance—but this semester, I felt there was more to be gained from this stack of self-assessments. So, without a set plan in mind, I put it on the
felt relatively comfortable reiterating. But this time, I wanted to try something new, and get staff members to look at the work they’d done in a new way. After every program, staff members turn in a Post-Program Form, an evaluation of how the program went. The form asks basic questions,
agenda: “Post-Program Evaluation: Trends in What We’ve Learned.” It sounded official, but I wasn’t really sure what it would entail. I assumed I would figure it out. Then there I was, having to figure it out. Sweating. I went into my binder, pulled out John’s four Post-Program Forms from last semester, handed
I mostly wanted the numbers of cost and attendance—but this semester, I felt there was more to be gained from this stack of self-assessments.
page
04
Assessment in Action
him two, and held up two for myself. The first program was an alcohol education luau, the second was a Halloween candy-apple-making program, the third was a passive program dispensing information on study tips, and the fourth was a holiday diversity party. “So what I’d like for us to do is look at your four Post-Program Forms and see if we can pull out any trends in your programming. I thought we would just run through them piece by piece and compare. So for example, let’s look at the first question. What were some of the successes of the programs?” He looked down and I looked down. “Well, I had a good turnout,” he said. “I have that too,” I said. “So generally, your programs are well attended. What else?” “The residents enjoyed it.”
a program about responsible drinking, about being a responsible student, responsibility to others in the form of cultural awareness, and even responsibility to yourself in enjoying and treating yourself.” “Wow. I didn’t think of it like that.” I couldn’t believe it. I actually found a common thread among four seemingly random programs. And it made sense: I was meeting with one of the most responsible staff members I have. I felt an affirmation coming from this evidence that programs are ultimately a reflection of the programmer. As our time drew to a close, I said, “I think you can use this knowledge of yourself to further guide you in your programming, and elsewhere, evaluating whether a program or anything else fits in with your value of responsibility.”
All assessment really requires is intent: even when you don’t know exactly what you’re looking for, if you look hard enough, you’ll find something. “These programs too. Great. What about, what could be improved?” “For my candy apple program, it could have been less messy.” “Okay, so teasing that out, maybe the program could have been a little more prepared? I have better advertising as an improvement too. So let’s move on to the next one: what did you and your residents learn?” “Hm. I don’t know.” “Let’s not think of these as your programs. Instead, think of yourself as a third-party observer to these programs. We know these programs are well attended and enjoyable, and could be a little bit more prepared and better advertised. But what other themes emerge?” He looked stumped. I felt stumped. I felt like I was supposed to have an answer. The conversation felt rushed and I wanted to feel like I was leading it somewhere. I looked down at the papers and it hit me: “You know what I see?” “What?” “Responsibility. In all of your programs, you’re teaching your residents about responsibility. I know it may seem like a stretch, but look: you have
Not all of those conversations—and I did end up having them all—ended up with one-word themes to encapsulate each staff member’s programs, though some did, and those were bonuses. For all of them though, based on this first attempt, I strove to find two common strengths and two areas needing improvement, and from there, encouraged them to think about how to incorporate the positive and prevent the negative in planning their upcoming programs. My experiment was a success, and it proved something to me: assessment doesn’t have to be structured, numerical, or almost anything that I formerly associated with the word. All assessment really requires is intent: even when you don’t know exactly what you’re looking for, if you look hard enough, you’ll find something. So don’t sweat it. It’s just assessment.
David Speiser is a first year graduate student in the Rutgers Ed.M. College Student Affairs Program. He is currently an Assistant Residence Life Educator. Follow him on Twitter @SpeisUpYourLife.
Spring 2013
page
05
Best of the Blogs
Networking 2.0 for
Assessment Professionals
{
By David Eubanks
}
We are starting a new segment in AiA called “Best of the Blogs.” Each issue will feature a new assessment blog that we found interesting. This issue’s comes from David Eubanks February 5th, 2013 blog that can be found at: http://highered.blogspot.com/. David holds a doctorate in mathematics from Southern Illinois University, and is currently at Eckerd College as Associate Dean of Faculty for Institutional Research & Assessment.
That assessment has grown as a profession is obvious from the size and number of conferences devoted to the topic, and there is a thriving email list at ASSESS-L (http:// lsv.uky.edu/archives/assess.html )where practitioners and theoreticians can hold asynchronous public conversations. There are, however, limitations to this approach, and the purpose of this post is to speculate on more modern professional social networking that might benefit the profession. I just turned 50, so my first response to any new idea is “Why is this important? I don’t have much time left, you know.” So let’s start with
page
06
Assessment in Action
Why?
To find out what other people think about something related to assessment. To connect with others who have similar assessment interests. To disseminate information, such as job listings, conference announcements, or research findings. To help establish a portfolio of professional activity. One of the things on my personal wish list is a repository for learning outcomes plans and reports than could be seen and commented on by others. I think this transparency would reduce the variability in (e.g.) accreditation reviews of same. This leads to
How?
Below I’ll describe some of the models that I have come across. There are surely others.
Email Lists
StackOverflow Model
This is currently done. There’s a searchable archive, but it’s not tagged with meta-data to make browsing and searching easier. My purely subjective ratings by motivations 1-4 listed above are: Email is great for finding out what others think, but the relative merit of any one response to a question is not easy to ascertain from the responses; there’s a silent majority. Conversations are only threaded by subject line. Connecting with others is easy enough, but searching their post history to look at the subjects is not. Disseminating information is a strength of the email list, until it becomes spam-like. Participation on email lists is probably not something you can put on your resume.
Stackoverflow.com, mathoverflow.net, and many other similar sites now exist to serve as a meeting place for professionals of different stripes. Comments: 1. The ‘overflow’ model excels at the Q&A give and take; this is its strong suit. Users post questions with meta-data to categorize them. Others can comment on the question or post a solution. All of these (question posts, comments, and solutions) can be voted up or down, and the original poster (or OP) can select a solution as the best, at which point it gets a check mark as ‘best answer.’ 2. User profiles are quite detailed, with graphs of activity and reputation scores. These are easily associated with meta-data tags, so it’s nearly ideal for finding others with similar interests. 3. Each site has a culture and stated rules about what should and should not be posted. For example, ‘soft questions’ like “how much caffeine do you consume while writing code?” are generally frowned on, as job ads would be at most sites. But this is all adjustable. Like Reddit (and email for that matter), it requires some moderation, but users providing up or down votes provide most of the filtering. 4. The reputation system built into the overflow model is plausibly usable as an indicator of professional activity. For example, see the page for Joel David Hamkins at MathOverflow.net--a site for working mathematicians. Other possibilities including using Facebook or LinkedIn or Google+ or Academia.edu or ResearchGate.net as a base platform. These all have the vulnerability of being beholden to a corporate interest, however.
Summer Spring 2013 2012
page
06 07
Reddit-Style Discussion Board
Reddit.com is a segmented combination of news aggregator and discussion board, with threaded comments and a voting system to allow a consensus to emerge. It’s easy to create a ‘sub-reddit’ on the site itself, or one can use the open-source platform to start from scratch. Comments related to the motivations: 1. One can write “self-posts” that are like public text messages of reasonable length, to invite others’ opinions, OR post a hyperlink to something interesting on the internet. It’s very flexible as a general-purpose way to share ideas and create threaded conversations. Voting is a low threshold to involvement, and so there’s more participation. 2. One can easily see someone else’s post history, but these are not tagged with meta-data. There is a ‘friend’ feature to follow people of interest, and private messaging within the system is possible. 3. Reddit is a ‘pull’ rather than ‘push’ communication, meaning you have to actually go look at the site to see posts, as compared to emails, which arrive in your in-box whether you want them to or not. This is probably preferred by some and not by others. Many assessment professionals are probably too busy to go surf the internet during the day. There are RSS feeds, however. 4. Reddit has a reputation system built in, and active (and popular) users accumulate ‘karma’. But the site is not set up to be a meeting place for professionals, and it would have to be off-sited and rethemed to change the perception of it as a place for teens to post memes.
More Connections
In addition to posting learning outcomes ideas/plans/reports/findings for public review, a well-designed professional networking site could seamlessly overlap with conference presentations, so that individual sessions could have a backchannel on the site as well. Twitter can accomplish this through hash tags, but these are limited and not combined into an easy to find place. There are also possibilities for crowd-sourcing problems using collaboration sites, but this goes beyond the present cope.
• Online webinars on assessment • Online RA training and assessment • Online professional development workshops • Hundreds of catalogued websites and listservs • The internet resource for student affairs professionals For more information, contact: Stuart Brown StudentAffairs.com Stuart@StudentAffairs.com
page
08
Assessment in Action
Knowing Your Students By Charles Kuski
A
s a Graduate Hall Director of a traditional first year residence hall, I have always felt that one of the great things about my position is working with my residents through their whole first year experience, as opposed to observing brief snapshots of their experience. During my first year in the position, I felt as though there was something I didn’t grasp about my students. I realized my problem was that my experience with my first-year residents started in
September when they moved into the building. This meant that I was late to the game. My summer internship was with the Student Life Office of New Student Orientation and Family Programs, working with four professionals, four graduate interns, and 47 orientation leaders to plan twenty orientation sessions throughout the summer. Coming from the smaller Rowan University, I don’t think I was prepared for how massive this undertaking would become when I accepted the internship. More importantly, when I accepted the internship, I didn’t realize executing orientation sessions in the summer would flow seamlessly with overseeing a first year building. Working for New Student Orientation in the summer and transitioning back to being a first year Hall Director allowed me to understand a larger first-year experience. I wanted to try to find out the kinds of experiences and preconceptions my residents were entering college with. My building doesn’t have any livinglearning communities or special interest populations outside housing only first year students. I sent out a quick survey to each student after their first week of classes. The survey was eight short, simple questions designed to get some demographic information from my residents. With just eight demographic questions, I was able to find out a wealth of information about my new residents. One question that became increasingly important was: “Have you met with your academic adviser yet? (Yes, in person / Yes, over the phone / No, not yet)” Knowing the number of students who had not begun working with their academic advisers helped the Resident Assistants and I to better tailor our programming and community development efforts to increase the focus on academics in the hall. I’m glad I realized the importance of demographic data early in the academic year. While just bits of information themselves, getting demographic data on a specific population and adding context creates information that could generate an accurate starting point, making assessment and evaluation at a later time richer and more contextual. Charles Kuski is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. He is currently a Hall Director at Rutgers University. Follow him on Twitter @CLKuski
Spring 2013
page
09
Using Social Media Tools By
Patty Rivas
Kevin O’Connell Assessment is commonly seen as surveys, quantitative and qualitative studies, and focus groups, but there is a whole other world out there that needs to be explored: Real-time assessment happening on Facebook, Twitter, and Instagram.
T
hese aren’t just social media tools anymore; they are the tools those in higher education need to start using in order to have a more well-rounded assessment strategy. As a Marketing & Social Media Graduate Assistant with Rutgers Recreation working alongside the Assistant Director of Marketing and Social Media, Kevin O’Connell, we have seen first hand how important social media can be in assessment and student satisfaction. In fact, we page
10
Assessment in Action
have trained colleagues at Rutgers University and the University of Maryland on creating engaging social media strategies to reach their target student populations. Social media has allowed the Recreation and Community Development department to get instant feedback on programs, facilities, services, and more. Instead of waiting to conduct an end of the year survey, we are getting instant feedback daily. For example, Kevin and I have trained our social media team to use Tweetdeck to monitor daily searches on twitter for hashtags relevant to our unit. These hashtags can be anything, such as #Rutgers, #Werblin and #Cagym, or hashtags trending for an upcoming event such as #ColorMeRU. We are able to see what students are saying about us in real time. If students are tweeting about how great one of our events is, we know that it is a program we should continue in the future. If someone tweets about how a treadmill is broken, we are able to get someone
to fix it right away (and respond to their tweet to let them know we saw it). Sometimes it’s positive, sometimes it’s negative, but we always respond to feedback. Twitter allows us to engage with students in a forum where they spend most of their day. Also, students see how active Recreation is on social media, and know they can engage with us online to get information.
All assessment strategies will have occasional negative feedback and some units may be concerned about this. However, we believe the more negative feedback the better we can serve our students. Social media allows departments to gather valuable data, and the best part is, departments can respond to and alleviate a situation before it gets out of control. If someone writes a complaint on the Recreation Facebook
wall, we now have a chance to respond and help them, either by addressing their concern, offering them some sort of solution to their problem, or directing them to the appropriate department contact. While we do engage with students who have issues with our department, there’s nothing better than getting positive feedback and spreading that message.
While Facebook doesn’t have a search option similar to a hashtag search, it is also a must-have for functional areas. Facebook has allowed Recreation to be visible to students, and encourage their feedback. Not many students are willing to pick up the phone to call somewhere for information. Recreation makes it easier for them by having an online presence. On a weekly basis, students message us on our main Recreation Facebook page as well as our five subpages asking how to register for a program, the hours of our recreation centers, or wanting more information on employment in our department. Our social media team has a rule; every message or comment should be acknowledged within 48 hours. Facebook and Twitter has helped the Recreation department gauge interest in key events throughout the year. In January, we posted a teaser picture for the “Color Me RU 5k” along with a caption, “The Color Run is coming to RU!” Immediately, it received hundreds of likes, shares, retweets and questions about registration. Our marketing department was able to share that information with our race director and instead of projecting 300-500 runners, we were able to accommodate 1,000 runners. That post told us that this event would be huge, and we could expect it to reach capacity quickly. We have had to do little promotional work for this event, because students are
Spring 2013
page
11
already so excited for it! We now know that this is a program students want, and can use this feedback for future events. Because we have created an engaging and interactive community through our social media platforms, when we released a more formal assessment survey to gauge the quality of our facilities, our fan base instantly gave us the feedback we were looking for. The key was keeping the survey short, creating posts on social media asking users to take the survey, but making sure that they were engaging, and being honest with our community, telling them, “we just want to know what you think of us.” Although social media is sometimes viewed as informal means of communication, it’s a free and powerful assessment tool, and can be adopted by anyone. Some units may say they don’t have time to do it, and to that we say: hire a student! Social media has helped Recreation to assess our unit and programs, but also to engage with students and get them excited about the Rutgers Recreation and Community Development department. As soon as a department invests time into social media, it becomes even more accessible to students. Your unit can do the same. #justdoit
About the Authors Patty Rivas Patty Rivas is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. She is currently an intern at Rutgers Office for Violence Prevention and Victim Assistance. Follow her on Twitter @PattyRivas13.
Kevin O’Connell Kevin O’Connell is the Assistant Director of Marketing and Social Media for Rutgers Recreation and Community Development. Follow him on Twitter @koco83.
AiA is Online! www.issuu.com/assessment_in_action page
12
Assessment in Action
Summer 2012
page
02
The Annoying Survey By Scott Beesley
D
uring one of my internships in graduate school, I had the pleasure of doing data entry for an end-of-the-year survey. No sarcasm here—data entry is really cool stuff. It is when all of the hard work of survey design and administering the survey starts to come together. As I typed in the data, trends began to emerge from the hundreds of sheets of paper. I discovered that students want programs in the middle of the week, that the students were happy with financial aid services, and that it looked like library services could use some work. The data kept pouring in, and then, I found it; the partly filled out survey. The first few questions were filled out and then the survey came to an abrupt end. I shrugged it off and put it in a separate “incomplete” pile I continued to code the data until I came
across what I believed was the most pointless waste of paper sitting on my desk. I entitled this survey, “the annoying survey.” Every single box for every question in the “annoying survey” had been marked. The survey culminated with a question that simply asked: “Please Provide Any Additional Comments.” The response? “These are annoying.” “Wow,” I thought to myself. “This person went out of their way to make my job more difficult and provide zero feedback.” I put the survey aside because I was so blown away by the lack of helpfulness of this student. But here’s the thing. I don’t remember a single result from that survey. I couldn’t tell you what functional area performed the best or what times that student wants to participate in activities on campus. The only result I remember is the
phrase “these are annoying.” Little did this student know, he or she provided me with vital information. Here’s how--this survey was administered at the end of the semester. It was spring, the students had been taking finals, completing papers, and subjected to numerous surveys ranging from the performance of the student centers to course evaluations. It appeared that dreaded survey fatigue was in full effect for this student, and they went out of their way to let me know. I thought that if this student went out of their way to communicate that “these are annoying,” how many students took the lazy route and just filled in the first circles they saw so they could simply complete the survey or not be bothered to fill it out at all? What I considered was maybe the end of the year is not the best time to subject students to a survey if we want accurate data for how the students are feeling. Maybe
Spring 2013
page
13
a survey wasn’t even the best method of data collecting at all! Another thing to consider about surveys is that they are massresponse systems. They tend to eliminate individual voices and move away from the specific and individual towards the general. Students may be annoyed with surveys because they think that their individual opinion doesn’t really matter—it is the overall impression of the mass response that matters. By this student rebelling against the structure of a survey, they are creating their own voice--and it’s this individual voice that caught my attention. Unfortunately, by the time I had this epiphany on the “annoying survey,” my internship had come to This is the “Annoying Survey.” Every box has been filled an end. I was never able to in, including a final suggestion. implement changes to this
survey at the institution, but I now know in the future to keep an eye out for this type of data. The lesson I learned was to save every survey even if it is incomplete or filled out with outlandish statements. Although it might not be the data that the survey intended to capture from the students, there is a good chance there is some hidden feedback that will be very useful for future implementation. Have you had your own “annoying survey?” Tell us about it! Email us at aia@rci.rutgers.edu. Scott Beesley is a second year graduate student in the Rutgers Ed.M. College Student Affairs Program. He is currently a an intern at Student Affairs Marketing, Media, & Communications.
Write for AiA! We’re looking for contributors from RU and beyond!
It’s not about publish or perish. It’s about making the culture of assessment visible on campus. It’s about making assessment part of every program. It’s about breaking the stigma of assessment as busy work. Would you like to learn more about assessment by writing about it? If you would like to find out more about the opportunity to write for AiA, please send us an email: aia@rci.rutgers.edu page
14
Assessment in Action
Assessment Reflection By Marylee Demeter
I
had eight weeks to turn things around in my first Educational Psychology course. It was the eve of spring break, my students just completed their first multiple choice exam, and they didn’t perform as well as I anticipated. Rather than blame my students for “not studying enough,” I spent part of my spring break wondering what went wrong, and devised a plan to address the problem. When we returned to class, I initiated a discussion with my students to determine why they did so poorly on their midterm; a multiple-choice, open-book, exam. Boy did they have a lot of complaining to do! “Professor Demeter, I have never heard of the word ‘moratorium;’” “Professor Demeter, you never even talked about cognitive apprenticeship in class!” My students’ feedback was extremely valuable in making effective changes to support their learning. I realized I wasn’t always using the “Educational Psychology jargon” and I was asking the impossible: that students master every concept in the book, regardless whether the material was discussed in class. I also learned my exam lacked validity; I set out to measure knowledge
of various concepts discussed in the book, assuming a random selection of test items would accurately assess student learning. In reality, my students were overwhelmed with the amount of information, and it became clear they were not reading the textbook; they were relying only on their notes when it came time to study for the exam. As a result of our discussion, I intentionally used Educational Psychology terms as part of the regular classroom dialogue, which encouraged students to use the terms as well. Instead of displaying concepts on PPTs and engaging discussion for two and a half hours, I allotted a significant amount of class time to allow students to work together on group projects, and I met with each group to provide feedback to ensure their success. I also provided incentives for reading the material by offering formative quizzes so students could receive feedback and reflect on their learning without jeopardizing their grades. I carefully selected items for the final, and revised them to mirror the way the material was presented in class. The result? Final exam grades went up, and my students were much happier with the way class progressed in those last eight weeks! That first semester was rough and my evaluations were less than stellar, but the experience taught me how important assessment is for student success. I have to admit, the negative feedback was intimidating, but rather than take it personally I used it as an opportunity to improve my teaching. Even more importantly, it taught me that assessing student learning of course material was not enough; it is also crucial to engage in ongoing assessment and reflection of my own teaching practices to effectively meet my students’ needs. After my first semester of teaching, I decided to administer my own course evaluations at the end of each semester, collecting quantitative information through Likert scales, as well as
Spring 2013
page page
15
qualitative information collected through open ended questions. Over time, my courses improved, as reflected by increasingly positive evaluations. I asked students to share what they liked and didn’t like, and made appropriate changes to my syllabus so I could provide the resources and activities that really motivate them to learn. For example, one semester I required students listen to NPR’s Education podcast and comment each week on our course website as part of their grade. When I read my evaluations at the end of that semester, I realized not only did they hate the requirement for a number of reasons; it wasn’t adding value to the course. The very next semester it was off my syllabus (although I did continue to encourage students to subscribe to the podcast for professional growth). Ultimately, continuous assessment of my teaching practices proved to be critical to both my professional growth and my students’ success. My students also appreciated that I took the time to
listen to them and offer options to suit their learning needs, resulting in a more positive and
page
16
Assessment in Action
collaborative learning environment. When I left my adjunct position at MCC to join Student Affairs here at Rutgers, many students were sad to see me leave; they were planning to take future courses with me! I realized I ultimately owed my reputation to ongoing assessment! Without it, I had no legitimate way of knowing if what I was doing was effective or well-received. I always tell my students, “this is your class; please let me how I can help!” Engaging in effective assessment practices allows them to do just that, and in turn helps me grow professionally. I wouldn’t be able to improve my teaching effectively without it! As Student Affairs professionals, we are obligated to provide responsive and high quality service to our students. When faced with adversity, rather than speculate, it is our duty to utilize our assessment skills to accurately determine the cause of the problem, along with a viable solution. Much like I approached the problem in my first Ed Psych course, we need to devise a solid plan to obtain the necessary feedback in making effective changes to support learning and success. Ongoing assessment is also crucial to our professional development so we can know that what we are doing is effective and embraced. Those affected will also thank us for listening and offering alternatives that matter to them. MaryLee Demeter is the Coordinator of Assessment and Research for the Division of Student Affairs at Rutgers University. She has been working in the field since launching her career at ETS in 2001, earning Masters Degrees in Educational Measurement, Statistics and Program Evaluation (2005) and Educational Psychology (2009), both from the Graduate School of Education at Rutgers. Ms. Demeter has also worked for Academic Support Services for Student-Athletes, the Office of Disability Services, GSAAP and the GSE at Rutgers. In addition, she served as adjunct professor of Psychology and Education at Middlesex County College from 2009 – 2012, while providing consulting services for a variety of assessment and research projects. Her passion lies in collaborating with colleagues on assessment initiatives to improve programs and services for the benefit of our students.
When Used Properly,
Data’s Pretty Cool
By Tammy Mulrooney
“We want a quiet living option,” the data said. “It would be good for studying,” the data said. However, when it was time to sign up for that living option during the Housing and Residence Life Lottery at Rutgers, students did not live up to what the data suggested from the annual EBI survey. When I spoke to fellow res-lifers Debbie Francisco, Coordinator of Assessment, and Michael Miragliotta, Assistant Director for Marketing, Assessment, and University Relations, they shared that this
phenomenon was not unusual. I met with Debbie and Michael to speak about the uses of data drawn from the EBI survey given at Rutgers each academic year. While able to speak to the benefits of EBI and the reasons why the university surveys the student population, they made clear that this assessment stands as a picture of what is happening at our university.
Discover the Campus Labs® Platform
A more comprehensive approach to institutional effectiveness JoIn us foR one of ouR uPCoMIng WeBInARs: Perspectives on Program Review: May 8 Building an Institutional Culture of Assessment: May 15
To view our full list of webinars, visit www.campuslabs.com/support/training
Creating Templates in Compliance Assist Planning: May 23 Writing Assessment Reports: May 24
Spring 2013
page page
17
Debbie and Michael provided that example to illustrate the notion that the data cannot be the only force by which Rutgers makes concrete decisions about the population of students it serves. Debbie stated, “It is a snapshot of how these people answered these questions on this day.” She made it clear that if those variables changed, there would be a distinct change within the answers received this year. Thus, Rutgers Housing and Residence Life chose to place the quiet living option on the availability list for the housing lottery system. If there were enough students who opted to live in the “Quiet Hall,” the university would provide it. Surprisingly, although the EBI results indicated that students wanted that option, the response during the lottery process showed that students actually did not want to live in a designated quiet area. The notion that we should use data to influence, rather than dictate, our practice really resonated with me. I work as a Hall Director in an apartmentstyle community. I struggled with understanding why my students would not come to my staff’s programs. My residents would inform me and my Apartment Assistants that they wanted to come to our events, but when the time came, attendance was low. So, I attempted a varied set of techniques to understand why this was occurring. My staff and I brainstormed, we had small sessions with residents to figure out why there was poor attendance, and we tried different days and times of the week. Unfortunately, we did not see any change in results. Then the EBI data arrived. Anticipating that the departmental EBI presentation would be a “snooze fest,” I got comfortable in my seat. However, I quickly found myself engaged and realizing possible explanations for my programming dilemma. While reviewing the data, which covered the entire apartment area, I saw that students were happy with what they were experiencing. They were pleased with staff members and our ability to respond to a crisis and they were happy with how we provided for them in terms of programming efforts and resources. Needless to say, I could not wait to inform my staff. When my Apartment Assistants came back to campus for the spring semester, we spoke about how we could use these scores to help guide our programming. One item in the data showed that the apartment residents were having trouble with time page
18
Assessment in Action
management and problem solving skills. We used this to our benefit and talked about ways we could alter our programming to respond to this need. During discussions I was informed that our Hall Government was planning to hold a study skills workshop once a month. Part of this initiative includes working on time management skills. I work in an area with mostly juniors and seniors who, as Debbie postulated, may be rediscovering their need to time manage effectively. As such, many may be realizing that they do not possess the skills they originally thought they had. However, like when Rutgers offered the possibility of the “Quiet Hall,” I found that my students were not always receptive to programming that addressed their stated needs. After attempting this new type of study session, we found that while attendance increased, students only arrived for short periods of time. After
a trial run of the study workshop, it was clear that we needed to develop short, intentional programs that would give students the skills they were looking to acquire or improve. After this experience I feel much less overwhelmed when confronted with data. Speaking with Debbie and Michael helped me to realize that, while the data is extremely important, it needs to be used to guide, not dictate, decisions. My takeaway is that making small changes in programming efforts to help students feel more confident in their skills may influence their overall satisfaction with Rutgers. EBI is a great tool and I am very happy to have a better understanding of it. Similar to the notion of using theory to influence and guide practice, I now feel much more comfortable with using data to help guide my practice as well.
Tammy Mulrooney is a first year graduate student in the Rutgers Ed.M. College Student Affairs Program. She is currently a Hall Director at Rutgers University.
DILBERT ©2010 Scott Adams. Used By permission of UNIVERSAL UCLICK. All rights reserved.
Spring 2013
page page
19