Workshop #4: Data Analysis & Visualization
Mustang Assessment 101 Assessment support all year round!
The Goal: Do most of your planning/creating/ troubleshooting in these workshops.
What’s happening today? Review: Signature inquiry whats; assessment hows Data analysis: Approaches for quantitative data Approaches for qualitative data Available tools
Data visualization:
Key principles Matching visualizations to needs Available tools
Review
Signature inquiry basics Signature inquiries (previously, “signature assessments”) are: Focused assessment projects Distinct from other types of less formal (e.g., “quick-check”) assessments “Deep dive” inquiries into one critical problem of practice over the course of the academic year Signature inquiry
Anthology Planning report on outcomes
Review
Signature inquiry basics 4A-i: Departmental Goal Number. 4A-i: Departmental Goal (Full Text). ABCD! 4B-i: Measure(s): Description. Data and methods used to assess (e.g., student survey) 4B-ii: Measure(s): Direct or Indirect?
Direct = Observable (e.g., knowledge test, minutes of wait time) Indirect = Participant reported (e.g., sentiment survey, focus group, interview)
4B-iii: Measure(s): Target(s). e.g.,Met increase fromMet 70% to 85% sense of 4C-i: Findings: Target Status. / Partially / Not Met positive / No data collected belonging 4C-ii: Findings: Detail. e.g., increased to 82%
When possible, disaggregate by demographics, location
4C-iii: Findings: Interpretation
4D-i: Action Plan: Steps. Now we know; what will we do? 4D-ii: Action Plan: Primary Stakeholders. Who’s involved in the Action Plan? 4D-iii: Action Plan: Products. What are we doing to implement the Action Plan? 4E-i: (Future) : Status Update. Fully implemented / In progress / Not started. 4E-ii: (Future): Status Update: Detailed.
Review
Assessment basics DECIDE
How do we undertake an assessment project, anyway?
1. Develop a purpose statement 2. Identify stakeholders 3. Examine existing data
PLAN ACT
4. Determine what data to collect 8. Collect data 9. Analyze data 5. Decide from where/whom data should be collected 6. Determine time frame for project 7. Develop data collection tools
USE 10. Share results 11. "Close the loop" by using assessment data
Review
Assessment basics
A udience B ehavior C ondition D egree
A. To whom does the outcome pertain? B. What should the audience know/be able to do? C. Under what conditions will learning occur? D. How much will be accomplished (i.e., what level is “success”)? This is the target you aim to meet.
Review
Assessment basics
A udience B ehavior C ondition D egree
Student Affairs Division staff who participate in Mustang Assessment 101 will demonstrate increased awareness of various forms of assessment by utilizing a different assessment approach from last year (25% of departments completing a signature inquiry).
Data Analysis
Approaches for quantitative data Basic approaches incorporate only one datapoint (e.g., a count of how many students participate in various programs), potentially collected over multiple periods of time or among various groups of participants
Trend analysis
Disaggregation
Somewhat complex approaches involve more than one data point and resulting in findings about the relationship between them (e.g., how does participation in a program correspond with students' graduation and retention)
Correlational analysis Highly complex approaches involve multiple data points and sophisticated statistical calculations which, alongside generalizable samples, can result in findings about cause and effect (e.g., that participation in a program causes higher graduation rates), predictors (e.g., that program participants are more likely to graduate) and other strong causal claims
Means testing
Propensity score matching
Regression
Data Analysis
Approaches for qualitative data In general, qualitative data analysis involves: 1. Transcribing textual data (interviews, focus groups) to documents 2. Reading the documents closely 3. Coding key sections of the documents for central ideas, topics, themes, quotes or other “chunks” of information 4. Gathering codes into broader categories 5. Describing the categories and their meaning for your assessment project
Data Analysis Available tools
Native system tools Microsoft Excel Sum, %, Filters
Qualtrics Crosstabs iQ, Text iQ
PowerBI / Tableau SPSS / STATA / R / Python Rev / OtterAI Nvivo, Dedoose Sneaky issue: Cleaning your data!
Student Affairs Division staff who participate in Mustang Assessment 101 will demonstrate increased awareness of various forms of assessment by utilizing a different assessment approach from last year (25% of departments completing a signature inquiry).
Data Visualization Key principles
In line with thinking of data visualization as a tool of communication, NCES (2017, pp. 19-28) provides four key principles "that serve as the foundation for helping viewers more readily understand information": 1.Show the data. Utilize labels and contextual information as necessary to enable an accurate interpretation of what's being shown. 2.Reduce the clutter. Always center the "take-home message" of a visualization and keep needless complexity in check. Don't try to do too much with any one data visualization. 3.Integrate text and images. Display visualizations with meaningful titles and, when necessary, introductory language or call-outs. 4.Portray data meaning accurately and ethically. Clearly convey which data is included versus not (e.g., "student survey respondents" as opposed to "SMU students"), pay attention to scale and understand that the majority of assessment does not contribute to causal claims (e.g., "our program causes students to have a higher sense of belonging").
Data Visualization Key principles
Data Visualization
Matching visualizations to needs Trend analysis Disaggregation Correlational analysis Means testing Propensity score matching Regression
Data Visualization Available tools
Microsoft Excel Qualtrics PowerBI / Tableau Word cloud generators (e.g., MonkeyLearn WordCloud Generator)
Student Affairs Division staff who participate in Mustang Assessment 101 will demonstrate increased awareness of various forms of assessment by utilizing a different assessment approach from last year (25% of departments completing a signature inquiry).
ME
CONTAC T
Dr. Kim Nelson Pryor Director for Student Affairs Assessment & Analytics
sa-assess@smu.edu (214) 768-6306 PAB 317 | Or request an assessment consultation