Community College of Beaver County Institutional Effectiveness Handbook
2014-2015
1
I. 1.
Institutional Effectiveness What is Institutional Effectiveness (IE)? Institutional Effectiveness and Assessment Institutional Effectiveness, Assessment, and Resource Allocation Institutional Effectiveness and CCBC’s Vision, Mission, Values, and Goals Institutional Effectiveness and Accreditation Institutional Effectiveness at CCBC FIGURE 1: Institutional Effectiveness Cycle and Process Narrative
Table of Contents 2
Creating a Culture of Institutional Effectiveness Review and Revision of the Institutional Effectiveness Handbook Support for Institutional Effectiveness Resources
I. Strategic Planning
2.
Strategic Planning vs. Institutional Effectiveness Strategic Planning at CCBC FIGURE 2: Strategic Planning Cycle and Process Narrative FIGURE 3: 2014-2019 Key Performance Indicators Key Performance Indicators (KPI) Resources
I. Student Learning Outcomes Assessment
3.
What are Student Learning Outcomes (SLOs)? Student Learning Outcomes Assessment at CCBC The Role of Faculty in the Assessment of SLOs FIGURE 4: SLO Assessment Cycle and Process Narrative More Information on Collecting and Aggregating SLO Data Resources
I. General Education Outcomes Assessment
4.
What is General Education? General Education at CCBC Assessing General Education FIGURE 5: General Education Outcomes and Mastery Matrices FIGURE 6: General Education Assessment Cycle and Process Narrative Resources
I. College Prep Outcomes Assessment
5.
What is College Prep? The College Prep Program at CCBC Assessing College Prep Resources
6.
Service Department Outcomes assessment
I. I. 7. I. I.
List List of of Figures Figures Glossary Glossary of of Assessment Assessment Terms Terms
8. 9.
What are service departments What are Service Departments FIGURE 7: Service Department Outcomes Assessment FIGURE 7: Service Departments Assessing Service Departments at CCBC Assessing Service Departments at CCBC FIGURE 8: Service Department Outcomes Assessment Cycle and Process Narrative FIGURE 8: Service Department Outcomes Assessment Cycle and Process Narrative Resources Resources
I. I. Appendices Appendices
A: A: B: B: C: C: D: D: E: E: F: F: G: G: H: H: I: I: J: J:
Institutional Effectiveness Council Charter Institutional Effectiveness Council Charter Faculty Fellowship Description/Application Faculty Fellowship Description/Application Institutional Effectiveness Council Application Form Institutional Effectiveness Council Application Form SLO/SDO Guidelines and Rubrics for Creating Assessment Plans SLO/SDO Guidelines and Rubrics for Creating Assessment Plans Institutional Effectiveness Council Forms for SLO/SDO Approval Institutional Effectiveness Council Forms for SLO/SDO Approval SLO Data Collection Forms SLO Data Collection Forms TracDat Data Entry Screenshots TracDat Data Entry Screenshots Example TracDat 4-Column Report Example TracDat 4-Column Report SLO/SDO Checklists for Data Discussions SLO/SDO Checklists for Data Discussions General Education Student Learning Outcomes: Program and General Education Student Learning Outcomes: Program and Assessment Overview Assessment Overview K: Using the Blackboard Outcomes Assessment Site K: Using the Blackboard Outcomes Assessment Site L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up M: 5-Year Key Performance Indicators and KPI Report Template M: 5-Year Key Performance Indicators and KPI Report Template N: Initiatives and Events Assessment Template N: Initiatives and Events Assessment Template
I. I. Works Works Cited Cited
10.
3
COMMUNITY COLLEGE OF BEAVER COUNTY
Institutional Effectiveness
4
Institutional Effectiveness What is Institutional Effectiveness (IE)? Institutional effectiveness (IE) is the measure of how well an institution is achieving its overall goals. It answers the question “how well are we collectively doing what we say we are doing?” (Characteristics 54). At CCBC, institutional effectiveness ensures the continuous improvement of academic and non-academic units through outcomes assessment.
4). A successful assessment process facilitates institutional improvement and increases an institution’s overall effectiveness in both academic and non-academic areas.
Institutional Effectiveness, Assessment, and Resource Allocation Resource allocation supports “the development and change necessary to improve and to maintain institutional quality” (Characteristics 4) and Institutional Effectiveness and Assessment assessment is the ongoing process that ensures To determine an institution’s level of effectiveness, the “effective and efficient use of the institution’s assessment is necessary. Assessment answers resources” (Characteristics 9). At CCBC, the question “Are our efforts bringing forth the resource allocation is an important component desired results?” (Addison of all planning and assessment cycles: strategic, student learning, and service department.
Institutional Effectiveness and CCBC’s Vision, Mission, Values, and Goals The vision, mission, and values of the college are the foundation for CCBC’s institutional goals, which are the fundamental criteria used to assess the college’s effectiveness.
Our Goal - Student Success Create a learning community by supporting student success through educational programs provided in diverse and accessible formats Our Goal - Community and Economic Development Partner with businesses, organizations, educational institutions, and governmental agencies to enhance economic opportunities for the region Our Goal - Organizational Development Create a culture that expects openness, collaboration, and mutual respect and embraces innovation and professional development Our Goal - Resources Develop and allocate resources which sustain the institution and encourage its growth and development
To access CCBC’s complete Vision, Mission, Values, and Goals (VMVG) statement visit http://www.ccbc.edu/VisionAndMission. 5
Institutional Effectiveness and Accreditation The ability to demonstrate an institution’s effectiveness is critically important to the Middle States Commission on Higher Education’s (MSCHE) accreditation process. Standards 1-7 of Characteristics of Excellence in Higher Education are framed within the context of institutional effectiveness and Standards 8-14 gauge institutional effectiveness in the academic arena. Standard 7: Institutional Assessment emphasizes MSCHE expectations regarding institutional effectiveness:
“
“
The institution has developed and implemented an assessment process that evaluates its overall effectiveness in achieving its mission and goals and its compliance with accreditation standards. (Characteristics 25)
Institutional Effectiveness at CCBC The Institutional Effectiveness Process at CCBC is designed to permeate every facet of the college. CCBC’s current Institutional Effectiveness Process includes three key elements: strategic planning, student learning outcomes assessment, and service-department outcomes assessment (see Figure 1).
FIGURE 1: INSTITUTIONAL EFFECTIVENESS COUNCIL Review Goals/ Strategic Objectives Board of Trustees July
C-2 Review and Update Strategic Objectives President and VPs May/June
C-1
College Level Strategic Planning and Assessment 5 Year Plan Updated Annually
C-3
Communicate to Program and Departments Deliver Plan July/August
C-4
Use of Results Findings Actions May/August
P/D-4
Assess Plan Outcome Leads May, August/September
P/D-3
Program and Department Level Planning and Outcome Assessment
P/D-2
Implement Plan Academic/ Fiscal Year
6
P/D-1
Outcomes Assessment Plan Outcome Leads August/September
INSTITUTIONAL EFFECTIVENESS PROCESS NARRATIVE IE PROCESS OVALS: C-Oval (College Oval): The top oval, or C-Oval, represents college-level strategic planning and assessment. P/D-Oval (Program/Department Oval): The bottom oval, or P/D Oval, represents program and department level planning and outcomes assessment.
C-4 and P/D 4 (Use of Results): In May/ August-September, college administrators gather and review the results, actions, and needs determined by service department and student learning outcomes assessments and the cycle recommences (C-1). IE PROCESS: P/D OVAL:
P/D-1 (Outcomes Assessment Plan): Program and service department outcome leads develop, IE PROCESS: C-OVAL: revise, and/or review the outcomes and measures associated with their program/s or area/s in C-1 (Review and Update Strategic Objectives): August/September. The college president will review and update his/ her presidential goals following the establishment P/D-2 (Implement Plan): Student learning and of the Board of Trustees’ annual priorities and service department outcomes assessment plans the college’s senior administrators will review are implemented accordingly. and update their strategic objectives from May through June. P/D-3 (Assess Plan): Outcome leads gather, aggregate, and evaluate assessment plan data. C-2 (Review Goals/Strategic Objectives): Data should be evaluated by May, with some The college’s Board of Trustees will review program outcome assessments extended to presidential goals and senior administrator’s August/September. strategic objectives in July. P/D-4 (Use of Results): In May for service C-3 (Communicate to Program and departments, or May, August, or September Departments): In July/August, senior for some programs, outcome leads discuss the administrators will communicate the Board’s results of student learning/service department priorities, president’s goals, and their own outcomes assessments with immediate strategic objectives to service departments and supervisors and use assessment results to create academic divisions. Department and division actions and establish needs for the upcoming supervisors will create action plans for their academic year. The cycle recommences (P/D-1). areas.
7
Creating a Culture of Institutional Effectiveness To ensure assessment at CCBC occurs continually and permeates all activities and areas of the college, initiatives and events at all levels of the institution and for all purposes are to be assessed. Assessment planning should occur in conjunction with event/initiative planning and be established before an event/initiative is rolled out to campus. The Initiatives and Events Assessment Template (see APPENDIX N), which can be accessed from the Outcomes Assessment course on Blackboard, provides a useful template for the assessment of such activities.
Institutional Effectiveness Council
The Institutional Effectiveness Council (IEC), which replaces the college’s previous Planning and Assessment Councils, is responsible for the oversight of CCBC’s planning and assessment activities (see APPENDIX A: Institutional Effectiveness Council Charter).
Final assessment action plans (outcomes, means of assessment, criteria), results, actions, and follow-up should be entered into TracDat by the employee responsible for the event/initiative. For further guidance regarding the Initiatives and Events Assessment Template or TracDat data entry, contact the Faculty Fellow for Planning, Assessment, and Improvement. Support for Institutional Effectiveness To ensure CCBC’s institutional effectiveness process is sustainable, the college has identified and/or created the following groups and positions to facilitate outcomes assessment and overall institutional effectiveness.
Executive Director of Institutional Research and engagement
The Director of Institutional Research and Engagement helps to inform decision making at CCBC through the collection, analysis, and reporting of data.
Data Mining Coordinator
The college’s data mining coordinator is responsible for CCBC’s data warehouse and reporting environments. The coordinator supports institutional effectiveness measures involving key performance indicators (KPI), benchmarking, and mandatory reporting requirements.
Faculty Fellow for Planning, Assessment, and Improvement The Faculty Fellow program recognizes the importance of faculty involvement in and knowledge of outcomes assessment, planning, and institutional effectiveness. The Faculty Fellow program creates an opportunity for faculty members, on a rotating basis of two-years, to dedicate time to the college’s planning, assessment, and improvement activities. The overall purpose of the Faculty Fellow program is to advance and sustain institutional effectiveness at Community College of Beaver County (see APPENDIX B: Faculty Fellowship Description/Application).
8
Review and Revision of the Institutional Effectiveness Handbook To ensure CCBC’s Institutional Effectiveness Handbook remains useful and accurate, it is to be carefully reviewed and updated yearly by the Faculty Fellow for Planning, Assessment, and Improvement with assistance from the Institutional Effectiveness Council and final approval by the President’s Executive Council.
Resources For further information regarding institutional effectiveness, please consult the following resources. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Middle States Commission on Higher Education. http://www.msche.org/publications/Assessment_Expectations051222081842.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 4. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 5. Internet Resources for Higher Education Outcomes Assessment. Ephriam Schechter, Office of Institutional Research and Planning, NCU. http://www2.acs.ncsu.edu/UPA/archives/assmt/resource.htm. 6. Student Learning Assessment: Options and Resources. Middle States Commission on Higher Education. http://www.msche.org/publications/SLA_ Book_0808080728085320.pdf 7. “Suggested Readings on Assessing Institutional Effectiveness.” Linda Suskie, Middle States Commission on Higher Education. http://www.msche.org/publications/Bibliography-Institutional-Effectiveness.pdf
9
COMMUNITY COLLEGE OF BEAVER COUNTY
Strategic planning
10
STRATEGIC PLANNING Strategic Planning vs. Institutional Effectiveness Strategic planning focuses on the actions taken to implement the institutional mission and meet institutional goals, while institutional effectiveness focuses on the results of those actions to determine how well the institution’s mission and goals are being fulfilled (Addison 7). Strategic Planning at CCBC CCBC’s strategic plan establishes the college’s overall direction and serves as the foundation for annual goal planning at all levels of the institution. The college’s current strategic plan can be accessed from CCBC’s website: http://www.ccbc.edu/strategicplan. At CCBC, strategic planning functions according to an established cycle that emphasizes a cascading process of goal-setting and embedded assessment to ensure the college is focused on institutional priorities and that performance is measured and evaluated on the basis of concrete outcomes. FIGURE 2 illustrates CCBC’s strategic planning cycle. The narrative following the figure explains the cycle.
11
FIGURE 2 STRATEGIC PLANNING CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY 2. KPI Report Prepared IR Director June
7.
Action Plan Assessments Collected/Annual Goals Report Prepared IEC April/May
KPI Report Reviewed and Budget Linked BoT July
3.
College-Wide Strategic Plan/KPI Developed BoT, President, IEC
6.
Action Plans Developed Department/Division Supervisors August/September
1.
Strategic Objectives Developed Senior Administrators August
5. 12
Presidential Goals Developed President July
4.
STRATEGIC PLANNING PROCESS Narrative NARRATIVE STRATEGIC PLANNING PROCESS BOX 1:
STRATEGIC PLANNING PROCESS BOX 2:
College-Wide Strategic Plan/KPI Developed
KPI Report Prepared
Development of the 5-Year Strategic Plan: The Board of Trustees (BoT) undertakes a comprehensive review of the college’s current strategic plan every five years to revise and update the college’s Vision, Mission, Values, and Goals as appropriate. In compliance with MSCHE Standard 1: Mission and Goals, this process includes “the involvement of the institution’s community” as well as the broadest segments of its internal and external constituencies. The exact form the 5-year review takes within these parameters is determined by the Institutional Effectiveness Council (IEC).
In June of each year, the Executive Director of Institutional Research and Engagement will prepare a KPI Report by reviewing the college’s data for the previous year and establish whether each key performance indicator was “achieved.” Past years’ trends for each KPI will also be included if they are available. STRATEGIC PLANNING PROCESS BOX 3: KPI Report Reviewed and Budget Linked
The KPI Report will be submitted to the Board of Trustees (BoT) for review, discussion, and action in July. Using information from the report, the To prepare for and conduct the 5-year review, BoT will create a list of priorities directly aligned the IEC must 1) develop a year-long timeframe with the college’s institutional goals. If resources (running from January-January) for the review need to be allocated or reallocated to support and process, 2) determine the process that will be sustain the BoT’s priorities, they are then linked used to gather the information necessary to to the college’s budgetary process. Following appropriately revise and update the college’s the Board’s review of the KPI Report, it will be strategic plan (i.e. compression planning, process distributed to the campus community and made management, etc.), and 3) select individuals to available as a public document. facilitate the 5-year review process. STRATEGIC PLANNING PROCESS BOX 4: Based on the information gathered during the five-year review process, the IEC composes a Presidential Goals Developed draft 5-Year Strategic Plan by October of the Following the creation of the Board’s priorities, in review year. The draft is then ready for review July the president will create a list of presidential and approval by the Board of Trustees at their goals that are directly informed by the Board’s annual January retreat. priorities. These goals will serve as presidential Development of College-Wide KPIs: performance goals. Upon approval of the 5-Year Strategic Plan, STRATEGIC PLANNING PROCESS BOX 5: the IEC will create a list of Key Performance Indicators (KPI) consistent with the institutional Strategic Objectives Developed goals as stated in the plan. The president, in consultation with a select group of faculty, After the president creates his/her goals, staff, and administrators will review CCBC, s/he will work with senior administrators to statewide, and national trends to establish fiveestablish strategic objectives for their areas. year benchmarks for each KPI. Once these All strategic objectives are to be directly benchmarks have been set by the president, the informed by presidential goals. Each senior Board of Trustees will review and approve them. administrator’s strategic objectives will serve as the administrator’s performance goals. 13
STRATEGIC PLANNING PROCESS BOX 6: Action Plans Developed After developing their own strategic objectives, senior administrators will work with appropriate department/division supervisors to establish action items for their areas. All actions items are to be directly informed by the senior administrator’s strategic objectives. Action plans will be developed for each action item. All action plans will establish appropriate assessments, identify necessary budgetary needs, and address the timeframe and logistics associated with the action items. Each department/division supervisor’s action items will serve as his/her performance goals. STRATEGIC PLANNING PROCESS BOX 7: Action Plan Assessments Collected/Annual Goals Report Prepared: In April/May, the IEC will collect action plan assessments and use the data to create an Annual Goals Report to be reviewed and acted on by the college’s administration and then the cycle recommences.
14
Figure 3 2014-2019 Key Performance Indicators Community College of Beaver County goal: Student Success
Goal: organizational Deveopment
1. Successfully Complete Developmental Education 2. Successfully Complete Initial Gateway English Course 3. Successfully Complete Courses with a Grad of “C� or Better 4. Fall-to-Spring Retention Rates 5. Fall-to-Fall Retention Rates 6. Attain Credential 7. AAS/AS Graduate Employment and Continuing Education Rate 8. License Examination Pass Rate 9. Academic Challenge 10. Student Effort 11. Career Counseling Services Use 12. Career Goal Clarification 13. Support for Students
21. Cultural Awareness 22. Employee Perception of College Commitment to Diversity 23. Employee Job Satisfaction 24. Active and Collaborative Learning 25. Student-Faculty Interaction
Goal: Community and Economic Development 14. Enrollment in Noncredit Workforce Development Courses 15. Contract Training Clients 16. Contract Training Student Headcount 17. Lifelong Learning Course Enrollments 18. College-Sponsored Community Events 19. Community Groups Using College Facilities 20. College Position in Community
Goal: Resources 26. Annual Unduplicated Headcount 27. FTE Enrollment 28. Annual Credit Count 29. Tuition and Fees 30. High School Graduate Enrollment Rate 31. Met Need of Pell Recipients 32. Met Need of Non-Pell Recipients 33. Student Perception of Institutional Financial Support 34. Support for Learners 35. Enrollment per Section 36. Teaching by Full-Time Faculty 37. Expenditures per FTE Student 38. Expenditure on Instruction and Academic Support 39. Employee Turnover Rate 40. College Support for Innovation
15
Key Performance Indicators (KPI) CCBC utilizes 40 key performance indicators or KPI to assess the college’s overall institutional effectiveness in relation to its institutional goals of Student Success, Organizational Development, Community and Economic Development, and Resources. Both the creation/revision of KPIs every five-years and the annual production of KPI reports are part of the college’s strategic planning process as illustrated in FIGURE 2. CCBC’s current key performance indicators are listed and described in FIGURE 3.
Resources For further information regarding strategic planning, please consult the following resources. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Middle States Commission on Higher Education. http://www.msche.org/publications/Assessment_Expectations051222081842.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 4. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 5. Strategic Planning: A Guide for Higher Education Institutions. Centre for Higher Education Transformation, Fred M. Hayward, Daniel J. Ncayiyana, and Jacqueline E. Johnson. http://chet.org.za/books/strategic-planning 6. Strategic Planning in Higher Education: A Guide for Leaders. Center for Organizational Development and Leadership, Rutgers. http://ddq.nu.edu.sa/files/35.pdf 7. “Suggested Readings on Assessing Institutional Effectiveness.” Linda Suskie, Middle States Commission on Higher Education. http://www.msche.org/publications/Bibliography-Institutional-Effectiveness.pdf
16
17
COMMUNITY COLLEGE OF BEAVER COUNTY
student learning outcomes assessment
18
STUDENT LEARNING OUTCOMES ASSESSMENT What are Student Learning Outcomes (SLOs)? The Middle States Commission on Higher Education (MSCHE) defines student learning outcomes or SLOs as “the knowledge, skills, and competencies that students are expected to exhibit upon successful completion of a course, academic program, co-curricular program, general education requirement, or other specific set of experiences” (Characteristics 63).
Student Learning Outcome Assessment at CCBC The use of SLO assessment results ensures institutional programs and resources at CCBC are organized and coordinated to achieve institutional and program-level goals, which contributes to the college’s overall level of institutional effectiveness.
The Role of Faculty in the Assessment of SLOs Faculty play an indispensable role in the assessment of student learning outcomes. Faculty are responsible for the creation of clearly articulated, observable outcomes and resulting action plans as well as the careful collection and aggregation of data associated with those plans. In addition, faculty are responsible for sharing assessment plan results with appropriate stakeholders and ensuring actions informed by assessment results are created and followed. Because of the central role faculty play in SLO assessment, a clear understanding of CCBC’s SLO assessment process is necessary. This handbook, the Institutional Effectiveness Council, and the faculty fellow program were designed to help guide faculty through the SLO assessment process and should be consulted whenever clarification is needed in regards to student learning outcomes assessment. FIGURE 4 illustrates CCBC’s student learning outcomes assessment cycle. The narrative following the figure explains the cycle.
19
FIGURE 4 SLO ASSESSMENT CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY SLO Assessment Process: To be completed annually by ALL institutional programs, EXCEPT General Education.
A
KEY: IEC = INSTITUTIONAL EFFECTIVENESS COUNCIL SLO = STUDENT LEARNING OUTCOME a: SLOs Revised/Created OL = OUTCOME LEAD A:D: CTIVITY SLO/SDO Guidelines D: OCUMENT and Rubrics R: ESPONSIBLE PARTY T: IMLINE
b c d g
a: SLOs Formally Reviewed D: IEC Forms
R: Outcome Lead (OL)/IEC
R: Outcome Lead (OL)
T: August-September
SLO ASSESSMENT PROCESS: To be completed annually for ALL institutional programs, EXCEPT General Education and College Prep.
T: August/September A A: SLOs Revised/Created D: SLO Guidelines and Rubric R: Outcome Lead (OL) T: August/September
C
B A: SLOs Formally Reviewed D: IEC Forms R: Outcome Lead (OL)/IEC T: September-October
A: Data Collected D: Data Collection Form R: Outcome Lead (OL) T: Nov-Dec (fall)/April-May(spring)
a: Data Collected
A: Data Aggregated
TracDat Matrix Form/ D: DataD: Collection D R: Outcome Lead (OL) Blackboard Collection Site/ T: May Other
G
A: Follow-Up Documented D: TracDat Matrix R: Outcome Lead (OL) T: Throughout Assessment Cycle
E F
A: Data Discussed/Budget Linked D: TracDat SLO Report/SLO Checklist/ Budget Forms R: All OLs/Division Director T: April-May/August-September
T: November-December (fall) E April-May (spring)
a: Actions Documented
a: Data Aggregated
D: TracDat
D: TracDat
R: Outcome Lead (OL)
R: Outcome Lead (OL)
T: April-May/ August-September
T: May
a: Data Discussed/Budget Linked
a: Follow-Up Documented
D: TracDat 4-Column Report/ SLO Checklist/Budgetary Forms R: All Program OLs/Immediate Supervisor
D: TracDat
T: April-May/ August-September
20
R: Outcome Lead (OL)
R: Outcome Lead (OL) T: Mid-Year/End-of-Year
SLO = Student Learning Outcome OL = Outcome Lead A: ctivity D: ocument
key
F A: Actions Documented D: TracDat Matrix R: Outcome Lead (OL) T: April-May/August-September
IEC = Institutional Effectiveness Council
R: esponsible Party T: imeline
SLO ASSESSMENT PROCESS NARRATIVE SLO ASSESSMENT PROCESS: BOX A
SLO ASSESSMENT PROCESS: BOX B
Activity (SLOs Revised/Created): Lead faculty members for individual program outcomes, or outcome leads (OLs), will receive an email reminding them to review, revise, and/or create new outcomes for the upcoming academic year as appropriate. NOTE: SLOs should only be modified at the five-year program review point unless substantial change has occurred in the program area necessitating the addition and/or revision of SLOs before the five-year program review.
Activity (SLOs Formally Reviewed): If an OL decides to revise and/or create new outcomes for the upcoming academic year, s/he must submit revised and/or new outcomes for formal review and approval to the Institutional Effectiveness Council (IEC).
Document (SLO/SDO Guidelines and Rubrics): Student learning outcomes created by outcome leads (OLs) should adhere to the guidelines for successful SLOs established in the SLO/SDO Guidelines and Rubrics document (see APPENDIX D), which is available from the Outcomes Assessment course on Blackboard. Responsible Party (Outcome Lead): Outcome leads or OLs are faculty members responsible for reviewing/revising/creating specific program outcomes within programs. Generally, CCBC faculty are not assigned to assess entire programs. Rather, faculty or OLs are assigned to certain outcomes within a program or programs. Timeline (August/September): SLOs will be reviewed, revised, and/or created no later than September to ensure appropriate outcomes are in place for the academic year.
Document (IEC Forms): To submit outcomes for formal IEC review, the OL must complete the appropriate IEC forms, which are available from the Outcomes Assessment course on Blackboard (see also APPENDIX E). Responsible Party (Outcome Lead/IEC): The outcome lead is responsible for submitting appropriate documentation regarding revised and/or newly created student learning outcomes to the IEC. Timeline (August-September): To be approved and active for the current academic year, revised and/or newly created outcomes must be submitted to the IEC no later than the Council’s last meeting in September.
SLO ASSESSMENT PROCESS: BOX C Activity (Data Collected): The OL should collect assessment information from appropriate courses according to his/her established assessment plan. Document (Data Collection Form/Blackboard Collection Site/Other): The OL should document all data collected in an electronic format so that it can later be uploaded to TracDat. The college created data collection form may be used if an OL so chooses, or the OL may request a Blackboard collection site be created and added to the Outcomes Assessment Site through Blackboard. Survey requests should be directed to the Faculty Fellow for Planning, Assessment, and Improvement. However, any method of collection is acceptable as long as it is accurate, thorough, and electronic. The Data Collection Form can be
21
found in the Outcomes Assessment Course on Blackboard (see also APPENDIX F). The form also contains a memo to alert other instructors, if need be, about the data the OL will be collecting. Responsible Party (Outcome Lead): The outcome lead is responsible for collecting and documenting data appropriate to the assessment of his/her assigned student learning outcomes according to his/her established assessment plan. NOTE: Please review “More Information on Collecting and Aggregating SLO Data,� which follows below, for more information regarding how faculty may choose to collect assessment data. Timeline (Nov-Dec (fall)/April-May (spring)): Assessment data should be collected during both the fall and spring semesters. During the fall semester, OLs should collect data no later than November/ December and during the spring semester data should be collected by OLs no later than April/May.
SLO ASSESSMENT PROCESS: BOX D Activity (Data Aggregated): After collecting and documenting appropriate assessment data using the data collection form or another electronic means of collection, OLs should aggregate the data and enter it into TracDat. NOTE: While data collection occurs during both the fall and spring semesters, aggregation of data and TracDat data entry take place during the spring semester only and account for data from the entire academic year. Document (TracDat): During this phase of the assessment cycle, an email reminder will be sent to all OLs alerting them that assessment data should be entered into TracDat for each outcome they are responsible for assessing. NOTE: See APPENDIX G or Blackboard Outcomes Assessment Site for a series of screenshots detailing how to enter assessment data into TracDat. Responsible Party (Outcome Lead): The outcome lead is responsible for aggregating and entering appropriate assessment data into TracDat during the spring semester each academic year. Timeline (May): All assessment data should be aggregated and entered into TracDat by the end of May. It is recommended assessments be aggregated and entered into TracDat as early as possible to allow time for discussion of the data with immediate supervisors before the end of the semester.
SLO ASSESSMENT PROCESS: BOX e Activity (Actions Documented): Each outcome lead should add actions and/or update previous actions in TracDat based on current SLO data. APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up provides more information regarding how to compose successful action statements. Document (TracDat): Each outcome lead (OL) should access TracDat to add actions and/or update the status of previous actions based on current SLO data and data discussions (see APPENDIX G: TracDat Data Entry Screenshots). Responsible Party (Outcome Lead): OLs are responsible for adding new actions. Timeline (April-May/August-September): Actions should be added to TracDat in April/May or August/September.
22
SLO ASSESSMENT PROCESS: BOX f
4-column report to his/her immediate supervisor and completing, in conjunction with the supervisor Activity (Data Discussed/Budget Linked): and other program OLs, the SLO Checklist. Following the collection and aggregation of OLs should retain copies of the completed SLO assessment data, immediate supervisors should Checklist and TracDat 4-column reports for all schedule data discussions with all outcome leads outcomes they are responsible for assessing. for each program in their area. During these These documents should be referenced as discussions, established actions will be discussed follow-up is added to TracDat throughout the and resource needs will be addressed. assessment cycle. The immediate supervisor is responsible for Document (TracDat 4-Column Report/SLO scheduling data discussions with all program Checklist/Budgetary Forms): Following the OLs, retaining copies of SLO Checklists and completion of data entry through TracDat, each TracDat 4-column reports for each CCBC program OL should create a TracDat 4-column program under his/her direction, and forwarding report (see APPENDIX H). APPENDIX G completed SLO Checklists to the IEC. The explains how to create a 4-column report. A supervisor is also responsible for moving forward copy of the report should be provided to the any budgetary needs deemed necessary immediate supervisor and used to discuss the following data discussions. program with the supervisor and other program OLs. In addition to the TracDat 4-column report, Timeline (April-May/August-September): the SLO Checklist (see APPENDIX I), which can Ideally, data discussions should take place be accessed from the Outcomes Assessment between April and May; however, in certain cases course on Blackboard, should be discussed and this may be impossible because of the timing of completed by OLs and the immediate supervisor assessment measures (final paper, final exam, during their data discussion. If necessary, the etc.). In such cases, program OLs should wait to supervisor should also complete appropriate meet as a group with the immediate supervisor budgetary forms (New Initiative form, for until the beginning of the fall semester (August/ example) to address specific program needs. September). Responsible Party (All Program Outcome Leads/Immediate Supervisor): Each program OL is responsible for providing a TracDat
23
SLO ASSESSMENT PROCESS: BOX G Activity (Follow-Up Documented): This step of the SLO assessment process is necessary and required, but occurs periodically throughout the assessment process. At a minimum, follow-up should be entered into TracDat at the mid-year and end-of-year points. Document (TracDat): At mid- and end-of-year points, the OL should access TracDat to document follow-up in regards to established actions (see APPENDIX G: TracDat Data Entry Screenshots and APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up). Responsible Party (Outcome Lead): The OL is responsible for entering follow-up in relation to established actions. Timeline (Mid-Year/End-of-Year): Follow-up information is a required part of the SLO assessment process and should be documented as appropriate or --at minimum--the mid-year (December) and end-of-year points (May).
More Information on Collecting and Aggregating SLO Data As outcome leads, faculty are responsible for collecting and aggregating the data needed to accurately assess the student learning outcomes they have established according to the assessment plans they have likewise developed. Depending upon the student learning outcomes and assessment plan a faculty member or outcome lead (OL) has developed, it may be necessary to collect assessment data from other faculty members, both full- and/or part-time. If that is the case, faculty members may pursue one of two options: 1) identify and contact appropriate faculty for assessment information on their own or 2) request a Blackboard collection site for the automated and anonymous collection and aggregation of appropriate assessment data. Requests for a Blackboard collection site should be submitted to the Faculty Fellow for Planning, Assessment, and Improvement. Faculty Fellow for Assessment, Planning, and Improvement. 24
Resources For more information regarding student learning outcomes assessment, please consult the following resources.
1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Middle States Commission on Higher Education. http://www.msche.org/publications/Assessment_Expectations051222081842.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 4. “Examples of Evidence of Student Learning.” Assessing Student Learning: A Commonsense Guide, Linda Suskie. http://www.msche.org/publications/exam ples-of-evidence-of-student-learning.pdf 5. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection.
6. National Institution for Learning Outcomes Assessment http://www.learningoutco Resources meassessment.org/publications.html “Regional Accreditation andlearning Studentoutcomes Learning:assessment, Principles for Goodconsult Practice.” For more 7. information regarding student please the following Council of Regional Accrediting Commissions. http://www.msche.org/publications/ resources. Regnlsl050208135331.pdf. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. 8. “Seminal Readings on Assessing Student Learning.” Middle States Commission on Higher Education. http://www.msche.org/publications/Bibliography---seminal. CCBC pdf Library Resource Center, Institutional Effectiveness Collection. 1. Assessing Student Learning and Institutional Understanding Middle States 9. Student Learning Assessment: OptionsEffectiveness: and Resources. Middle States Expectations. Middle on Higher Education. Commission onStates HigherCommission Education. http://www.msche.org/publications/SLA_ Book_0808080728085320.pdf http://www.msche.org/publications/Assessment_Expectations051222081842.pdf
25
COMMUNITY COLLEGE OF BEAVER COUNTY
general education learning outcomes assessment
26
GENERAL EDUCATION LEARNING OUTCOMES ASSESSMENT What is General Education? General education is an important component of undergraduate study that contributes to students’ “essential knowledge, cognitive abilities, understanding of values and ethics…and draws students into new areas of intellectual experience, expanding cultural and global awareness and sensitivity, and preparing them to make enlightened judgments outside as well as within their academic specialty” (Characteristics of Excellence 47).
General Education at CCBC Community College of Beaver County has identified five general education competencies: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Cultural Literacy, and Technology Literacy. The college’s general education requirements are publicly defined in the academic catalog: http://www.ccbc.edu/ CourseSchedulesAndSearch.
Assessing General Education CCBC’s general education competencies are assessed at both the gen. ed. and major-specific levels through embedded course assignments referred to as General Education Competency (GEC) assignments, which are established in master syllabi. Mastery matrices attached to GEC assignments are used to directly assess established student learning outcomes associated with each of CCBC’s general education competencies, as illustrated in FIGURE 5: General Education Outcomes and Mastery Matrices. APPENDIX J provides further information about CCBC’s general education program, including how mastery matrices should be used and timelines for collegewide general education assessment. General education outcomes are assessed according to an established cycle as illustrated in FIGURE 6. The narrative following FIGURE 6 explains the General Education assessment cycle.
27
FIGURE 5 GENERAL EDUCATION OUTCOMES AND MASTERY MATRICES REQUIREMENT 1: COMMUNICATION PROFICIENCY OUTCOME #1 Demonstrate clear and skillful communication methods appropriate to different occasions, audiences, and purposes. MASTERY MATRIX #1 Mastery Student consistently demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.
Progressing Student generally demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.
Low/No Mastery Student does not consistently or generally demonstrate clear and skillful communication methods appropriate to occasion, audience, and purpose.
REQUIRMENT 2: INFORMATION LITERACY OUTCOME #2
Access, evaluate, and appropriately utilize information from credible sources.
MASTERY MATRIX #2 Mastery Student consistently accesses, evaluates, and appropriately utilizes information from credible sources.
28
Progressing Student generally accesses, evaluates, and appropriately utilizes information from credible sources.
Low/No Mastery Student does not access, evaluate, and appropriately utilize information from credible sources.
REQUIREMENT 3: SCIENTIFIC AND QUANTITATIVE REASONING OUTCOME #3 Select and apply appropriate problem-solving techniques to reach a conclusion (hypothesis, decision, interpretation, etc.). MASTERY MATRIX #3 Mastery Student consistently selects and applies appropriate problemsolving techniques to reach a conclusion.
Progressing Student generally selects and applies appropriate problemsolving techniques to reach a conclusion.
Low/No Mastery Student does not consistently or generally select and apply appropriate problem-solving techniques to reach a conclusion.
REQUIREMENT 4: CULTURAL LITERACY OUTCOME #4 Demonstrate an understanding and appreciation of the broad diversity of the human experience. MASTERY MATRIX #4 Mastery Student consistently demonstrates an understanding and appreciation of the broad diversity of the human experience.
Progressing Student generally demonstrates an understanding and appreciation of the broad diversity of the human experience.
Low/No Mastery Student does not consistently or generally demonstrate an understanding and appreciation of the broad diversity of the human experience.
REQUIREMENT 5: TECHNOLOGY LITERACY OUTCOME #5 Utilize appropriate technology to access, build, and share knowledge in an effective manner. MASTERY MATRIX #5 Mastery Student consistently utilizes appropriate technology to access, build, and share knowledge in an effective manner.
Progressing Student generally utilizes appropriate technology to access, build, and share knowledge in an effective manner.
Low/No Mastery Student does not consistently or generally utilize appropriate technology to access, build, and share knowledge in an effective manner.
29
FIGURE 6 General Education ASSESSMENT CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY General Education Assessment Process:
A
To be completed annually for the General Education Program.
KEY: IEC = INSTITUTIONAL EFFECTIVENESS COUNCIL SLO = STUDENT LEARNING OUTCOME a: SLOs Revised/Created OL = OUTCOME LEAD A: CTIVITY D: SLO/SDO Guidelines and Rubric D: OCUMENT R: ESPONSIBLE PARTY R: T: Faculty Fellow IMLINE
b c d g
a: SLOs Formally Reviewed D: IEC Forms
R: Faculty Fellow/IEC
August/September T: August-September SLOT:ASSESSMENT PROCESS: To be completed annually for ALL institutional programs, EXCEPT General Education and College Prep. A
C
B A: SLOs Formally Reviewed D: IEC Forms R: Outcome Lead (OL)/IEC T: September-October
A: SLOs Revised/Created D: SLO Guidelines and Rubric R: Outcome Lead (OL) T: August/September
IEC = Institutional Effectiveness Council
A: Data Collected D: Data Collection Form R: Outcome Lead (OL) T: Nov-Dec (fall)/April-May(spring)
SLO = Student Learning Outcome
a: SLOs Communicated/ A: Data Aggregated Implemented D: TracDat Matrix R: Outcome Lead (OL) D: DataT: Collection Memo May
G
A: Follow-Up Documented D: TracDat Matrix R: Outcome Lead (OL) T: Throughout Assessment Cycle
e f
A: Actions Documented D: TracDat Matrix R: Outcome Lead (OL) T: April-May/August-September
A: ctivity
T: October (fall)/ February (Spring)
D: ocument
A: Data Discussed/Budget Linked D: TracDat SLO Report/SLO Checklist/ Budget Forms R: All OLs/Division Director T: April-May/August-September
E
a: Data Collected
D: TracDat
D: Data Collection Form/ Blackboard Collection Site/ Other
T: April - May
a: Data Discussed/Budget Linked D: TracDat 4-Column Report/ SLO Checklist/Budget Forms R: Faculty Fellow/Immediate Supervisor T: April-May/ August-September
30
R: Faculty Fellow
a: Data Aggregated R: Faculty Fellow
OL = Outcome Lead
R: Faculty Fellow/Teaching Faculty T: November-December (fall) April-May (spring)
a: Follow-Up Documented D: TracDat
R: Faculty Fellow
T: Mid-Year/End-of-Year
key
F
D
R: esponsible Party T: imeline
GENERAL EDUCATION OUTCOMES ASSESSMENT PROCESS NARRATIVE GEN. ED. ASSESSMENT PROCESS: BOX A
Timeline (August-September): To be approved and active for the current academic year, Activity (SLOs Revised/Created): Student revised and/or newly created outcomes must be learning outcomes for CCBC’s General Education submitted to the IEC no later than the Council’s Program will be reviewed/revised/created last meeting in September. as appropriate each academic year. NOTE: Outcomes should only be revised at the five-year GEN. ED. ASSESSMENT PROCESS: BOX C program review point, unless significant change Activity (SLOs Communicated/Implemented): in the program area necessitates an immediate Following formal approval by the IEC, in cases revision. of revision and/or newly created SLOs, or Document (SLO/SDO Guidelines and following review of SLOs with no changes by the Rubrics): Student learning outcomes for the faculty fellow, the fellow should formally contact General Education Program should adhere all instructors teaching courses from which to the guidelines for successful SLOs as assessment data will be collected using the Data established in the SLO/SDO Guidelines and Collection Memo, which can be accessed from Rubrics documents, which are available from the the Outcomes Assessment course on Blackboard Outcomes Assessment course on Blackboard (see also APPENDIX F: SLO Data Collection (see also APPENDIX D). Forms). Responsible Party (Faculty Fellow): The Faculty Fellow for Planning, Assessment, and Improvement serves as the Outcome Lead (OL) for the General Education Program.
Document (Data Collection Memo): APPENDIX F: SLO Data Collection Forms, includes a standardized memo that may be used by the faculty fellow to inform other faculty (both full- and part-time) about the method/s, criteria, Timeline (August/September): SLOs will be and means of collection that will be used to reviewed, revised, and/or created no later than assess the student learning outcomes attached September to ensure appropriate outcomes are in to CCBC’s General Education Program. If the place for the upcoming academic year. data collection memo is not used, the faculty fellow must develop his/her own method of GEN. ED. ASSESSMENT PROCESS: BOX B communicating assessment logistics to other faculty members teaching within the General Activity (SLOs Formally Reviewed): If the Education Program. faculty fellow decides to revise and/or create new outcomes for the upcoming academic year, Responsible Party (Faculty Fellow): The the outcomes must be formally reviewed and Faculty Fellow for Planning, Assessment, and approved by the Institutional Effectiveness Improvement should prepare and send data Council (IEC). collection emails to appropriate full- and part-time faculty members. Document (IEC Forms): To submit outcomes for formal review to the IEC, the faculty fellow must Timeline (October (fall)/February (spring)): complete the appropriate IEC forms, which are Since assessment data should be collected available from the Outcomes Assessment course during both fall and spring semesters, data on Blackboard (see also APPENDIX E). collection memos should be sent near the middle of those semesters, or October and February, Responsible Party (Faculty Fellow/IEC): The respectively. NOTE: Data gathered during the faculty fellow will be responsible for submitting academic year is only required to be aggregated appropriate documentation regarding revised and/ and entered into TracDat during the spring or newly created student learning outcomes to the semester. IEC. 31
GEN. ED. ASSESSMENT PROCESS: BOX D
GEN. ED. ASSESSMENT PROCESS: BOX E
Activity (Data Collected): Those faculty, both full- and part-time, teaching within CCBC’s General Education Program and previously contacted by the Faculty Fellow should upload appropriate assessment data to the Outcomes Assessment Site on Blackboard.
Activity (Data Aggregated): After receiving assessment data from all course sections via the Outcomes Assessment course on Blackboard, the data should be aggregated and entered into TracDat.
Document (Data Collection Form/Blackboard Collection Site/Other): All general education data is to be collected through the Outcomes Assessment Site on Blackboard; however, the Faculty Fellow may choose to collect data using alternative means. The college created data collection form may be used if an OL so chooses. Any method of collection is acceptable as long as it is accurate, thorough, and electronic. The Data Collection Form can be found in the Outcomes Assessment course on Blackboard. Responsible Party (Faculty Fellow/Teaching Faculty): OLs and the faculty fellow are responsible for ensuring an appropriate data collection method is put into place. All faculty teaching within CCBC’s General Education Program will be responsible for uploading appropriate assessment data to the Outcomes Assessment Site on Blackboard or reporting data as requested.
Document (TracDat): During this phase of the assessment cycle, the Faculty Fellow will enter data in TracDat. Responsible Party (Faculty Fellow): The Faculty Fellow for Planning, Assessment, and Improvement, is responsible for aggregating and entering assessment data into TracDat. Timeline (April-May): All assessment data should be aggregated and entered into TracDat between April and May. It is recommended data be aggregated and entered into TracDat as early as possible to allow ample time for data discussions with the immediate supervisor. GEN. ED. ASSESSMENT PROCESS: BOX F Activity (Data Discussed/Budget Linked): The faculty fellow should meet and discuss outcomes assessment data and any budgetary needs.
Document (TracDat 4-Column Report/SLO Checklist/Budgetary Forms): Following TracDat data entry, the Faculty Fellow should Timeline (Nov-Dec (fall)/April-May (spring)): create a TracDat 4-column report (see Assessment data should be collected during APPENDIX H). APPENDIX G explains how to both the fall and spring semesters. During the create a 4-column report. A copy of the report fall semester data should be uploaded to the should be provided to the immediate supervisor Outcomes Assessment Site from Novemberand used to discuss the program with the December. During the spring semester data supervisor. In addition to the TracDat 4-column should be uploaded from April-May. Specific report, the SLO Checklist (see APPENDIX I), assessment data deadlines should be determined which can be accessed from the Outcomes by the Faculty Fellow. Assessment course on Blackboard, should be discussed and completed by the Faculty Fellow and the immediate supervisor during their data discussion. If necessary, the supervisor should also complete appropriate budgetary forms (New Initiative form, for example) to address specific program needs.
32
Responsible Party (Faculty Fellow/ Immediate Supervisor): The Faculty Fellow is responsible for providing a TracDat 4-column report to his/her immediate supervisor and completing, in conjunction with the supervisor, the SLO Checklist. The Faculty Fellow should retain copies of the completed SLO Checklist and TracDat 4-column report for all general education outcomes. These documents should be referenced as follow-up is added to TracDat throughout the assessment cycle. The immediate supervisor is responsible for scheduling a data discussion with the faculty fellow, retaining copies of SLO Checklists and TracDat 4-column reports for each CCBC program under his/her direction, and forwarding completed SLO Checklists to the IEC. The supervisor is also responsible for moving forward any budgetary needs deemed necessary following data discussions. Timeline (April-May/August-September): Ideally, data discussions should take place between April and May; however, in certain cases this may be impossible because of the timing of assessment measures (final paper, final exam, etc.). In such cases, the Faculty Fellow and immediate supervisor should wait to meet until the beginning of the fall semester (August/ September).
33
GEN. ED. ASSESSMENT PROCESS: BOX H Activity (Follow-Up Documented): This step of the general education assessment process is necessary and required, but occurs periodically throughout the assessment cycle. At a minimum, follow-up should be entered into TracDat at the mid-year and end-of-year points. Document (TracDat): At mid- and end-of-year points, the Faculty Fellow should access TracDat to document follow-up in regards to established actions (see APPENDIX G: TracDat Data Entry Screenshots and APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up). Responsible Party (Faculty Fellow): The Faculty Fellow is responsible for entering followup in relation to established actions. Timeline (Mid-Year/End-of-Year): Follow-up information is a required part of the assessment process and should be documented as appropriate or --at minimum--the mid-year (December) and end-of-year points (May).
34
Resources For more information regarding the assessment of general education, please consult the following resources: 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 3. Developing Research and Communication Skills: Guidelines for Information Literacy in the Curriculum. Middle States Commission on Higher Education. https://www.msche.org/publications/Developing-Skills080111151714.pdf 4. “General Education Competency Grid.� Middle States Commission on Higher Education. http://www.msche.org/publications_view.
35
COMMUNITY COLLEGE OF BEAVER COUNTY College Prep learning outcomes assessment COLLEGE PREP LEARNING OUTCOMES ASSESSMENT What is College Prep? Traditionally called “developmental” or “remedial” courses, CCBC’s college prep courses prepare students for college-level study. In fact, not all college prep courses at CCBC are remedial in nature, but rather required to better acclimate first-time students to the post-secondary environment. The College Prep Program at CCBC Community College of Beaver County identifies four areas of study often deemed necessary to prepare students for college-level work: college success, reading, writing, and math. To better serve students taking college prep courses at CCBC, in 2014 the institution began assessing all college prep areas as one program, much like the assessment of its General Education program. Assessing College Prep The College Prep program is assessed according to the Student Learning Outcomes Assessment Cycle (see FIGURE 4).
36
Resources 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Developmental Assessment in Community Colleges: A Review of the Literature. Katherine L. Hughes and Judith Scott-Clayton. Community College Research Center, Teachers College: Columbia University. http://www.postsecondaryresearch.org/conference/PDF/NCPR_Panel%202_ HughesClaytonPaper.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/ publications/CHX-2011-WEB.pdf 4. Connecting Curriculum, Assessment, and Treatment in Developmental Education. Community College Virtual Symposium, U.S. Department of Education: Office of Vocational and Adult Education. https://www2.ed.gov/about/offices/list/ovae/pi/cclo/brief-3-reinventing-dev-ed.pdf 5. “Culture of Evidence and Inquiry.� Achieving the Dream. http://achievingthedream.org/focusareas/culture-of-evidence-inquiry 6. National Center for Developmental Education. Appalachian State University. http://ncde. appstate.edu/publications/what-works-research-based-best-practices-developmental-education 7. NADE: National Association for Developmental Education. http://www.nade.net/index.html
37
COMMUNITY COLLEGE OF BEAVER COUNTY
Service Department outcomes assessment
38
SERVICE DEPARTMENT OUTCOMES ASSESSMENT What are Service Departments? CCBC defines service departments as any non-academic area of the college that provides services to the institution’s employees and/or students. Currently, CCBC recognizes seventeen such departments. What are Service Departments? services to the institution’s employees and/or
Service Department outcomes assessment CCBC defines service departments as any non-academic area of the college that provides
students. Currently, CCBC recognizes seventeen such departments.
FIGURE 7 SERVICE DEPARTMENTS COMMUNITY COLLEGE OF BEAVER COUNTY AcademicACADEMIC Support Finance and & FINANCE SUPPORT & ENOPERATIONS and Enrollment operations ROLLMENT
Community
COMMUNITY relations and& RELATIONS DEVELOPMENT developemnt
Enrollment Enrollment ManManagement
Finance Finance
Communications Communications
Counseling Counseling
Financial FinancialAid Aid
Development Development
agement
HumanREHUMAN SOURCES Resources Human
Human ReResources sources
Information INFORMATION TECHNOLOGIES Technologies
Workforce and
WORKFORCE continuing & CONTINUING EDUCAEducation TION Information Workforce Information TechWorkforce Technologies nologies Continuing Continuing Education Education
Career Career CenterCenter Physical PhysicalPlant Plant LibraryLibrary Tutorial/Learning Tutorial/Learning Lab
Lab & Supportive Services Supportive Services Student Activities
Student Activities GED Program
39
Assessing Service Departments at CCBC Service departments at CCBC have established service department outcomes (SDOs) that are assessed annually according to an established cycle. Like student learning outcomes or SLOs, SDOs are tracked using employee created assessment methods and are assessed annually against employee generated benchmarks. All SDO assessment data is stored in TracDat. The SDO assessment cycle is illustrated in FIGURE 8: SDO Assessment Cycle and explained in the accompanying narrative.
FIGURE 8 SDO ASSESSMENT CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY SDO assessments must be completed annually for all service departments
A h g
b
a: SDOs Revised/Created
a: SDOs Formally Reviewed
D: SLO/SDO Guidelines and Rubrics
D: IEC Forms
R:Outcome Lead (OL) T: July/August
R: Outcome Lead (OL)/IEC T: August - September
D: Data Collection Memo R: Outcome Lead (OL) T: September-October
a: Follow-Up Documented
a: Data Collected
D: TracDat
D: Data Collection Form
R: Outcome Lead
R: Outcome Lead (OL)
T: Mid-year/end-of-year
T: March-April
f
a: Data Discussed/Budget Linked D: TracDat 4-Column Report/SDO Checklist/Budgetary Forms
a: Actions Documented
a: Data Assessed/Reported
D: TracDat
D: TracDat
R: Outcome Lead (OL)
R: Outcome Lead (OL)
R: Outcome Lead/Immediate Supervisor
T: April-May
T: April - May
T: April-May
key
IEC = Institutional Effectiveness Council
A: ctivity
SDO = Service Department Outcome
R: esponsible Party
OL = Outcome Lead
40
c d e
a: SDOs Communicated/ Implemented
D: ocument T: imeline
SDO ASSESSMENT PROCESS NARRATIVE SDO ASSESSMENT PROCESS: BOX A Activity (SDOs Revised/Created): The lead staff member for each department outcome, or outcome lead (OL), will receive an email prompting him/her to review, revise, and/or create outcomes for the upcoming academic year as appropriate. NOTE: SDOs should only be modified at the five-year point unless substantial changes have occurred in the area necessitating the addition and/or revision of SDOs before the five-year mark. Document (SLO/SDO Guidelines and Rubrics): Service department outcomes created and/or revised by OLs should adhere to the guidelines for successful SDOs established in the SLO/ SDO Guidelines and Rubrics document, which is available from the Outcomes Assessment course on Blackboard (see also APPENDIX D).
Timeline (August-September): To be approved and active for the current academic year, revised and/or newly created SDOs must be submitted to the IEC no later than the Council’s last meeting in September. SDO ASSESSMENT PROCESS: BOX C Activity (SDOs Communicated/Implemented): If necessary, OLs should contact all employees from which they will need to collect data to appropriately assess established outcomes. Document (Data Collection Memo): APPENDIX F: SLO Data Collection Forms contains a standardized memo that may be adapted by staff OLs to inform other staff members (both full- and part-time) about the method/s, criteria, and means of collection that will be used to assess established outcomes.
Responsible Party (Outcome Lead): OLs are employees responsible for specific department outcomes.
Responsible Party (Outcome Lead): Staff OLs should complete the data collection memo—if necessary--and send it to appropriate full- and part-time staff members.
Timeline (July/August): SDOs should be reviewed, revised, and/or created no later than August to ensure appropriate outcomes are in place for the academic year.
Timeline (September-October): Staff OLs should contact other staff members as necessary no later than October.
SDO ASSESSMENT PROCESS: BOX B Activity (SDOs Formally Reviewed): If the lead staff member decides to revise and/or create new outcomes for the upcoming academic year, the outcomes must be formally reviewed and approved by the Institutional Effectiveness Council (IEC). Document (IEC Forms): To submit outcomes for formal review to the IEC, the lead staff member must complete the appropriate IEC forms, which are available from the Outcomes Assessment course on Blackboard.
SDO ASSESSMENT PROCESS: BOX D Activity (Data Collected): Staff OLs should collect appropriate assessment data. Document (Data Collection Form): The data collection form (see APPENDIX F), or other electronic, staff-created form, should be used to capture assessment data. Responsible Party (Outcome Lead): The OL should ensure appropriate data is collected. Timeline (March-April): All assessment data should be collected no later than April.
Responsible Party (Outcome Lead/IEC): The lead staff member will be responsible for submitting appropriate documentation regarding revised and/or newly created service department outcomes to the IEC. 41
SDO ASSESSMENT PROCESS: BOX E
Document (TracDat 4-Column Report/SDO Checklist/Budgetary Forms): Following the Activity (Data Assessed/Reported): After completion of data entry through TracDat, a receiving the appropriate assessment data, it TracDat 4-column report (see APPENDIX H) should be aggregated and entered into TracDat. should be created and provided to the supervisor by each staff OL. APPENDIX G explains how Document (TracDat): During this phase of the a 4-column report is created in TracDat. The assessment cycle, staff OLs will receive an email 4-column report should be used to discuss the reminder regarding TracDat data entry. program with the supervisor and other staff Responsible Party (Outcome Lead): The OL is OLs. In addition to the TracDat report, the responsible for aggregating and entering data into SLO/SDO Checklist for Data Discussions (see APPENDIX I), which can be accessed from the TracDat. Outcomes Assessment course on Blackboard, Timeline (April-May): All assessment data should be discussed and completed by OLs should be aggregated and entered into TracDat and the immediate supervisor during their data by the end of May. discussion. If necessary, the supervisor should also complete appropriate budgetary forms (New SDO ASSESSMENT PROCESS: BOX F Initiative form, for example) to address specific Activity (Actions Documented): Each outcome department needs. lead should add actions and/or update previous Responsible Party (Outcome Lead/Immediate actions in TracDat based on current SDO data. Supervisor): Each program OL is responsible APPENDIX L: SLO/SDO Guidelines for Reporting for providing a TracDat 4-column report to his/ Results, Actions, and Follow-Up provides more her immediate supervisor and completing, in information regarding how to compose successful conjunction with the supervisor and other program action statements. OLs, the Data Discussion Checklist. OLs should Document (TracDat): Each OL should access TracDat to add actions and/or update the status of previous actions based on current SDO data and data discussions (see APPENDIX G: TracDat Data Entry Screenshots). Responsible Party (Outcome Lead): OLs are responsible for adding new actions and/or updating previous actions. Timeline (April-May): Actions should be added to TracDat in April/May. SDO ASSESSMENT PROCESS: BOX G Activity (Data Discussed/Budget Linked): The OLs for each service department should meet as a group to discuss actions and needs associated with each service department based on SDO assessment results with their immediate supervisor to determine and gain approval for new/updated actions and/or budgetary requests.
42
retain copies of the completed SDO Checklist and TracDat 4-column reports for all outcomes for reference and to ensure a more streamlined and effective SDO assessment process as follow-up is added to TracDat throughout the assessment cycle. The immediate supervior is responsible for scheduling data discussions with all department OLs, retaining copies of SDO Checklists and TracDat 4-column reports for each CCBC department under his/her direction, and forwarding completed SDO Checklists to the IEC. The supervisor is also responsible for moving forward any budgetary needs deemed necessary following data discussions. Timeline (April-May): Data discussions should take place between April and May. SDO ASSESSMENT PROCESS: BOX H Activity (Follow-Up Documented): This step of the SDO assessment process is necessary and required, but occurs periodically throughout the assessment cycle. At a minimum, follow-up should be entered into TracDat at the mid-year and end-of-year points.
Document (TracDat): At mid- and end-of-year points, OLs should access TracDat to document follow-up in regards to established actions (see APPENDIX G: TracDat Data Entry Screenshots and APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up). Responsible Party (Outcome Lead): The OL is responsible for entering follow-up in relation to established actions. Timeline (Mid-Year/End-of-Year): Follow-up information is a required part of the assessment process and should be documented as appropriate or --at minimum--the mid-year (December) and end-of-year points (May).
Resources For further information regarding the assessment of service departments, please consult the following resources: 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Administrative Assessment: A Guide to Planning, Implementing, and Reporting Division-Wide Evaluation and Assessment. Office of Institutional Effectiveness, Marymount University. http://www.marymount.edu/Media/Website%20 Resources/documents/offices/ie/AdminAssessHandbook.pdf 3. “Assessment and Evaluation.� NASPA: Student Affairs Administrators in Higher Education. https://www.naspa.org/focus-areas/assessment-and-evaluation 4. CAS Professional Standards for Higher Education. Council for the Advancement of Standards in Higher Education. http://standards.cas.edu/getpdf. cfm?PDF=E868395C-F784-2293-129ED7842334B22A 5. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 6. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 7. Internet Resources for Higher Education Outcomes Assessment. Office of Institutional Research and Planning, North Carolina State University. http://www2.acs.ncsu.edu/UPA/archives/assmt/resource.htm#hbooks
43
LIST OF FIGURES FIGURE 1: Institutional Effectiveness Cycle and Process Narrative FIGURE 2: Strategic Planning Cycle and Process Narrative FIGURE 3: 2014-2019 Key Performance Indicators FIGURE 4: SLO Assessment Cycle and Process Narrative FIGURE 5: General Education Outcomes and Mastery Matrices FIGURE 6: General Education Assessment Cycle and Process Narrative FIGURE 7: Service Departments FIGURE 8: Service department outcomes Assessment Cycle and Process Narrative
44
GLOSSARY OF ASSESSMENT TERMS
a
ACTION ITEMS
ACTION PLAN
Strategic actions developed by department/division supervisors in concert with senior administrators that are informed by the senior administrator’s strategic objectives. Action items serve as each department/division supervisor’s performance goals.
a) The plan created by all outcome leads (faculty and staff) that establishes outcomes, assessment methods, and benchmarks for specific programs/departments. b) The plans created by department/division supervisors in concert with senior administrators that establish the logistics associated with each supervisor’s action items.
ANNUAL GOALS REPORT A document prepared by the Institutional Effectiveness Council (IEC) that reports and analyzes the results of department/division supervisors’ action plans (see also ACTION PLAN, definition b).
ASSESSMENT The continuous and cyclical process through which ongoing institutional improvement—at all levels—is achieved.
ASSESSMENT METHOD The measure attached to a stated student learning or service department outcome that indicates how an outcome will be assessed (i.e. exams, essays, licensure exams, certifications, gap analyses, turn-around times, etc.). See also DIRECT and INDIRECT MEASURES.
b
BOARD PRIORITIES
BUDGET FORMS
The set of goals CCBC’s Board of Trustees establishes each year in response to the college’s KPI and Annual Goals Report.
Forms used to establish, discuss, and determine budgetary needs and allocation. Budget forms can be obtained from the office of the controller.
45
c
CRITERIA
d
DATA COLLECTION FORM
DATA COLLECTION MEMO
The assessment methods or measures and benchmark attached to a learning/department outcome.
An electronic form that may be used by outcome leads (OLs) to collect outcomes assessment data (see also DATA COLLECTION MEMO and APPENDIX F: SLO Data Collection Forms).
A form letter used by outcome leads (OLs) to communicate program/department outcomes. The data collection memo can be found in the Outcomes Assessment course on Blackboard (see also DATA COLLECTION FORM and APPENDIX F: SLO Data Collection Forms).
DATA DISCUSSIONS Discussions held between outcome leads (OLs) and immediate supervisors following the collection and aggregation of outcome data through TracDat.
DIRECT MEASURE A typically quantitative measure of outcome achievement (i.e. exams, rubrics, audits, financial rations, etc.)
f
FACULTY FELLOW FOR PLANNING, ASSESSMENT, AND IMPROVEMENT
FOLLOW-UP
Faculty member appointed by the President to serve a two-year term as lead faculty for planning, assessment, and improvement (see also APPENDIX B: Faculty Fellow Description/Application).
Information entered into TracDat at the mid- and end-of-year points of the assessment cycle by outcome leads (OLs) in regards to established actions. cycle by outcome leads (OLs) in regards to established actions.
g
46
GENERAL EDUCATION The knowledge, skills, and abilities that are of use in all programs and prepare students for immediate academic, personal, and professional endeavors as well as for a life of learning. CCBC recognizes five general education competencies: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy.
GENERAL EDUCATION COMPETENCY ASSIGNMENT Also known as GEC assignments (or portfolio assignments), General Education Competency assignments are common assignments established in the master syllabus for all CCBC courses that include a general education competency. GEC assignments are graded using a common rubric and assessed using a mastery matrix or matrices.
GENERAL EDUCATION COMPETENCY ASSIGNMENT RUBRIC
The rubric included in the master syllabus for all courses that meet a general education requirement. The GEC rubric should be used to grade General Education Competency assignments.
GENERAL EDUCATION MASTERY MATRICES
Rubrics used to measure student achievement of the general education outcomes attached to CCBC’s five general education requirements: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy. General education mastery matrices are included in the master syllabus for all courses that meet a general education requirement. They are not used to grade student assignments, but rather to assess students’ level of mastery in regards to specific general education requirements and outcomes (see also APPENDIX J: General Education Student Learning Outcomes: Program and Assessment Overview).
GENERAL EDUCATION OUTCOMES The learning outcomes attached to CCBC’s General Education program. There is one learning outcome for each general education requirement: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy (see also APPENDIX J: General Education Student Learning Outcomes: Program and Assessment Overview).
GENERAL EDUCATION REQUIREMENTS The five requirements that define CCBC’s General Education program and define the knowledge, skills, and abilities the institution views as vital to all majors and a life of learning. CCBC’s general education requirements are Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy.
I
IEC
Acronym for the Institutional Effectiveness Council.
IEC FORMS Forms used to submit revised and/or newly created student learning or service department outcomes to the Institutional Effectiveness Council for formal approval (see also APPENDIX E: Institutional Effectiveness Council Forms for SLO/SDO Approval). 47
INDIRECT MEASURE A typically qualitative measure that should be used to supplement direct measures associated with student learning or service department outcomes. Indirect measures may take such forms as surveys, focus group data, and interviews.
INSTITUTIONAL EFFECTIVENESS The measure of how well an institution is achieving its overall goals. It answers the question “[H]ow well are we collectively doing what we say we are doing?” (Characteristics of Excellence 54). At CCBC, institutional effectiveness ensures the continuous improvement of academic and non-academic units through outcomes assessment.
INSTITUTIONAL EFFECTIVENESS COUNCIL (IEC) The institutional council charged with oversight of the college’s planning, assessment, and improvement efforts (see also APPENDIX A: Institutional Effectiveness Council Charter).
INSTITUTIONAL EFFECTIVENESS PLAN The college’s plan for continuous improvement through planning and outcomes assessment as outlined in the Institutional Effectiveness Handbook.
INSTITUTIONAL GOALS
The overarching goals established in the college’s strategic plan that identify the institution’s overall aim and aspirations.
k
KEY PERFORMANCE INDICATORS (KPI) Established measures designed to assess elements vital to the institution’s overall success, such as enrollment, retention, and graduation rates (see also APPENDIX M: 5-Year Key Performance Indicators and KPI Report Template) .
KPI
48
o
The acronym used to refer to the college’s key performance indicators.
OBJECTIVE The term typically used to refer to learning goals at the course level that are established in master syllabi. Objectives are course specific and should not be confused with outcomes which are program specific.
OL
The acronym used to refer to outcome leads.
OUTCOME
The term typically used to refer to goals at the program/department level that are established and tracked by faculty and staff and reported in TracDat. Student learning outcomes are program specific and represent broader learning goals than course-level objectives.
OUTCOME LEAD
The faculty or staff member assigned to a student learning or service department outcome. Outcome leads are often referred to as OLs.
p
PRESIDENTIAL GOALS
r
RUBRIC
s
SERVICE DEPARTMENT OUTCOME
Goals created by the college president that are informed by the Board’s Priorities and serve as the president’s performance goals.
A tool used to quantitatively and directly assess student learning. A rubric often takes the form of a matrix, with established criteria, scores, and descriptions in various rows and columns.
Outcomes developed by non-academic service departments to gauge quality of service.
SDO
SLO
The acronym used to refer to service department outcomes.
The acronym used to refer to student learning outcomes.
SLO/SDO GUIDELINES AND RUBRICS Guidelines used by faculty and staff to revise and/or create outcomes for programs and departments, respectively. The Institutional Effectiveness Council also uses the guidelines to evaluate outcomes submitted by faculty and staff for formal approval.
49
STRATEGIC OBJECTIVES The objectives developed by senior administrators in concert with the president that are directly informed by presidential goals and serve as each senior administrator’s performance goals.
STRATEGIC PLAN
The overarching, long-term plan that directs goal-setting and resource allocation at all levels of the institution.
STRATEGIC PLANNING
The process utilized to create the institution’s strategic plan every five years as well as the annual processes used to sustain the strategic plan (see also FIGURE 2: Strategic Planning Cycle and Process Narrative).
STUDENT LEARNING OUTCOME (SLO) Program-level outcomes developed by faculty members to gauge student learning.
t
TRACDAT
CCBC’s electronic data management system.
TRACDAT 4-Column Report A report generated using TracDat that organizes all current outcomes assessment data in a four-column matrix. The 4-column report is used during data discussions with immediate supervisors.
50
APPENDICES
51
Appendix a: Institutional effectiveness council charter
52
PURPOSE: The CCBC Institutional Effectiveness Council (IEC) replaces the existing Planning Council and Institutional Assessment Council, both of which are dissolved. The Institutional Effectiveness Council is created through the Office of the President to function as part of the College governance system. It is charged with oversight of the planning and assessment activities of the College. • Maintain a focus on planning and assessment measures that promote student success.
COMMUNITY COLLEGE OF• BEAVER COUNTY • Institutional Effectiveness • Council Charter PURPOSE:
•
The CCBC Institutional Effectiveness Council (IEC) replaces • the existing Planning Council and Institutional Assessment Council, both of which are dissolved. The Institutional Effectiveness Council is created through • the Office of the President to function as part of the College governance system. It is charged with oversight of the planning and assessment activities of the College. • • • • •
RESPONSIBILITIES AND FUNCTIONS:
MEMBERSHIP:
The Institutional Effectiveness Council is designated by the President to have the following responsibilities and functions:
The Institutional Effectiveness Council (IEC) shall be comprised of 18 members chosen by the president through an application process:
• Integrate the Vision, Mission, Values, and Goals of the College into the organization; • Ensure that the College’s strategic plan maintains its viability and integrity; • Engage employees at all levels of the college in assessment and planning activities; • Serve as facilitators for the development of planning and assessment activities; • Monitor key performance indicators aligned with the Vision, Mission, Values and Goals; • Provide reports to the Board of Trustees and broader College community;
• Five (5) full-time faculty: One (1) from each academic division (Aviation, Business & Technologies, Nursing & Allied Health, and Liberal Arts & Sciences), and one (1) additional twelve month faculty member. • Two (2) part-time faculty: One (1) from the academic programs and one (1) from continuing education/workforce development. • Four (4) staff members. • Four (4) administrators/confidentials (including one (1) vice president and one (1) division director).
53
In addition, the following positions will have permanent membership on the IEC but may not hold any office: In addition, the following positions will have permanent • AVP for (or anot successor membership onAssessment the IEC but may hold anyposition) office: Director of Institutional (or aposition) successor position) • AVP for Assessment (orResearch a successor • Data Miner (or a successor position) Director of Institutional Research (or a successor position) • Data Miner (or a successor position)
OFFICERS:
TERMS:
At the first meeting of each year, the following officers shall be chosen to serve for one (1) year:
Terms shall begin the first day of the academic year. All members are eligible to serve one (1) three (3) year term; members may reapply after a lapse of one year. Upon the convening of the first meeting, lots will be drawn to set an equal number of initial one (1), two (2) and (3) year terms that will then convert to a standard three (3) year term upon its expiration.
• A chair whose responsibilities include the scheduling and convening of meetings, setting agendas, and establishing subcommittees, working groups, etc. as needed. • A vice chair who will assist the chair and serve in that capacity in the absence of the chair. • A secretary whose responsibilities include keeping minutes and distributing materials to the membership as requested by the chair.
MEETINGS: The IEC membership will set a year-long meeting schedule at its first meeting but must meet a minimum of two (2) times each during the fall and spring academic semesters.
54
In the case of a member’s resignation or removal, a replacement will be appointed by the President to serve the remainder of that term.
REMOVAL: If a member’s employee status changes such that he/she no longer represents his/her current group, that member will cease to be a member of the IEC. Any member who misses two (2) consecutive meetings will be removed from the IEC.
RECORDS: The IEC secretary will archive all Council documents including minutes in Outlook Public Folders (or a successor site).
APPENDIX B: FACULTY FELLOWSHIP DESCRIPTION/APPLICATION
55
FELLOWSHIP DESCRIPTION Faculty Fellow for Planning, Assessment, and Improvement GENERAL PURPOSE OF FELLOWSHIP The Faculty Fellow for Planning, Assessment, and Improvement will advance and sustain institutional effectiveness at Community College of Beaver County. The Fellow reports directly to the President.
ESSENTIAL DUTIES AND RESPONSIBILITIES
KNOWLEDGE, SKILLS, AND ABILITIES
The Faculty Fellow for Planning, Assessment, and Improvement will be expected to:
The Faculty Fellow for Planning, Assessment and Improvement should possess the following knowledge, skills, and abilities:
• Advance a culture of evidence at CCBC through planning, assessment, and improvement; • Cultivate leadership to create, maintain, and support planning, assessment, and improvement at the program, departmental, and institutional levels; • Support the planning, collection, dissemination, and use of assessment and assessment results for institutional improvement; • Serve on the Institutional Effectiveness Council; • Meet with the President to discuss successes, challenges, and goals; • Attend workshops and training sessions designed to increase knowledge of institutional effectiveness. APPOINTMENT OF FACULTY FELLOW The Faculty Fellow for Planning, Assessment, and Improvement is appointed by the college President and serves a two-year term. TERMS OF SERVICE The position of Faculty Fellow for Planning, Assessment, and Improvement is included in the CCBC Society of the Faculty bargaining unit. During the academic year, the Faculty Fellow will receive full release from his/her required course load and will also be reimbursed for a double overload. Additionally, the Fellow will be expected to hold the equivalent of 12 summer hours per week, which will be spread
56
• An understanding of CCBC’s culture and current planning, assessment, and improvement efforts; • An ability to effectively work with faculty, staff, and administrators to advance the institution’s culture of evidence; • A knowledge of planning, assessment, and improvement processes; • An aptitude for the organization, composition, and presentation of information; • A capacity to learn new skills essential to advancing planning, assessment, and improvement at CCBC; • A familiarity with accreditation standards and expectations.
over 2-5 business days. A faculty member holding this position will be entitled to all holidays observed by faculty during the time of his or her service. At least 50% of required summer hours must be held on campus. The Fellow will be reimbursed for summer hours according to current overload rates equivalent to 12 credits. Any activities, procedures, and recommendations generated by this position shall be consistent with the CCBC-SOF Collective Bargaining Agreement and will not supersede or replace the procedures and language stated in that Agreement. No faculty member shall be obligated to serve in this position.
Faculty Fellow for Planning, Assessment, and Improvement Application Form
NAME:_________________________________________________________________ TITLE:_________________________________________________________________ TENURED:
YES_____ NO_____
SUPERVISOR’S SIGNATURE:____________________________________________________________ By signing this document, the immediate supervisor acknowledges the applicant is approved to be released from his/her contracted duties for the term of appointment.
APPLICATION DIRECTIONS: PART I: Compose a letter of intent that addresses the following three areas: • REASON FOR APPLYING: Please describe why you are interested in becoming CCBC’s Faculty Fellow for Planning, Assessment, and Improvement. • PREVIOUS EXPERIENCE: Please describe prior work and/or educational experiences that have prepared you for appointment to the Fellowship program. • CONTRIBUTION TO THE COLLEGE: Please describe how you believe you will contribute to the ongoing work of the college in advancing a culture of planning, assessment, and overall improvement. PART II: Submit current curriculum vitae. PART III: Submit two letters of recommendation from the following individuals: • Immediate supervisor • Faculty colleague APPLICATION SUBMISSION: The completed application form and all other required materials should be submitted to the Office of the President by Friday, November 15, 2014 for appointment consideration. The Faculty Fellow for Planning, Assessment, and Improvement will begin this assignment in spring 2015.
Thank you for your interest in advancing CCBC’s commitment to planning, assessment, and overall institutional improvement! 57
APPENDIX C: INSTITUTIONAL EFFECTIVENESS COUNCIL APPLICATION FORM
58
Institutional Effectiveness Council Application Form NAME:_________________________________________________________________
TITLE:_________________________________________________________________
□ □
CHECK ONE:
FULL-TIME FACULTY
STAFF
□ □
ADJUNCT FACULTY
ADMINISTRATOR/CONFIDENTIAL
REASON FOR APPLYING: Please describe your reason for applying to the Institutional Effectiveness Council in 2-4 sentences.
PREVIOUS EXPERIENCE: Please describe any previous work or educational experience in regards to institutional effectiveness, strategic planning, and/or assessment in 2-4 sentences.
CONTRIBUTION TO THE COUNCIL: Please describe how you believe you will contribute to the work of the Council in 2-4 sentences.
□
I have read and understand the IEC charter and if selected as a member will uphold the responsibilities and functions of the Council as well as my personal responsibilities as a Council member.
SIGNATURE:_____________________________________________
DATE:___________________________________________________
59
APPENDIX d: SLO/SDO Guidelines and Rubrics for Creating Assessment plans
60
SLO/SDO GUIDELNES AND RUBRICS For Creating/Revising Student Learning and Service Department Outcomes Assessment Plans PART ONE: CREATE OUTCOME Characteristics of Successful Outcomes
1. 1. 2. 2. 3. 3. 4. 4.
Observable: Can you identify evidence that the outcome was or was not achieved? Measurable: Can you measure the degree to which the outcome was met or not met? Aligned: Is the outcome aligned with other institutional goal areas such as general education, the College’s strategic plan, and/or the vision, mission, values, and goals of the College? Number: Do 3-5 outcomes clearly identify the most important foci of the program or department to be assessed?
Anatomy of Successful Outcomes
1. 1. 2. 2. 1.
W/WB/WBAT Phrase: Outcomes generally begin with a “will,” (W) “will be” (WB) or “will be able to” (WBAT) phrase. For instance, outcomes created for academic programs may begin with a phrase like “Students will be able to,” and outcomes for service departments may begin with a phrase like “Help desk requests will be…”. Measurable, Action Verb: Choose an action verb that results in an overt behavior that can be measured. Using Bloom’s Taxonomy of Learning helps ensure you create outcomes that range from lower to higher order skills.
BLOOM’SBLOOM’S LEVELS LEVELS
1. Knowledge 1. Knowledge
2. Comprehension 2. Comprehension 3. Application 3. Application 4. Analysis 4. Analysis
5. Synthesis 5. Synthesis 6. Evaluation 6. Evaluation
ACTION VERBS ACTION VERBS
arrange,arrange, define, describe, duplicate, label, label, list, define, describe, duplicate, list, memorize, name, order, recognize, memorize, name, order, recognize, reproduce reproduce classify, describe, discuss, explain, express, classify, identify, describe,indicate, discuss,locate, explain, exrecognize, report, press, identify, indicate, locate, recognize, restate, review, select, translate report, restate, review, select, translate Apply, choose, demonstrate, dramatize, employ Apply, choose, demonstrate, dramatize, illustrate, interpret, operate, practice, schedule, employ, illustrate, interpret, operate, pracsketch, solve, use tice, schedule, sketch, solve, use analyze, appraise, calculate, categorize, comanalyze, appraise, calculate, categorize, pare, contrast, criticize, differentiate, discrimcompare, contrast, criticize, differentiate, inate,distinguish, distinguish, examine, examine, experiment, quesdiscriminate, experition, test ment, question, test assemble, compose, Arrange,Arrange, assemble, collect,collect, compose, con- construct create,design, design, develop, develop, formulate, struct, create, formulate, manage, organize, plan, prepare, propose manage, organize, plan, prepare, propose argue, assess, choose, comAppraise,Appraise, argue, assess, attach,attach, choose, defend, estimate, judge, predict, rate, compare,pare, defend, estimate, judge, predict, score, select, support, value, and rate, score, select, support, value, and evaluate evaluate
61
3.3.
Stated Outcome: Limit the stated outcome to one area of content. Avoid “ands” and “ors.” Stated outcomes for student learning should reflect knowledge or skills gained by the student over the scope of the entire academic program, not just a specific course. Outcomes for service departments should reflect overarching goals aimed at overall area improvement.
Examples of Successful Student Learning and Service Department Outcomes
Student Learning Outcome Examples: Students in the Social Work program will be able to identify the basic psychological concepts, theories, and developments related to human behavior. Students in the Criminal Justice program will be able to employ effective written and verbal communications skills required for police operations, investigations, and public safety. Students in the Early Childhood Education program will be able to create curriculum using a variety of instructional strategies which provide for meaningful and challenging learning experiences and alternative models and methodologies. Service Department Outcome Examples: Financial aid personnel will monitor 3-year aggregate student loan default rate for CCBC students.
Financial aid personnel will improve financial aid notification time.
Help desk requests will be processed more quickly. Enrollment Management staff will increase the number of admissions interviews for incoming freshman.
PART TWO: IDENTIFY ASSESSMENT METHODS Establish Measures
1.1.
Direct Measures: At least one direct measure must be identified per outcome. Two or more is considered best practice.
Description of Direct Measures
Tangible, observable, direct examination of student learning, gauge of effectiveness of a service, typically quantitative
Example of Direct Measures Student Learning: Exams (standardized, certification, licensure, pre-/post), rubrics, field experience/supervisor instructor ratings*, portfolios*, presentations*, projects*, performances*, research papers*, analyses of discussion threads* *If scored with rubrics
Service Departments: Turn-around times, completion of service, productivity/service/ incident counts, compliance (government), gap analyses, financial ratios, audits, inspections, tracking (e.g. website hits), lack of service (e.g. computer server downtime) 62
2. 3.
2.5.
Indirect Measures: Indirect measures should be used to supplement direct methods of assessing the established outcome.
Description of Indirect Measures
Example of Indirect Measures Student Learning: Surveys (CCSSE, SENSE), interviews, focus groups, course evaluations, course grades, student “rates” (graduation, employment, transfer), honors, awards, scholarships
Opinions, feelings, satisfaction, attitudes, habits of mind, reports of learning, inputs (faculty qualifications/library holdings), outputs (graduation rates, employment placement rates), typically qualitative
Service Departments: Satisfaction surveys, interviews, focus groups, opinions, feelings, attitudes
Characteristics of Successful Assessment Methods 1. Valid: Does the method measure the stated outcome? Does it hit the target? 2. Reliable: How consistent is the measure? Are you confident of the results? Are the results useful and reasonably accurate? 3. Embedded/Authentic: Is the measure already part of the course or “normal course of business”? Use authentic methods when possible instead of “add-ons.” Data Collection Logistics College-wide processes for the collection of outcomes assessment data are established in the Institutional Effectiveness Handbook; however, embedded in those processes are areas that require individual faculty, staff, and administrators to plan the logistics of data collection, aggregation, and follow-up reporting for their assigned outcomes. Periodically Review Established Assessment Methods At a minimum assessment methods should be reviewed at the five-year mark to determine if currently used methods are sufficient, need to be revised, or if new methods must be developed. Examples of Appropriate Outcome Assessment Methods
Student Learning Outcome Example
PROGRAM: Culinary Arts Create and serve a variety of cuisines typically found in a food service facility in a team environment. Identify and utilize proper food and beverage preparation and service practices to meet industry standards for safety, cleanliness, and sanitation
ASSESSMENT METHODS HMR 103-112 Culinary Lab Final Projects (Instructor Evaluation for Technique and Teamwork) HMR 103-112 Culinary Lab Final Projects (Instructor Evaluation for Sanitation, Mise en Place, and Equipment) ServSafe Exam
63
Service Department Outcome Example
DEPARTMENT: Financial Aid
ASSESSMENT METHODS
Monitor 3-year aggregate student loan default rate.
Federal 3-Year Cohort Default Rate
Improve financial aid notification response time.
Average number of days award letter mailed for completed financial aid applications.
PART THREE: SET BENCHMARKS Characteristics of Successful Benchmarks 1. Defines Criteria for Success: Benchmarks establish how good is good enough, define success, and determine rigor. 2. Justifiable: Benchmarks should be realistic and attainable, but also “a stretch.� 3. Defensible: Benchmarks should be based on something (pre-established measure, trends, etc.) 4. Specific: Benchmarks should clearly identify which outcome/s will be measured and directly identify how the chosen assessment method will be used for benchmarking purposes. Anatomy of a Benchmark 1. Benchmark: Statement establishing the successful attainment of a specific outcome. EXAMPLE: 80% of students will earn a 3 out of 5 or better for all rubric criteria. 2. Target: A percentage included with the benchmark to demonstrate a wider attainment of the benchmark. EXAMPLE: 80% of students will earn a 3 out of 5 or better for all rubric criteria. Types of Benchmarks 1. Standards-Based: Benchmarks based around established competencies/criteria (i.e. rubric defining criteria for a writing assignment or exam testing learned competencies). 2. Peer/Norm-Based: Benchmarks based around published instruments, national averages, or peer institutions (i.e. IPEDS, certification/licensure requirements, professional organizations). 3. Value-Added: Benchmarks that measure growth, change, improvement (i.e. pre/post-test). 4. Longitudinal: Benchmarks that measure change over time and/or compare groups at different points in time (i.e. measures of credit enrollment over a 2-year time period, freshman scores vs. graduate scores).
64
Examples of Successful Benchmarks Student Learning Outcome Example
PROGRAM: Culinary Arts
ASSESSMENT METHODS
Create and serve a variety of cuisines typically found in a food service facility in a team environment.
HMR 103-112 Culinary Lab Final Projects (Instructor Evaluation for Technique and Teamwork). Identify and utilize propHMR 103-112 Culinary er food and beverage Lab Final Projects (Inpreparation and service structor Evaluation for practices to meet industry Sanitation, Mise en standards for safety, clean- Place, and Equipment) liness, and sanitation
BENCHMARK 80% of students will score a 3 out of 5 or better for Technique and Teamwork on the Instructor Evaluation. 85% of students will score a 4 out of five or better for Sanitation, Mise en Place, and Equipment on the Instructor Evaluation.
Service Department Outcome Example
DEPARTMENT: Financial Aid
ASSESSMENT METHODS
Monitor 3-year aggregate student loan default rate.
Federal 3-Year Cohort Default Rate
Improve financial aid notification response time.
Average number of days award letter mailed for completed financial aid applications.
BENCHMARK CCBC students will maintain a loan default rate 3% lower than the national average. Financial aid notification letters will be processed 2 days sooner than previous average.
*Adapted from Dr. Janine Vienna’s “Assessment Shop Work” presentation. Faculty Convocation. August 2014
65
66 All outcomes stated in measurable and observable terms
All outcomes aligned with general education competencies as applicable
Outcomes stated in measurable and observable terms of student achievement
Outcomes aligned with general education competencies as applicable
All assessment methods use at least one direct measure
More than one assessment method is used for all program outcomes
Assessment methods use direct measures
Multiple assessment methods strengthen findings (secondary measures may be indirect).
Logistics appear to be well-planned and organized.
All logistics plans identify responsible person(s), when and how data will be collected and data storage.
All benchmarks appear to be sufficiently rigorous and defensible.
Benchmarks appear to be sufficiently rigorous and defensible.
Logistics
All benchmarks identify specific performance criteria
Benchmarks identify specific performance criteria
Benchmarks (success criteria)
All methods identified are valid and reliable to assess program outcomes
Assessment methods identified are valid and reliable to assess program outcomes
Assessment Methods
5 (or more) program outcomes identified which adequately address the most important focus of the program
Highly Developed
3-5 (or more) program outcomes identified which adequately address the most important focus of the program
Outcomes
Criteria
Review Rubric for Academic Programs
Most logistics plans identify responsible person(s), when and how data will be collected and data storage.
Some benchmarks appear to be sufficiently rigorous and defensible.
Some benchmarks identify specific performance criteria
More than one assessment methods is used for most program outcomes.
Some assessment methods use at least one direct measure
Some methods identified are valid and reliable to assess program outcomes
Some outcomes aligned with general education competencies as applicable
Some outcomes stated in measurable and observable terms
3-4 program outcomes identified which adequately address the most important focus of the program
Developed
No logistics plans identify responsible person(s), when and how data will be collected and data storage.
No benchmarks appear to be sufficiently rigorous and defensible.
No benchmarks identify specific performance criteria
Only one assessment methods is used for all program outcomes.
No assessment methods use at least one direct measure
No methods identified are valid and reliable to assess program outcomes
No outcomes aligned with general education competencies as applicable
No outcomes stated in measurable and observable terms
1-2 program outcomes identified which adequately address the most important focus of the program
Developing
Feedback
67
Developing
Some benchmarks identify specific performance criteria Some benchmarks appear to be sufficiently rigorous and defensible. Most logistics plans identify responsible person(s), when and how data will be collected and data storage.
All outcomes appropriately linked to strategic plan. All assessment methods use direct measures. All outcomes have more than one assessment method. All assessment methods appear to be valid and reliable to accurately measure the outcomes. All benchmarks identify specific performance criteria All benchmarks appear to be sufficiently rigorous and defensible.
Outcome(s) appropriately linked to strategic plan. Assessment Methods
Assessment methods use direct measures. Multiple assessment methods strengthen findings. Assessment methods appear to be valid and reliable to assess the outcomes. Benchmarks (success criteria)
Benchmarks identify specific performance criteria Benchmarks appear to be sufficiently rigorous and defensible. Logistics
Logistics appear to be well-planned All logistics plans identify responand organized. sible person(s), when and how data will be collected and data storage.
Some assessment methods use direct measures. Some outcomes have more than one assessment method. Some assessment methods appear to be valid and reliable to accurately measure the outcomes.
Some outcomes appropriately linked to strategic plan.
Some outcomes stated in measurable and non-measurable language.
All outcomes stated in measurable language
Outcomes stated in measurable language
No logistics plans identify responsible person(s), when and how data will be collected and data storage or no logistics plans.
No benchmarks identify specific performance criteria No benchmarks appear to be sufficiently rigorous and defensible.
No assessment methods use direct measures. No outcomes have more than one assessment method. No assessment methods appear to be valid and reliable to accurately measure the outcomes.
No outcomes appropriately linked to the strategic plan.
No outcomes stated in measurable language.
One or two outcomes appropriate No outcomes appropriate to the to the unit’s services, processes and unit’s services, processes and functions. functions.
Developed
All outcomes appropriate to the unit’s services, processes and functions.
Highly Developed
Outcomes appropriate to the unit’s services, processes and functions.
Department Outcomes
Criteria
Review Rubric for Service Departments Comments
APPENDIX E: INSTITUTIONAL EFFECTIVENESS COUNCIL FORMS FOR SLO/SDO APPROVAL
68
69
I approve the content and scope of the proposed outcome/s and recommend them for IEC approval.
OUTCOME LEAD/S:________________________________________________________________
If not approved, provide reason/s below:
Signature____________________________ Date_______________
_____Approved _____Not Approved IEC Chair
FOR IEC USE ONLY:
We/I acknowledge the proposed outcome/s address areas identified through prior assessment processes and/or appropriate research that will better enable the program/department to meet the needs of those it serves and the overall Vision, Mission, Values, and Goals of the college.
IMMEDIATE SUPERVISOR:_________________________________________________________
REQUIRED SIGNATURES:
RESOURCES: The SLO/SDO Guidelines and Rubrics, which are located in the Outcomes Assessment course on Blackboard, are helpful resources when creating/revising student learning and service department outcomes. If additional assistance is needed as outcomes are prepared for approval, please contact the Faculty Fellow for Planning, Assessment, and Improvement. Please return this cover page, with appropriate signatures, when submitting revised and/or newly established outcomes to the IEC.
DIRECTIONS: Student learning and service department outcomes should only be revised at the program-review point or every five years, unless significant changes have occurred in the program/area necessitating earlier outcome revision. All new and/or revised student learning and service department outcomes must be approved by both the immediate supervisor and the Institutional Effectiveness Council. To receive IEC approval, the following forms must be completed and submitted to the IEC at least one week prior to their last meeting in September. Forms will be returned to outcome leads and supervisors within five business days after they are approved or not approved by the IEC. Outcome leads, supervisors, and the IEC should retain copies of the forms for record keeping purposes.
INSTITUTIONAL EFFECTIVENESS COUNCIL Forms for Student Learning/Service Department Outcome Approval
70
If yes, which one(s)?
Related to Strategic Plan Goal/Objective? Y /
If yes, which one(s)?
N
Related to General Education Outcome? Y / N
1.
Outcomes
2.
1.
(Name of academic program or service department)
Assessment Plan For:
Assessment Methods
Logistics:
Logistics:
Benchmarks (Criteria for Success)
(Responsible Person)
71
If yes, which one(s)?
Related to Strategic Plan Goal/Objective? Y /
If yes, which one(s)?
N
Related to General Education Outcome? Y / N
2.
Outcomes
2.
1.
Assessment Methods
Logistics:
Logistics:
72
If yes, which one(s)?
Related to Strategic Plan Goal/Objective? Y /
If yes, which one(s)?
N
Related to General Education Outcome? Y / N
3.
Outcomes
2.
1.
Assessment Methods
Logistics:
Logistics:
73
If yes, which one(s)?
Related to Strategic Plan Goal/Objective? Y /
If yes, which one(s)?
N
Related to General Education Outcome? Y / N
4.
Outcomes
2.
1.
Assessment Methods
Logistics:
Logistics:
74
If yes, which one(s)?
Related to Strategic Plan Goal/Objective? Y /
If yes, which one(s)?
N
Related to General Education Outcome? Y / N
5.
Outcomes
2.
1.
Assessment Methods
Logistics:
Logistics:
75
If yes, which one(s)?
Related to Strategic Plan Goal/Objective? Y /
If yes, which one(s)?
N
Related to General Education Outcome? Y / N
6.
Outcomes
2.
1.
Assessment Methods
Logistics:
Logistics:
APPENDIX f: SLO Data Collection Forms
76
TO: Full- and Part-Time Faculty Teaching _____________________(Name of course here) FROM: SUBJECT: Student Learning Outcomes Assessment DATE:
This memo regards the collection of assessment data for the _____________________program. (Description of data to be collected here, i.e. specific test questions, assignment grade, rubric criteria, etc.) Please submit appropriate data to____________________no later than___________________ using ______________________(Describe means of submission here). If you have any questions, please feel free to contact me at _________________________. You may also stop by my office at your convenience. Thank you in advance for your cooperation with this effort. As always, your continued contribution to CCBC is invaluable and much appreciated.
77
marks (Criteria for Success)
DATA COLLECTION FORM
COURSE NAME AND SECTION NUMBER COURSE NAME AND SECTION NUMBER Example : Example : WRIT 101-03 WRIT 101-03
78
NUMBER OF STUDENTS ENROLLED IN CLASS AT TIME OF NUMBER OF STUDENTS ASSIGNMENT ENROLLED IN CLASS AT TIME OF ASSIGNMENT Example: Example : 23 23
(TITLE OF APPROPRIATE (TITLE OF APPROPRIATE DATA) i.e. final exam DATA) i.e. final exam scores, scores, competency assigncompetency assignment ment grades, etc. grades, etc. Example: Example : 70, 65, 99, 70, 65, 97, 99, 88, 97, 89, 88, 87, 89, 56,87, 97,56, 96,97, 77,96, 79,77, 83,79, 82,83, 59, 79, 82, 89, 59, 99, 79, 98, 89, 66, 99, 69, 98, 100, 7666, 69, 100, 76
APPENDIX G: TRACDAT DATA ENTRY SCREENSHOTS
79
EXAMPLE: John Smith, Outcome #1: Critical Thinking
USING TRACDAT DIRECTIONS AND SCREENSHOTS
80
STEP 1:
Access TracDat from tracdat.ccbc.edu
STEP 2:
Log in to TracDat
STEP 3:
Select the Appropriate Program/Unit
STEP 4:
Click Plan
81
82
STEP 5:
Edit/Add New Outcome
STEP 6:
Edit/Add Means of Assessment
STEP 7:
Access Results Area
STEP 8:
Select Appropriate Outcome for Results Entry
83
STEP 9:
84
Enter Results and Link to Actions
STEP 10:
Add Actions and Resources
STEP 11:
Add Follow-Up
85
STEP 12:
86
Create a TracDat Report
87
APPENDIX h: Example tracdat 4-column report
88
89
APPENDIX i: SLO/SDO Checklists for data discussion
90
SLO/SDO CHECKLIST For Data Discussions DIRECTIONS: The SLO/SDO Checklist should be completed during data discussions with the immediate supervisor. Data discussions should be scheduled after all SLO/SDO data for a specific program or department has been collected and reported through TracDat by all outcome leads (OLs) associated with the program/department. Outcome leads should bring a TracDat 4-column report to the data discussion meeting for all the outcomes they are responsible for within the program/department. Complete and original SLO/ SDO Checklists should be sent to the Institutional Effectiveness Council. A copy of the checklist and required attachments should be kept by all outcome leads as well as the immediate supervisor.
PROGRAM/DEPARTMENT NAME:___________________________________________________________
OUTCOME LEADS & ASSIGNED OUTCOME:
1.______________________________________________
2.______________________________________________ 3.______________________________________________
4.______________________________________________ 5.______________________________________________
DATE OF DATA DISCUSSION:___________________________________________
TRACDAT 4-COLUMN REPORTS PROVIDED: ______Yes ______No
Please attach copies of each outcome lead’s (OL’s) 4-column TracDat report.
NOTES:
RESOURCES NEEDED: ______Yes ______No If “yes,” please check one: _____Funds Available from Current Budget Line _____Additional Funds Needed/New Initiative NOTES:
91
DISCUSSION SUMMARY: Please provide a brief narrative regarding the highlights of the data discussion.
IMMEDIATE SUPERVISOR’S SIGNATURE:_______________________________________________ I acknowledge all outcome leads responsible for reporting assessment data for the________________________program/department have met with me to discuss program needs and assessment results for the 20_____/20_____academic year.
OUTCOME LEADS’ SIGNATURES: 1._________________________________________________ 2._________________________________________________ 3._________________________________________________ 4._________________________________________________ 5._________________________________________________ We/I acknowledge that student learning/service department assessment data was thoroughly and appropriately discussed, including necessary resource needs, with the immediate supervisor for the _________________________program/department for the 20_____/20_____ academic year.
92
APPENDIX j: General education student learning outcomes program and assessment overview
93
COMMUNITY COLLEGE OF BEAVER COUNTY GENERAL EDUCATION STUDENT LEARNING OUTCOMES PROGRAM AND ASSESSMENT OVERVIEW REQUIREMENT 1: COMMUNICATION PROFICIENCY
Definition: Communication Proficiency requires the skilled presentation of ideas through appropriate media and in a manner suitable to the audience. In addition to WRIT 101: English Composition, courses with the following prefixes are among those that require demonstration of Communication Proficiency: COMM, FINE, FILM, FREN, MUSI, SPAN, THEA, WRIT 201.
Outcomes and Mastery Matrix: OUTCOME #1 Demonstrate clear and skillful communication methods appropriate to different occasions, audiences, and purposes. MASTERY MATRIX #1
Mastery
Progressing
Low/No Mastery
Student consistently demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.
Student generally demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.
Student does not consistently or generally demonstrate clear and skillful communication methods appropriate to occasion, audience, and purpose.
Assessment Timeline: CORE COURSE/S WRIT 101
}
All sections assessed every academic year
PROGRAM SPECIFIC COURSES 2015 Group 1-Nursing and Allied Health
94
2016
Group 2- Aviation Sciences
2017
Group 3-Business and Technologies
2018
Group 4-Liberal Arts and Sciences
}
All courses within area requiring a General Education Competency Assignment with a Communication Proficiency focus assessed every fourth academic year on a rotating basis.
REQUIRMENT 2: INFORMATION LITERACY
Definition: Information Literacy recognizes the need to integrate authoritative resources with an existing knowledge base. In addition to LITR 210: Concepts of Literature or WRIT 103: Writing for Business and Technology (depending on your major), courses with the following prefixes are among those that require demonstration of Information Literacy: JOUR, LITR, PHIL.
Outcomes and Matrix: OUTCOME #2
Access, evaluate, and appropriately utilize information from credible sources.
MASTERY MATRIX #2
Mastery
Progressing
Student consistently accesses, Student generally accesses, evaluates, and appropriately evaluates, and appropriately utilizes information from credible utilizes information from credible sources. sources.
Low/No Mastery Student does not access, evaluate, and appropriately utilize information from credible sources.
Assessment Timeline: CORE COURSE/S LITR 210/WRIT 103
}
All sections assessed each academic year
PROGRAM SPECIFIC COURSES 2015 Group 1-Nursing and Allied Health
}
2016
Group 2- Aviation Sciences
2017
Group 3- Business and Technologies
All courses within area requiring a General Education Competency Assignment with an Information Literacy focus assessed every fourth academic year on a rotating basis.
2018
Group 4-Liberal Arts and Sciences
.
95
REQUIREMENT 3: SCIENTIFIC AND QUANTITATIVE REASONING
Definition: Scientific and Quantitative Reasoning employs empirical and mathematical processes and scientific methods in order to arrive at conclusions and make decisions. Courses with the following prefixes are among those that require demonstration of Scientific and Quantitative Reasoning: BIOL, CHEM, MATH, NANO, PHYS.
Outcomes and Matrix: OUTCOME #3 Select and apply appropriate problem-solving techniques to reach a conclusion (hypothesis, decision, interpretation, etc.). MASTERY MATRIX #3
Mastery
Progressing
Low/No Mastery
Student consistently selects and applies appropriate problem-solving techniques to reach a conclusion.
Student generally selects and applies appropriate problem-solving techniques to reach a conclusion.
Student does not consistently or generally select and apply appropriate problem-solving techniques to reach a conclusion.
Assessment Timeline: CORE COURSE/S BIOL 100 BIOL 101
BIOL 201 MATH 130 PHYS 101 PHYS 105
}
All sections assessed each academic year
PROGRAM SPECIFIC COURSES 2015 Group 1-Nursing and Allied Health
96
2016
Group 2-Aviation Sciences
2017
Group 3- Business and Technologies
2018
Group 4-Liberal Arts and Sciences
}
All courses within area requiring a General Education Competency Assignment with a Scientific and Quantitative Reasoning focus assessed every fourth academic year on a rotating basis.
REQUIREMENT 4: CULTURAL LITERACY
Definition: Cultural Literacy delineates the patterns of individual and group dynamics that provide structure to society on both individual and global levels. Courses with the following prefixes are among those that require demonstration of Cultural Literacy: ANTH, GEO, HIST, POLS, PSYC, SOCI, BUSM 255 or ECON 255.
Outcomes and Matrix: OUTCOME #4 Demonstrate an understanding and appreciation of the broad diversity of the human experience. MASTERY MATRIX #4
Mastery
Progressing
Low/No Mastery
Student consistently demonstrates an understanding and appreciation of the broad diversity of the human experience.
Student generally demonstrates an understanding and appreciation of the broad diversity of the human experience.
Student does not consistently or generally demonstrate an understanding and appreciation of the broad diversity of the human experience.
Assessment Timeline: CORE COURSE/S
PSYC 101 PSYC 106
}
All sections assessed each academic year
PROGRAM SPECIFIC COURSES 2015 Group 1-Nursing and Allied Health 2016
Group 2-Aviation Sciences
2017
Group 3-Business and Technologies
2018
Group 4-Liberal Arts and Sciences
}
All courses within area requiring a General Education Competency Assignment with a Cultural Literacyfocus assessed every fourth academic year on a rotating basis.
97
REQUIREMENT 5: TECHNOLOGY LITERACY
Definition: Technology Literacy enhances the acquisition of knowledge, the ability to communicate, and productivity. In addition to CIST 100: Introduction to Information Technology, courses with the following prefixes are among those that require demonstration of Technology Literacy: CISF, CISN, CIST, CISW, OFFT, VISC.
Outcomes and Matrix: OUTCOME #5
Utilize appropriate technology to access, build, and share knowledge in an effective manner.
MASTERY MATRIX #5
Mastery
Progressing
Low/No Mastery
Student consistently utilizes appropriate technology to access, build, and share knowledge in an effective manner.
Student generally utilizes appropriate technology to access, build, and share knowledge in an effective manner.
Student does not consistently or generally utilize appropriate technology to access, build, and share knowledge in an effective manner.
Assessment Timeline: CORE COURSE/S CIST: 100
}
All sections assessed each academic year
PROGRAM SPECIFIC COURSES 2015 Group 1-Nursing and Allied Health
98
}
2016
Group 2-Business and Technologies
2017
Group 3-Aviation Sciences
2018
Group 4-Liberal Arts and Sciences
All courses within area requiring a General Education Competency Assignment with a Technology Literacy focus assessed every fourth academic year on a rotating basis.
USING THE MASTERY MATRICES Purpose and Scope of the Matrices: Each General Education Mastery Matrix directly assesses a specific student learning outcome (SLO) defined in CCBC’s General Education Program and is intended to be used for the assessment of general education only. The matrices in no way infringe upon a faculty member’s right to evaluate course assignments in a way s/he deems appropriate; rather, each matrix, while a part of all General Education Competency (GEC) Assignment rubrics, should be recognized as a measure distinct from those criteria used to determine a student’s score on the GEC assignment as a course-specific requirement.
Uniformity in Scoring Using the Matrices To ensure a reliable level of uniformity in scoring using the General Education Mastery Matrices, faculty members should refer to the following guidelines. It is important to remember the percentages established in the guidelines do not correspond with a student’s overall grade on the GEC assignment, but are rather general guidelines to help faculty more uniformly identify levels of mastery, progression, and low/no mastery across course sections that may be taught by various faculty members.
MASTERY: A student who displays mastery of a certain outcome/general education competency could be said to demonstrate the skill and/or ability identified by the outcome at the 80-100% level.
PROGRESSING:
A student who demonstrates s/he is progressing in his/her mastery of a certain outcome/general education competency could be said to display the skill and/or ability identified by the outcome at the 70-79% level.
LOW/NO MASTERY: A student who displays low/no mastery of a certain outcome/general education
competency could be said to demonstrate the skill and/or ability identified by the outcome at the 69% or below level. Mastery Matrices and General Education Competency Assignments At CCBC, General Education Competency (GEC) Assignments are identified and included in the master syllabi for all courses that include a general education component. Each GEC assignment should be defined and used across sections. GEC assignments should also include a facultyapproved rubric, which all instructors should use to determine a student’s course grade for the assignment. Beginning in the fall of 2014, all GEC assignment rubrics should be amended to include the General Education Mastery Matrix or Matrices appropriate to the GEC assignment category identified in the master syllabus. Matrices to be appended to previously established GEC rubrics will be sent to faculty members by the lead faculty member charged with assessing CCBC’s General Education Program. Immediate supervisors will be responsible for ensuring all master syllabi containing a GEC assignment are updated to include the appropriate GEC Mastery Matrix/Matrices.
99
Cut and Paste Matrices For ease of the faculty member as well as consistency in formatting, presentation, and reporting, the following tables, which include General Education Requirement title, defined outcomes for said requirement, and defined criteria by which to score each outcome, should be appended to already established GEC assignment rubrics within master syllabi using the cut and paste function.
General Education Requirement #1: Communication Proficiency Outcome #1: Demonstrate clear and skillful communication methods appropriate to different occasions, audiences, and purposes
Mastery Student consistently demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.
Progressing Student generally demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.
Low/No Mastery Student does not consistently or generally demonstrate clear and skillful communication methods appropriate to occasion, audience, and purpose.
General Education Requirement #2: Information Literacy Outcome #2: Access, evaluate, and appropriately utilize information from credible sources.
Mastery Student consistently accesses, evaluates, and appropriately utilizes information from credible sources.
Progressing Student generally accesses, evaluates, and appropriately utilizes information from credible sources.
Low/No Mastery Student does not access, evaluate, and appropriately utilize information from credible sources.
General Education Requirement #3: Scientific and Quantitative Reasoning Outcome #3: Select and apply appropriate problem-solving techniques to reach a conclusion (hypothesis, decision, interpretation, etc.).
Mastery Student consistently selects and applies appropriate problem-solving techniques to reach a conclusion.
Progressing Student generally selects and applies appropriate problem-solving techniques to reach a conclusion.
Low/No Mastery Student does not consistently or generally select and apply appropriate problemsolving techniques to reach a conclusion.
General Education Requirement #5: Technology Literacy Outcome #5: Utilize appropriate technology to access, build, and share knowledge in an effective manner.
Mastery Student consistently utilizes appropriate technology to access, build, and share knowledge in an effective manner.
Progressing Student generally utilizes appropriate technology to access, build, and share knowledge in an effective manner.
Low/No Mastery Student does not consistently or generally utilize appropriate technology to access, build, and share knowledge in an effective manner.
General Education Requirement #4: Cultural Literacy Outcome #4: Demonstrate an understanding and appreciation of the broad diversity of the human experience.
100
Mastery Student consistently demonstrates an understanding and appreciation of the broad diversity of the human experience.
Progressing Student generally demonstrates an understanding and appreciation of the broad diversity of the human experience.
Low/No Mastery Student does not consistently or generally demonstrate an understanding and appreciation of the broad diversity of the human experience.
Example GEC Assignment Rubric with Appended Mastery Matrix
101
APPENDIX k: using the blackboard outcomes assessment site
102
USING THE OUTCOMES ASSESSMENT SITE DIRECTIONS AND SCREENSHOTS STEP 1:
Access Blackboard
103
STEP 2: Log in to Blackboard
STEP 3:
104
Access the Outcomes Assessment Course
STEP 4: Watch the “How To” Video Announcement
STEP 5:
Find Documents OR Upload Assessment Data
105
APPENDIX l: slo/sdo guidelines for reporting results, actions, and follow-up
106
SLO/SDO GUIDELNES For Reporting Results, Actions, and Follow-Up PART ONE: RESULTS Characteristics of Successful Results and Result Statements
1.5. 2.6.
Clear: Successful result statements establish what the numbers mean. Concrete: Successful result statements are supported by data evidence.
Anatomy of Successful Results and Result Statements
1.4.
Data Evidence: The evidence that is used to establish whether a benchmark is met, not met, or inconclusive should always be included when reporting outcomes assessment results. Since CCBC uses an electronic data storage system, TracDat, data evidence must be in an electronic form (i.e. Excel sheet, Microsoft Word table, etc.)
2.5.
Data Description: A brief statement of statistics related to the benchmark. It is often helpful to begin data descriptions with a phrase such as, “The results 1) indicate, 2) demonstrate, 3) show, 4) reveal, etc.,” “The results exceed/meet/do not meet the benchmark for…,” or “Based on the results of the [assessment method]…”.
3.6. 4.7.
5.8.
Data Analysis: A brief analysis of the assessment results. Data analyses can focus on long-range trends, year-over-year information, highs/lows, strengths/weaknesses, external occurrences (snow days, change of instructor), etc. Data Discussion/Dissemination: Discussion and dissemination of assessment data and results ensures stakeholders are informed of and have input regarding results. Depending on the scope of the established outcomes, it may be important to discuss and/or disseminate assessment data/results to faculty teaching in the program, members of an advisory committee, supervisors, clinical site supervisors, internship supervisors, primary employers, etc. A brief statement of how data was discussed/ disseminated should be included in the result statement. Result Conclusion: Establish whether the outcome was met, not met, or inconclusive.
107
Examples of Successful Student Learning and Service Department Result Statements Student Learning Result Statement Example: Description Analysis
Discussion
The 2012-2013 first attempt NCLEX-RN pass rate for CCBC was 88.89% compared to 86.22% and 84.29% for the PA and national pass rates; the benchmark was met. This is the third consecutive year for which the CCBC first-time pass rate exceeded both the PA and national pass rates. CCBC experienced a slight decline in the pass rate from 2011-2012, however the decline was also reflected at the PA and national levels. Nursing faculty discussed the results and believe the quality of instruction, clinicals and course assignments continue to contribute to positive NCLEX-RN results.
Service Department Result Statement Examples: Description Analysis Discussion
The results indicated that for all three rubric criteria - education, experience and professional presentation - 80% or more of the students created resumes that met quality standards; the benchmark was met. The highest criteria meeting quality standards was for experience (92%) and the lowest meeting quality standards was for professional presentation (81%). For education, 85% of the students met the quality standard. The results were discussed with faculty in the CTE programs to inform them of student strengths/weaknesses with the development of their students’ resumes.
PART TWO: ACTIONS What is an Action Plan? Action plans are 1) the plans put in place to further improve a program or service based on positive assessment results (meeting/exceeding the established outcome benchmark), or 2) the plans put in place to improve upon a program or service because established benchmarks were not met. Anatomy of an Action Plan
108
1. 1.
Action Statement: An action plan statement establishes the action to be taken in a specific, concise manner. There should only be one action per action plan.
2. 2.
Resource Identification: When planning what action should be taken to improve a program/ service, consider whether resources are needed to accomplish the action. Linking planning, budgeting, and assessment is a critical component of the assessment cycle.
3. 3.
Implementation/Timeline/Responsible Party: Considering and planning the logistics of the action plan are also necessary. Who will do what and when are important questions to ask as the action plan is developed.
Examples of Successful/Unsuccessful Student Learning and Service Department Action Plans
SUCCESSFUL Student Learning Action Plan Example: Implement a peer reviewed role play assignment in NURS 130 for students to practice therapeutic communication techniques and create evaluation criteria to recognize successful techniques. The role play assignment will be developed during Summer 2014 and added to the NURS 130 syllabus and grading criteria. In the Fall of 2014 the role play activity will be implemented in the course. All instructors teaching NURS 130 will be responsible for understanding and implementing the assignment. No additional resources are needed for this action.
UNSUCCESSFUL Student Learning Action Plan Example:
Review Fisdap results with the program advisory committee.
SUCCESSFUL Service Department Action Plan Example: Send a direct mail postcard to all credit students to notify them of the day/time of the job fair and the employers who will be in attendance to increase student participation at the fair. The postcards will be developed by the office of enrollment management and the College’s marketing department at least three weeks before the job fair. The postcards will be mailed two weeks prior to the fair. Resources for this project will come from the enrollment management budget and will total approximately $1000.
UNSUCCESSFUL Service Department Action Plan Example Get more people to help with processing transcripts and determining transfer equivalencies.
Action Plan Considerations When creating an action plan, consider many possible avenues for improvement to ensure a valuable action is pursued.
Student Learning Action Plan Considerations
Service Department Action Plan Considerations
Teaching Methods/Tasks
Services
Outcomes
Process Management
Curriculum
Outcomes
Assessment Strategies
Assessment Strategies
Learner/Student
Benchmarks
Support/Resources
Support/Resources
109
PART THREE: FOLLOW-UP
Why is Follow-Up Important? plemented, 2) sessment cyclesFollow-up is important because it 1) demonstrates action plans are being implemented, 2) describes the progress being made towards improvement, and 3) ensures assessment cycles are complete and functional.
Characteristics of Successful Follow-Up ibe the progress 1. Describes Progress: The primary function of follow-up reporting is to describe the action plan. 1. effectiveness of progress being made towards improvement through implementation of the established action plan. When describing the progress of an action plan, it is important to discuss the effectiveness of the action plan and examine any new data associated with the plan. nd end-of2.2. Occurs Regularly: Follow-up reporting should occur at both the mid-year and end-ofexamined and year points to ensure the effectiveness of the action plan is being thoroughly examined he following and appropriate information is available as new action plans are established for the following assessment cycle.
pment Day October *Adapted from Dr. Janine Vienna’s “Assessment Next Steps” presentation. Professional Development Day October 2014.
110
APPENDIX M: 5-YEAR KEY PERFORMACE INDICATORS KPI REPORT TEMPLATE
111
GOAL 1: STUDENT SUCCESS Create a learning community by supporting student success through educational programs provided in diverse and accessible formats. 1. Successfully complete developmental education Percent of enrollees in PREP English, Reading, and Mathematics earning grades of C or above, reported separately for each discipline, fall terms, as reported for Achieving the Dream
Benchmark
Fall 2012
Fall 2013
Fall 2014
Fall 2015
Fall 2016
English = √= Achieved Reading = √= Achieved Math = √= Achieved
2. Successfully complete the initial gateway English course Percent of enrollees in gateway English earning grades of C or above, fall terms, as reported for Achieving the Dream
Benchmark
Fall 2012
Fall 2013
Fall 2014
Fall 2015
Fall 2016
√= Achieved
3. Successfully complete courses with a grade of “C” or better Percent of enrollees in all courses earning grades of C or above, fall terms, as reported for Achieving the Dream
Benchmark
Fall 2012
Fall 2013
Fall 2014
Fall 2015
Fall 2016
√= Achieved
4. Fall-to-Fall Retention Rates Percent of FTIC fall cohort enrolling in the subsequent fall term, reported separately for full-time and part-time students, as reported for Achieving the Dream
Benchmark
Fall 2012
Fall 2013
Fall 2014
Fall 2015
Fall 2016
Full-time = √= Achieved Part-time = √= Achieved
5. Attain a Credential Percent of FTIC fall cohort who attain a degree, diploma, or certificate within four years, as reported for Achieving the Dream
Benchmark √= Achieved
112
Fall 2008
Fall 2009
Fall 2010
Fall 2011
Fall 2012
6. AAS/AS Graduate Employment and Continuing Education Rate Percent of AAS and AS graduates reporting employment in a job related to their CCBC program or transfer to a four-year institution within a year of graduation, as reported by Career Services
Benchmark
Class of 2012
Class of 2013
Class of 2014
Class of 2015
Class of 2016
√= Achieved
7. License Examination Pass Rates Percent of first-time testers meeting or exceeding national mean as reported by the National Council of State Boards of Nursing
Benchmark
2012
2013
2014
2015
2016
2015
2016
2015
2016
LPN = √= Achieved RN = √= Achieved RadTech = √= Achieved
8. Academic Challenge CCSSE composite benchmark score of survey items; standardized national sample mean = 50
Benchmark
2012
2013
2014
50.0 √= Achieved
9. Student Effort CCSSE composite benchmark score of survey items; standardized national sample mean = 50
Benchmark
2012
2013
2014
50.0 √= Achieved
10. Career Counseling Services Use Percent of students saying they used the college’s career counseling services as reported by CCSSE
Benchmark
2012
2013
2014
2015
2016
2015
2016
√= Achieved CCSSE national
11. Career Goal Clarification Percent of students saying college helped to clarify career goals as reported by CCSSE
Benchmark
2012
2013
2014
√= Achieved CCSSE national
113
12. Supports For Students Rate for question “This institution treats students as its top priority” in annual Noel-Levitz survey
Benchmark
2015
2016
2017
2018
2019
√= Achieved CCSSE national
GOAL 2: COMMUNITY AND ECONOMIC DEVELOPMENT Partner with businesses, organizations, educational institutions, and government agencies to enhance economic opportunities for the region. 13. Enrollment in Noncredit Workforce Development Courses Course enrollments in Workforce and Continuing Education noncredit open-enrollment workforce development and contract training courses for the fiscal year, as reported by the Dean of Workforce and Continuing Education
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved
14. Contract Training Clients Number of business, government, and non-profit organizational units contracting with the college for customized training and services for the fiscal year, as reported by the Dean of Workforce and Continuing Education
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved
15. Contract Training Student Headcount Annual unduplicated headcount of workforce development students served through contract training arrangements for the fiscal year, as reported by the Dean of Workforce and Continuing Education
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved
16. Lifelong Learning Course Enrollments Number of course enrollments lifelong learning courses offered by Workforce and Continuing Education for the fiscal year, as reported by the Dean of Workforce and Continuing Education
Benchmark √= Achieved
114
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
17. College-sponsored Community Events Number of college-sponsored cultural and educational events open to the public held at the college each year
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
FY 2015
FY 2016
√= Achieved
18. Community Groups Using College Facilities Unduplicated count of Beaver County organizations using college facilities each year
Benchmark
FY 2012
FY 2013
FY 2014
√= Achieved
19. College position in the community Rate for question “This institution is well-respected in the community” in annual Noel-Levitz survey
Benchmark
2015
2016
2017
2018
2019
√= Achieved CCSSE national
GOAL 3: ORGANIZATION DEVELOPMENT Create a culture that expects openness, collaboration, and mutual respect and embraces innovation and professional development. 20. Cultural Awareness Percent of students saying college contributed to understanding of people of other racial/ethnic backgrounds as reported by CCSSE
Benchmark
2012
2013
2014
2015
2016
√= Achieved CCSSE national
21. Employee Perception of College Commitment to Diversity Total votes made by employees to the question “Increase the diversity of racial and ethnic groups represented among the student body” in the annual Noel-Levitz survey
Benchmark
2015
2016
2017
2018
2019
√= Achieved CCSSE national
115
22. Employee Job Satisfaction Rate of overall satisfaction with employment as reported by the Noel-Levitz survey
Benchmark
2015
2016
2017
2018
2019
2015
2016
2015
2016
√= Achieved CCSSE national
23. Active and Collaborative Learning CCSSE composite benchmark score of survey items; standardized national sample mean = 50
Benchmark
2012
2013
2014
50.0 √= Achieved
24. Student-Faculty Interaction CCSSE composite benchmark score of survey items; standardized national sample mean = 50
Benchmark
2012
2013
2014
50.0 √= Achieved
GOAL 4: RESOURCES Develop and allocate resources which sustain the institution and encourage its growth and development. 25. Annual Unduplicated Headcount Annual total unduplicated headcount for students enrolled in credit and noncredit courses, reported by fiscal year
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
FY 2015
FY 2016
FY 2015
FY 2016
√= Achieved
26. FTE Enrollment Annual total FTE student enrollment in credit courses, reported by fiscal year
Benchmark
FY 2012
FY 2013
FY 2014
√= Achieved
27. Annual Credit Count Annual total count of credits taken by degree-seeking students, reported by fiscal year
Benchmark √= Achieved
116
FY 2012
FY 2013
FY 2014
28. Tuition and Fees Change in sponsoring, credit student tuition and fees
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved
29. High School Graduate Enrollment Rate Percent of Beaver County public high school graduates attending CCBC in the fall semester following their high school graduation
Benchmark
Fall 2012
Fall 2013
Fall 2014
Fall 2015
Fall 2016
√= Achieved
30. Met need of Pell Recipients Percent of Pell Grant recipients who have 50 percent of more of their total need met by grants and scholarships
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved Pell Recipients
31. Met need of Non-Pell Recipients Percent of students with financial need, who did not receive Pell Grants, who have 50 percent of more of their total need met by grants and scholarships
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
2015
2016
2015
2016
√= Achieved Students
32. Student Perception of Institutional Financial Support Percent of students saying college provides financial support needed as reported by CCSSE
Benchmark
2012
2013
2014
√= Achieved
33. Support for Learners CCSSE composite benchmark score of survey items; standardized national sample mean = 50
Benchmark
2012
2013
2014
50.0 √= Achieved
34. Enrollment per Section Mean credit course class size in the Divisions of Liberal Arts & Sciences and Business & Technologies, excluding self-paced, internship, and independent study course
Benchmark
Fall 2012
Fall 2013
Fall 2014
Fall 2015
Fall 2016
√= Achieved
117
35. Teaching by Full-Time Faculty Percent of total teaching load hours taught by full-time faculty
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
FY 2014
FY 2015
FY 2016
√= Achieved
36. Expenditures per FTE student Unrestricted current fund operating expenditures per FTE student
Benchmark
FY 2012
FY 2013
√= Achieved
37. Expenditure on Instruction and Academic Support Percent of total educational and general operating expenditures expended on instruction and academic support as reported by IPEDS categories
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved
38. Employee Turnover Rate Percent of employees leaving the institution annually, excluding retirees as reported by the National Community College Benchmark Project national medium.
Benchmark
FY 2012
FY 2013
FY 2014
FY 2015
FY 2016
√= Achieved
39. College Support for Innovation Rate for question “This institution makes sufficient budgetary resources available to achieve important objectives” in annual Noel-Levitz survey
Benchmark √= Achieved CCSSE national
118
2015
2016
2017
2018
2019
APPENDIX n: Initiatives and events assessment template
119
INITIATIVES AND EVENTS ASSESSMENT TEMPLATE COMMUNITY COLLEGE OF BEAVER COUNTY
Outcomes
Means of Assessment/criteria
Program or event:
Assessment Methods:
Responsible Person:
Outcome:
120
Criteria:
Results
Action taken and follow up
WORKS CITED Addison, Jennifer D. Institutional Effectiveness: A Practical Guide for Planning and Assessing Effectiveness. 5th ed. Virginia Highlands Community College, 2013-2014. Web. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. 12th ed. Middle States Commission on Higher Education, 2006. Print. Claglett, Craig. Institutional Effectiveness Assessment Report. Carroll County
Community College, December 2011. Web.
General Education Assessment Plan. Johnson County Community College.
http://www.jccc.edu/files/pdf/assessment/general%20-ed-assessment-plan.pdf
“Understanding Middle States Expectations for Student Assessing Student Learning and Institutional Effectiveness.” Student Learning Assessment: Options and Resources. 2nd Ed. MSCHE, 2007. Vienna, Janine. “Assessment Next Steps for Academic Programs: Professional Development Shop-Work.” Professional Development Day, Community College of Beaver County. Learning Resources Center, Monaca, PA. 7 October 2014. Presentation. Vienna, Janine. “Assessment Next Steps for Administrative Units: Professional Development Shop-Work.” Professional Development Day, Community College of Beaver County. Learning Resources Center, Monaca, PA. 7 October 2014. Presentation. Vienna, Janine. “How to Set Benchmarks.” Faculty Convocation, Community College of Beaver County. Learning Resources Center, Monaca, PA. 22 August 2014. Presentation. Vienna, Janine. “Identifying Assessment Methods.” Faculty Convocation, Community
College of Beaver County. Learning Resources Center, Monaca, PA.
22 August 2014. Presentation.
Vienna, Janine. “Writing Measurable Academic Outcomes.” Faculty Convocation,
Community College of Beaver County. Learning Resources Center, Monaca, PA.
22 August 2014. Presentation.
121
Community College of Beaver County Institutional Effectiveness Handbook
122