Case Study

Page 1

CAEP Evidence Packet #4.1.b Graduate Impact Case Study Missouri Baptist University Executive summary: The purpose of this case study, conducted by the Missouri Baptist University (MBU) educator preparation program (EPP) during 2016-17, was to determine the level of PK-12 student impact of a group of MBU program completers during their 1st, 2nd, or 3rd year of teaching. With those data, the EPP hoped to be able to verify that the EPP’s completers are making a positive impact on PK-12 student learning, as well as learn more about program strengths and weaknesses. This study attempted to answer the following questions: 1. 2. 3.

What is the level of impact that Missouri Baptist University program completers are having on their PK-12 students? Based on student impact and teacher effectiveness data, what are the areas of teaching strength and weakness for the program completers? How might MBU make changes to the EPP’s initial certification programs to improve the level of program completer impact on their PK-12 students?

Data collected and analyzed from 6 recent graduates of MBU’s EPP have been reviewed and indicate that the new teachers are relatively successful across almost all MoSPE Teacher Standards and quality indicators (Qis) that were evaluated. The data shared with the EPP and analyzed in this case study clearly demonstrate MBU program completers have a satisfactory level of impact on their PK-12 students. The data collected were in a variety of formats; nonetheless, data reviewed and qualitative text analysis revealed adequate or higher level of impact on PK-12 students. The collected completer data tells a story about our completers’ ability to effectively apply the professional knowledge, skills, and dispositions that our program was designed to achieve. There was only one quality indicator that appeared in more than one participants’ areas of strength: QI 1.2 - The teacher cognitively engages students in the content. Participants 2 and 4 were strong in this area. Interestingly, all other areas of strength were singular for each participant. This might indicate the challenge of comparing completers who work in different school districts that use a variety of evaluation systems and/or schools and districts that select varied quality indicators as areas of focus for any given year. Similarly, this case study did not reveal consistent weaknesses across the 6 participants. Quite the opposite, there was not one repeated weak quality indicator for more than one participant. However, many themes emerged from each individual case that will guide our EPP as we move forward to provide a better preparation experience for our candidates so that they can be as effective as possible at impacting student growth in their future classrooms. Instrument: Case study Page 1 of 56


Data presented: Data presented in body of case study Data summary, explanation, and analysis: Data summarized, explained, and analyzed in body of case study. Actions to be taken: Program Recommendation #1 MBU EPP faculty should audit the curriculum to make sure course objectives are sufficiently addressing monitoring and setting student goals. Based on survey feedback, completers need to better know how to conduct goal setting activities for their students. Program Recommendation #2 MBU EPP faculty should examine curriculum in EDRD 423/523 Teaching Reading in the Content Area to ensure the syllabus contains objectives and activities that provide candidates practice in analyzing student reading levels using various reading tests, including STAR, DRA, etc. Program Recommendation #3 MBU EPP faculty should discuss ways in which to collect PK-12 student impact data from completers on an ongoing basis. Perhaps the EPP could create a Spartan Impact Research Team that meets regularly to collect and analyze completer data, present findings at faculty and advisory council meetings, and suggest program improvements to the faculty based on the results of the data.

Page 2 of 56


Spartan Teacher Impact Research Study

FLOW DIAGRAM FOR ANALYZING ASSESSMENT DATA PROCESS

Assessment Survey Source of Data Spartan Teacher Impact Research (STIR) Case Study

Student Growth Data and/or Teacher Effectiveness Data provided by new teachers.

Data Collection and Validation 1.

Collected annually

RED ARROWS REPEAT PROCESS Data Analysis 1. 2.

Compared against district, school, and/or state data if provided. Qualitative text analysis looking for trends.

Action Taken The case study revealed three actions to be taken:1.Review curriculum to ensure candidates are learning how to teach students goal setting. 2. Review curriculum to ensure candidates are learning how to read and analyze reading level data. 3. Program impact data needs to be collected by a division research team.

Results The case study revealed that our graduates who shared impact data are making a positive impact on student growth and are evaluated in the average or above range for teacher effectiveness.

Page 3 of 56


Spartan Teacher Impact Research Study

Case Study: An Analysis of Missouri Baptist University Education Division Spartan Teacher Impact Research

Melanie Bishop Dean of MBU Education Division

Summer 2017

Page 4 of 56


Spartan Teacher Impact Research Study

An Analysis of Missouri Baptist University Education Division Completer Impact on PK-12 Student Learning and Teaching Effectiveness Educator preparation programs (EPPs) must produce teachers that contribute to student learning and provide evidence of expected levels of student growth and development, in addition to other measures of teaching effectiveness. All state departments of education collect value-added measures and/or studentgrowth percentiles from the PK-12 school districts they serve, and some of those state departments share that data with EPPs. However, Missouri’s Department of Elementary and Secondary Education (DESE) currently does not provide EPPs with student growth data. Consequently, the faculty and administration in the Education division at Missouri Baptist University (MBU) have implemented a plan for data collection and analysis to provide evidence that our graduates are making a positive impact on student learning and those graduates are effectively applying the professional knowledge, skills, and dispositions that our program was designed to achieve. One section of the aforementioned plan is an annual case study conducted by the unit leadership on a cross-section of program completers who have willingly shared student growth and teacher effectiveness data. The purpose of this case study is to determine the level of impact of a group of Missouri Baptist University initial certification program completers during their first, second, or third year of teaching on their PK-12 students and to generalize those results to provide the University with valuable information that demonstrates verification that MBU’s initial certification program completers are making a positive impact on student learning and reveals program strengths and weaknesses. The case study is being conducted to answer the following questions: 1. What is the level of impact that Missouri Baptist University program completers are having on their PK-12 students? 2. Based on student impact and teacher effectiveness data, what are the areas of strength and weakness for the program completers? 3. How might MBU make changes to the initial certification program to improve the level of program completer impact on their PK-12 students?

Page 5 of 56


Spartan Teacher Impact Research Study

Background on MBU’s Evidence of Program Impact Unfortunately, DESE does not currently share any PK -12 data with EPPs; consequently, MBU is implementing a Program Impact Phase-In Plan (Appendix A) to show evidence that program completers contribute to an expected level of student-learning growth. In summer 2016, the unit began soliciting feedback and data from recent graduates for analysis purposes to determine the level of impact they have had on their PK-12 students. The Spartan Teacher Impact Research Study was originally designed to include an annual focus group of recent graduates to learn their perceptions of the program and to request their student impact data and principal feedback on teaching effectiveness. Based on contact information previously collected while they were MBU students, recent graduates from three previous years were emailed to determine availability and willingness to participate in program impact data collection. A cross-section of graduates representing a variety of certification areas were invited to main campus to participate in a focus group and brainstorming session to determine ways in which they might share student impact data with MBU and also to solicit feedback from the graduates regarding their perception of their preparation program. Although numerous emails and phone calls were made, only one of the recent graduates was available to attend a summer focus group. The event was cancelled a few weeks later and it was determined that a change of strategy was needed to collect program impact data. After much consideration, the Unit decided to change strategy from the idea of a sufficiently sized focus group approach to a case study consisting of data collected from a smaller group of completers. Those students who replied to the focus group invitation with an interest in helping us in the future were contacted individually by email and then by phone to discuss ways in which they might be able to support MBU’s efforts to collect their student impact data. Six program completers expressed interest in voluntarily providing student impact and participating in the case study, and they began emailing a variety of program impact data to be analyzed and included in the case study.

Page 6 of 56


Spartan Teacher Impact Research Study

Data Gathering and Analysis Techniques Six program completers willingly shared student growth and teacher effectiveness data for our research purposes. Table 1 below notes information about the six completers and the data provided. Certification Area

Campus Attended while MBU Student

School where Employed

*Free/ Reduc ed %

Grade Level/ Subject

Evaluation System used by District

Data Provided

1/A

Middle School ELA

Franklin County

Washingto n Middle

29.7

7th ELA

2

NEE

1. Formative Evaluations

2/B

Secondary Social Studies

Jefferson Community College

Fox High

32

Social Studies 9-12

3

NEE

1. Formative Evaluation 2. Summative Evaluation 3. Unit of Instruction

3/C

Elementary

Main

Jury Elementar y

76

3rd

3

DistrictCreated

1. Formative Evaluation

4/D

Secondary Business Ed

Main

McCluer SouthBerkeley High

66.2

Career/T ech Ed 912

3

NEE

1. Formative Evaluations 2. Student Survey Data

5/E

Elementary

Main

Duello Elementar y

26.6

Kdg

1

Marzano iObservation

1. ELA Reading Levels

6/F

Middle School ELA

Troy/ Wentzville

Rockwood Valley Middle

9.2

7th ELA

1

DistrictCreated

1. Student Work Sample 2. Student Growth Data 3. Formative Evaluation 4. Summative Evaluation

Years Experience

Participant /Principal Identifier

Table 1 – Case Study Participant and Data Information *Note: Free/Reduced Lunch percentages from Quick Facts http://mcds.dese.mo.gov/quickfacts/ The program completers listed above shared a variety of program impact data such as formative and summative classroom observation assessments completed by building administrators, pre/post student growth data, and student survey data. Due to the fact that these completers work in different school districts that use a variety of student growth and evaluation assessments, the data shared was in different formats. The MBU Education division looks forward to the future when DESE shares student impact data, but until then, we will continue to reach out to program completers each year, collect impact data,

Page 7 of 56


Spartan Teacher Impact Research Study

and conduct a case study which will help our division assess our strengths and weaknesses based on this data. Teacher Evaluation Models DESE created a state-wide teacher evaluation system called the Missouri Educator Evaluation System (MEES) and required school districts to either use the MEES, a similar system, or a districtcreated system as long as it included the MEES components. Of the six case study participants, two were evaluated by a district-created teacher evaluation system, three were evaluated by the Network for Educator Effectiveness (NEE), and one completer was evaluated based on Robert Marzano’s iObservation evaluation system. According to Network for Educator Effectiveness (NEE) (n.d.), the NEE is a “comprehensive educator assessment system designed by experts on professional development and assessment within the University of Missouri’s College of Education.” See examples of results from a NEE evaluation in appendices B and C. The NEE system records multiple measures of teacher effectiveness and allows administrators to monitor progress using four data sources: classroom observations, units of instruction, teacher provided professional development plan, and student surveys. According to IObservation. (n.d.), the Marzano evaluation product is an instructional evaluation system that is part of the suite of products connecting teacher growth to student achievement. IObservation (n.d.) manages, and reports longitudinal data from classroom walkthroughs, teacher evaluations and teacher observations. The district created evaluation systems use different tools, require different numbers of observations, collect a variety of student surveys and impact data; however, all of the evaluation systems used in this case study are similar in that they all evaluate teacher performance based on the nine Missouri Standards for Professional Educators (MoSPE). Some of the systems include a select number of standards and quality indicators (QI) identified as school and district quality indicators of focus. Some of the district-created systems include evaluations of all of the standards and quality indicators. Each participant’s distinctly difference pieces of evidence were reviewed and analyzed based on the type of data provided. The qualitative data included in the formative and summative evaluations were reviewed for common themes. Specific sections of the evaluations related to evidence of student growth

Page 8 of 56


Spartan Teacher Impact Research Study

were noted in the data provided for all six of the case study participants. This qualitative data analysis in combination with growth data provided by four participants will be explained in detail and pictured graphically in the following section. The evaluation and student growth data was compared to determine participants’ strengths and weaknesses. The collective data was reviewed to determine suggestions for program improvements. Participant 1 Level of Impact on Student Growth Participant 1 has two years of teaching experience in middle school in an ELA/Communication Arts classroom. The school is in a rural district with 29.7% of students receiving free/reduced lunches (Quick Facts, n.d.) She attended classes at the Franklin County Regional Learning Center. Student growth data was not provided; however, she was evaluated by her principal (Appendix B) and two of the four evaluated standard/quality indicators (QI) are related to student learning, cognitive engagement, problem solving and critical thinking. Participant 1 scored a mean of 5 and 4.3 on a scale of 0 to 7 respectively on QI1.2 and 4.1 indicating that according to Principal A she is impacting student learning. The teacher showed growth over time in her ability to engage students in the content and lead students to problemsolving and critical thinking.

Page 9 of 56


Spartan Teacher Impact Research Study

PARTICIPANT 1 FORMATIVE OBSERVATIONS

5.5

6 4

5

5

6 5

4.3

5 3

4

5

6

6 4

4

OBSERVATION SCORE

Mean 7

1/17/2017 7

10/27/2016

6.3

9/27/2016

7

8/22/2016

THE TEACHER THE TEACHER USES THE TEACHER COGNITIVELY ENGAGES INSTRUCTIONAL ESTABLISHES SECURE STUDENTS IN THE STRATEGIES THAT LEAD TEACHER-STUDENT CONTENT. STUDENTS TO PROBLEMRELATIONSHIPS. SOLVING AND CRITICAL THINKING. 1.2

4.1

THE TEACHER MONITORS THE EFFECT OF INSTRUCTION ON THE WHOLE CLASS AND INDIVIDUAL LEARNING.

5.3B

7.4

MOSPE STANDARDS & QUALITY INDICATORS

Figure 1. Participant 1 - formative observations conducted by principal. Area(s) of Strength This participant’s area of strength is QI 5.3 - her ability to establish secure teacher-student relationships. According to Principal A on the 8/22/16 Formative, “Reinforcement was given to individual students on their goals and (Participant 1) monitored students as they were writing.” On 10/27/16, Principal A noted, “Use of humor and setting expectations for how they treat each other. Students were willing to share out their thoughts and willing to take a risk.” On 1/17/17, Principal A wrote, “…Students have built a relationship with each other and Participant 1 and are willing to ask questions and work when appropriate.” Area(s) of Weakness Although Participant 1 scored well on all categories, her lowest scores were in QI 4.1 – The teacher uses instructional strategies that lead students to problem-solving and critical thinking. On 9/27/16, Principal A stated, “Students were doing basic recall of alphabetical order and vocabulary appropriate for this class. This was an appropriate use of time and part of the curriculum.” Perhaps the administrator expected Participant 1 to push the students to high order critical thinking levels.

Page 10 of 56


Spartan Teacher Impact Research Study

Participant 2 Level of Impact on Student Growth Participant 2 is in her third year of teaching Secondary Social Studies. The high school where she works is located in the suburbs of St. Louis and the school has a 32% FRL (Quick Facts, n.d.) She attended MBU classes that met on the Jefferson Community College campus. She provided her 15/16 summative classroom observations and 16/17 to date formative observations completed by Principal B. She also submitted an evaluation of a Unit of Instruction. In addition to the participant’s scores, this school district also included the mean from each quality indicator for the participant’s school, district, and comparative group. Quality Indicators 1.2, 4.1, and 7.4 are directly related to student growth and development and Participant 2’s evaluations indicate that she is making a strong level of impact on student growth. Principal B noted on his summative evaluation on 2/22/16, “Her professionalism, hard work, and quality instructional practice (promoting and assessing student learning) makes her a true asset to our school. 94% of students met learning targets on Unit of Instruction.”

Page 11 of 56


THE TEACHER USES ASSESSMENT DATA TO IMPROVE LEARNING.

THE TEACHER MONITORS THE EFFECT OF INSTRUCTION ON THE WHOLE CLASS AND INDIVIDUAL LEARNING.

3.1 THE TEACHER MANAGES TIME, SPACE, TRANSITIONS, AND ACTIVITIES.

THE TEACHER IMPLEMENTS CURRICULUM STANDARDS.

1.2 THE TEACHER USES INSTRUCTIONAL STRATEGIES THAT LEAD STUDENTS TO PROBLEM-SOLVING AND CRITICAL THINKING.

THE TEACHER COGNITIVELY ENGAGES STUDENTS IN THE CONTENT.

4

4

4.9

district.

Page 12 of 56

5

6

5.8

5.6

5.6

5.6

5.4

5.1

4.7

4.5

4.8

4.4

5.2

5.2

5

OBSERVATION SCORE

Spartan Teacher Impact Research Study

PARTICIPANT 2 SUMMATIVE EVALUATION 15/16 15/16 Summative Evaluation Mean - Participant 2

15/16 Summative Evaluation Mean - Fox Senior High

15/16 Summative Evaluation Mean - Fox C-6 District

15/16 Summative Evaluation Comparitive Group

MOSPE STANDARDS AND QUALITY INDICATORS 4.1 5.2 7.2 7.4

Figure 2. Participant 2 – summative evaluation 15/16 conducted by principal compared to others in her


Spartan Teacher Impact Research Study

PARTICIPANT 2 FORMATIVE EVALUATIONS 16/17 16/17 Formative Evaluations Mean - Participant 2 16/17 Formative Evaluations Mean - Fox Senior High 16/17 Formative Evaluations Mean - Fox C-6 District

THE TEACHER THE TEACHER USES COGNITIVELY INSTRUCTIONAL ENGAGES STUDENTS STRATEGIES THAT IN THE CONTENT. LEAD STUDENTS TO PROBLEM-SOLVING AND CRITICAL THINKING. 1.2

5.4

4.3

5.2

6

5.6

5.7

5.4

6.7 4.8

4.2

4.9

7 5.7

5.4

4.9

OBSERVATION SCORE

7

16/17 Formative Evaluations Comparitive Group

THE TEACHER MANAGES TIME, SPACE, TRANSITIONS, AND ACTIVITIES.

THE TEACHER MONITORS THE EFFECT OF INSTRUCTION ON THE WHOLE CLASS AND INDIVIDUAL LEARNING.

5.2

7.4

4.1

MOSPE STANDARDS & QUALITY INDICATORS

Figure 3. Participant 2 – formative evaluations 16/17 conducted by principal compared to others in district.

Area(s) of Strength This teacher’s areas of strength are QI 1.2 and 4.1. She clearly exceeds the school, district, and comparison group’s mean in all areas, but she is particularly strong in the engaging students in the content, using instructional strategies that lead students to problem-solving and critical thinking. She also scored very high on managing time, space, transitions, and activities, which is typically a weakness of new teachers. Area(s) of Weakness This teacher scored well in all categories, but her lowest score was in QI 7.4 – The teacher monitors the effect of instruction on the whole class and individual learning. There was no narrative provided on the 16/17 formative evaluation so it is unclear how the teacher can improve in this area.

Page 13 of 56


Spartan Teacher Impact Research Study

Participant 3 Level of Impact on Student Growth Participant 3 is in her third year of teaching elementary students. She teaches at a school with 76%FRL in an urban district on the eastern side of St. Louis County (Quick Facts, n.d.). She attended classes on MBU’s main campus. This graduate did not provide any student growth data, so the formative evaluation was used to determine level of impact on student growth. According to the evaluation rubric (Appendix C), participant 3 used a variety of informal and formal assessments aligned with student goals. According to Principal C, she reflected on the use of assessment models and approaches that were aligned with learning goals and used student input when appropriate. Participant 3 established individualized learning goals for students based on an analysis of data. As noted in quality indicator 7.2b, she used formative assessment data to determine whether instruction resulted in student learning or growth. According to the narrative provided by Principal C, Participant 3 is “using data to plan, collaborate, evaluate and set long-term goals for her students and their learning.� Principal C also provided feedback regarding QI 3.1 and curriculum implementation, Participant 3 began the lesson with a very strong and relevant scaffolding review which included prior learning and instructional targets, and their connections to the new learning target and objective. Tremendous effort and collaboration was used to plan and design the lesson based on prior formative assessment data gathered by the teacher. This proved worthwhile and led to approximately 25% of the students attaining the level of proficiency required for skill mastery, with the remaining 75% of students on the bubble of mastering the skill, as determined by her formative assessments.

Page 14 of 56


Spartan Teacher Impact Research Study

Participant 3 Formative Evaluation 9/28/16

Observation Score

5 4

4 3

3

3 2

2

2

2

1 0 3.1 The teacher 6.1 The teacher 6.2 The teacher 7.1 The teacher 7.2 The teacher 7.3 The teacher designs learning models verbal demonstrates effectively uses uses assessment involves students experiences that and nonverbal sensitivity to multiple data to improve in selfalign to communication culture, gender, assessment student learning. assessment curriculum that is correct, intellectual, and modes and strategies. standards. effective, and physical approaches to professional. differences. assess student learning.

MoSPE Standard & Quality Indicator

Figure 4. Participant 3 – formative evaluation 9/28/16 conducted by principal. Area(s) of Strength This participant’s strongest areas according to this one formative evaluation are QI 3.1 and 7.1. Quality Indicator 7.1 is The teacher effectively uses multiple assessment modes and approaches to assess student learning. In the principal's written comments for standard 3.1 she noted, that Participant 3 presented a very-well planned lesson. The lesson was designed around “MO Grade Level Expectations for 3rd grade; mapped curricula and grade level standards; Aimsweb and eValuate data, and other professional development meetings which focused on the math curriculum.” Principal C wrote that this teacher did an exemplary job with her questioning to ensure students were thinking critically. She stated that the teacher conducted a rich math discourse and that she demonstrated math confidence. It was also noted that her classroom management and productivity from students spoke volumes to her lesson quality and the positive relationships she had with all of her students. According to the principal, this was “a very strong and purposeful lesson! Excellent work!”

Page 15 of 56


Spartan Teacher Impact Research Study

Area(s) of Weakness Participant 3 was encouraged to push the lesson, especially for the 25% of students who needed it because of their skill mastery early. This comment implies that this participant can do more to engage students who have already mastered the lesson objective.

Participant 4 Level of Impact on Student Growth Participant 4 is in his third year of teaching Secondary Career and Technical Education. He teaches in an urban school with 66.2%FRL (Quick Facts, n.d.). This participant attended MBU’s main campus. He did not provide student growth data, but many conclusions can be drawn from the many formative evaluations and student survey data (Appendix D) provided. His mean score of 5.8 (scale of 07) on QI 1.2 – The teacher cognitively engages students in the content implies that students’ growth is being positively impacted by this teacher. Participant 4 scored consistently high in this category. The students in this teacher’s classes gave him the second highest score when asked a question aligned to QI 2.2 – Teacher sets and monitors student goals which indicates Participant B is having a positive impact on student growth, according to his students. Students also rated him highest on QI 2.3 – Teacher provides learning opportunities that are adapted to diverse learners and support the intellectual, social and personal development of all students.

Page 16 of 56


Spartan Teacher Impact Research Study

PARTICIPANT 4 FORMATIVE EVALUATIONS 15/16 15/16 Formative Principal Evaluation 9/16/15

15/16 Formative Principal Evaluation 9/21/15

15/16 Formative Principal Evaluation 10/27/15

15/16 Formative Principal Evaluation 11/4/15

15/16 Formative Principal Evaluation 1/11/16

15/16 Formative Principal Evaluation 1/26/16

15/16 Formative Principal Evaluation 2/9/16

15/16 Formative Principal Evaluation 2/18/16

6 6 5.5

6 THE TEACHER ESTABLISHES SECURE TEACHER-STUDENT RELATIONSHIPS.

THE TEACHER MONITORS THE EFFECT OF INSTRUCTION ON THE WHOLE CLASS AND INDIVIDUAL LEARNING.

4

5

5

6

6

5.5

6 5

6

5.3

THE TEACHER USES STRATEGIES THAT PROMOTE KINDNESS AND SOCIAL COMPETENCE AMONG STUDENTS IN THE CLASSROOM COMMUNITY.

2.2

THE TEACHER USES EFFECTIVE DISCIPLINE THAT PROMOTES SELFCONTROL.

1.2

THE TEACHER MANAGES TIME, SPACE, TRANSITIONS, AND ACTIVITIES

THE TEACHER SETS AND MONITORS STUDENT GOALS

4

5

5

6

6 5.8

6 6

4

5

6 6

THE TEACHER COGNITVELY ENGAGES STUDENTS IN THE CONTENT.

OBSERVATION SCORE

7

15/16 Formative Principal Evaluation Mean

5.2

5.2B

5.3

5.3B

7.4

MOSPE STANDARDS & QUALITY INDICATORS

Figure 5. Participant 4 – formative evaluations and mean score 15/16 conducted by principal.

Page 17 of 56


Spartan Teacher Impact Research Study

PARTICIPANT 4 FORMATIVE EVALUATIONS FALL 2016

6

6

6

6

6 5.2

Mean

5

5

THE TEACHER THE TEACHER COGNITVELY MANAGES TIME, ENGAGES SPACE, STUDENTS IN THE TRANSITIONS, CONTENT. AND ACTIVITIES

1.2

11/21/2016

7

10/12/2016

6

6

7

10/6/2016

5

OBSERVATION SCORE

7

8/18/2016

THE TEACHER THE TEACHER THE TEACHER USES EFFECTIVE USES EFFECTIVE MONITORS THE DISCIPLINE THAT VERBAL AND EFFECT OF PROMOTES SELFNONVERBAL INSTRUCTION ON CONTROL. COMMUNICATION. THE WHOLE CLASS AND INDIVIDUAL LEARNING. 5.2B

6.1

7.4

MOSPE STANDARDS & QUALITY INDICATORS

Figure 6. Participant 4 – formative evaluations and mean score fall 2016 conducted by principal.

Page 18 of 56


This teacher expects us to think a lot and concentrate in this class.

1.1

1.2

Page 19 of 56

Figure 7. Participant 4 – student survey scores aligned to MoSPE standards 1 and 2. 2.2667

1.975

1.3 1.4 1.5 2.1 2.2 2.3 2.3 2.4 2.5 Survey Questions Aligned to MoSPE Standards & Quality Indicators

This teacher helps us treat people who are different with respect.

2.4

This teacher connects what we aer learning with things we already know.

This teacher talks about how to deal with emotions.

2.4

This teacher helps us become better learners.

2.4

This teacher shows us or gives examples of what we are supposed to learn or do.

2.2667

This teacher tells us the goals for each lesson.

2.2667

The work this teacher gives me is just right for me - not too easy and not too hard.

2.3

This teacher tells us how different people view this subject.

2.325

This teacher talks about how this clsas subject relates to things we learn in other classes.

2.35

This teacher explains how experts develop knowledge in this subject.

This teacher makes us use the vocabulary we learn.

Mean Scores 0-3

Spartan Teacher Impact Research Study

Participant 4 Student Survey Questions - Mean Scores Aligned to Standards 1 & 2

2.125

1.8

2.6


This teacher uses lots of different things to help us learn (such as readings, maps, or objects).

4.1

4.2

Page 20 of 56

2.1

4.2b 4.3 5.1 5.2 5.2b 5.3 5.3b 6.1 6.3 7.2 Survey Questions Aligned to MoSPE Standards & Quality Indicators

Figure 8. Participant 4 – student survey scores aligned to MoSPE standards 4,5,6 and 7. This teacher knows when we understand the lesson.

2.25

This teacher teaches us how to judge our own progress in this class.

2.24

Thist teacher often tests us to see what we know or can do.

2.36

This teacher expects us to use proper, full sentences in class discussions.

2.4

This teacher gives clear, precise explanations.

This teacher knows me and cares about me.

2.25

This teacher encourages us to be kind and help each other.

This teacher talks respectfully with students when they misbehave.

2.3333

We are learning almost all the class time.

This teacher makes lessons interesting.

2.35

When our teacher assigns work in small groups, we each know exactl what we are supposed to do.

This teacher often uses technology (e.g., tablets, computer, blogs, email, Powerpoint) in a way that helps us learn.

This teacher makes us explain our answers.

Mean Scores 0-3

Spartan Teacher Impact Research Study

Participant 4 Student Survey Questions - Mean Scores Aligned to Standards 4-7 2.44 2.38 2.3333

2.3 2.25

2.1

7.3

7.4


Spartan Teacher Impact Research Study

Area(s) of Strength The data indicates that Participant 4 has many strengths. QI 1.2 The teacher cognitively engages students in the content is one of his strengths. Principal D noted that all students were engaged during her observation. QI 5.2 – The teacher manages time, space, and transition activities was Participant 4’s greatest area of improvement from 15/16 to 16/17 as he went from a mean of 5.3 to 7.0. It is important to note, however, that there was a different evaluator during the evaluation that produced this increased mean score. Participant 4 went down slightly in his mean score in QI 5.2b – The teacher uses effective discipline that promotes self-control. However, the narrative included, “Teacher addressed an off-task behavior by conferencing briefly with the student. Teacher addressed students with dignity and respect; hence the same is reciprocated.” She also discussed that students appeared to automatically get into the routine of the class. On a different date, Principal D wrote, “When students did not get in groups initially, teacher gave the students 60 seconds to get into their groups. Teacher maintained a positive and firm toe that worked effectively with the students, as they began to self-correct their behavior without further directions from the teacher. She also noted that the teacher returned questions with questions as not to spoon-feed the answers. Participant also had a high mean on QI 5.3 – The teacher uses strategies that promote kindness and social competence among students. The students in Participant 4’s classes rated him highest on QI 2.2 – Teacher sets and monitors student goals,; QI 2.3 - Teacher provides learning opportunities that are adapted to diverse learners and support the intellectual, social and personal development of all students; QI 5.3b – The teacher establishes secure student/teacher relationships; and QI 4.2b – The teacher uses instructional resources appropriately to enhance student learning. Area(s) of Weakness According to formative evaluations, Participant 4 could improve in QI 2.2, 5.2b and 5.3b; however, it is important to note that these quality indicators were only evaluated one time. In 15/16, the district evaluated QI 2.2 - The teacher sets and monitors student goals. Although Participant 4 had a mean score of 5.0 on a scale of 0-7, this was his lowest category, which implies there is room for growth in this

Page 21 of 56


Spartan Teacher Impact Research Study

area. The students scored participant 4 lowest in QI 1.5, 4.2, and 7.3. Participant 4 can improve in the area of involving students in self-assessment strategies.

Participant 5 Level of Impact on Student Growth Participant 5 is finishing her first year of teaching kindergarten students in a suburban district with 26.6%FRL (Quick Facts, n.d.). She attended classes at the Troy/Wentzville Regional Learning Center. The district tracks ELA reading levels for students. Participant 5 provided a copy of her students’ 2016/2017 ELA levels. The results of the data (Appendix E) indicate that she impacted students’ reading levels in the following ways: In August/September the students were given the Development Reading Assessments (DRA) benchmark test. On the Continuum of Performance (COP), the students scored between 2 and 3. The benchmark for Kindergarten as scored by this assessment is A-3 for Kindergarten students. A comparison of the DRA scores and the Fountas & Pinnell (F&P) Assessment was also provided. The Fountas and Pinnell (Efficacy Studies, Research Base and White Papers on Validity and Reliability, n.d.) is another benchmark assessment but for K-6. Kindergarten levels are A-D and 1st grade levels are E-J. As of March 4th, students scored above grade level according to the F&P. No student had a score of below grade level. The 17 remaining students had a reading score on grade level, or B or C benchmark levels on the F&P. This data reflects that Participant 5 is making a positive impact on student achievement with this group of students. That all kindergarten students in the class are scoring at or above grade-level in March is excellent! In October, November, January, and February there were a few students scoring below grade-level, so they have improved more than expected to scoring B or even C o the benchmark assessment.

Page 22 of 56


Spartan Teacher Impact Research Study

PARTICIPANT 5 16/17 STUDENTS' READING LEVELS # On Grade Level

# Below Grade Level

16

4

4

0

1

2

2

2

2

3

4

5

13

14

14 5

NUMBER OF STUDENTS

16

17

18

# Above Grade Level

BASELINE

OCTOBER

NOVEMBER

DECEMBER

JANUARY

FEBRUARY

MARCH

MONTH

Figure 9. Participant 5 – number of students reading above, at, or below grade level 16/17. Area(s) of Strength Participant 5 provided her students’ reading levels which indicate that the teacher has strength in the area of teaching her students how to read. Area(s) of Weakness This participant did not provide any formative or summative principal evaluations. It is difficult to imply weak areas from this limited data.

Participant 6 Level of Impact on Student Growth Participant 6 attended MBU’s main campus when a student and has just finished her first year of teaching middle school Language Arts in an affluent suburban school district and her middle school has 9.2% FRL (Quick Facts, n.d.). According to student growth data noted below, Participant 6 had a positive level of impact on the reading levels of the students in her classroom. She provided a cross-section of

Page 23 of 56


Spartan Teacher Impact Research Study

student data indicating that the teacher positively impacts various genders and ability levels. Students in her building take a Star Reading test three times each year. According to Guide more reading growth in less time.(n.d.), the Star Reading program measures students’ understanding of multiple reading skills across a variety of domains: foundational skills, reading informational text, reading literature, and language. The female, male, and challenging students all improved in the areas of growth percentile, grade equivalent, and estimated oral frequency. The evaluation used in this district includes evaluations of all nine standards and most quality indicators. According to formative and summative evaluations, Participant 6 scored highly on the following quality indicators related to student growth: QI 2.3 The teacher designs differentiated lessons. Although creating differentiated lessons does not always equate to an increase in student achievement, Participant 6’s student growth data supports the notion as she planned and implemented lessons to meet the needs of diverse learners. Principal F noted the following comments which support level of impact on student growth, "Students were observed using a handout about transition words as they wrote. They also actively used their outlines to develop well-constructed body paragraphs. Students are demonstrating the ability to identify relevant, robust evidence to support their claims. In addition, they are demonstrating the ability to select organizational structures that work best for them indicating a greater level of maturity in and ownership for their writing." Principal F noted on the summative evaluation that the teacher "has developed in her students an understanding of the value of resources to support their reading and writing." Participant 6 consistently stretches her students to demonstrate higher order thinking skills. The twist in the character analysis papers related to The Outsiders and her constant challenge to "prove it" for every answer or claim, are strong examples of how Participant 6 formally and informally requires critical thinking from her students. Participant’s evaluations also indicate effective use of assessments that lead to student learning. Principal F noted, "Participant 6 has consistently demonstrated the ability to give specific, substantive information about the progress of her students. Her classroom instruction reflects the strengths and needs evidence in her formal and informal assessment data. Participant 6 has also demonstrated the ability to analyze an assessment to determine if it is indeed measuring what she set out to measure. She has

Page 24 of 56


Spartan Teacher Impact Research Study

eliminated or revised assessments when appropriate. Participant 6 brought forward several samples of student work that demonstrated significant growth across each of the essential curricular standards.

Participant 6 Student Growth Data STAR Reading Levels - Growth Percentile 8/26/16

12/16/16

4/6/17

Growth Percentile

11.4 10.6 7.9

7.1

8

Female A

6.7

7.4

8.5

Female B

7.8

8.8 7.2 5.7

Male C

6.5 7

Male D

7.8 7.9

8.6

9.3

Male Challenge Female E Challenge F

Students

Figure 10. Participant 6 – growth percentile of students’ reading levels from STAR assessment.

Page 25 of 56


Spartan Teacher Impact Research Study

Participant 6 Student Growth Data 16/17 STAR Reading Levels - Grade Equivalent 8/26/16

12/16/16

4/6/17

Grade Equivalent

10.4 8.8

8.3 6.8

6.4

6.9

6.2 6.5

6.8

8.6

6.4 5.2

Female A

Female B

Male C

6.1 6.3

Male D

11

9.2

6.7 6.8

Male Challenge E

Female Challenge F

Student

Figure 11. Participant 6 – grade equivalent of students’ reading levels from STAR assessment.

Page 26 of 56


Spartan Teacher Impact Research Study

Participant 6 Student Growth Data STAR Reading Levels - Estimated Oral Reading Fluency Lexile

Est ORF Lexile

8/26/16

1170 1150 1065

1235 1090 1015

12/16/16

4/6/17

1265 1145 1070

1050 995

1250 11401150

13551395 1285

865

Female A

Female B

Male C

Male D

Male Challenge E

Female Challenge F

Student

Figure 12. Participant 6 – estimated oral reading fluency lexile of students’ reading levels from STAR assessment.

Page 27 of 56


Disciplinary research and inquiry methodologies Interdisciplinary instruction Diverse social and cultural perspectives

44 44 44

1.1

1.2

1.3

1.4

1.5 2.1 2.2 2.3 2.4 2.5 3.1 3.2 MoSPE Standards & Quality Indicators

44 44 44

standards 1, 2, 3, and 4.

Page 28 of 56

44 44

3.3

44

4.1

44

4.2

Cooperative, small group and independent learning,

Instructional goals and differentiated instructional strategies Instructional strategies leading to student engagement in problem-solving and critical-thinking Appropriate use of instructional resources to enhance student learning

44

Lessons for diverse learners

44

Implementation of curriculum standards

Formative Score 3/3/17

Differentiated Learning: prior experiences, multiple intelligences, strengths and needs Differentiated learning language; language, culture, family and knowledge of community values

Differentiated lesson designs

44

Student goals

Student Engagement in subject matter

44

Cognitive, social, emothional and physical development

Content knowledge and academic language

Observation Score 1-7

Spartan Teacher Impact Research Study

Participant 6 Formative & Summative Evaluations 16/17 Standards 1-4 Summative Score 16/17

55 44

4.3

Figure 13. Participant 6 – formative and summative evaluations 16/17 conducted by principal aligned to


Management of time, space, transitions, and acitivities Classroom, school , and community culture Verbal and nonverbal communication

4 4 44

5.1 5.2 5.3 6.1 4

Professional learning Professional rights, responsibilities and ethical practices Induction and collegial activities Collaborating to meet student needs Cooperative partnerships in support of student learning

6.3 6.4 7.1 7.2 7.3 7.4 7.5 8.1 MoSPE Standards & Quality Indicators 8.2 8.3 9.1 9.2 9.3

Formative Score 3/3/17

4 4 4

standards 5,6,7,8 and 9.

Page 29 of 56 Communication of student progress and maintaining records

4

Effect of instruction individual/class learning

44

44

Student-led assessment strategies

44

Self-assessment and improvement

6.2 44

Assessment data to improve learning

55

Effective use of assessments

5

Technology and media communication tools

5

Sensitivity to culture, gender, intellectual and physical differences Learner expression in speaking, writing, and other media

Classroom management techniques

Observation Scores

Spartan Teacher Impact Research Study

Participant 6 Formative & Summ ative Evaluations 16/17 Standards 5-9 Summative Score 16/17

5 4 44 4 4 4

Figure 14. Participant 6 – formative and summative evaluations 16/17 conducted by principal aligned to


Spartan Teacher Impact Research Study

Area(s) of Strength & Weakness For a first year teacher, Participant 6 has many strengths as she scored in the Developing-4 category in most areas. This district’s evaluation categories are: Emerging -1, Emerging-2, Developing-3, Developing-4, Proficient-5, Proficient-6, and Distinguished-7. Her highest score of Proficient -5 was in QI 2.3 Differentiated lesson design. In the summative evaluation, Principal F noted, "Participant 6 takes care to check on his (a struggling student) progress/understanding frequently providing specific next steps he should complete before the next progress check. She understands that while this is not the need of every student in her class, it is what he needs so she works with him in this fashion. Participant 6's students respect her and trust her to help them build the skills they need to be successful because they see the care and effort she puts into her instruction and feedback. She invests time in knowing her students beyond their work engaging them in conversations with ease. She is able to embed students' interests, experiences, and aspirations into her instruction and also uses them as leverage points making learning meaningful to her students. Participant 6 also scored highest on QI 5.3 - The teacher creates a learning environment (classroom culture) that encourages active engagement in learning. Principal F noted in her evaluation that Participant 6 “uses classroom meetings to build a positive classroom community. This practice has contributed to a classroom environment in which students feel safe to call out the problems they see in any scale or to celebrate the things that are going well. Because students' voices are validated, they are using them more and more to express their needs.” Another one of this teacher’s strengths is QI 6.2 Sensitivity to culture, gender, intellectual and physical differences. Principal F noted that Participant 6 has supported two ELL students new to this country this year. Both students have demonstrated significant growth under the leadership of Participant 6. The safe and nurturing environment of her classroom, has given these students the space they needed to adjust and now their true abilities are beginning to clearly shine. Participant 6 holds a deep commitment to building positive communication skills in her students and so she is very intentional in modeling communication skills that lend themselves to fostering relationships by helping others feel heard and understood. The principal’s summative final comments (Part 6 principal summative) included,

Page 30 of 56


Spartan Teacher Impact Research Study

"Participant 6 demonstrates an appropriate level of performance for a first year teacher in each of the 9 standards. She has shown great initiative in the standard of effective communication through her implementation of class meeting practices and professionalism as she has sought learning opportunities tied to the components most integral to her approach to teaching.�

Area(s) of Weakness Participant 6 can continue growing in all standards and quality indicators as she has not reached distinguished in any category. However, in her young career as a teacher Principal F noted that she can focus on training in her preparation for Read 180 as the school will be implementing this program in the 2017/2018 school year.

Executive Summary & Program Recommendations The purpose of this case study was to determine the level of impact of a group of MBU program completers during their first, second, or third year of teaching on their PK-12 students and to generalize those results to provide the University with valuable information that demonstrates verification that MBU’s initial certification program completers are making a positive impact on student learning and reveals program strengths and weaknesses. This study attempted to answer the following questions: 1. What is the level of impact that Missouri Baptist University program completers are having on their PK-12 students? 2. Based on student impact and teacher effectiveness data, what are the areas of strength and weakness for the program completers? 3. How might MBU make changes to the initial certification programs to improve the level of program completer impact on their PK-12 students? The data collected and analyzed from six recent graduates from MBU’s EPP has been reviewed and indicates that the new teachers are relatively successful in almost all standards and quality indicators that were evaluated. The data shared with the University and analyzed in this case study clearly demonstrate MBU program completers have a satisfactory level of impact on their PK-12 students. The

Page 31 of 56


Spartan Teacher Impact Research Study

data collected was in a variety of formats; nonetheless, data were reviewed and qualitative text analysis revealed adequate or higher level of impact on PK-12 students. The completer data tells a story about our completers’ ability to effectively apply the professional knowledge, skills, and dispositions that our program was designed to achieve. There was only one quality indicator that appeared in more than one participants’ area of strength QI 1.2 - The teacher cognitively engages students in the content. Participants 2 and 4 were strong in this area. Interestingly, all other areas of strength were singular for each participant. This phenomenon might indicate the challenge of comparing completers who work in different school districts that use a variety of evaluation systems and/or schools and districts that select various quality indicators as areas of focus for any given year. Similarly, this case study did not reveal consistent weaknesses across the six participants. Quite the opposite, there was not one repeated weak quality indicator for more than one participant. However, many themes emerged from each individual case that will guide our division as we move forward to provide a better preparation experience for our students so that they can be as effective as possible at impacting student growth in their future classrooms. Program Recommendation #1 The MBU Education division faculty should audit the curriculum to make sure course objectives are addressing monitoring and setting student goals. Based on evaluation feedback, completers need to know how to conduct goal setting activities for their students. Program Recommendation #2 The MBU Education division faculty should examine curriculum in EDRD 423/523 Teaching Reading in the Content Area to ensure the syllabus contains objectives and activities that allow future teachers to analyze student reading levels from various reading tests, such as STAR, DRA, etc. Program Recommendation #3 The MBU Education division faculty should discuss ways in which to collect impact data on an ongoing basis. Perhaps the division could create a Spartan Impact Research team that meets regularly to collect and analyze completer data, present findings at faculty and advisory council meetings, and suggest program improvements to the faculty based on the results of the data.

Page 32 of 56


Spartan Teacher Impact Research Study

Conclusion This case study has significant impact on the Education division at MBU as it is our first major step toward utilizing completer data to determine our program’s effectiveness. As accountability measures keep increasing in the field of educator preparation, hopefully state departments of education will support EPPs that need to analyze completer data for program effectiveness. Requesting data from recent graduates did provide valid and reliable qualitative data for a small case study for the EPP to examine; however, if state departments of education would share student impact data with EPPs, we could draw generalizable conclusions that could generate more powerful and sustainable program changes.

References Efficacy Studies, Research Base and White Papers on Validity and Reliability. (n.d.). Retrieved June 05, 2017, from http://www.fountasandpinnell.com/research/ Guide more reading growth in less time. (n.d.). Retrieved June 05, 2017, from http://www.renaissance.com/products/assessment/star-360/star-reading-skills/ IObservation. (n.d.). Retrieved June 05, 2017, from http://www.iobservation.com/ Network for Educator Effectiveness (NEE). (n.d.). Retrieved June 05, 2017, from https://nee.missouri.edu/ Quick Facts. (n.d.). Retrieved June 05, 2017, from http://mcds.dese.mo.gov/quickfacts/SitePages/DistrictInfo.aspx

Page 33 of 56


Spartan Teacher Impact Research Study

Appendix A Spartan Teacher Impact Research Phase-In Plan Summer 2016

Status

Fall 2016

Status

Spring 2017

Status

Summer 2017

Status

Fall 2017

Status

Spring 2018

Status

Summer 2018

Status

17/18 MBU Principal Survey of Case Study Principals

Will be analyzed in early summer 2018

17/18 MBU Principal Survey of Case Study Principals

Will be analyzed in early summer 2018

17/18 MBU Principal Survey of Case Study Principals

Will be analyzed in early summer 2018

CAEP Standard/Quality Indicator & Evidence Pieces 4.1 Impact on P-12 Student 14/15 & 15/16 DESE Principal Survey Question 39b - comparison of MBU to state averages Complete

DESE Administered Principal Survey 16/17 MBU Case Study - Collect Student Growth and Teacher Effectiveness Data from 1st, 2nd, 3rd year teachers Complete

MBU Case Study

16/17 DESE Principal Survey Question 39b comparison of Will be analyzed MBU to state in early summer averages 2018 17/18 MBU Case Study - Collect Student Growth and Teacher Effectiveness Data from 1st, Data will be 2nd, 3rd year collected Fall teachers 17/Spring 18

16/17 Case Study - Spartan Teacher Impact Research Study Complete 16/17 MBU Principal Survey of Case Study Principals

MBU Administered Principal Survey

17/18 Case Study - Spartan Teacher Impact Research Study

Will be analyzed and written in early summer 2018

Complete

4.2 Indicators of Teaching 14/15 & 15/16 DESE Principal Survey questions 1-39, 41 - comparison of MBU to state averages

DESE Administered Principal Survey 16/17 MBU Case Study - Collect Student Growth and Teacher Effectiveness Data from 1st, 2nd, 3rd year teachers Complete

MBU Case Study

16/17 DESE Principal Survey questions 1-39, 41 - comparison Will be analyzed of MBU to state in early summer averages 2018

Complete 17/18 MBU Case Study - Collect Student Growth and Teacher Effectiveness Data from 1st, Data will be 2nd, 3rd year collected Fall teachers 17/Spring 18

16/17 Case Study - Spartan Teacher Impact Research Study Complete 16/17 MBU Principal Survey of Case Study Principals

MBU Administered Principal Survey

17/18 Case Study - Spartan Teacher Impact Research Study

Will be analyzed and written in early summer 2018

Complete

4.3 Satisfaction of Employers 14/15 & 15/16 DESE Principal Survey questions 1-39, 41 - comparison of MBU to state averages

DESE Administered Principal Survey 16/17 MBU Case Study - Collect Student Growth and Teacher Effectiveness Data from 1st, 2nd, 3rd year teachers Complete

MBU Case Study

16/17 DESE Principal Survey questions 1-39, Will be analyzed 41 - comparison and written in of MBU to state early summer averages 2018

Complete 17/18 MBU Case Study - Collect Student Growth and Teacher Effectiveness Data from 1st, Data will be 2nd, 3rd year collected Fall teachers 17/Spring 18

16/17 Case Study - Spartan Teacher Impact Research Study Complete 16/17 MBU Principal Survey of Case Study Principals

MBU Administered Principal Survey

17/18 Case Study - Spartan Teacher Impact Research Study

Will be analyzed and written in early summer 2018

Complete

4.4 Satisfaction of Completers 14/15 & 15/16 DESE Principal Survey questions 1-50 & 52 - comparison of MBU to state averages Complete

DESE Administered 1st, 2nd, 3rd Year Teacher Survey

MBU Focus Group

MBU Administered 1st, 2nd, 3rd Year Teacher Survey

15/16 Focus Group Spartan Teacher Impact Research Invited 1st, 2nd, 3rd year teachers to campus for focus group

Only 1 student available - Event Canceled Decided to collect data through case study

16/17 DESE Principal Survey questions 1-50 & Will be analyzed 52 - comparison and written in of MBU to state early summer averages 2018

16/17 Focus Group Spartan Teacher Impact Research Invited 1st, 2nd, 3rd year teachers to Scheduled focus group event 7/25/17 16/17 MBU New Teacher Survey of Case Study Participants Complete

Page 33 of 56

17/18 Focus Group Spartan Teacher Impact Research Invite 1st, 2nd, 3rd year teachers to focus group event 17/18 MBU New Teacher Survey of Case Study Participants

Will take place July 2018 Will be analyzed in early summer 2018


Spartan Teacher Impact Research Study

Appendix B Participant 1 –Formative Evaluations

Page 34 of 56


Spartan Teacher Impact Research Study

Page 35 of 56


Spartan Teacher Impact Research Study Appendix C Participant 3 –Formative Evaluation

Page 36 of 56


Spartan Teacher Impact Research Study

Page 37 of 56


Spartan Teacher Impact Research Study

Page 38 of 56


Spartan Teacher Impact Research Study

Page 39 of 56


Spartan Teacher Impact Research Study

Page 40 of 56


Spartan Teacher Impact Research Study

Page 41 of 56


Spartan Teacher Impact Research Study

Page 42 of 56


Spartan Teacher Impact Research Study

Page 43 of 56


Spartan Teacher Impact Research Study

Page 44 of 56


Spartan Teacher Impact Research Study

Page 45 of 56


Spartan Teacher Impact Research Study

Page 46 of 56


Spartan Teacher Impact Research Study

Page 47 of 56


Spartan Teacher Impact Research Study Appendix D Participant 4 –Student Survey Details

Page 48 of 56


Spartan Teacher Impact Research Study

Page 49 of 56


Spartan Teacher Impact Research Study

Page 50 of 56


Spartan Teacher Impact Research Study

Page 51 of 56


Spartan Teacher Impact Research Study

Page 52 of 56


Spartan Teacher Impact Research Study

Page 53 of 56


Spartan Teacher Impact Research Study Appendix E Participant 5 –Student Growth Data – Reading Levels

Page 54 of 56


Spartan Teacher Impact Research Study

Page 55 of 56


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.