6. Assess Technology - Hendrix

Page 1

Continuing to Assess 1 Running head: CONTINUNING TO ASSESS TECHNOLOGY

Continuing to Assess Technology in Math Classroom for State Assessments Jeremy Hendrix Brett Tracy

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 2 Abstract Every Arizona student has to pass the state mandated Arizona Instrument to Measure Standards (AIMS) test before they can graduate from high school. Ensuring some weaker students pass this test is seemingly impossible due to their lack of involvement in their own learning, their teachers’ lack of caring, or the failure to conducting a proper review before the test. A central problem in preparing for the AIMS test is providing immediate feedback on practice tests in the classroom setting. This feedback is vital to all teachers and students because it makes them aware of deficiencies as students prepare to take the AIMS test. A dull and uninviting environment causes student participation and engagement in lessons to be less than desirable. Students need to be engaged in material in a way that they find stimulating and entertaining if they are too succeed. In order to assist in preparation for AIMS, interactive technology was implemented into a math classroom in an Arizona high school for a two year period to determine if a specific combination of technology would affect student success on the AIMS assessment. Results over the investigated period of time showed, that when students received immediate feedback and remediation during preparation for the AIMS students had increased engagement which led to higher assessment scores and ratings.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 3

Introduction While teachers prepare students to take state mandated tests, both teachers and students are usually unaware of how well the students are acquiring the necessary skills and knowledge in order to pass these tests. The passage of time during which no feedback occurs can be detrimental to students’ development and progress while preparing for state mandated testing. Once it is discovered students have fallen behind, the time needed to go back and review during so mastery can occur can delay teachers and students assessment of students’ knowledge of the material. The teacher can become frustrated in having to review material again and feel pressured to get back on a timeline that has been set previously to cover all important and required concepts for state mandated testing. The students, in addition to their development and progress being stifled, can become frustrated with the material and stop learning. With the development of a new interactive technology, student engagement, participation, and performance can be monitored and assessed quickly. These technology tools, such as interactive whiteboards (SmartTech’s Smartboard) and Student Response Systems (eInstruction’s Classroom Performance System), allow teachers to create dynamic lessons and assess student learning immediately. These new tools encourage students to become engaged as active participants in their own learning, and allow them to maintain a comfort level with the material that is being presented. There has been several research studies conducted on student engagement and participation in education over the past several years. Such research has demonstrated a direct correlation between student success and the amount of time they are actively engaged (Wu & Huang, 2007). However, research studies focused on outcomes when technology is infused into

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 4 the course have occurred primarily at the post-secondary level. At the conclusion of a previous study, (Tracy & Hendrix, 2008) on the topic, the researchers recommend a study in which ‌one teacher collects all of the data ‌ [in order to] will take the element of different teachers and their styles out of the equation. By incorporating this modification, the data should solidify the idea that technology tools which focus on engagement and participation will have an impact on student assessment. (Tracy & Hendrix, 2008) The purpose of this study is to incorporate this recommendation in a high school math class to determine if interactive technology affects student learning at the secondary level and successfully prepares students for state mandated testing.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 5

Context The community where the study took place is located in central Arizona in the southwestern United States. In 2002, the State of Arizona mandated the Arizona Instrument to Measure Standards (AIMS) as a requirement to ensure that its student population met a satisfactory performance level in order to graduate with a high school (Arizona Revised Statutes in Section 15 Article 701). The study was conducted in a school district comprised of nine traditional and one alternative high school. The student population of this district is approximately 15,000 students with 900 teachers and administrators. The school district’s mission statement is to “Empower All Students for the Choices and Challenges of the Twenty-first Century.” In the 2008 Arizona LEARNS School profiles, seven of the high schools in this district were designated “Excelling,” the highest possible ranking. The remaining two traditional schools received “Highly Performing” labels, the second-highest ranking. The school where the study took place has approximately 1700 students locally on its campus as was classified as “Excelling.” The high school where the study was conducted was recently re-classified as a “Title I school”, in which 60% of the students qualify for free and reduced lunch. The school’s mission statement is “Every student can learn”. There were approximately 450 sophomore students --the focal group of the study --in the school each of the last two years. The mathematics area – the academic area focused on in the study -- consists of twelve highly-qualified teachers in the area of mathematics of whom five teach sophomore level mathematics. The AIMS review materials used during the study were used by all five teachers.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 6 During the first year of research there were 54 students who participated in the study vs. 515 students that took the AIMS math test. In the second year of research 56 students participated in the study vs. 460 students that took the AIMS math test. In the study, the two “experimental” groups were compared to each other as well as the rest of the student population – the “control group” -- during the appropriate academic years. Most of the technologies utilized by the sophomore teachers are the same in that classrooms have a computer at the teacher’s station, a projector, and an interactive whiteboard. The only exception is one technology, a Student Response Systems, which was used by the students in these experimental groups only. The Student Response Systems manufactured by eInstruction is a classroom set of handheld devices that allow students to communicate acquired concepts to their teachers in real time. Teachers can assess these acquired skills in test format or during a classroom lecture. This interactive technology promotes student engagement, participation and knowledge acquisition. Currently, eInstruction’s classroom performance system is being used by over 4 million students in more than 100,000 K-12 and higher education settings (einstruction.com, 2009)

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 7

Theoretical Framework Change is consistently occurring in all areas of life. Schools districts and individual schools are not an exemption to this fact and there have been many changes in school districts, schools, and classrooms. One of the more prominent changes in these entities has focused on technology availability and integration. In many areas, money for technology integration has not been distributed equally amongst states, schools districts, schools, or classrooms (Snyder & Prinsloo, 2007). Due to this disparity of technology within schools and classrooms the pressing issues of what technology to purchase, how it is implemented, and how can it change instruction for students and teachers have begun come to the fore. In 2002, the United States federal government passed the No Child Left Behind Act (Synder & Prinsloo, 2007) which includes (Title II Section D) a stated goal of assisting all students to cross the “digital divide� to ensure that they become technologically literate (Synder & Prinsloo, 2007). How can school districts and schools help students become technology literate if they cannot purchase and implement technology into their classrooms? These events place the successful use and implementation of technology in schools and classrooms squarely on the shoulders of administrators, teachers, and instructors. Teachers and instructors will benefit from using technology in their classrooms if their students are motivated and use technology in the class Great care must be taken when adopting new technologies into classrooms because they can interfere with the learning process (Hashemzadeh & Wilson, 2007). In order to avoid this interference, some classroom materials and philosophies would have to be abandoned in favor of a total redesign of the course (Hedberg, 2006). These new courses would have to extend beyond the current implementation -- and move beyond just passing information to students -- into making the students more interactive with the information (Hedberg, 2006). Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 8 Los Angeles-based educator and 1-to-1 computing advocate Gary Stager believes that when educators become aware of new ways for students to learn, they also realize that many of the traditional ways we expect students to learn are ineffective (Wambach, 2006). Christensen’s (Hedberg, 2006) idea of disruptive innovations describes how these changes with technology can occur and states: Disruptive innovations or technology is one that eventually takes over the existing dominant technology in the market, despite the fact that the disruptive technology is both radically different from the leading technology and that it often initially performs less successfully that the leading technology according to existing measures of performance, but over time the functionality or the attributes of the new way of doing things replace the other technologies. This idea of disruptive innovation can then be implemented into the concept that in order to achieve specific desired effects many tools, instead of one, may need to be employed to create a range of interactive activities using technical resources by the teacher to create motivation in the student so he/she will commit more time and energy to learning (Hedberg, 2006). There have been many research studies conducted on student engagement, participation, and performance in education over the past several years. Research that has focused specifically on student participation and engagement has shown that there is a direct correlation in student success to the time they are actively engaged (Wu & Huang, 2007). However, research studies that investigated student participation, engagement, and performance with technology infused into the course have occurred primarily at the post-secondary level. These studies have focused only on the implementation of a specific form of technology, instead of integrating multiple technologies into the course. In order to determine if technology can help improve student engagement, participation, and ultimately performance, all of the technologies need to be investigated as a single form.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 9 One of the systems researched to investigate student engagement, participation, and performance are remote devices that students used to answer questions and provided personal thoughts to the teacher regarding information being presented during class activities and presentations. These systems commonly referred to as Student Responses Systems, have begun to be implemented into educational settings with varying results. Pemberton, Borrego, and Cohen (2006) noted that with the use of student response systems final grades were not significantly different between similar groups, but the value of the learning and teaching in psychology classes that used the Student Response System was much higher than the psychology classes that did not use this technology at Texas Tech University. In addition, a study conducted by Texas Instruments and the University of Texas demonstrated that the ability to respond to questions anonymously created a non-threatening environment in which students felt they were equals, causing greater participation and engagement in class activities and discussions (Davis, 2003). Interactive whiteboards are a second form of technology that has been researched to determine their effects on student engagement, participation, and performance. Interactive whiteboards bring on a different form of student interaction with the teacher and students. Haldane (2007) noted in her research that although teachers at the beginning of her study did not feel interactive whiteboards would affect their planning and teaching of lessons, they did note that the interactive whiteboard had significantly changed their preparation and teaching practices by the end of the study. She continued, “It is the user of the board who chooses whether or not to take full advantage of the digital whiteboard’s interactive potential. The digital board simply provides an opportunity for interactivity to occur� (p. 259). It is this interactivity that allows the questions and suggestions to be posed by students, to help ensure that everyone in the classroom

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 10 has grasped the concept to further their own knowledge base. In addition, interactive whiteboards help in keeping the momentum of the class consistent. Teachers can simply tap an icon to bring up previous or new material to review at any time during class activities. This is in contrast to having to make sure the information is consistently available on the static whiteboard, which may get erased at any time. Lastly, many forms of technology are still relatively new to education (Moallem, Kermani, & Chen, 2005), which is especially the case when incorporating online testing systems. Paper and pencil tests traditionally do not show if students are acquiring the needed knowledge and skills through the instructional methods that their teachers are currently practicing (Woodfield & Lewis, 2003). Due to the time it takes to assess results from paper and pencil tests, many important missing pieces of information are lost in order to move on in the class. If the information was immediately available teachers could look at tests scores quickly, decide if the students met expectations in order to explore new topics in the class, or if concepts needed to be re-taught using different methods before moving on so the information makes sense at that present moment (Woodfield & Lewis, 2003). The ability to facilitate instantaneous feedback through online testing allows the teachers and students to quickly assess their own strengths and weaknesses in their teaching and learning (Moallem, Kermani, & Chen, 2005). This form of testing has proven to be more motivating than traditional test forms (Woodfield & Lewis, 2003). Teachers who have implemented this form of testing have additionally reported that student interest and attention have increased when using the computer to take class tests (Woodfield & Lewis, 2003). In most cases, teachers have reported that purchased online tests are appropriately challenging to all students, measure individual student performance, provide data that can be

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 11 compared and analyzed for specific purposes, and closely engage students and teachers in the educational process (Woodfield & Lewis, 2003). Instead of looking at each of the technical systems listed above individually, if a teacher were to look at them as a whole, Christensen’s idea of disruptive innovation would allow for an investigation into the impact on student motivation, participation, and performance. In order for successful tasks to occur with multiple forms of technology the following properties need to be in place within the classroom and supported by the teacher: 1) the tasks to be completed by the student are complex enough to engage students in upper level thinking and performance skills; 2) they exemplify authentic work and learning; 3) they are open enough to encourage different learning approaches, but constrained enough so they do not become out of hand; and 4) records and information can be collected and calculated quickly for assessment (Collins, Hawkins, & Frederiksen, 1993). By integrating interactive whiteboards, anonymous response systems, and immediate feedback systems together, students can accomplish these tasks in addition to interacting with the information as an individual or as a group, practicing self-assessments, taking assessments, and receiving feedback immediately (Collins, Hawkins, & Frederiksen, 1993). All of these pieces together can help facilitate non-dominating conversations, debates, and arguments so that ideas and knowledge can be shaped or reshaped allowing students to own the information and ensuring success in the future (Masters & Oberprieler, 2004). To confirm that learning, participation, and motivation are occurring it is suggested that work discussed should be tied to a classroom assignment. This would serve as encouragement for students to participate more in class (Masters & Oberprieler, 2004). With the teachers integrating multiple tools to increase participation, motivation, and performance in their classrooms, they now have vast amounts of data about their students at a

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 12 moment’s notice. Teachers who use all of these tools can track how students are interacting and performing in class (Morris, Finnegan, & Wu, 2005). With this information in hand, teachers can then direct students towards areas that refinement and review needs to occur (Morris, Finnegan, & Wu, 2005). This immediate directional change by the teacher demonstrates to the student the teacher is invested in their success in class. By the students knowing the teacher is invested in them, the students then begins to grow an excitement for learning and their confidence continues to grow so they are prepared for success on the next homework assignment, quiz, test, and the future. Teachers can have an enormous impact on their students by holding them accountable to the material that is being presented.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 13

Method

The district has implemented various forms of technology into classrooms to better prepare students for the AIMS test, including interactive whiteboards and LCD projectors. However, with these technologies student success on the AIMS math assessment has not improved and, in fact, has remained relatively unchanged throughout the last five years with a few moderate increases and decreases in the rate at which students pass. There have been no formal comparison studies to see if these technologies have made an impact on the students’ AIMS test scores and the technologies have been implemented sparingly and not used as a whole.

The high school district, in which the school-site where the study took place, has implemented a three-week AIMS preparation module into the curriculum of all sophomore math classes in conjunction with the technology they have previously implemented. This preparation came into effect due to low scores by too many students’ on prior AIMS tests. The experimental investigation coincided with the delivery of the AIMS preparation module lasting three weeks and including a pre-test, a post-test, and four review packets. The pre-test and post-test, although proctored by the teachers in their classrooms, are graded and analyzed by the high school district. Upon analysis completion pre-test and post-test scores are reported back to the teachers in a time delayed state. The four review packets consist of multiple choice questions designed to simulate the AIMS math assessment. Their intended purpose is for the student to review questions that have similar format, scope and sequence, and level of difficulty to the actual AIMS math assessment. Based of a previous study (Tracy and Hendrix, 2008), the best technology build out was to incorporate all the technologies previously implemented by the district with the addition of the Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 14 Student Response System. With the addition of this interactive technology teachers and students were able to communicate directly about what concepts were mastered in preparation for the AIMS math assessment. During the study the Student Response System was used in the testing mode to instantaneously show the students and teacher what concepts were mastered from the district created review module. From this information the teacher was able to immediately review only the concepts that were not at a mastery level, so that students could continue to progress through the material in a timely fashion. When compared to other teachers who had to manually grade the review packets and return them the next day, the student response system accomplished this time intensive task immediately. In addition, the Student Response System provided immediate analysis of student progress helping the teacher better prepare the students to focus only on weaker concepts. Whereas the teachers who had to grade by hand were at a disadvantage because they had to calculate this information on their own. In the two study groups the students would spend approximately 20 to 30 minutes of the allotted class time to work on small portions of the review packets and input their answers using the Student Response System. The remaining 20 to 30 minutes was spent covering the questions that were missed by the students. With the use of the Student Response System the teacher knew exactly which students missed which questions. This allowed the teacher to direct the attention towards the students that needed remediation and away from the students that had already mastered these concepts. During this remediation time the teacher would call on students that missed particular questions to work the problem on the interactive whiteboard and then would poll the class using the Student Response System so the student could get peer feedback to determine if they did the problem correctly. If the problem was done correctly the teacher would move on to the next

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 15 problem. If the problem was done incorrectly another student was randomly selected to rework the same problem on the interactive whiteboard to fix any mistakes that were made. The class was then polled again to make sure of mastery on this particular problem. The original student called to the interactive whiteboard, was then questioned to make sure that they understood the mistake they had made and why the correction was needed. Once an entire review packet was completed, the teacher would post the grades of all students for the class to see. When the teacher posted these scores students names were displayed next to their scores for the whole class to see. The teacher made sure to tell the class that every student is here to learn and nobody is to be made fun of for their score. After the posting of the overall score for the review packet, the teacher then posted the individual question report that also had students names displayed with what they had previously answered on the question. The teacher then went over this again question by question, filling in any gaps that the students may still have had on these problems. This process was repeated for the four review packets as well as the pre and post-tests.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 16

Findings It is hypothesized that the implementation of technology on reviews for state mandated testing reveal that the students are engaged, that the students will participate, and that their scores will have an increase of at least 10% in a three-week period. The goal of this study is to prove that you can get students to improve in preparation for state mandated testing at a faster rate using technology to foster your review session than without such technology. The use of several technology tools acting as one will not only increase student engagement and participation, but will also increase performance on the state mandated AIMS math assessment. The students will receive immediate feedback on their learning so they can ask questions for clarification and refinement. All students will at least achieve the “meets” rating on their AIMS assessment. Thirty percent of all the students will receive an “exceeding” rating on their AIMS assessment.

The study is centered on a group of 54 students (Year 1 grouping), and a group of 56 students (Year 2 grouping) who took a district pre- and post- assessment. The study measured growth for the three-week periods in which the district review module was implemented. The ultimate success will be measured by the AIMS ratings for each student. Minimal growth would be defined as zero to ten percent, whereas substantial growth would be any growth over 10%. The number of students who will achieve a “meets” or “exceeds” rating on the AIMS assessment will determine the ultimate success of the project. After the state reports back the AIMS results to the district the researchers will be able to determine if the intervention was a success or not.

The study participants not only met the desired outcomes that were set forth, they completely exceeded them. With the substantial growth goal being set at 10%, only one student Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 17 out of the 54 students in study group #1 (Appendix A & B) , and 3 students out of the 56 students in study group #2 (Appendix C & D), had less than substantial growth. The highest level of growth experienced in study group #1 was 50% while the highest in study group #2 was a 61% increase in scores during the three week study window. The average growth for all the students in study group #1 was 23% and in study group #2 was 29.97% (see Appendix E- J for comparison data). This result was rather unexpected, but proves that the use of technology that the students are familiar with had a profound impact on the district reported scores. The last part of this study was to see if this interactive technology would ultimately have a profound impact on the AIMS math assessment. While there would be no pretest to show growth on the AIMS math assessment, one can compare the study group with the entire school population. Quantitative data analysis demonstrated that the reviews associated with the implemented technologies helped students receive a high score on their AIMS math assessment when compared to students who received reviews without the implemented technology. Of the 54 students who participated in the study during year 1, zero failed or approached on the AIMS assessment. All 54 students (100%) met the minimum passing score on the AIMS assessment. The average scored of this group was 747 with 900 being a perfect score. Of the 54 students, 25 of them received an “exceeding” rating, the highest rating on the assessment. The remaining 29 students received the “meet” rating, the second highest rating on the assessment. Of these 29 students, 11 of them missed the “exceeding” rating by one correct response on the assessment. 46% of the students “exceeded” on the AIMS math assessment. When compared to the rest of the student population only 87 students received an “exceeding” rating on their AIMS test, which means that 29% of the total exceeding population of the school came directly from the study group (see appendices K, L, M for AIMS data).

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 18 Of the 56 students who participated in the study during year 2, zero failed or approached on the AIMS assessment. All 56 students (100%) met the minimum passing score on the AIMS assessment. The average scored of this group was 756 with 900 being a perfect score. Of the 56 students, 27 of them received an “exceeding” rating, the highest rating on the assessment. The remaining 29 students received the “meet” rating, the second highest rating on the assessment. Of these 29 students, 7 of them missed the “exceeding” rating by one correct response on the assessment. 48% of the students “exceeded” on the AIMS math assessment. Included in these scores was one student who scored a perfect 900 on their AIMS math assessment. Due to a state embargo on the school and district data, we do not at this time have access to any other students’ scores (see appendices K, L, M for AIMS data). When comparing study group #1 to study group #2 there was an average increase of nine points. In study group #2 the average score was six points above the required score of 750 to get an “exceeding” rating. Whereas in study group #1 the average score was three points below the required score of 750 to get an “exceeding” rating. Qualitative data analysis demonstrates that this increase in average raw score is due to higher expectations from the teacher at the beginning of the review module, more familiarity with the technology and method, a change from a teacher centered model to a student centered model, and class buy in due to knowledge from study group #1. The initial growth data was staggering enough to justify that this research has been a success, and after the actual AIMS numbers to the study are added there was an amazing differentiation of how this technology impacted the way students can prepare for state mandated testing. The research shows that the study groups learned and retained the information being presented to them for a longer period of time. The study groups were able to take the

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 19 information and apply it to a high stakes test regardless of any predisposed fear of testing. They used the knowledge gained through the technology to achieve something that the researcher never thought was possible.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 20

Implications The researchers learned many things from this study. First the researchers learned that regardless of what students you have, if you give them the necessary tools to be successful they can and they will be successful. Educators need to look at what tools students can benefit the most from and try to create a learning environment in which students can achieve greater success. Secondly, the researchers learned that technology is already all around us, and we must embrace it to help our technology savvy students learn in the same way they play. Students use technology all the time, as teachers we are doing our students a disservice by taking away the one thing that keeps them going, technology. Although the use of technology, the desired outcomes, and way the data was obtained never change, the class structure and instruction did modify over time to met student needs. In the first year of research, the researcher followed a teacher centered model where he would constantly try to help out students using a simple explanation of how to do the problems. In second year of the research, the researchers let the students explain more of how they did the problems and why they did the problems that way. Instead of this being a teacher driven review session as in year one, the researchers would let it be a student centered review session where they put in all the input to other students in the second year. The change of instruction can be the indicator why students performed better earlier in the study window in the second year over the first year. Due to the benefits of the student centered model being implemented student accountability was enhanced. The students in the first year of the study were only accountable to themselves, which takes a special student to remain on task and keep focus for the whole entire time of the study. As the researcher, posting of student scores without names for all to see Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 21 created a level of accountability without the humility of other students seeing your name. Doing this increased the anxiety level of the student just one step more and allows them to have a sense of accountability placed on them by the other students in the room. It goes back to the old saying; it takes a community to raise a child. In this aspect you are allowing the students to help each other out. The researchers foresaw weaker students seeking out stronger students for assistance and clarification in order to improve their own scores. This idea of transparency for all students is generally not well received by the educational community and is looked down upon. The main thing about this is you have to maintain an open learning environment for all students to be successful. In conducting further research using this same type of technology and method, the researcher recommends that a learning environment be created where students feel comfortable sharing ideas with the instructor. A positive learning climate must be maintained in order to achieve the same amount of success. Classroom management is a huge key to the successful results obtained from this study. Keeping students on task and free from misbehaving is absolutely vital to maintain an engagement level you need for this study. Lastly being a good teacher and adapting to the variety of students is necessary. The teacher cannot be rigid in their presentation, must be willing to stop and take the time to answer any necessary questions that the students have.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 22

References Collins, A., Hawkins, J., & Frederiksen, J. (1993, 1993/1994). Three Different Views of Students: The Role of Technology in Assessing Student Performance. Journal of the Learning Sciences, 3(2), 205. Retrieved October 30, 2007, from Academic Search Premier database. Davis, S. (2003). Observations in classrooms using a network of handheld devices. Journal of Computer Assisted Learning, 19(3), 298. EInstruction-Simple Solutions. Real Results. 23 June 2009 <http://www.einstruction.com/>. Haldane, M. (2007, September 1). Interactivity and the Digital Whiteboard: Weaving the Fabric of Learning. Learning, Media and Technology, 32(3), 257. (ERIC Document Reproduction Service No. EJ772462) Retrieved September 30, 2007, from ERIC database. Hashemzadeh, N., & Wilson, L. (2007, September). Teaching With The Lights: Out What Do We Really Know About The Impact Of Technology Intensive Instruction?. College Student Journal, 41(3), 601-612. Retrieved October 30, 2007, from Academic Search Premier database. Hedberg, J. (2006, July). E-learning futures? Speculations for a time yet to come. Studies in Continuing Education, 28(2), 171-183. Retrieved October 30, 2007, from Academic Search Premier database. Hodge, S., & Anderson, B. (2007, September 1). Teaching and Learning with an Interactive Whiteboard: A Teacher's Journey. Learning, Media and Technology, 32(3), 271. (ERIC Document Reproduction Service No. EJ772451) Retrieved September 30, 2007, from ERIC database. Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 23 Masters, K., & Oberprieler, G. (2004, May). Encouraging equitable online participation through curriculum articulation. Computers & Education, 42(4), 319. Retrieved October 30, 2007, from Academic Search Premier database. Moallem, M., Kermani, H., & Chen, S. (2006, January 11). Handheld, Wireless Computers: Can They Improve Learning and Instruction?. Computers in the Schools, 22(3-4), 93. (ERIC Document Reproduction Service No. EJ736525) Retrieved September 30, 2007, from ERIC database. Morris, L., Finnegan, C., & Wu, S. (2005, July). Tracking student behavior, persistence, and achievement in online courses. Internet & Higher Education, 8(3), 221-231. Retrieved October 30, 2007, from Academic Search Premier database. Pemberton, J., Borrego, J., & Cohen, L. (2006, January 1). Using Interactive Computer Technology to Enhance Learning. Teaching of Psychology, 33(2), 145. (ERIC Document Reproduction Service No. EJ736342) Retrieved September 30, 2007, from ERIC database. Snyder, I., & Prinsloo, M. (2007). Young People's Engagement with Digital Literacies in Marginal Contexts in a Globalised World. Language & Education: An International Journal, 21(3), 171-179. Retrieved October 30, 2007, from Academic Search Premier database. Tracy, B., & Hendrix, J. (2008). Using Technology in the Secondary Classroom to Assess Engagement, Participation, and Performance. Masters of Technology Education Action Research Project. Northern Arizona University. Woodfield, K., & Lewis, J. (2003, January). Getting on Board With Online Testing. T.H.E. Journal, 30(6), 32. Retrieved November 4, 2007, from Academic Search Premier

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 24 database. Wambach, C. (2006, September). From Revolutionary to Evolutionary: 10 Years of 1-to-1 Computing. T H E Journal, 33(14), 58-59. Retrieved October 30, 2007, from Academic Search Premier database. Wu, H., & Huang, Y. (2007, September 1). Ninth-Grade Student Engagement in TeacherCentered and Student-Centered Technology-Enhanced Learning Environments. Science Education, 91(5), 727. (ERIC Document Reproduction Service No. EJ772556) Retrieved September 30, 2007, from ERIC database.

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 25

Appendix A Class #1 (Year 1) from pre-test to post test Pretest

In Class #1

Growth #1

In Class #2

Growth #2

PostTest

Growth Post

Overall Growth

78 60 72 52 62 68 36 48 72 46 86 76 76 52 54 74 34 64 36 82 80 82 64 46 38

84 72 80 66 76 74 66 51 78 82 90 76 86 65 58 76 56 70 50 88 82 80 84 76 48

6 12 8 14 14 6 30 3 6 36 4 0 10 13 4 2 22 6 14 6 2 -2 20 30 10

88 82 82 78 86 90 82 54 92 90 92 86 86 66 62 86 80 86 65 94 90 86 84 82 65

4 10 2 12 10 16 16 3 14 8 2 10 0 1 4 10 24 16 15 6 8 6 0 6 17

88 90 82 88 88 92 86 72 96 96 98 96 96 80 67 88 82 90 78 98 94 94 90 84 76

0 8 0 10 2 2 4 18 4 6 6 10 10 14 5 2 2 4 13 4 4 8 6 2 11

10 30 10 36 26 24 50 34 24 50 12 20 20 28 13 14 48 26 42 16 14 12 26 38 38

61.52%

72.56%

11.04%

81.36%

8.80%

87.56%

6.20%

26.44%

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 26 Appendix B Class #2 (Year 1) from pretest to post test Pretest

In Class #1

Growth #1

In Class #2

Growth #2

Post Test

Growth Post

Overall Growth

64 60 52 70 76 62 54 40 54 60 74 80 46 82 46 52 86 64 74 78 74

74 68 59 75 88 63 62 53 67 68 76 84 60 82 62 66 86 69 86 78 82

10 8 7 5 12 1 8 13 13 8 2 4 14 0 16 14 0 5 12 0 8

86 73 66 80 88 72 92 54 78 71 82 90 73 94 69 73 94 76 86 80 86

12 5 7 5 0 9 30 1 11 3 6 6 13 12 7 7 8 7 0 2 4

92 84 68 84 92 80 96 58 80 86 88 92 80 100 84 82 98 80 92 86 90

6 11 2 4 4 8 4 4 2 15 6 2 7 6 15 9 4 4 6 6 4

28 24 16 14 16 18 42 18 26 26 14 12 34 18 38 30 12 16 18 8 16

64.19%

71.81%

7.62%

79.19%

7.38%

85.33%

6.14%

21.14%

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 27 Appendix C This is class #1 (Year 2) from pre-test to post-test Pretest

In Class Test #1

Growth #1

In Class Test #2

Growth #2

PostTest

Growth post

52 58 64 53 78 75 58 78 72 44 67 53 47 81 67 72 67 64 81 69 72 58 61 58 56 53 75

70 90

18 32

4 -4

76 82 78 86 86 74 68 74

23 4 3 28 8 2 24 7

78 88 82 84 80 68 92 90 88 68 78 86 74 80 82

31 7 15 12 13 4 11 21 16 10 17 28 18 27 7

74 86 88 80 92 90 84 86 78 78 82 78 80 96 92 86 82 72 84 90 82 64 82 84 76 80 88

2 8 10 2 2 4 -8 0 -6 -4 4 -2 2 0 6

82 98 96 90 98 84 96 92 88 76 84 90 94 88 92 92 92 88 98 100 92 86 88 96 80 90 100

8 12 8 10 6 -6 12 6 10 -2 2 12 14 -8 0 6 10 16 14 10 10 22 6 12 4 10 12

30 40 32 37 20 9 38 14 16 32 17 37 47 7 25 20 25 24 17 31 20 28 27 38 24 37 25

64.19

80.08

15.89

82.74

2.66

8

26.56

Copyright Jeremy Hendrix, Brett Tracy 2009

4 10 12 -2 0 4 10 8

90.74

Overall Growth


Continuing to Assess 28 Appendix D This is class #1 (Year 2) from pre-test to post-test Pretest

In Class Test #1

Growth #1

In Class Test #2

Growth #2

Posttest

Growth post

Overall Growth

58 53 50 75 78 50 44 64 33 58 56 72 47 42 64 39 75 50 75 53 42 75 58 50 42 56 47 86 58

86 80 84 86 82 64 74 86 70 78 72 80 68

28 27 34 11 4 14 30 22 37 20 16 8 21

-16 -2 -6 8 6 8 0 2 12 10 10 4 16

64 74 82 82 78 70 78 90 74 68 66 84 82 94 64

0 35 7 32 3 17 36 15 16 18 24 28 35 8 6

70 78 78 94 88 72 74 88 82 88 82 84 84 78 74 72 88 82 82 70 66 92 78 74 78 82 86 88 82

10 -2 6 0 4 0 -12 2 4 6 12 -2 4 -6 18

86 90 96 98 94 82 84 88 94 92 92 94 80 90 96 94 92 88 90 90 86 98 92 86 90 88 94 92 82

16 12 18 4 6 10 10 0 12 4 10 10 -4 12 22 22 4 6 8 20 20 6 14 12 12 6 8 4 0

28 37 46 23 16 32 40 24 61 34 36 22 33 48 32 55 17 38 15 37 44 23 34 36 48 32 47 6 24

56.90

77.14

20.25

80.48

3.34

90.28

9.79

33.38

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 29 Appendix E

Growth of Math Classes when studying for Arizona State Math Assessment (percent scores)- 1st Year Group 1 - Year 1

64

69.14

Pretest

67.12 68.86

In-Class Test #1

Copyright Jeremy Hendrix, Brett Tracy 2009

Group 2 - Year 1 81.48 78.33

86.88 84.19

In-Class Test #2

Post Test


Continuing to Assess 30 Appendix F

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 31 Appendix G

Growth Comparison in Math Classes when studying for Arizona State Math Assessment (raw score) - 1st Year Group 1 - Year 1

14.36

19.28 9.48

3.12 -0.28 Growth Between pre-test and in class #1

Group 2 - Year 1

5.4

Growth between In Growth between in class Test #1 and class test #2 and in class #2 Post Test

Copyright Jeremy Hendrix, Brett Tracy 2009

15.05

5.86

Growth between Pre-test to PostTest


Continuing to Assess 32 Appendix H

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 33 Appendix I

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 34 Appendix J

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 35 Appendix K

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 36 Appendix L

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 37 Appendix M

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 38 Appendix N

Copyright Jeremy Hendrix, Brett Tracy 2009


Continuing to Assess 39 Appendix O

Copyright Jeremy Hendrix, Brett Tracy 2009


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.