(This is a sample cover image for this issue. The actual cover is not yet available at this time.)
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier’s archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/copyright
Author's personal copy
Computers & Education 58 (2011) 397–412
Contents lists available at ScienceDirect
Computers & Education journal homepage: www.elsevier.com/locate/compedu
Opening The Door: An evaluation of the efficacy of a problem-based learning game Scott J. Warren a, *, Mary Jo Dondlinger a, b,1, Julie McLeod c, 3, Chris Bigenho a a
Department of Learning Technologies, College of Information, University of North Texas, 3940 N. Elm, Rm. G150/G187, Denton, TX 76207, USA Richland College, 2800 Abrams Rd., Dallas, TX 75243, USA2 c Department of Learning Technologies, College of Information, 3940 N. Elm, Rm. G150 Denton, TX 76207, USA
b
a r t i c l e i n f o
a b s t r a c t
Article history: Received 11 March 2011 Received in revised form 29 June 2011 Accepted 10 August 2011
As higher education institutions seek to improve undergraduate education, initiatives are underway to target instructional methods, re-examine curricula, and apply innovative technologies to better engage students with content. This article discusses the findings of an exploratory study focused on a course redesign that game elements, PBL methods, and 3-D communication tools in an introductory computing course. Some of these findings included an appreciation for how the technology skills gained in the course applied to the world of work, an understanding of the significant role that interpersonal communications play in learning and in career success, a sense of empowerment fostered by access to resources, and an increased willingness to play, explore, and experiment with tools, content, and design processes. Ó 2011 Elsevier Ltd. All rights reserved.
Keywords: Blended learning Course design Undergraduate Educational game Achievement Problem-based learning
1. Introduction In an effort to enhance the quality of the undergraduate experience in large-enrollment introductory courses, one state in the southwestern United States provided universities with small grants that target initiating and supporting research focused on the redesign of these courses. Goals of the course redesign project included using innovative instructional design methods, small group instruction, and communications technologies to improve student satisfaction, critical thinking and problem-solving skills in large group instruction (LGI). The specific course targeted by this redesign was an introductory course in basic computer applications used in educational settings. Not only was this a large-enrollment course, the pre-existing format of the course was problematic for students because it targeted overly finegrained learning objectives that have shown little transfer from the narrow context of the course tasks themselves to the course assessments, let alone support skill transference into future school and work contexts. In addition to the university’s goals, a major goal of this course redesign was to better prepare undergraduates for their future work beyond the classroom as professionals in the workplace, using technology tools to communicate effectively with peers and future employers. This article details the analysis leading to the resulting curricular redesign, provides an overview of the key elements of the design, and an evaluation of the effectiveness of those design elements in the pilot implementation of the course. Evaluation of the course redesign involved mixed methods, both quantitative and qualitative. Quantitative measures included drop, failure, and withdrawal rates, analysis of posttest achievement scores, and summative course evaluation scores. The qualitative methodology included constantcomparative analysis of student semi-structured interviews. These methods were used to better contextualize the quantitative findings and develop a richer understanding of student experiences in the redesigned course. The results of our evaluation of this redesign along with the theoretical foundations and description of the redesign are reported in this article.
Abbreviation: PBL, problem-based learning. * Corresponding author. Tel.: þ1 940 765 2799; fax: þ1 940 565 4194. E-mail addresses: scott.warren@unt.edu (S.J. Warren), mdondlinger@dcccd.edu (M.J. Dondlinger), jkmcleod@unt.edu (J. McLeod), cbigenho.unt@gmail.com (C. Bigenho). 1 Tel.: þ1 972 238 6340. 2 Tel.: þ1 972 998 0288; fax: þ1 940 565 4194. 3 Tel.: þ972 998 0288; fax: þ940 565 4194. 0360-1315/$ – see front matter Ó 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2011.08.012
Author's personal copy
398
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
1.1. Background Prior to redesigning the course, the instructional design team conducted a needs analysis of the existing course to identify structural problems to be addressed by the redesign (Warren & Dondlinger, 2008). This involved both document analysis of the course materials conducted according to methods suggested by Robson (2002) followed by a series of interviews with current and past instructors as well as twelve of the nineteen students in one section of the course. Questions in these interviews were open-ended to allow instructors and students to provide whatever information they felt was relevant. The original course used an Adobe FlashÔ-based computer-based instruction (Kulik & Kulik, 1991) program called SAM 2003 (Thomson Course Technology, 2007). Following completion of software modules, students take an examination to show that they have acquired the skills and knowledge provided through the software training tasks. No help is provided during the exam, and the instructor can set a limit to the number of times a student may take it. A typical week in the existing face-to-face course design had students attend class for 3 h each week and individually work on the tutorials, practice, and exams in the SAM 2003 software. The instructor was expected to help students as needed and troubleshoot technical problems with the software; however, no direct instruction through lecture or other means was provided as the software was expected to do this. From the instructor and student semi-structured interviews, the design team specifically identified the following challenges with the software and the course: 1. Most learning objectives focused on a narrow set of skills such as how to open, save, or close a file, and the functions in one program were not linked to others. The creation of a chart in Excel has applications in Word, Publisher, and PowerPoint. Understanding how the programs can be used in a complementary fashion is an important objective that wasn’t addressed in the existing curriculum. Moreover, the ways to perform functions, such as opening or saving a file, are the same for each program. Yet the tutorials took students through each of the steps for these actions with each program in the suite. Students found this boring and repetitive, particularly since most of them were already familiar with these basic actions from previous computer applications courses at the secondary level. 2. Computer-based assessments and instruction were too rigid. There are commonly several ways to perform an action in Microsoft Office, but the program recognized only one. In some instances, the practice exercise asked a student to complete a task in one manner, while the assessment compelled another. Students expressed frustration with this problem throughout the course. 3. Applications of knowledge were decontextualized and lack authenticity. Students felt that the applications of learning expected in both the training practice and exams did not fit well with their life experiences. Students saw little relevance between course content and their future work. They also found the repetitive drill and practice unengaging. 4. Computer-based instruction provided little interaction with peers and the instructor. Although the software program provided some student-to-system interaction, it did not provide ways for students to interact with each other. Moreover, students felt that the software program essentially relegated the instructor to the role of grade collector and troubleshooter of the software rather than experts on effectively using the Office Suite. Instructors also felt disengaged and some reported that they sought to integrate elements of the Scholarship of Teaching and Learning (Bass, 1999) or Chickering and Gamson’s (1987) Seven Principles for Good Practice in Undergraduate Education to improve student learning and their own teaching, but were restricted by the system. Further, while the existing course explored the history of the Internet and the basic construction of a web page, it did not introduce students to online productivity and collaboration tools that could be invaluable to them in both their academic and professional careers. Many of these computer literacy skills were also repetitive for students who were expected to have mastered them by 8th grade according to state standards (Texas Education Agency, 2010). Students and instructors also found that the course did not leverage online tools, such as wikis, weblogs, and social networking sites, that can be used to improve the viability and productivity of OfficeÔ programs.
1.2. Literature review University requirements dictated that course redesigns include technology and teaching methods that improved student satisfaction and better engaged students in meaningful interactions with peers and instructors. Following the preliminary analysis of issues with the existing course, the design team examined the literature on technology and student satisfaction. While the previous course design also included a technology component, institutional data indicated low student satisfaction as well as high drop, failure, and withdrawal rates. Thus, we felt it important to understand the situational factors in technology integration that lead to increases or decreases in student satisfaction. We also examined problem-based learning (PBL), known for enhancing critical thinking and engagement. Since members of the design team have expertise in the area of games for learning, they also explored how they might combine elements of game design with PBL in an emerging game genre, the alternate reality game (AltRG), to further engage students in compelling and meaningful interaction with course content, peers, the instructor, and their future careers. 1.2.1. Technology and student satisfaction Using technology to engage students and increase their overall satisfaction with learning is not new. Many studies have confirmed that satisfaction in technology-based learning environments is equal to that of traditional face-to-face classrooms (Schoech (2000); Spooner, Jordan, Algozzine, and Spooner (1999); Wernet, Olliges, and Delicath (2000). Indeed, a recent study by the International Center for Media and the Public Agenda led by Moeller (2010) at the University of Maryland asked students to go 24 h without technology; most reported their experience in terms of psychological withdrawal symptoms as it was such a central part of their lives. As far back as 2001, a study by Kubey, Lavin, and Barrows correlated high Internet use with lower academic performance while a more recent study in 2009 by Karpinski and Duberstein similarly correlated high use of the social network FacebookÔ among students with lower academic performance. According to a report by Borreson Caruso and Salaway (2007) for EDUCAUSE, undergraduates spend an average of 18 h a week using technology just for course work, with more than 80% preferring moderate or high use of information technology in their courses.
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
399
Nevertheless, all studies did not yield the same results. Rivera and Rice (2002) compared a traditional course to a hybrid and full-web course and found satisfaction to be substantially lower in the full-web course and highest in the traditional course. Irons, Jung, and Keel (2002) compared an interactive television course that used the web to enhance the course. Satisfaction was lower in the course that required students to use the web. However, it was not the web-enhancements to the course that dissatisfied students; it was the lack of access to the Internet and posted materials. Hara and Kling (1999) found that student frustrations were high in a text-based Multi Object Oriented environment (MOO). Nevertheless, it wasn’t the environment that aggravated students; it was confusion over what to do and how to do it once inside it. Thus, ensuring access to materials and providing clear instructions for learning activities are key to satisfying learner needs. Other studies found interaction to be the key to student satisfaction (Ahern & Repman, 1994; Frederickson, Pickett, Shea, Pelz, & Swan, 2002). Further, Hassenplug and Harnish (1998) found that the mere perception of interaction, “rather than the actual amount, may relate to student satisfaction with the distance learning experience” (p. 599). However, too much interaction can cause learners to feel overwhelmed, resulting in cognitive overload (Hara & Kling, 1999; Mason & Weller, 2001). Nevertheless, an absence of interaction leaves students feeling isolated. Thus educators using technology for teaching should encourage interaction, yet carefully guide student interaction to maximize learning and curb frustration. These findings correlate with students’ frustrations in the course selected for this redesign as students and instructors both found the lack of interaction to be a major problem in the previous technology-based format. 1.2.2. Media and method The burst of development in Internet technologies and their application to learning environments spawned the now famous debate between Clark (1994a, 1994b) and Kozma (1991, 1994) regarding instructional media versus methods. Clark maintained that media can never influence learning, and it is instead the instructional method responsible. In contrast, Kozma asserted that media may influence certain aspects of cognition and further that media and method are dependent upon one another to effectively deliver learning experiences. This particular redesign and evaluation effort does not attempt to evaluate the various media embedded in the instructional design. Instead, it focuses on the instructional methods that shaped the design. While a variety of media were deployed in the redesign, it played a supporting role in providing a situated learning context for student activities and instruction (Anderson, Reder, & Simon, 1996; Brown, Collins, & Duguid, 1989; Lave & Wenger, 1991; Vanderbilt, 1993). 1.2.3. Problem-based learning (PBL) Course designers chose a problem-based learning approach because PBL has been correlated with improving post-secondary learning experiences by providing authentic contexts for learning (Bonk, Kirkley, Hara, & Denned, 2001) and compelling students to engage in storydriven, problem-centered tasks (Jonassen, 1999; Jonassen & Hernandez-Serrano, 2002; Warren, in press, 2006b). Tiwari and Lai (2002) also found that PBL encourages learners to hone a variety of thinking skills: “analyze and synthesize data; develop hypotheses; apply deductive reasoning to a problem situation; draw conclusions after analysis, synthesis and evaluation of new information; synthesize strategies/ solutions; and monitor and evaluate their own thinking process” (p. 2). Other authors have met with success when they combined a problem-based learning approach with proven critical thinking strategies to enhance the existing strengths of PBL methods and provide additional scaffolding for learners (DiPasquale, Mason, & Kolkhorst, 2003; Elder & Paul, 2002; Keller, 2002; Willis, 2002). One technique has been to require student self assessment of their own thinking and encourage metacognitive processing (Bransford, Vye, Bateman, Brophy, & Roselli, 2003; Meyerson & Adams, 2003). Another study found that increased metacognition was correlated with the use of PBL and may also stimulate increased transfer of knowledge to new contexts and settings (Lin, Hmelo, & Kinzer, 1999). Furthermore, the reported satisfaction rates of students engaged in some PBL-based courses has been found to be higher than those rates reported by students in traditional learning environments (Ge & Land, 2003). It is therefore reasonable to expect that using such methods in a post-secondary course would lead to comparable results. Savery and Duffy (1995) have noted that “knowledge evolves through social negotiation and the viability of individual understandings” (p. 2) where the interaction between student and peers as well student and instructor becomes central to learning. These principles formed the framework for redesign of the course, particularly because interaction was an element that the students and instructors indicated they missed in the previous course format. These principles also provided a cogent set of criteria to include in the instructional design and its evaluation: 1. 2. 3. 4. 5. 6. 7. 8.
Anchor all learning activities to a larger task or problem. Support the learner in developing ownership for the overall problem or task. Design an authentic task. Design the task and the learning environment to reflect the complexity of the environment they should be able to function in at the end of learning. Give the learner ownership of the process used to develop a solution. Design the learning environment to support and challenge the learner’s thinking. Encourage testing ideas against alternative views and alternative contexts. Provide opportunity for and support reflection on both the content learned and the learning process. (p. 3–6)
Elder and Paul (2002) and other authors have successfully used each of these to design instruction in undergraduate and graduate courses. However, they sought to further engage students by leveraging today’s technologies, such as digital games, as a means of embedding scaffolds for student learning and providing a coherent narrative within which to place the ill-structured problems that would drive knowledge construction. 1.2.4. Problem-based learning merged with game elements As part of the process of developing the course, the instructional designers analyzed games to determine where they intersect with problem-based learning principles as means to support learning and engagement. Researchers such as Dede, Ketelhut, and Ruess (2006) and Squire (2006) point to and experiment with the use of digital games as a means of supporting inquiry-based learning in areas ranging from
Author's personal copy
400
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
social studies to science. A sense of play present in these learning environments that is thought to better prepare learners for a world in constant change and evolution because it allows them to test ideas and concepts more freely. By play here, we refer back to the Vygotskyinfluenced (1978) definition that two of the authors of this study applied in their earlier work on game-based learning environments, “Play.(is an activity that) allows for the mind’s exploration of the rules and consequences of engaging with or breaking them” (Warren, Dondlinger, & Barab, 2009, p. 490). Although it was published long after the design and research reported here was conducted, the recent book, A New Culture of Learning: Cultivating the Imagination for a World of Constant Change, affirms these principles. In the book, Thomas and Brown (2011) treat learning as a means of inquiry that is integral to learning through games. We analyzed the list of social constructivist design elements given by Savery and Duffy (1995) above against elements of game structures identified by Salen and Zimmerman (2004) and Crawford (2003), and noted several overlapping components. Specifically, both games and constructivist elements include:
strategically constructed conflict or problem context for engagement with the conflict/problem goals or objectives for play or learning normative rules or conditions governing play or learning a quantifiable outcome with a means for assessing success scaffolds to support learners when challenges are too great cognitive conflict emerging from interaction with a designed problem
Fig. 1 illustrates the parallel components of PBL and game environments. Where these two environments overlap is in situating learning tasks (rather than other types of game goals) in the narrative context as the problem to be resolved and in providing instruction through innarrative characters known as pedagogical agents. 1.2.5. Alternate reality games Both the university’s timeline and budget prohibited designing and developing a learning game in a stand-alone, immersive world. However, a new genre of game, the Alternate Reality Game (AltRG) provided a welcome alternative. The AltRG distributes game challenges, tasks, and rewards across a variety of media both digital and real. As described by the International Game Developers Association (Martin & Chatfield, 2006), “Alternate Reality Games take the substance of everyday life and weave it into narratives that layer additional meaning, depth, and interaction upon the real world” (p. 6). CNET staff writer, John Borland (2005), depicts them as “an obsession-inspiring genre that blends real-life treasure hunting, interactive storytelling, video games and online community” (para. 4). Some AltRGs such as ilovebees.com served as marketing for the release of Microsoft’s Halo 2. Others have an educational focus such as Hexagon Challenge which sought to “address decision-making skills, after-action report generation, and adaptation to performance” (Bogost, 2007). While not explicitly educational, others have dealt with social, economic, and environmental justice challenges. For example, Jane McGonigal has masterminded AltRGs as a means to construct what she describes as “collective intelligence.” The purpose of her 2007 ALTRG World Without Oil (Ernst, 2007) was to “play our way to a set of ideas about how to manage that crisis [a dramatic decrease in oil availability]” (cited in Strickland, 2007, p. 1). McGonigal noted that learners developed strategies for coping with a peak oil crisis and further claimed changes to their real world activities (Strickland, 2007). We thus conjecture that the simulated problem in the AltRG prompted players to create real world applications of knowledge constructed with their peers in a simulated play space. 2. Material and method 2.1. Course redesign Overlapping problem-based learning and alternate reality game principles were used to redesign the computer applications course. Rather than listen to lectures, complete practice exercises, and take frequent multiple-choice tests, students hone their technology skills by
Fig. 1. Conception of similarities between PBL and game structures.
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
401
solving a series of ill-structured problems using the very tools they are expected to learn. These problems are contextualized in a narrative story posed by fictional clients. Students work on each task or problem in small groups that ranged from three to five, using a variety of productivity and communication tools. We describe the full redesign of the course in terms of the principles from Savery and Duffy (1995) listed previously. 2.1.1. Principle 1: Anchor all learning activities to a larger task or problem This PBL principle aligns with two game design elements: conflict or problem and narrative plotline. In the redesign, the entire course was anchored to and contextualized within an overarching story, which was used to drive course activities. This story, which unfolds gradually to students throughout the duration of course, is provided to instructors in The Door Codex (the instructor’s manual for the course) as follows: “Frustrated at their inability to influence the World of Man as their worshippers have faded away over time, the ancient Greek deities are now cyber-beings that live entirely as digital impulses on the web. They act independently and are building up their empires once more, this time in the digital ether. Some have been more successful than others at manipulating the systems and virtual worlds that now make up their homes. Since the advent of 3D immersive environments, they have found a new faces and new troubles..” From The Door Codex (Warren, 2009). This underlying narrative anchored a series of seemingly unrelated problem-based tasks to a larger conflict or problem and engaged students in game structures that included puzzles, codes, and ciphers that they must solve, retrieve or use correctly in order to gain access to learning materials, information, and resources that provide additional scaffolding and narrative support for the problem-based tasks. 2.1.2. Principle 2: Support the learner in developing ownership for the overall problem or task While The Door AltRG course was designed with the above storyline in mind, that narrative was used to contextualize a series of more immediate and authentic tasks. To support the learner in developing ownership for the overall problem or task, a two-tiered narrative structure provided context and framed course problem-solving activities. The top tier of this narrative engaged students with fictional clients who “hired” student teams to complete problem-based tasks that required solutions using the technology skills students were honing in the course: proficiency with MS Office programs, for example. Designers constructed a series of six PBL scenarios, each involving a fictional character that needed assistance in solving a problem. Students were “hired” by these fictional clients rather than given an assignment by the instructor. Although the instructor role-played as the various clients, answering student inquiries and clarifying specifications for the finished product, course designers aimed to support learner ownership for the problem by leveraging a client/service-provider relationship in lieu of the instructor/student dynamic. Students were introduced to and communicated with the clients through the clients’ social networking sites (i.e. Facebook, WordPress blogs) and email accounts. When they emailed the instructor at his email address with questions for the fictional client, he directed them to visit the client’s respective website or to email the client directly. The first problem is introduced by a character named Hester and is presented both in Podcast (audio) and text. The instructor and the course syllabus refer to this problem as Hester’s Task, but references to “Demeter’s Tale” provide some clues to students that there’s more to the story than Hester is telling. The following are the directions provided by Hester: Hello again! I’m so glad you agreed to help. Not that I left you with much choice, but never mind. There is much to do! The first thing I need you to do is to become familiar with the Internet and the parts of a computer. Before I can send you along any further, I’ll need some evidence that you know what you’re talking about. On the Moodle, your professor posted some resources that discuss the basic parts of a computer, the history of the Internet, the role of the Internet, and some other goodies related to ethical behavior on the web and what he calls ’netiquette,’ though I don’t think he coined the term. What I want you to do with all that is to use a word processing program like Notepad or Word to create a summary of each. You may also want to include an image that you feel represents each major item. I thought about making you take a quiz over it, but decided that you are adults who can pick this up very quickly without some old woman looking over your shoulder. Submit your paper on the Moodle where you see the little hand holding a piece of paper that says ’Ms. Tremede’s Task .010. Keep an eye out for clues. Some of these sites are thick with them, so be sure to click on any stray links you find.. Hester also offers students rewards for locating relevant game information. Further, she notes that additional resources to improve their problem solutions will be revealed if they obtain these rewards, such as a video that students can locate in YouTubeÔ if they put together a web address correctly (see Fig. 2). 2.1.3. Principle 3: Design authentic tasks In the top-level narrative tasks, the ill-structured problems that students are asked to solve require them to use all the major components of Microsoft OfficeÔ. The problems students face increase in complexity as the course progresses. In one instance, students had to provide directions to an inept gym teacher for how to construct a properly functioning grade book spreadsheet in order to allow him to keep his job at a middle school. In another instance, students develop an improved website for a local nightclub that included appropriate use of basic color theory and space usage. Fig. 3 captures the laments of one client, Iris, and her call for help developing a PowerPoint presentation for an upcoming conference. In essence, each of the clients and characters in the six, PBL scenarios had alternate personas, hidden beneath their client identities, and all of them were embroiled in an underlying conflict with each other as well as the unsuspecting student players.
Author's personal copy
402
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
Fig. 2. Two client characters meet in a YouTubeÔ video to provide game context.
2.1.4. Principle 4: Design the task and the learning environment to reflect the complexity of the environment they should be able to function in at the end of learning The redesign made use of a hybrid or blended learning format. Eight face-to-face class times were dedicated to scaffolding student learning of the computer applications and to facilitate group problem solving. Online resources, support, and collaboration tools were also provided through the free courseware platform, Moodle. However, students were encouraged to make use of whatever productivity and communication tools best fit the dynamic of their groups. Emphasis was placed on communicating with peers, in class and online, to develop viable and deliverable solutions, rather than enforcing conformity to a specific version of a designated proprietary software program. The goal of this instruction was to provide students with a general set of skills that would allow them to use any word processor, spreadsheet program, or presentation tool and adapt to new versions readily. It also addressed some of the issues cited previously that frequently accompany online learning and digital collaboration, compelling students to negotiate solutions to issues of accessibility, software compatibility, and file management in their own teams. For example, if one team member could not afford the latest version of MS OfficeÔ, the team might use Oracle’s Open OfficeÔ or Google DocsÔ. These broadened objectives were expected to better prepare students for their future world of work. Moreover, the distributed nature of an AltRG and the hybrid or blended course design compelled learners to confront additional complexities they will likely face at the end of learning. First, while materials related specifically to the course (syllabus, class policies, grading rubrics) were housed in Moodle, many of the resources needed to develop meaningful solutions to course problem-based tasks were distributed elsewhere, compelling students to search for and evaluate the sources of information they located. In each task, one character introduced students to the problem he or she was facing, but that problem often involved a conflict with another character who
Fig. 3. Game character Iris’ website.
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
403
viewed the issue from a different perspective. Students needed to explore all sides of the problem and diverging character interests in order to develop appropriate problem solutions. Seeking and evaluating information as well as reconciling multiple perspectives are complexities students will face beyond their college work. Additionally, the hybrid design of the course required students to manage time together in class and devise ways to communicate and collaborate when they were not together, skills and abilities that are increasingly valuable in a global economy. 2.1.5. Principle 5: Give the learner ownership of the process used to develop a solution Learners were given ownership of the problem-solving process in a number of ways. Although they were introduced to the basic steps of the problem-solving process early in the semester, they were encouraged to fine-tune and adapt those steps to the unique circumstances of each problem-solving task. They also had ownership of dividing up the work, assigning roles, and establishing timelines. As mentioned in Principle 4, students were introduced to a variety of productivity and communication tools and encouraged to make use of whichever tools best fit the needs of their group. A dedicated space in Second Life provided a venue for students’ avatars to meet as a group or to consult with the instructor during office hours that were held there. Discussion boards were provided in Moodle. Students were also introduced to wikis, file sharing sites (such as Dropbox), and Google tools, including Google docs, Gmail, and chat. 2.1.6. Principle 6: Design the learning environment to support and challenge the learner’s thinking Throughout the course, the instructor played the role of both teacher and game Puppet Master, providing soft scaffolding and additional resources for students who might be struggling either to solve the ill-structured problems or locate game resources. Most importantly, the instructor also acted to challenge learners’ thinking as they developed their solutions. The clients that these instructors role-played provided the puzzles and uncertainty with students simply by acting as clients. The authenticity and challenge stemmed from having instructors behave as real clients do. Some characters were “out-of-office" from time-to-time; meanwhile, others couldn’t respond to student emails immediately, and some required follow-ups from the students for a response. At times, they were impatient or demanding in their responses and not quite as helpful or direct as students expected them to be, necessitating clarification emails from students. Most of all, their concern was not on student success, but instead just wanted their task done to their specifications. The additional mystery in the second layer was intended to draw students further into the narrative and further challenge them cognitively. However, students were also cognitively challenged by the clients when they did not give students what they needed in the same way that an instructor might be expected to do. Game characters also acted as gatekeepers, judging the quality of student solutions and preventing them from moving to the next problem until the last had been adequately addressed. As students moved through the story at both levels, clues and minor puzzles were revealed. If students were successful at piecing this information together, they discovered that the clients they worked for were not who Table 1 The Door learning tasks, goals, and two-tiered narrative. Task
Learning goal
Problem narrative (Tier 1)
Hidden game story narrative (Tier 2)
Leto’s Task
The learner will solve a problem for a client that requires the use of a word processor’s primary tools such as spell check, font type and size selection, and secondary tools such as collaboration editing tools like comment and track changes.
Leto and Hunter are considering law suits against one another, but an arbitrator told them to seek another solution before going to court. They ask the groups to come up with a solution to their problem, which is that the plans for Leto’s new building will block the sun from Hunter’s greenhouse. Student solutions need to be presented in a professional-looking letter that can be given to the arbitrator and judge.
Leto claims that Hunter is an old girlfriend who is bitter over the ending of their relationship; she disapproves of his new lifestyle in the clouds in a place he calls The Asphodel Tower. Leto also lets slip that The Man in the Window is responsible for a group called Und3rw0rld13 that has a MySpace page. If students locate the page, they find www.wikidot.com entries for a Cult of Demeter. There are also additional links to historical pieces about ancient Greeks and their cults.
Ray’s Task
The learner will solve a problem that requires that they can restructure a spreadsheet so that it properly calculates cells, formats color and font, and their process can be explained to a novice in a job aid they will create.
Ray Seiss, an eighth grade social studies teacher and football coach, is having trouble with Diana Juneau, the assistant principal She is looking for an excuse to have him fired by finding problems in his grade spreadsheets over the course of the last year. Students must review his grade book information and clean it up so that Diana can understand what it is he actually did.
Ray (Zeus) provides his key and important information. Demeter is plotting something big with Charon. He provides information he found about the Cult of Charon. “Their beliefs appear to be tailored somewhat on the book [http://en.wikipedia.org/wiki/American_Gods] by Neil Gaiman. It maintains that there are ancient deities that continue to walk among us.
Athena’s Task
The learner will solve a problem that requires that they can take a poorly constructed digital presentation that is missing information, communicate with clients to get this information, locate other resources to fill out the presentation including graphics and researched information, and construct it in a professional manner that engages the viewer.
Danny hired Iris to develop a professional presentation for him, which he needs for a major international conference. His company is Bronze Armory Limited and they make body armor for soldiers.Iris has several other contracts and mixed up the content for his presentation with the content given to her for another presentation. Danny’s is due in a couple of weeks and she has three other projects due during the same time. She wants each group to develop the presentation using her content.
Athena tells players that Hester has been up to something. She directs players to Dr. Charlie Daseh (Hades), a man she detests and references the und3rw0rld13 group. She provides links to additional information on MySpace pages for Hades and his group. They are also told that they may get additional information if they can gain friend access to Daseh and underw0rld13.
Roman, owner of the fictional Tartarus Club, wants a small website created for his club that includes information about bands, specials, the location, etc. He wants it to look good and use a color scheme that is in keeping with the club’s theme.Charlie said he would have it done for him as a favor, but his Und3rw0rld13 crew says it is beneath them, so he wants the class design teams to complete it.
Gaining entry to und3rw0rld13 is the key, so students have to set up their own account and MySpace page to make it look like they understand technology when the group reviews their MySpace friend request. The Blog entries there detail a couple of their escapades including the theft of Hester’s program and why. Once this is discovered, students are told to contact Charlie with the evidence and demand its return to Hester. He responds with program location.
Charlie’s The learner will solve a problem that Deception requires them to use all the skills they have developed with computer applications during the semester as well as learn some basic web design and publishing skills.
Author's personal copy
404
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
they appeared to be on the surface, but instead represented the ancient Greek gods seeking to reclaim followers and power by harnessing the power of the Internet, an additional challenge to their original thinking. As the semester progressed, students engaged in additional problem-based learning activities, each growing more and more ill structured as students learned to cope with more abstraction and honed their problem-solving and critical thinking skills. In the following table, we provide an overview of the PBL tasks. For Leto’s Task, we provide an example of the hidden narrative learners uncover as they play. For Ray’s Task, we note the particular technology objectives that we targeted with the problem as presented in Table 1. At the conclusion of the semester, students created an online portfolio of their solutions, highlighting their use of the computer applications tools to solve the problems. 2.1.7. Principle 7: Encourage testing ideas against alternative views and alternative contexts The two-tiered narrative framing not only provided the means for students to situate their learning in a more meaningful and engaging context (Warren, Stein, Dondlinger, & Barab, 2009), it also encouraged students to interrogate the inconsistencies between the two plotlines and challenged students to rethink their surface-level understandings of what was presented to them by the game characters, peers, and the instructor. Within the course design, students worked in groups as a means to build consensus toward the development of their problem solutions prior to turning in any work for expert review by the instructor and graders. Beyond working within teams to generate defensible solutions, students were also required to submit their solutions for review by other teams so that they would receive alternative views on the quality and defensibility of their solutions as judged by groups with a similar task but a different approach. In addition, they were required to evaluate other team’s work, which provided them with another perspective and possibly another context for understanding the relevance of the ill-structured problem. As a result, students tested their ideas against alternate views in multiple ways: 1.) by the individual student in self-evaluation reports regarding their own contribution to the solution and work, 2.) by other members of their teams to determine how well each group participant contributed to the overall work, 3.) by several members of other, external teams to evaluate the overall product, and 4.) by experts in the form of the instructor and graders who role-played the characters and provided feedback to groups and individual students. 2.1.8. Principle 8: Provide opportunity for and support reflection on both the content learned and the learning process Finally, students were asked to keep a web log (blog) to reflect on their experiences during the course. Blog entries were both guided and unguided depending on the week’s activities and student feedback throughout the course. These reflections allowed students to reflect on the content and skills that they learned related to the MS Office products, how they used those products, as well as how they used them in the past. They also prompted students to examine their own processes for learning, their skill in working in groups, past learning experiences, and how they have changed over the course of the semester. The blogs also allowed instructors and researchers to examine student metacognitive processing activity. Moreover, with each form of evaluation described above in Principle 7, students were challenged to reflect on their own work, examine their view on the problem in the context of peer groups within the classroom, understand how they and their work is viewed by others within the group, and finally, how their work as part of a group was judged by impartial experts. Each of these evaluations was iterative and students received perspectives to reflect upon their own contributions and the overall solution throughout the learning process. 2.2. Evaluation methods Given that this course redesign stemmed from a university initiative aimed at improving retention, satisfaction, and critical thinking, evaluation of the redesign employed multiple methods: qualitative and quantitative. Retention was examined by comparing drop, failure, and withdrawal rates between sections of the existing course design and The Door pilot section. Satisfaction with the redesigned course was evaluated using the university’s standard five-item course evaluation and compared to satisfaction rates from sections of the existing course. Technology skills achievement was also measured using a comparison of pre and posttest scores taken from a section of the course using existing SAM 2003 software only and compared with scores from the section that employed The Door curricula. In order to examine the role of the problem-based learning and game elements in enhancing satisfaction and critical thinking, a qualitative approach was also deployed. In addition to evaluating the effectiveness of the PBL components of the course, the designers also sought to determine whether learners as gamers were engaged with and motivated by a game designed for a required non-major course. Did the game engage students’ intrinsic desire to play by providing adequate extrinsic reward in a game designed for learning? Did students experience a sense of play when engaged with the game? We thought it important not only to understand whether learners engaged in play, but also to determine whether play contributed to learning. In asking these questions, we sought to identify the challenges and successes of this design that might be applied by others who design games for learning. The qualitative analysis relied on data collected primarily from two sources: weekly student web log (blog) reflections and semistructured interviews with 11 students and the instructor. Analysis of this data involved a constant-comparative approach to identify emergent themes regarding learner experiences with the problem-based learning and the alternate reality game contexts (Gall, Borg, & Gall, 1996); the rationale for employing these methods follows Denzin & Lincoln’s (2003) prescription for a method that “seeks answers to questions that stress how (sic) social experience is created and given meaning” (p. 13). The weblogs provided a data regarding students’ experiences throughout the course; the semi-structured interviews conducted at the end of the semester to generate a summative review of the interactions, benefits, and detriments stemming from the implementation of this redesign. The semi-structured interview questions were as follows: 1. 2. 3. 4. 5.
Tell Tell Tell Tell Tell
me me me me me
about about about about about
your experience with the class. a time when you were successful in learning in the class. a time when you struggled to learn in class. a time when you worked well with your peers to complete a class task. a time when you struggled to work with your peers to complete a class task.
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
405
In addition, interviewers were provided with a series of follow-up questions in case student answers were overly terse or were uninformative for evaluating the construction of the course. 1. 2. 3. 4.
Would you want to take a class set up like this again? Why or why not? How do you feel about technology in general in classes? Why? Did taking this class change this for you at all from how you felt at the beginning? If so, how? If not, why not? Will you use any of the resources from this class in the future?
These questions were open-ended and used to prompt student reflection. Each interviewer generated follow-up questions based on student responses. This sometimes led students in different directions based on what was personally important to each student. 3. Results Although the findings from analysis of the quantitative measures are reported more fully elsewhere (Warren, Dondlinger, Whitworth, & Jones, 2010), results from initial implementation of the experimental curriculum had mixed, but promising results on measures of retention, satisfaction, and achievement as shown in Table 2.
Table 2 Quantitative results for student retention, satisfaction, and achievement.
Retention (% DFW) # of Drops # of Failures # of Withdrawals Satisfaction Achievement
Comparison n ¼ 57
Treatment n ¼ 32
Differences
21.05% 1 2 9 3.64 M ¼ 78.83
12.50% 2 0 2 4.2 M ¼ 85.96
8.55%
alpha ¼ .05, z(6) ¼ 6.86, p ¼ 1.64 t ¼ 3.90, crit ¼ 1.67
The results indicated an 8.55% difference in the percent of students who dropped, failed, or withdrew between the comparison course and the treatment. Moreover, satisfaction with the redesigned course was statistically significantly higher than in sections using the existing course design. Finally, student achievement, as measured by posttest in both groups and compared using a two-sample t-test assuming unequal variances, showed greater improvement in the treatment group than the comparison group. Although student satisfaction was higher in the redesigned course, qualitative data collected through interviews with students tells a slightly different story. Interviews indicated that students gained a number of insights and understandings from the experience. The principal themes identified through analysis of the qualitative data included the following: 1. 2. 3. 4.
An appreciation for how the technology skills gained in the course applied to the world of work and would impact their future. An understanding of the significant role that interpersonal communications play in learning and in career success. A sense of empowerment fostered first by access to resources and later by development of the knowledge and skills to become resourceful An increased willingness to play, explore, and experiment with tools, content, and processes that points to potential lifelong learning.
Students also noted several challenges arising from the course structure in their interviews. The most consistent challenges related to group cooperation, teamwork, and interpersonal communication, which had both positive and negative connotations. The majority of those students interviewed (nine of eleven), reported that they struggled with completing tasks in a group (Warren, Dondlinger, & McLeod, 2008). 4. Discussion A complete discussion of the codes and categories, which grounded the themes identified above is reported elsewhere (Warren & Dondlinger, 2009; Warren, Dondlinger, & McLeod, 2008). Here we offer a discussion focused on each of the eight elements suggested by Savery and Duffy (1995) in order to determine how each of the social constructivist design elements that the instructional designers included in the course contributed to or detracted from the overall results. According to the student interviews, some elements were more successful than others. Those tied to social interactions among groups of learners were especially problematic for most students. Each element is discussed in detail below although those that made sense to combine as part of the evaluation experiences of students are discussed together. All student names have been replaced with pseudonyms in order to ensure their anonymity. 4.1. Principle 1: Anchor all learning activities to a larger task or problem Since the primary applications, Microsoft OfficeÔ Suite, that formed the core of the course are already pervasively used in K-12 education, many of the students in the course already had experience using them. Despite this prior experience, a common theme that emerged from the interviews was the realization amongst students of how important these technologies would be to their success in the world of workda realization that did not seem to occur to them in their prior experiences with these technologies. Rhonda stated that her prior learning experiences were neither challenging nor meaningful, “My teacher said, ‘Well, do a professional edit in Word,’ and I’m like, ‘Why?’" This student went on to explain that the context, completing tasks for fictional clients, provided both the
Author's personal copy
406
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
challenge and relevance: “the way he [the instructor] did it where you have a client and tasks, it.just made you really excited to work.” This immersive context also made students aware of how important technology would be to their future work. As Beth reflected, “I was pretty efficient working with, you know, computers and things.but until I took this class, honestly, I didn’t realize how much you use technology on a day-to-day [basis].” 4.2. Principles 2 & 5: Support the learner in developing ownership for the overall problem or task and give the learner ownership of the process used to develop a solution Students often mentioned these two elements in the same breath. What was significant was the conflict within students about the instructional methods, which granted students the autonomy for developing problem-solving processes themselves. While several students were impressed with the responsiveness of the instructor, many expressed an initial frustration that he did not always give them finite directions or concrete answers, but rather encouraged them to consult the fictional clients, their teammates, and the resources provided to work out solutions to the ill-structured problems. As Beth expressed it, Well, the frustration a lot of times was that there wasn’t a lot of. I like knowing exactly what is wanted.(a)nd he was trying to introduce us to clients.(but was) not telling us everything.. This redirection, while frustrating, compelled students to make use of various learning resources beside the instructor. Several students asserted that this increased their own resourcefulness and sense of power over their learning. Rob stated “finding things on the Internet is easier . when you know how to look for them.” He lamented not having physical resources from the class, but then concluded that “[now] I know how I can find it on the Internet.so if I can’t figure something out, I’m sure I can figure out how to find it, so I guess I can get directions and do it.” Will indicated that this empowerment turned out to be the best part of the class, The information was out there and he gave us the tools to figure that out. So, that was the best part of the class was that if you couldn’t get a hold of him, he’d given you the tools to figure it out anyways. Students became more self-reliant as a result of having ownership of their learning and problem-solving processes. 4.3. Principle 3: Design an authentic task As discussed in relation to Principle 1, students recognized that their newfound technology skills transferred to the world of work and that the authenticity of the tasks played a role. Rob appreciated this authenticity, stating, “I liked the idea of how the class is set up because [in the real world] the teacher is not going to stand up, [and say] ‘Okay see what we are going to do.’” He realized that in the working world, he’d more likely hear the following: “here’s your project for the week and get it done. That’s how it’s going to be.” Rob concluded that how the course tasks were designed was “pretty realistic and I like that. I wish more classes would get involved in that.” Students also recognized the authenticity of working collaboratively in teams. While the majority of the students in the sample had frustrations with team dynamics, they realized that developing strategies for managing teamwork was a critical and authentic part of the problem-solving process. As Beth expressed it, There is going to be that one person in the room of twenty people that you just do not like.but, when it comes to the professional world, if you cannot get over that, you are not going to make it. The interviewee added that collaboration has its benefits, reflecting that “some people are just better at computers than others or you may be better at writing than others or whatever. You have got to collaborate and work together somehow.” Another student noted that depending on others when a grade is a stake was particularly difficult, stating, “When I have to rely on someone else for a grade and I know they are not doing what they are supposed to be doing, it bugs me.” However, later in the interview, she noted that learning to work in a team would be vital to his future: I know that no matter what there’s going to be people that I just don’t like working with and there will be people throughout my career and life. So, in one end I guess it is a good experience learning it now.
4.4. Principles 4 & 6: Design the task and the learning environment to reflect the complexity of the environment they should be able to function in at the end of learning and design the learning environment to support and challenge the learner’s thinking Managing work in teams was one way that students found themselves dealing with the complexity they should be able to function in at the end of learning. They also noted that this complexity challenged their thinking. Rhonda described working in teams as both challenging and suspenseful: It was, well I liked the challenging aspect of it. It wasn’t that it was hard. But the way he, like I said the way he did it where you have a client and tasks, it just made the suspense.and just made you really excited to work. She added that although the environment challenged her, she felt supported by the professor: He would always get back to us, most of the time. Professors, they don’t have time to get back to you like that and (the professor) was a busy guy. I was surprised he was getting back to us as quick as he was <snaps fingers>. I was excited about that. Beth noted that the complexity of the environment, coupled with support from peers and the professor empowered her to grow comfortable with making errors and then improving:
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
407
I mean, just getting comfortable with.(g)etting comfortable with making mistakes. When you are comfortable with making mistakes and then going back and fixing then you know you are getting there. You may not know how to do it yet, but you are no longer as scared of it. In contrast, when asked about whether or not they had played or enjoyed the AltRG, many confessed that they initially were too scared to play it, a fear stemming from inexperience with the various technologies that comprised the game. However, by the end of the semester, they noted that they regretted not having engaged further with the play elements. By this time, students stated that they had overcome their fear of new technologies and were intrigued by the mystery that the game presented, as well as the rewards that the game provided. One interviewee indicated, “I think just playing the game would have given me that much more knowledge of the computer. And it might have helped out with maybe some of the tasks.” Beth noted a disconnect between the second tier of the game and her course grade: If getting into the game had been part of your grade, I would have learned more. I would have made myself. I may have been tearing out my hair but.(t)his was my first semester back in college in twelve years. This student added later that after becoming more familiar with the technology tools used throughout the course, she thought that playing the game “would have been fun” and wished she had risen to the challenge. After overcoming the challenging nature of the environment, Rob indicated a willingness to play with different tools as a result. Explaining that he “had never actually done anything like that [create a web page] before,” he expressed some amazement that “it was pretty easy to figure out” after he just “messed around with it a little bit.” These reflections point to growth in student willingness to explore, their increasing sense of empowerment, and the desire to seek additional opportunities for meaning-making d characteristics that are often associated with lifelong learners (Cropley & Knapper, 1983). 4.5. Principle 7: Encourage testing ideas against alternative views and alternative contexts Will cited working in a team environment as means of examining alternative views and critiquing his own. He found that “(h)aving other people’s ideas, not just my own” and “just having other people there” were important to coming up with viable solutions to course tasks. He describes the process of testing ideas and notes its value in the following passage: We have got to design a spreadsheet. Okay, well I kind of know what I’m doing, but what do you think. Oh, that would be a great idea, we’ll put that in there or that’s a really crappy idea, we’ll take that out. So, just having other people’s opinions and ideas. Maybe they have a specific knowledge that I don’t that adds some extra insight or I can add some extra insight and then of course maybe they are better at grammar than I am. Rhonda stated that by the end of the semester, she was taking ownership of the learning process as she worked with her peers to develop a solution: Especially, like with the last task, with the web pages. Is one person just going to come up with ideas? How we going do it? Because see, at first, I just started playing around with it.(a)nd just came up with some stuff. I’m like, ‘Oh, wow. This looks nice.’ And I showed it everybody and they added their input. Overall, students noted that they benefited from the inclusion of opportunities to communicate and test their ideas while being exposed to alternative views despite the challenge of group work. 4.6. Principle 8: Provide opportunity for and support reflection on both the content learned and the learning process In the weblogs students kept throughout the course and interviews that were conducted at the end of the semester, students were provided opportunities to reflect on what they learned and how they learned it. For example, after confessing that she had minimal experience using technology, Beth reported in her conversation with the interviewer that she learned to use technology more confidently: I was very tentative about all kinds of technology. I mean, I don’t listen to music, so I didn’t have an iPod. I’d go to those sites and maybe play some Jewel Quest on my computer and then I’d turn it off. I mean that was the extent I was using my computer for and while I’m still doing a lot of that because that’s what I enjoy, I now can go into more sites, I know there’s more sites out there. I just found a few more things to do with my computer.overall I just feel a lot more comfortable with my computer than I did. For others, the opportunity for reflection in class carried over into their personal lives. Rhonda noted that, as a result of the class, she had begun documenting reflections on her daily life: Yes, I really, to be honest, I never blog. I’m like, ‘what is a blog? I’ve heard the new term, you know, used around. "Blog. What’s a blog?" And now, I started one on my MySpace page.I really like that, you pretty much post all your feelings and cares! She went on to say that, reviewing her blogs from the beginning of the semester to the end that. I was pretty efficient working with, you know, computers and things.but until I took this class, honestly, I didn’t realize how much you use technology on a day-to-day.basis. Wow. You really need this for any job you do.it doesn’t matter what field. She discovered through the content learned and the learning process compelled by having clients and fellow team members with whom she had to work, “how important it really was checking your email. But I just realized, that if I was in a company, I’d have to email (and) do a spreadsheet for the budget.” The opportunity for reflection provided the means for students to process what they were learning and reflect on how that learning transferred to their future lives and work.
Author's personal copy
408
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
5. Implications As a result of this pilot implementation, we learned a number of lessons about designing and implementing a learning game built on PBL principles. These lessons relate to the problem-based and game narrative tiers, as well as the blend of these two sets of elements. In addition to reporting these lessons in this section, we also provide suggestions for instructional designers and instructors seeking to implement games for learning as a means of inquiry (Thomas & Brown, 2011). In this pilot iteration of the course design, it may be argued that, due to lack of student participation in the second tier game component, we were not successful at tapping students’ intrinsic motivations and engaging them in deep inquiry. Because it was an optional activity for the course, too few students engaged in the fantasy game tier and did so too much later in the course to evaluate whether the more explicit game components intrinsically motivated students to explore beyond the required first tier client-based activities. While we can attribute the learning gains in computer literacy skills, observed in posttest scores, to the first tier PBL tasks, the degree to which game play contributed to these gains is inconclusive. Those students that did play the game expressed an increased willingness to explore technology tools beyond the software applications targeted in the course. They also indicated an increased sense of resourcefulness, having gained confidence in their ability to seek answers to their questions from sources other than the instructor and materials provided in the course. Those that did not play the AltRG noted that the rewards were insufficient and unclear when weighed against the amount of time it might take to work through the puzzles and ciphers in the game. The level of difficulty of game tasks was not commensurate with the rewards of additional resources that could be much more easily located with the most rudimentary Internet search skills. 5.1. Lessons for designers of learning games In order to move the field forward, it is as important to recognize and learn from our mistakes as it is to report our successes as was done by Ted Castranova with his Arden game (Baker, 2008) and two of the authors of this paper with the Anytown game (Warren, Dondlinger, Stein, & Barab, 2009). From the results of this study, we note five lessons that arose in our use of game elements in the course. Some of these tensions may be surmountable by engaging in better analysis of the audience for which the games will be designed. Others emerged specifically because this game was designed from a social constructivist perspective and may not apply to games designed from other epistemic stances (Bernstein, 1983; Hollis, 1994). 1. A game may not be necessary. Market research showing that 72% of all households and 53% of 18–49 year-olds play video games (Entertainment Software Association, 2011) has enticed instructional designers seeking to engage students in content areas in which students tend to underperform, such as science and math. However, the research support for developing and using instructional games is still highly questionable (Hayes, 2005), particularly in post-secondary education. Consequently, designers’ time may be unwisely spent creating learning games in situations where students need better scaffolding for the rigors of socially constructing knowledge and solving ill-structured problems. However, from a design perspective it may also be that The Door AltRG did not adequately connect both narrative tiers to the learning goals. The first tier of the game that included explicit, problem-based interactions with clients also included the game elements that Salen and Zimmerman (2004) described including artificial conflicts, an interactive rule set, and win outcomes. Although transitions to the second tier fantasy narrative were not seamless, interviewed students found the conflict, client specifications (the rule set), and success conditions in the PBL tier to be meaningful and authentic enough to engage and satisfy them. The additional game tier wasn’t necessary to achieve those design goals. The overlap between PBL and game elements that we identified as the rationale for the additional narrative tier may actually have been too great to warrant that very tier, given that PBL shares with games the elements that best engage and motivate students/players. Learning game designers should note that well-designed PBL scenarios may be sufficient to meet their instructional goals and that a fantasy game narrative might not be necessary. 2. Some games do not feel like games. Our original plan was to develop a game in the vein of a Quest Atlantis (QA) (Barab et al., 2009), but targeted at college-aged student computer literacy practices rather than science learning. Further, we intended to use Second Life rather than ActiveWorlds as a building tool used by QA in order to increase the graphical fidelity and game feel. However, due to budgetary constrictions, we opted for the AltRG structure. While this game format likely fit better with the problem-based learning foundation upon which it was built, the AltRG was an emerging game genre, not commonly encountered by most of the students who came into the course. When asked directly about the game components, interviewers often had to point them out, as students either did not recognize them as being part of a game or had simply not explored them. While the AltRG contained Salen and Zimmerman’s (2004) three main game elements, the distributed nature of it did not mesh well with students’ previous experience of video games in the first-person shooter, massively multi-player, and adventure game genres developed in stand-alone, 3-D virtual worlds. Even for those students that reported liking to play games, the structure of the AltRG did not provide the same immersive experience to which these gamers were accustomed. Although the AltRG has become more popular than it was when we introduced it in this course, it appeals to a different set of sensibilities than other game genres. While the AltRG genre we chose for The Door was a viable solution to our budget and development restrictions, it did not give learners the game feel we sought. Designers should consider how different game genres appeal to different student segments prior to starting game creation. 3. Underprepared students. The very nature of a problem-based learning experience requires that students work together in small groups to socially construct knowledge. However, as we have noted, this created conflicts among students who were not prepared for group work, coming from a school culture that values individual learning over group knowledge construction. While students recognized that they would be required to work in groups during their future employment, teamwork was a major challenge for many students and the instructor. Another challenge was that the tasks bridging the core first tier problem-based learning and second tier fantasy game aspects required abstract and critical thinking skills that many of our students did not come to the course with from their prior learning experiences. The politics of accountability and mandated standardized testing have necessitated that the K-12 environments from which students come to college focus on concrete thinking using what Sfard (1998) called the “acquisition model” of learning. In this model, memorizing information is privileged over problem-solving, critical thinking, and creativity. A major thrust of the university’s initiative, of which this course redesign was a part, aimed at addressing this problem. However, because students were not prepared to
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
409
engage in the level of abstract thinking required by either the first or second game tiers, the instructor and designers had to focus on providing additional scaffolds in the first tier tasks, which were required components of the course. Providing additional scaffolds to encourage engagement with the second tier is left for later iterations of the course. Our challenge now is to consider whether our design sufficiently provided both the hard technological and soft instructor scaffolds (Brush & Saye, 2001) that students required in order to raise them to the level we sought. Learning game designers should consider the level of thinking required by their design and the degree to which students are prepared to achieve it. 4. Motivating play as a means of inquiry. As noted earlier, our intent in designing the game tier was to tap the intrinsic motivation to play: explore, seek answers, try different approaches, increase one’s level of skilldthe cognitive behaviors that learning theorists have observed gamers to exhibit when playing video games for entertainment (Annetta, 2008; Gee, 2003). Nevertheless, several students objected to the game component in the first few weeks of the course. They reported to the instructor that they were nervous about the team-based structure of course activities, and they were especially fearful of playing the second tier of the game, being unaccustomed to the some of the technology and unfamiliar with the AltRG genre. As a result, the instructor dropped the requirement that they play the second game tier. Instead, his focus shifted to providing additional soft scaffolding for the PBL tier tasks, including reassurance and guidance regarding resolving team conflicts. The fact that the instructor did not mandate playing the second tier ensured that some students would not. In interviews, students stated that they would have engaged in the fantasy game narrative if it had been required, and some even stated that they wished they had played it. Designers should consider how to balance support for student learning in the form of hard scaffolds and direct instruction with opportunities to take risks and try new approaches without worry of repercussions, such as a lower course grades or fear of embarrassment about proficiency with the technology. 5. Too much game, too little reward. Following our analysis of the existing course design, we had merged or reduced the number of surfacelevel learning objectives, such as opening, closing, or saving a file, in order to reframe those objectives to an application-level, such as create a spreadsheet or PowerPoint presentation that addresses a specified set of needs. Although most students had been exposed to the software applications in the MS Office Suite, many needed to reacquaint themselves with software functions and the skills needed to execute them. In designing the course around authentic PBL tasks, we sought to create more meaningful activities and thought that, in groups, students would teach and learn these skills from each other using materials provided in the course, including, but not limited to, the computer-based instruction software used in the comparison course. What we had not considered in the analysis phase of this project was students’ trepidation with working in teams: relying on others when grades were at stake and possibly fearing team members might perceive them incompetent or lacking in skill. These fears were compounded by an unfamiliar game genre. The shroud of mystery and intrigue designed into the game narrative to entice students into further inquiry had the opposite effect, elevating their nervousness and uncertainty. In light of these observations, we may well have designed too much or too difficult a game with too little reward for playing. Removing game play as a course requirement alleviated the fear of failure associated with the game, but it made the rewards for playing, namely the discovery additional resources to enhance solutions to PBL tasks or extra credit points, insufficient to entice many students to play. Possible strategies to address this include either introducing the game later in the course after students are more familiar with team tasks and technology skills or making the game narrative and tasks less puzzling early in the course so that students gain confidence in their abilities before advancing to more complicated game challenges. Designers should be mindful of qualms learners have about their skills when determining the level of challenge compared to the rewards for overcoming that challenge when designing and sequencing learning game tasks.
5.2. Questions to consider prior to game development When deciding whether or not to include game components in a course design, we recommend instructional designers consider the following questions during the analysis and design phases of the process. Because the tools (i.e. game engines, virtual world building tools, etc.) often determine what kinds of development can be done, we limit our recommendations to those tied to pre-development analysis and design. Once these questions are answered, development can commence. 5.2.1. Analysis phase During the analysis phase of the process, designers of learning games should consider the following the questions:
Do students appreciate learning games Are students prepared for the complexity of the game? Are instructors prepared to teach with games? What if students or instructors are not prepared?
If analysis indicates that students are not open to games or are underprepared for the self-direction and open inquiry required for successful game play, designers efforts may be better spent developing the pedagogical scaffolds to foster these abilities in a meaningful PBL scenario which engages students in ways similar to learning games. While mystery or fantasy is alluring to some, when a course grade is a stake, the uncertainty in such elements can be obstacles to learning playfully rather than the enticements they are intended to be. Just as designers must consider the preparedness of students, it is equally important to consider whether instructors are prepared to provide for students needs. Although the research findings we report here do not address this issue, prior research (Warren, Dondlinger, Stein, et al., 2009) and efforts to scale up The Door since this pilot, lead us to include this consideration. If instructors will require significant training to meet student needs effectively, a game might not be feasible. 5.2.2. Design phase After addressing the questions pertaining to the analysis phase, designers should consider the following before designing their learning game:
Author's personal copy
410
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
Is the game genre appealing to your learner players? Does the addition of game elements further your instructional goals? Does the narrative context complement course content? Is the level of difficulty appropriate to learner skills? How will you motivate play? Should you provide alternatives to game play?
When a game genre is unfamiliar or unappealing to learners, the benefits to motivation commonly associated with game play are lost. Moreover, if the instructional goals are met or exceeded through other means (in our case, the PBL components) then the game might be a waste of designers’, players’, and instructors’ time. However, whether a game will further the instructional goals, ensuring that the narrative complements those goals and the level of difficulty supports play are crucial considerations. Designers should also consider whether to motivate play through mandate or instead provide students with options that match their personal preferences, ranging from direct instruction to open inquiry: PBL and learning games falling somewhere in between the opposite ends of this spectrum depending on how they are designed. 6. Conclusion While the problem-based learning methods combined with game elements have yielded some mixed results, students appeared to gain many skills needed for the remainder of their university experience. The course also required students to engage in real-world skills they would need in their future work including self-organization, solving ill-structured problems with indirect guidance, self-evaluation, and self-monitoring of their progress as they sought to meet the needs of clients and team members. Although the problem-based learning component was challenging for some students, they made clear linkages between problems they were solving in the course and those they would have to solve in the future, finding them meaningful and relevant. Only a few students went beyond the PBL tasks and explored the game narrative that connected the PBL tasks. This was likely because playing the game was optional rather than a course requirement. While additional resources for solving the problem-based tasks were built into the reward structure of the game, students did not realize until late in the semester that these resources could enhance their problem solutions. After discovering this, students reported wishing that they had played the game. Problems among students working in their groups were paramount and clearly had roots in the design of a course that required high levels of digital communication as a means of completing work, asking questions, coordinating group behaviors, planning, and producing. Teamwork challenges were further exacerbated by the size of some of the groups that ranged from three to five, depending on student choice. However, a lack of student experience with group management and problem-solving further complicated matters. While many students reported disliking working in groups to complete tasks, they did recognize its necessity in their future careers and that excellent interpersonal communications skills are necessary for their future success. In the future, we plan to reduce the number of students in each group to make teams more manageable. We also noted that the requirements of a social constructivist learning environment and its accompanying ill-structured tasks required students to be largely self-directed. Student reactions and work in the course indicate that many were not prepared to engage in selfdirected learning. Their resistance to asking questions to the clients and seeking assistance from peers was a challenge for the instructor, who had to continually redirect students to these resources. Although we sought to foster self-direction through organizing the course around a series of ill-structured problems and giving students ownership of the problem-solving process, we hypothesize that some students lacked what Schunk (2000) and Zimmerman (1990) have identified as self-regulated learning skills (SRL): time and behavior management, self-evaluation, organizing, planning, information and assistance seeking, and record keeping. Future iterations of our research will seek to determine whether students have sufficient SRL skills to engage with this form of ill-structured game construct. In addition, we will seek to determine what forms of scaffolds are necessary to support those that are underprepared. In order to ensure that sufficient time to address these skills and provide adequate instructor and designer scaffolding, we will likely reduce the amount of content in both tiers of the game. Further, we will experiment with different rewards for engaging with the 2nd tier narrative to determine its impact on the student experience and whether development of such a narrative yields learning gains commensurate with the effort requisite to developing it. Overall, the design continues to provide a wealth of data and results related to improving student experience with experimental and innovative instructional methods. Although the impact of these efforts on retention, satisfaction, and other design goals have yet to be fully understood, the results of the research on this pilot of the course redesign leave the researchers hopeful that the use of an AltRG in a hybrid course setting can be used to clearly frame the problem-based learning tasks and that future iterations of the course will meet student needs within the course as well as for their future work outside of the university. Acknowledgements This research and design work was supported by a University of North Texas Quality Enhancement Plan grant. References Ahern, T. C., & Repman, J. (1994). The effects of technology on online education. Research on Computing in Education, 26(4), 537–546. Anderson, J. R., Reder, L. M., & Simon, H. A. (1996). Situated learning and education. Edcucational Researcher, 25(4), 5–11. Annetta, L. (2008). Video games in education: why they should be used and how they are being used. Theory Into Practice, 27, 229–239. Baker, C. (2008, March 24, 2008). Trying to design a truly entertaining game can defeat even a certified genius. Wired, 16. Barab, S., Scott, B., Siyahhan, S., Goldstone, R., Ingram-Goble, A., Zuiker, S., et al. (2009). Transformational play as a curricular scaffold: using videogames to support science education. Journal of Science Education and Technology. Bass, R. (1999). The scholarship of teaching and learning: what’s the problem? Inventio, 1(1), 1–9. Bernstein, R. J. (1983). Beyond objectivism and relativism: Science, hermeneutics, and praxis. Philadelphia: University of Pennsylvania Press.
Author's personal copy
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
411
Bogost, I. (2007, September 1). An instructional alternate reality game. Water Cooler Games. Retrieved March 9, 2008, 2008, from. http://www.watercoolergames.org/archives/ 000842.shtml. Bonk, C., Kirkley, J., Hara, N., & Denned, V. (2001). Finding the Instructor in post-secondary online learning: pedagogical, social, managerial and technological locations. In J. Stephenson (Ed.), Teaching and learning online: Pedagogies for new technologies (pp. 76–97). London: Kogan Page. Borland, J. (2005, February 28). Blurring the line between games and life. CNET News.com Retrieved February 19, 2008, 2008, from. http://ecoustics-cnet.com.com/ Blurringþtheþlineþbetweenþgamesþandþlife/2100-1024_3-5590956.html. Borreson Caruso, J., & Salaway, G. (2007). The ECAR study of undergraduate students and information technology, 2007. Educause Center for Applied Research. Bransford, J., Vye, N., Bateman, H., Brophy, S., & Roselli, B. (2003). Vanderbilt’s AMIGO project: Knowledge of how people learn enters cyberspace. Retrieved 4/20/2004, 2004, from. http://www.vanth.org/mmedia/vanth0103/vanth0103cd/papers/AmigoWFig.pdf. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. Brush, T., & Saye, J. (2001). The use of embedded scaffolds with hypermedia-supported student-centered learning. Journal of Educational Multimedia and Hypermedia, 10(4), 333–356. Chickering, A., & Gamson, Z. (1987). Seven principles for good practice in undergraduate education. American Association for Higher Education (AAHE) Bulletin, 3–7. Clark, R. E. (1994a). Media and method. Educational Technology Research & Development, 42(3), 7–10. Clark, R. E. (1994b). Media will never influence learning. Educational Technology Research & Development, 42(2), 21–29. Crawford, C. (2003). The art of interactive design. San Francisco, CA: No Starch Press. Cropley, A., & Knapper, C. (1983). Higher education and the promotion of lifelong learning. Studies in Higher Education, 8, 15–21. Dede, C., Ketelhut, D., & Ruess, K. (2006). Designing for motivation and usability in a museum-based multi-user virtual environment. Retrieved 3/11/2006, 2006, from. http:// www.gse.harvard.edu/wdedech/muvees/documents/AELppr.pdf. Denzin, N., & Lincoln, Y. (2003). The discipline and practice of qualitative research (2 ed.). Thousand Oaks, CA: Sage Publications. DiPasquale, D., Mason, C., & Kolkhorst, F. (2003). Exercise in inquiry: critical thinking in an inquiry-based exercise physiology laboratory course. Journal of College Science Teaching, 32(6), 388–393. Elder, L., & Paul, R. (2002). Critical thinking: teaching students how to study and learn (Part II). Journal of Developmental Education, 26(2), 34–35. Entertainment Software Association. (2011). 2010 sales, demographic, and usage data: essential facts about the computer and video game industry. Washington, D.C.: Entertainment Software Association. Ernst, N. (2007, June 1, 2007). World without oil in retrospect. Group Action June 1, 2007. Retrieved July 5, 2008, 2008, from. http://groupaction.blogspot.com/2007/06/worldwithout-oil-in-retrospect.html. Frederickson, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2002). Student satisfaction and perceived learning with on-line courses: principles and examples from the SUNY Learning Network. Journal of Asynchronous Learning Networks, 4(2). Gall, M. D., Borg, W. R., & Gall, J. P. (1996) (6th ed.). Educational research: An introduction, Vol. I White Plains, NY: Longman Publishers. Ge, X., & Land, S. (2003). Scaffolding students’ problem-solving processes in an ill-structured task using question prompts and peer interactions. Educational Technology Research & Development, 51(1), 21–38. Gee, J. P. (2003). What video games have to teach us about learning and literacy. New York: Palgrave-Macmillan. Hara, N., & Kling, R. (1999). Student’s frustrations with a web-based distance education course. First Monday, 4(12). Hassenplug, C. A., & Harnish, D. (1998). The nature and importance of interaction in distance education credit classes at technical institutes. Community College Journal of Research and Practice, 22(6), 591–606. Hollis, M. (1994). The philosophy of social science: an introduction. Cambridge, UK: Cambridge University Press. International Center for Media and the Public Agenda. (2010). 24 hours: Unplugged. College Park, MD: ICMPA at the University of Maryland. Irons, L. R., Jung, D. J., & Keel, R. O. (2002). Interactivity in distance learning: the digital divide and student satisfaction. Educational Technology & Society, 5(3). Jonassen, D. H. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and models: A new paradigm of instructional theory, Vol. 2 (pp. 215–239). Mahwah, NJ: Lawrence Erlbaum Associates, Publishers. Jonassen, D. H., & Hernandez-Serrano, J. (2002). Case-based reasoning and instructional design: using stories to support problem solving. Educational Technology Research & Development, 50(2), 65–77. Karpinski, A., & Duberstein, A. (2009). A description of Facebook use and academic performance among undergraduate and graduate students. Paper presented at the American Educational Reesarch Association Annual Meeting. Keller, G. S. (2002). Using problem-based and active learning in an interdisciplinary science course for non-science majors. The Journal of General Education, 51(4), 272–281. Kozma, R. B. (1991). Learning with media. Review of Educational Research, 61(2), 179–211. Kozma, R. B. (1994). Will media influence learning? Reframing the debate. Educational Technology Research & Development, 42(2), 7–19. Kubey, R., Lavin, M., & Barrows, J. (2001). Internet use and collegiate academic performance decrements: early findings. Journal of Communication, 51(2), 366–382. Kulik, C., & Kulik, J. (1991). Effectiveness of computer-based instruction: an updated analysis [Meta-analysis]. Computers in Human Behavior, 7, 75–94. Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge, UK: Cambridge University Press. Lin, X., Hmelo, C., & Kinzer, C. (1999). Designing technology to support reflection. Educational Technology Research & Development, 47(3), 43–62. Martin, A., & Chatfield, T. (2006). Alternate reality games white paper - IGDA ARG SIG. Mt. Royal, New Jersey: International Game Developers Association. Mason, R., & Weller, M. (2001). Factors affecting students’ satisfaction on a web course. Ed at a Distance, 15(8). Meyerson, P., & Adams, S. W. (2003). Designing a project-based learning experience for an introductory psychology course: A quasi-experiment (reports - evaluative). Chicago, IL: American Educational Research Association. Rivera, J. C., & Rice, M. L. (2002). A comparison of student outcomes & satisfaction between traditional & web based course offerings. Online Journal of Distance Learning Administration, 5(3). Robson, C. (2002). Real world research. Malden, MA: Blackwell Publishing. Salen, K., & Zimmerman, E. (2004). Rules of play: Game design fundamentals. Cambridge, Mass: MIT Press. Savery, J. R., & Duffy, T. M. (1995). Problem based learning: An instructional model and its constructivist framework. Educational Technology, 35, 31–38. Schoech, D. (2000). Teaching over the Internet: Results of one doctoral course. Research on Social Work Practice, 10, 467–487. Schunk, D. H. (2000). Learning theories: an educational perspective (3rd ed.). Saddle River, NJ: Merrill. Sfard, A. (1998). On two metaphors for learning and the dangers of choosing just one. Educational Researcher, 27(2), 4–13. Spooner, F., Jordan, L., Algozzine, B., & Spooner, M. (1999). Student ratings of instruction in distance learning and on-campus classes. Journal of Educational Research, 92, 132–141. Squire, K. (2006). From content to context: videogames as designed experience. Educational Researcher, 35(8), 19–29. Strickland, E. (2007, July 10). Play peak oil before you live it. Salon.com. Retrieved March 8, 2008, 2008, from. http://www.salon.com/tech/feature/2007/07/10/alternative_ reality_games/. Texas Education Agency. (2010). Texas essential knowledge and skills, technology applications middle school, Vol. x126.12. Austin, Texas: State Board of Education TEKS Review Committees, 1–20. Thomson Course Technology. (2007). Building the bridge to better microsoft office instruction: Report on a survey (White paper). Boston, MA: Thomson Course Technoogy/ Cengage. Tiwari, A., & Lai, P. (2002). Promoting nursing students’ critical thinking throughproblem-based learning. Paper presented at the 2002 Annual International Conference of the Higher Education Research and Development of Australasia (HERDSA), Perth, Australia. Vanderbilt, C. a. T. G. a. (1993). Anchored instruction and situated cognition revisited. Educational Technology, 33(3), 52–70. Warren, S. J. The impact of a Multi-user Virtual Environment (MUVE) teacher instructional time, voluntary student writing practice, and student writing achievement unpublished dissertation. Bloomington, Indiana: Indiana University-Bloomington, in, press. Warren, S. J. (2006b, April 7-11, 2006). Using a digital learning environment scaffolds to improve student writing in a PBL-style instructional space. Paper presented at the american educational research association annual meeting, San Francisco, CA. Warren, S. J. (2009) (3rd ed.). The door codex, Vol. 1 TX: Denton. ThinkTankTwo@UNT. Warren, S. J., & Dondlinger, M. (2008). Designing games for learning. In R. Fertig (Ed.), Handbook of research on effective electronic gaming in education. Hershey, PA: Idea Group Reference: IGI Global. Warren, S. J., & Dondlinger, M. J. (2009). Examining four games for learning: Research-based lessons learned from five years of learning game design and development. Paper presented at the Association for Educational Communications and Technology.
Author's personal copy
412
S.J. Warren et al. / Computers & Education 58 (2011) 397–412
Warren, S. J., Dondlinger, M. J., & McLeod, J. (2008). Power, play and PBL in postsecondary learning: leveraging design models, emerging technologies, and game elements to transform large group instruction. Paper presented at the American Educational Research Association Annual Meeting. Warren, S. J., Dondlinger, M., Stein, R., & Barab, S. (2009). Educational game as supplemental learning tool: benefits, challenges, and tensions arising from use in an elementary school classroom. Journal of Interactive Learning Research, 20(4), 487–505. Warren, S. J., Stein, R., Dondlinger, M., & Barab, S. (2009). A look inside a design process: Blending instructional design and game principles to target writing skills. Journal of Educational Computing Research, 40(3), 295–301. Wernet, S. P., Olliges, R. H., & Delicath, T. A. (2000). Postcourse evaluations of WebCT (Web Course Tools) classes by social work students. Research on Social Work Practice, 10(4), 18. Willis, A. S. (2002). Problem-based learning in a general psychology course. The Journal of General Education, 51(4), 281–292. Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: an overview. Educational Psychologist, 25, 3–17.