39442
or et al.Simulation & Gaming
SAG43510.1177/104687811243944
The Coaching Cycle: A Coaching-by-Gaming Approach in Serious Games
Simulation & Gaming 43(5) 648–672 © 2012 SAGE Publications Reprints and permission: sagepub.com/journalsPermissions.nav DOI: 10.1177/1046878112439442 http://sag.sagepub.com
Anna-Sofia Alklind Taylor1, Per Backlund1, and Lars Niklasson1
Abstract Military organizations have a long history of using simulations, role-play, and games for training. This also encompasses good practices concerning how instructors utilize games and gaming behavior. Unfortunately, the work of instructors is rarely described explicitly in research relating to serious gaming. Decision makers also tend to have overconfidence in the pedagogical power of games and simulations, particularly where the instructor is taken out of the gaming loop. The authors propose a framework, the coaching cycle, that focuses on the roles of instructors. The roles include instructors acting as game players. The fact that the instructors take a more active part in all training activities will further improve learning. The coaching cycle integrates theories of experiential learning (where action precedes theory) and deliberate practice (where the trainee’s skill is constantly challenged by a coach). Incorporating a coaching-bygaming perspective complicates, but also strengthens, the player-centered design approach to game development in that we need to take into account two different types of players: trainees and instructor. Furthermore, the authors argue that the coaching cycle allows for a shift of focus to a more thorough debriefing, because it implies that learning of theoretical material before simulation/game playing is kept to a minimum. This shift will increase the transfer of knowledge. Keywords after action review, coaching by gaming, coaching cycle, debriefing, deliberate practice, experiential learning, formative feedback, game-based training, instructor roles, playercentered, puckstering, serious games, summative feedback, teacher player, teacher roles
1
University of Skövde, Sweden
Corresponding Author: Anna-Sofia Alklind Taylor, School of Humanities and Informatics, University of Skövde, P.O. Box 408, SE-541 28 Skövde, Sweden Email: anna-sofia.alklind.taylor@his.se
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
649
Alklind Taylor et al.
Simulation/games are rapidly gaining ground within sectors such as learning, training, rehabilitation, and marketing, just to mention a few (Ritterfeld, Cody, & Vorderer, 2009). In vocational training, for example, military and rescue training, simulation/ games have been successfully accepted as a low-cost and efficient complement to realworld exercises (Smith, 2006; Thompson, Carroll, & Deaton, 2009). When discussing game-based learning and training, researchers most often focus on the effects of games on the learner. Only a few consider how the introduction of games in a curriculum will affect teaching (Dargue, Smith, Morse, & Frank, 2006; Liu & Wang, 2006; Pivec, 2009; Ruben, 1999; Sandford, Ulicsak, Facer, & Rudd, 2006). We see a trend toward self-directed training using games, because customers want a low-cost solution—preferably Internet based, as that would ease the cost of maintaining central facilities (Dargue et al., 2006; Smith, 2006; Wood, Douglas, & Haugen, 2002). This has serious implications for the instructors’ roles before, during, and after training. For instance, Liu and Wang (2006) found that instructors had difficulty in knowing how to participate in the learning situation during the actual game playing. Others (e.g., de Penning, Kappé, & Boot, 2009; Salas, Rosen, Held, & Weissmuller, 2009; Zachary, Bilazarian, Burns, & Cannon-Bowers, 1997) are working toward systems that automatically measure the performance of the trainees, give appropriate feedback, and collect information to be used for debriefing afterward. As a result, instructors’ tasks are mainly preparing the session, introducing the simulation/game, and conducting debriefing afterward. During the gaming event, however, they mainly observe or, as in the case of distance education, assess data from the simulation (Dargue et al., 2006). One concern, voiced by instructors, is the loss of control over what trainees actually learn during self-directed training (Linderoth, 2010; Sandford et al., 2006; Thorpe, 2010). However, as we argue, the instructors can influence trainees during training through what we call coaching by gaming. By this we mean that the instructor gives the trainee challenges, feedback, and directions, by becoming a game player alongside the trainees. This is closely related to role-playing (Crookall, Oxford, & Saunders, 1987), that is, when participants enact, for instance, a crisis situation and instructors take on the roles of commanders, coworkers, civilians, and so on. When brought into the virtual world of a digital game, this entails several new demands, not only on the instructor, but also on the system and the interaction methods used. Examples include improved artificial intelligence (AI) and improved teaching methods. To optimize the training effect within constraints such as time and costs is a challenge in itself. However, when we include factors such as instructor expertise and level of experience, even more complexity is added because the system and interaction methods must facilitate the cognitive workload of the instructor (Salas et al., 2009). The problem then becomes a question of maintaining a high degree of training quality without increasing costs (e.g., instructor salaries and system development costs). This proposition also has pedagogical consequences pertaining to the knowledge-action gap, described by Crookall and Thorngate (2009), specifically in how and when theoretical knowledge should be taught. We agree with the view that action should precede theory, and we agree that successful learning lies in the time and effort put into the actual training
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
650
Simulation & Gaming 43(5)
followed by a thorough debriefing. By shifting time spent on theoretical material from initial lectures to debriefing, training effectiveness can be increased without spending more time on training activities. Another important concept is deliberate practice. Deliberate practice is critical for developing superior levels of performance (expertise), that is to say skills beyond mere “acceptable” (Ericsson, 2006). Superior skills are necessary for high-risk professions such as rescue service and military personnel. By coaching instead of instructing, the instructor makes it possible for the trainees to enter a state of deliberate practice, that is, practicing in a specific and purposeful way, constantly stretching themselves beyond their comfort zone, and persistently striving for improvement (Ericsson, 2006). It is not just about putting in hours of practice, but requires concentration, reflection, and useful feedback about one’s performance. To achieve this, the expert-to-be needs a human coach or mentor to guide and push him or her in the right direction (Bransford & Schwartz, 2009; Ericsson, 2006; Ericsson, Pretula, & Cokely, 2007; Gee, 2007). Relating back to game-based training, these tasks could obviously be performed by the game itself, at least for training of simple skills. For more complex proficiencies, however, such as leadership, decision-making, and communication skills, a human coach would be more adaptive and also more cost-effective than a computer AI (Prensky, 2001; Thorpe, 2010). Even though military organizations have a long tradition of using digital games for training, their practices, from an instructor’s point of view, are rarely explicitly described in literature aimed at a wider game-based learning audience. Therefore, we are proposing a coaching-by-gaming framework that we term the “coaching cycle.” This entails that the instructor will take on multiple roles. The coaching cycle incorporates and integrates common practices from game-based military and rescue training with several theories concerning research into deliberate practice (e.g., Ericsson, 2006), game-based learning (e.g., Garris, Ahlers, & Driskell, 2002), experiential learning (e.g., Kolb, 1984), and debriefing (e.g., Lederman, 1992). It focuses attention on two major phases that we claim are essential for effective game-based training in vocational education: coaching during simulation/gaming and debriefing.
Research Approach This research aims to make those implicit assumptions of the instructors’ roles during game-based vocational training explicit. Our goal is not only to formalize good practices, but also to take a critical look at training practices in the light of existing theories in game-based learning and related areas. We believe that the result can benefit the general serious games community in that we view serious gaming not only from learners’, but also from the instructors’ perspective. We base our empirical findings on two case studies done in collaboration with the Swedish Land Warfare Centre (SLWC) and the Swedish Civil Contingencies Agency (SCCA). Both organizations have an important role in educating professionals who are going to perform tasks that are considered highly complex and dangerous (i.e., future
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
651
Alklind Taylor et al.
soldiers and fire fighters, respectively). SCCA is currently moving toward flexible distance training and e-learning, and both organizations are forced to undergo expenditure cuts. They use simulations and games to a certain degree in their curricula (SLWC more so than SCCA). However, it is incorrect to view this primarily as a rationalization, because pedagogical reasons for using simulation/games are sound. Empirical data have been collected through participant and nonparticipant observations during training occasions where simulations/games are used. By participant observations, we mean that the observer also, where possible, participated in the exercise, for example, by taking one of the gunners place during virtual tank crew training. Participant observation allows the observer to get first-hand experience of the event, which in turn leads to a deeper understanding of the training situation (Agar, 2008). Observations were also complemented with informal interviews with training personnel, mainly instructors and system operators, to get more information about training procedures and attitudes toward game-based training, and to confirm conclusions made from aforementioned observations (Lincoln & Guba, 1985). A total of 11 hours and 42 minutes of video material was analyzed using the Transana 2 software for transcribing video clips and sound recordings. Not all recordings were transcribed, due to time constraints and to the fact that not all recorded material included speech or included speech that was of immediate interest for this particular study. Interesting scenes and utterances concerning aspects related to training phases and instructor roles were sorted into categories. The emerging themes for instructor roles were as follows: •• Lecturer, that is, when the instructor mainly is involved in one-way communication, standing in front of the trainees in a traditional classroom setting •• Observer, that is, when the instructor observes trainee performance, either directly (peeking over the trainee’s shoulder) or indirectly (looking at screens showing, for instance, simulation data, a 2D or 3D map of the virtual environment, or a first-person view of one of the trainees). •• Scenario author, that is, when the instructor and/or system operator create content related to the specific training session, for example, writing the mission description, creating the map in which the simulation will take place, as well as placing entities such as buildings, vehicles, avatars, and NPCs (nonplayable characters). Scenario authoring can take place before or during training (or in between training sessions). •• In-game player or puckster, that is, when the instructor or system operator controls an avatar within the game. •• Live role-player, that is, when the instructor plays a role (e.g., a commanding officer) in the physical environment. •• Technical support, that is, when the instructor or system operator solves a technical problem (e.g., handling bugs appearing during gameplay) that does not require help from technical staff. •• Debriefer, that is, when the instructor gives summative feedback and/or discusses individual or group performances and alternative solutions.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
652
Simulation & Gaming 43(5)
The primary theoretical sources for this research have been literature within the areas of deliberate practice, pedagogy, and serious games. These findings coupled with empirical data from our case studies have been integrated to a model for game-based training.
Simulation and Game-Based Training Skill training is one of the most common uses of simulation/games (also called serious games). While simulations and role-play are widely used as part of a sound training curriculum, games are slowly gaining acceptance as serious training tools among instructors and trainees (Proctor, Lucario, & Wiley, 2008; Smith, 2007). It is, here, relevant to distinguish between games and simulators. Granted, some games are indeed simulators with game features, but the term simulator may be slightly misleading, as not all games are simulators. In fact, we find significant differences between simulators and games. First, although game elements can be integrated into a simulator, simulators in themselves do not have gaming properties, such as fantasy, rules/goals, sensory stimuli, challenge, mystery, and control (Garris et al., 2002). Second, simulations are seen as representations of (parts of) the real world, while games are systems in their own right (Crookall et al., 1987). In addition, games often contain aspects of fantasy and novelty that go against the rules of the real world (e.g., being able to fly or walk through walls), and these aspects are considered crucial for the “fun factor� of games, as already pointed out by Garris et al. (2002). The SLWC and the SCCA both use simulators as well as games in their training, although simulator-based training is more widely used. Games have only been used as part of the curricula for a short period of time (a few years). We believe that the motivational power of games makes them well suited for this type of training, and the fact that games are being increasingly adopted into training curricula is cause to further investigate the characteristics of game-based training.
Vocational Skill Training: Some Concrete Examples This section primarily serves to provide an organizational context for the proposed framework. The following description of training sessions is based on observations made at the SLWC and the SCCA. Please note that the description is a simplification and an aggregation of the actual training taking place at these two organizations. All training follows curricula, stating what the trainees are supposed to learn (e.g., battle training for the commander position in a tank). The actual training consists of theoretical material (e.g., books and lectures), live practice sessions (e.g., driving an actual tank or practicing smoke diving in a physical building), and sessions of simulation training (e.g., battle simulation in a tank simulator or smoke diving in a virtual building). Typically, much of the theoretical parts of the education have to be completed before entering the live practice and simulation sessions. Simulation training usually takes place at a central facility consisting of hardware, software, technical
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
653
Alklind Taylor et al.
Figure 1. The main stages in simulator training as found in the empirical data
staff, and system operators. In some cases, trainees are allowed to take home a PC version of the software (e.g., VIRTUAL BATTLESPACE 2 VIRTUAL TRAINING KIT [VBS2: VTK], 2008) to practice at home. Figure 1 is a simplified depiction of the main stages in simulator training at the SLWC. Before a training session, the system operator receives specifications for the session (e.g., training objective and what scenario to create). These specifications are usually not very detailed, as the training objective can be renegotiated anytime. Thus, the training scenario must allow for a certain degree of improvisation and flexibility in terms of learning goals and challenges. Usually, the system operator simply creates a map and a 3D environment, while entities such as vehicles and enemy avatars are added later, once training has commenced. This practice of creating half-finished scenarios is enabled through an effective interaction design in which creating new entities in the gaming system is a quick and easy task to perform. Most training sessions start with a lesson in a classroom setting. The trainees are assumed to have read some theoretical material beforehand and thus have knowledge about what they are about to practice. The lesson usually takes about 1 to 3 hours, depending on trainees’ adeptness and familiarity with the particular training setting.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
654
Simulation & Gaming 43(5)
During this time, instructors not only give a theoretical lesson, but also bring up practicalities surrounding the facility (including safety issues). Great care is taken to ensure that the trainees understand the game mechanics (e.g., that they recognize the properties of different terrain) and how the simulation differs from reality. The lesson ends with a briefing on the practice scenario and the objective (e.g., protect a caravan of supply trucks from hostile vehicles). During the simulation, the instructors monitor the progress and may also do some role-playing where necessary. At the SLWC, it is also common to involve one or several system operators, who take care of resource management, placing out enemy entities, and so on. This has the advantage that the instructors can focus on assessment and not micromanagement of the simulation itself. However, if the training session is seen as not too complex, some instructors choose to take on the system operator role themselves. This also depends on whether the instructor has had technical training on how to use the system. At the SCCA, normally only one instructor is present during the sessions, and, thus, she or he has to do all tasks concurrently. As a rule, the training is made up of segments, making each simulation brief, but intense. After certain goals have been reached (or if the trainee has failed in reaching the objective), the instructor gives some brief feedback on the performance. Then the simulation is resumed, either by playing the same scenario again or given a new task to perform. Often, the trainees are asked to assess their own performance in smaller groups before getting feedback from the instructor. The training session normally ends with a debriefing (called after action review, AAR, in military contexts) back in the classroom, where some of the performance is replayed (e.g., through recordings of the simulation). Trainees are asked to explain their behavior and discuss how they could have acted differently. In our experience, the debriefing is not as in-depth as described and recommended by, for example, Lederman (1992). Even though most simulation and game systems used by these organizations include evaluation or AAR functionality, these are not used to the full extent. The main reasons given for this are difficulties in getting an overview of all information logged during training and too much effort in preparing the debriefing in relation to the usefulness of the output during debriefing (compared with just showing a few snapshots of noteworthy events). Usually, the instructors only have a few minutes to prepare the debriefing. One of the training centers at the SCCA tested a slightly different approach during a project about game-based training (Backlund et al., 2009). The general idea is to let the trainees experience some of the strain of the task (breathing apparatus entry) in a simulated environment before entering the classroom for a theoretical lesson that explained basic components and tactics. The lesson is followed by live exercises with reduced complexity (i.e., full turnout gear with covered eyes, but no fire). At this stage, the students are expected to know the task in some detail, hence being ready for volume training in the simulator. The training task is then finalized with live exercises using full turnout gear and real fire. By introducing a component of experiential learning at an early stage of the training program, it was hypothesized that motivation
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
655
Alklind Taylor et al.
would rise and that the students would grasp the theories presented in a better way. Indeed, evaluations showed a slight advantage of the group that had a simulator session instead of a theoretical lesson preceding its first live exercise (Backlund et al., 2009). The process developed during the project did not utilize the full potential of gamebased training as it offered only reduced debriefing in conjunction with the simulator sessions and no instructor feedback during the actual sessions, that is, important aspects of the coaching cycle proposed here. However, the simulator training sessions featured distinct game elements such as challenges and scoring. Furthermore, the scoring system was based on the training task, and the system provided feedback on performance. Evaluations showed that the game component served as a motivator for the trainees. One difference between the SLWC and the SCCA is the number of instructors involved in the training session. Within the SLWC, at least two instructors (one teacher and one system operator) are present during the simulation. This means that tasks can be divided up between them, for example, by having the system operator acting as a “puckster,” that is, controlling a group of AI entities (Colonna-Romano et al., 2009) while the teacher observes trainees’ performance. At the SCCA, however, usually only one instructor is available (for cost-efficiency reasons) to both run the simulation and assessing trainee performance at the same time. Organizations such as these also have established lessons-learned processes. A lessons-learned process involves, in short, the acquirement and analysis of experiences from, for example, sessions such as the one described previously. The analysis is described in a lessons-identified report that is then evaluated, implemented, and followed up, resulting in a lessons-learned report (Johnny Gullstrand, major at the SLWC, personal communication, November 28, 2008).
The Power of Game-Based Training So why would game-based training be more powerful than merely simulator training? One argument is that games have a potential to provide a situated learning environment, not only in the physical sense (e.g., moving around in a virtual world), but also emotionally (Gee, 2009). In addition, the input-process-outcome game model presented by Garris et al. (2002) provides an explanation to why specifically games are good as learning tools/environments. According to the model, an instructional system poised with game characteristics will “trigger a cycle that includes user judgements or reactions such as enjoyment or interest, user behaviors such as greater persistence or time on task, and further system feedback” (p. 445). Thus, a positive effect of this game cycle is greater motivation and persistence in the learner, who spends more time and effort at the task. If we achieve a good balance between instructional content and game characteristics, the result is better learning. The effect is strengthened by the fact that the game cycle is iterative, in contrast to a single-trial learning approach (Garris et al., 2002). Moreover, the input-process-outcome game model calls attention to the fact that learning happens during training and through action, a view anchored in
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
656
Simulation & Gaming 43(5)
experiential learning approaches (e.g., Kolb, 1984; Ruben, 1999). This is especially apparent for skill training, but just as important in other types of learning as well (Thatcher, 1990). It is important to state that game-based training does not imply easy or light training (Bente & Breuer, 2009), but is rather a method to deliver challenges that strain the trainees beyond their current skill level (Gee, 2007). This is closely related to the concept of deliberate practice put forth by Anders Ericsson and his colleagues. According to Ericsson (2006), deliberate practice “presents performers with tasks that are initially outside their current realm of reliable performance, yet can be mastered within hours of practice by concentrating on critical aspects and by gradually refining performance through repetitions after feedback.” (p. 694). Key requirements of deliberate practice are demanding (but not impossible) tasks, specific goals for improvement, continuous feedback, and opportunities for repetition. These requirements make simulations and games especially suitable for this kind of activity (Ericsson et al., 2007). Moreover, a good instructor is also important for deliberate practice, to guide and push the learner in the right direction (Bransford & Schwartz, 2009; Ericsson, 2006; Ericsson et al., 2007). Becoming an expert involves learning to monitor oneself and avoiding plateaus to continue improving. Here, an instructor, or rather coach, plays a pivotal role alongside books, videos, school curricula, and computer programs (Bransford & Schwartz, 2009). In our view, the difference between an instructor and a coach is that a coach takes a more active part in the actual training without dictating the “right” way to do things (Bolton, 1999). Thus, coaching complements in-game feedback with advice that steers the trainees toward developing expertise in their profession (as opposed to expertise in winning the game). One key to successful game-based coaching is appropriate methods of assessment or proof of learning (Bente & Breuer, 2009). Two major types of assessments are summative assessment (measurements after training has occurred) and formative assessment (measurements during training). Summative assessment usually occurs before and/or during debriefing and is essential for grading or similar forms of evaluation. However, according to Bransford and Schwartz (2009), feedback often comes too late for learning to be optimal. By incorporating formative assessment into the training, trainees can benefit by using the feedback as a learning experience of its own (Bente & Breuer, 2009; Iuppa & Borst, 2007; Shute, Ventura, Bauer, & Zapata-Rivera, 2009). Formative assessment and feedback are especially useful for the development of metacognitive skills, for example, analyzing and evaluating decision-making processes (Raybourn, 2007). From our experiences at the SLWC and the SCCA, formative assessment is not extensively used, and mainly in an ad hoc manner. Too few instructors are available to monitor all trainees at all times, and the training systems used are crude when it comes to automatically assessing aspects such as communication skills, collaboration, and other social interactions. Summative and formative assessments serve different purposes (Boston, 2002), and both are, thus, useful in game-based training. Furthermore, formative assessment is not only a help for learners, but it can also be beneficial for instructors, as it serves as a guide toward effective coaching (Bransford & Schwartz, 2009). Bente and Breuer
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
657
Alklind Taylor et al.
(2009) claim that (serious) games are ideal for formative assessment and in-game feedback, including psychophysiological measurements. These measures enhance metacognitive abilities that improve learning: Feedback about stress level and learning progress can help the players to identify problematic situations and to evaluate their own performance properly. This can endorse self-assessment in learners, which is a valuable way to improve meta-cognitive skills or even lead to a more realistic self-construal. (Bente & Breuer, 2009, p. 336) This does not mean, however, that game feedback makes instructors superfluous— only that their professional role, for example, as motivators, is altered and facilitated by game mechanics. Furthermore, a well-known, and often debated, problem in vocational training is that of transfer of knowledge from one situation (training) to another (workplace). As put forth by Billett (1998), knowledge is situated, that is, it is influenced by the circumstances and social factors surrounding the training. To enhance transfer, knowledge not only needs to be embedded in those sociocultural practices where that knowledge will be used in the future, but also disembedded so that associations between similar practices can be made more effectively (Billett, 1998). A simulation/ game has the potential to include such sociocultural practices, if designed well. Apart from providing access to authentic activities, enhancing transfer also involves guided learning by experts and promoting reflection through problem solving and critical thinking (Billett, 1998). Debriefing, if done in depth, supports such reflective thinking and is essential for transfer (Crookall & Thorngate, 2009; Lederman, 1992). As described in the previous section, simulation/game training is often preceded by a time period where theoretical material is taught, usually in a classroom setting. It is, however, possible to minimize this phase, and this would actually be beneficial for learning. Crookall and Thorngate (2009) convincingly argue that action precedes knowledge, not the other way around, and that this view affects how simulation/games are designed, used, and debriefed: If we believe that the primary “transformation” is knowledge into action, then we are likely first to teach content and then run a simulation/game to demonstrate how students can apply that knowledge to some practical situation, followed by a light verification debriefing. On the other hand, if we see the essential thrust of knowledge as flowing from action, then we are more likely to plunge students into a simulation with little pre-teaching, and then ask them to relate their experience to real-world situations in a variety of debriefing tasks— talk, essays, research. (p. 18) Since we are talking about (practical) skill training with adults, the latter view is more useful and, thus, preteaching can be kept to a minimum, primarily presenting
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
658
Simulation & Gaming 43(5)
how to operate the game and the goal of the exercise. A thorough debriefing is especially important when applied to game-based training, because games can elicit unexpected and unwanted behaviors and beliefs that can cause damage to the individuals involved if not dealt with appropriately (Jones, 1998).
The Many Roles of the Instructor in Game-Based Training As Ruben (1999) points out, the instructor’s role is often overlooked in the literature on game-based learning and training. Most practitioners and researchers agree that pedagogical issues make instructors an essential ingredient even in the most adaptable and complex simulation/game, but their role will have to change (Bente & Breuer, 2009; Prevou & McGurn, 2010). Prensky (2001) lay out a number of “new” roles for instructors in game-based learning, namely, as motivators, content structurers (integrators/reformulators), debriefers, tutors (individualizers, steerers, selectors, adjusters, guides, facilitators), and producers/designers. Some of these are more descriptions of tasks than actual roles, but nevertheless makes the point that instructors will not be seen rolling their thumbs in boredom anytime soon. Most prevalent in the literature is the transformation from instructor to the roles of facilitator and coach, that is, a change from merely relaying instructional content to guiding the learners toward selfreflection (Prevou & McGurn, 2010). Introducing content in real time can, according to Raybourn (2007), influence trainees’ actions and help “the instructor create opportunities for adaptive thinking and the demonstration in communication or leadership skills as the situation dynamically changes and becomes more stressful” (p. 208). As mentioned earlier, all of these tasks or roles can either be distributed among several instructors or taken on by one individual. Fine lines separate the different roles, and we argue that an instructor switches seamlessly between them. Another interesting perspective is the one suggested by Angela Brennecke in her doctoral thesis (Brennecke, 2009), to view not only the learner or trainee as a player, but also the instructor. This view of the instructor as player has several advantages: First, it stresses the point of not only satisfying the needs of the trainees when designing a simulation/game, but also to consider issues of usability, accessibility, and gameplay from the instructor’s perspective. This brings a new dimension to the term “player-centered design” (Brennecke, 2009). Second, instructor performance and buyin/acceptance (Alexander, Brunyé, Sidman, & Weil, 2005; Chesney, 2006) of gamebased training systems would most likely increase. By feeling involved in the gameplay, the instructor would more easily relate to the student situation and not get the (wrong) impression that the simulation/game takes over his or her role as instructor (Brennecke, 2009; Prensky, 2001). According to Brennecke, the instructor can prepare the training session by playing a backstory, that is, the introductory part of the game that gives the player a background for the story and characters in the simulation/game. This is referred to as the Authoring Component. In a case study, Brennecke and Schumann (2009) tested this by implementing a game-based system for crime scene investigation (CSI) training. The
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
659
Alklind Taylor et al.
idea was to have a cop-and-robber scenario; the instructor would prepare the training session by playing the game as the antagonist, robbing a virtual apartment of valuable things and leaving clues such as foot- and fingerprints. The students were then given the task of investigating the crime scene and trying to “outsmart” the instructor. The system also included a reviewing component that would be used for the debriefing session afterward. We find this notion of instructor as player an interesting one, worth investigating further. Although instructors are known to “coplay” in live role-playing exercises, the concept has not, to a larger extent, been adopted in general practices involving digital serious games. As we have seen in our case studies, however, instructors do, in fact, play with their students, but this practice is not fully documented and seems to have evolved from instructors’ previous experiences and not systematic studies (Rubel, 2006). Thus, it is not far-fetched to extend the framework to incorporate the notion of instructor as player into other parts of the training, resulting in a coaching-by-playing perspective. We will discuss this in more detail in the next section. Another important role that we need to address is the role of debriefer. As mentioned earlier, debriefing is essential for transfer and should be an integral part of simulation or game-based training. Through a guided discussion, the debriefer has to make sure that the trainees make relevant reflections about their experiences from simulation/gameplay, leading to a deeper understanding of the situation and the tasks performed (Lederman, 1992).
Coaching Cycle We have, during our observations at the SLWC and at the SCCA, identified and formalized training practices involving games. This work has recognized good practices that should be preserved and, by extension, leveraged to other game-based training domains. Furthermore, we have also identified a few areas in which improvement is needed (such as minimal time for initial lessons and more thorough debriefing). From these observations, as well as the theoretical findings described in the earlier sections, we have derived a coaching cycle model that characterizes game-based training from the instructors’ perspective. The cycle is similar to the macrocycle described by Klabbers (2008) and the guidelines for professional military education described by Prevou and McGurn (2010). The coaching cycle consists of three phases: scenario preparation, gameplay, and debriefing. We have also added a link to lessons-learned processes (that includes lessons identified) because this will both affect and be affected by the individual training sessions. Figure 2 depicts the coaching cycle, and we will henceforth describe each phase in more detail.
Scenario Preparation It is in the preparation phase that the instructor prepares the training session in light of previous sessions, the trainees skill level, purpose of the exercise, and so on
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
660
Simulation & Gaming 43(5)
Figure 2. The coaching cycle
(Klabbers, 2008). The most prominent part of this phase, at least from a simulation/ game perspective, is the creation of a training scenario (Prevou & McGurn, 2010), that for warfare and rescue training consists of tasks such as selecting maps, setting triggers and placing enemy units (or hazardous items, fire, etc.), civilian units, and so on. Another important part is to decide how to manage and organize the briefing, so that participants have enough understanding to be able to run the simulation/game (Prevou & McGurn, 2010). The role of the instructor is one of content structurer and even producer/designer, to use the terms suggested by Prensky (2001). At the SLWC, these tasks are usually carried out by the system operator. The preparation phase is closely related to Brennecke’s Authoring Component (described in the previous section), especially if one adopts the idea of the instructor as game player, playing the backstory. Then, engagement becomes as important as usability. By engagement, we mean that the system should have characteristics that elicit reactions such as flow (Csíkszentmihályi, 1990), immersion, and presence (Alexander et al., 2005). To facilitate the instructor’s tasks during this phase, the system needs to be adaptable in terms of different challenges and difficulty levels related to different trainee groups or, in some cases, individuals. Similarly, as mentioned in the section about vocational skill training, training at the SLWC and the SCCA has aspects of improvisation and fluctuating learning goals. For instance, a learning goal might be
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
661
Alklind Taylor et al.
renegotiated between instructors in the middle of a training session. Thus, without flexibility, a training system quickly becomes ineffectual. Usability is another important factor, because not all instructors are experts in games and simulation systems. The simulator/game should support the instructor in carrying out the necessary tasks in an effective, efficient, and straightforward manner.
Gameplay The phase most often ignored by researchers is the activities of the instructor during simulation or play. However, even the most convinced advocates of “instructor-less training” admit that instructors need to be present during training, at least to make sure that the trainees are interpreting the game appropriately (Chatham, 2009; Salas et al., 2009). Other relevant activities can be summarized as progress monitoring, in which the instructor follows the trainees’ progress during play, takes notes for the debriefing session, and gives continuous feedback on training aspects that can be immediately taken into account by the trainee(s). In some cases, the instructor may also be involved in role-play, playing a superior, team mate, civilian, or enemy. The coaching-bygaming phase relates to the instructor role that Prensky (2001) would call tutor, individualizer, steerer, selector, adjuster, guide, or facilitator, or what Bransford and Schwartz (2009) call coach. Coaching during play. Feedback is the main source of guidance in game-based training (Salas et al., 2009). From a deliberate practice perspective, it is important for the instructor to give continuous feedback during training to put more pressure on the trainees and guide them in the right direction. Only then will the training be effective (Bransford & Schwartz, 2009). In contrast, Klabbers (2008) claims that facilitators should intervene as little as possible while the game is running. In our view, however, a risk arises that, if all is left to system feedback, trainees might focus on those skills that they are already fairly good at, and not practice enough on the skills that need improvement. Some of the “damage” might be undone during debriefing, but then it is again up to the trainee to make sure that he or she trains in a slightly different manner in the next trainings session (assuming that one is provided). It is also more difficult to unlearn a pattern of incorrect behaviors, such as decision making, once training has reinforced that pattern (Iuppa & Borst, 2007). Iuppa and Borst (2007) suggest an “instructor-in-the-loop approach” that “may allow for minute interactions with users to keep them on-track, and prevent them from continual repetition of incorrect behaviors” (p. 138). For instance, the instructor might give direct feedback about improper radio communication during the training instead of waiting until debriefing. That way, the trainees can change their behavior immediately. We see a few drawbacks to direct feedback, however. The instructor has to tread a fine line between pushing the trainees too little or too much. Too much strain will eventually lead to the trainees’ loss of concentration that can only be upheld for shorter period of times (Chatham, 2009). Also, to avoid an undesirable dependency on the instructor, the trainees should be allowed a gradual
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
662
Simulation & Gaming 43(5)
increase in independence as they progress toward advanced skill levels (Iuppa & Borst, 2007). Furthermore, the risk is that direct feedback may interrupt the trainees, resulting in loss of immersion. That is why we suggest a “coaching-by-playing” approach, that is, that feedback is given within the game context. For example, the instructor might avertedly make the task harder for those who use a wrong method to achieve the goal or change the availability of resources to promote cooperative behavior (Raybourn, 2007). In one of our observations at the SLWC, one of the teachers noted that a group of soldiers neglected to search the buildings in their area. He then went to the system operator and asked him to take out that entire group. The system operator, now acting as a puckster, created an enemy avatar in one of the virtual buildings, and started to shoot at the soldiers’ avatars. This way, the trainees effectively got the message while still immersed in gameplay. By using a puckster, instead of the system’s AI, the instructors also make sure that the trainees are not what Gee (2009) calls “gaming the system,” that is, learning how to exploit the underlying rules of the game instead of the methods that will also be useful in real situations. This is especially crucial when training novices (Frank, 2012). One of the system operators at the SLWC justified his puckstering with the fact that game AI has a different agenda than him. As an instructor, he has a pedagogical goal when controlling an enemy avatar—a goal that goes beyond winning the game, that is, those trainees who use methods that only works in-game (and not in reality) should be “punished” by being shot down, while those who perform well should be ignored and, thus, have a larger chance of winning the game. As can be seen in Figure 2, the input-process-outcome game model (Garris et al., 2002) has been incorporated in the coaching cycle: the cyclic process of system feedback, user judgment, and user behavior. In our model, we have also added instructor feedback as one component, to put emphasis on the active role of the instructor. Note, however, that the amount of feedback given has to be weighed against the learning goals and the complexity and flexibility of the system and the feedback given therein. If the learning goal is fairly simple and the system is highly adaptable, the instructor can focus mostly on monitoring the progress. If the trainees have to acquire complex skills that the system is not programmed to react upon, the instructor has to take a more active part in the training. Assessment during play. Assessment includes measuring and evaluating the performance of the trainees, individually or in groups, and is essential to be able to give appropriate feedback. Assessment can be done by the instructor, by peers, by the training system, or by any combination of these (Raybourn, 2007). For instance, Raybourn (2009) presents a game-based system for training Special Forces in intercultural competence in which trainees switch between in-game role-playing and being observers/ evaluators (alongside instructors). She found that the role of observer/evaluator accelerated reflective learning in that they “internalized the concepts and new vocabulary more quickly than others” (Raybourn, 2009, p. 608). For complex behaviors that are difficult or straining to assess by a human instructor, an adaptive system based on AI or human behavior modeling might be preferred (Salas et al., 2009). The instructor can
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
663
Alklind Taylor et al.
then spend more time focusing on other tasks such as role-playing, technical support, and so on. However, creating a program that can assess if a behavior is adaptive and creative might be more expensive and cumbersome than letting the instructors do at least some of the assessment. As expressed by Prensky (2001): While “adaptive” learning games will become more “tutorlike” in the future, sensing more and more of the player’s situation from his or her responses and adjusting accordingly, a human is still best at seeing exactly why a learner may be having difficulty. (p. 352) Salas et al. (2009) claim that the expertise of the instructor is key to successful training, because the task of assessing complex behavior easily becomes overwhelming. Here, a good design can ease some of these tasks, even if the game does not do the actual tutoring. However, being an expert does not make you a good coach (Bransford & Schwartz, 2009). The instructor also needs to reflect upon his or hers tasks and to grow in the role as a coach or tutor. Bransford and Schwartz (2009) call this bidirectional learning, that is, “learners learn from teachers and teachers learn from learners” (p. 436). Considering all the information that the instructor has to keep track of and act upon, we see a great need for a system that facilitates these tasks and reduces the instructor’s cognitive workload. As previously mentioned, formative assessment can be used to guide the trainee while providing information on the system state and player performance. However, to provide correct and relevant feedback, the variables to be logged have to be carefully selected. This means that they should connect to the training task. Backlund, Engström, Johannesson, and Lebram (2010) operationalized traffic safety variables in a game-based training simulator that made it possible to monitor progress and provide feedback that directed the trainee toward the learning goal. Many situations exist in which measurable progress is of great assistance to the trainee, and that type of feedback is prominent in many games. It is also important that the methods used to assess trainee performance are as unobtrusive as possible, because they otherwise will disrupt the trainee’s engagement in the game. This is also known as stealth assessment (Shute et al., 2009). Yet another aspect of system feedback relates to self-efficacy, that is, the idea that a person’s belief in success or failure will actually have an impact on the outcome (Bandura, 1997). Backlund, Engström, Johannesson, Lebram, and Sjödén (2008) developed feedback based on Bandura’s theories of self-efficacy. They found some interesting practical implications pertaining to the design of simulation/games. We see a distinct advantage of using a design for self-efficacy, even if it does not result in improved performance at the time of testing. For example, better self-efficacy is likely to raise motivation and hence prolonged engagement. Self-efficacy has consequences not only for the performance of the particular task, but also for what activities the trainee chooses to engage in and the persistence invested in them. In other words, a reasonably high level of self-efficacy also
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
664
Simulation & Gaming 43(5)
encourages the person to keep on learning and undertake the same tasks again that were not accomplished the first time.
Debriefing Debriefing, or after action review (AAR), is an essential part of any experience-based learning situation (Lederman, 1992), as it allows the participants to reflect upon and generalize from the training experience (Crookall et al., 1987; Thatcher, 1990). As expressed by Garris et al. (2002): “Debriefing provides a link between what is represented in the simulation/gaming experience and the real world” (p. 454). This involves both reviewing the gamed process as well as sense making and creation of meaningful knowledge (Klabbers, 2008). For instance, participants can discuss alternative solutions that were not acted out, but that are just as reasonable (Raybourn, 2007). For these reasons, it is important that debriefing occurs as soon as possible after the end of the training session (Johnson & Gonzalez, 2008). Debriefing can take many forms, the simplest being a discussion between the instructor(s) and the trainees. However, as Greenaway (2007) points out, just discussing verbally is not the optimal way to debrief. He makes a distinction between effective and dynamic debriefing, where dynamic debriefing “is more than a lively discussion. When a debriefing is truly dynamic, each person is fully engaged in the learning process and has some influence over its direction” (p. 61). Visual aids and other media play a central role as tools for communication in dynamic debriefing, because some things may be difficult to express with words and/or some participants may not be equally reflective and articulate (Greenaway, 2007). Logs of trainee activity and diagnostic tools are also important, especially when participants fail to recall or have different opinions about what happened during a specific event (Johnson & Gonzalez, 2008). Because the instructor’s tasks during training are quite demanding, it is necessary to take some time between the actual simulation/gaming and the debriefing to reflect upon and review the performance of individuals, and in some cases teams. The system should facilitate summative assessment by having in-built summary functions (tables, charts, etc.) of logged data that enable the instructor to get an overview of the training session and to compare and triangulate data from a specific portion of the session (see, for example, Backlund et al., 2010). In our experience, simulation systems often log an overwhelming amount of data, and, still, the instructor ends up using only a fraction of it. Careful analyses should be performed to determine which data are most (and least) important for different types of training contexts. By doing so and making sure that the “right” data are grouped and attended to, the decision-making process is made easier and more accurate. As training scenarios change over time and new learning goals are set, a flexible system also should allow the instructor to create new ways to view different groups of data. Since the coaching cycle does not include a theoretical knowledge attainment phase, debriefing becomes pivotal to the learning outcome of the training. It is
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
665
Alklind Taylor et al.
therefore important to make sure that enough time is spent on this phase and that the instructor is aware of its importance as well as the pitfalls to avoid. Here, the system can contain checklists, exercises, topics for discussion, and so on, to help the instructor to plan and carry out the debriefing session. If possible, the system could include support for automated debriefing, but, as Johnson and Gonzalez (2008) point out, intelligent systems cannot completely replace the instructor.
Lessons-Learned Process An important part of vocational training organizations is to implement a “lessonslearned process” as a basic training component, where training is a fundamental component (Johnny Gullstrand, major at the SLWC, personal communication, November 28, 2008). Here, the individual training sessions are viewed from a wider perspective and generalized upon. Individual performance is of minor interest, and the instructor is more concerned with questions such as “Was the training effective in relation to the curricula?” Another concern is previous trainees, now professionals, and how they perform in real-world situations. The lessons-learned process is actually a set of several processes, working in parallel, as vocational training often entails several different types of training methods (not just game-based training). Each method consists of several training sessions, intended to train different learning objectives during the course of a longer education. Lessons-learned processes work in two directions. On one hand, they receive input from recently completed training sessions (and information from organizations employing professionals, for example, the army or fire departments), but it also serves as input to future sessions. We have only brushed against the complexity of the lessons-learned processes. It is an area that should be explored further, especially because it has been largely ignored by researchers within the field of game-based training. Adopting game-based training will inevitably have consequences on the overall organization, including the lessonslearned process, and it is important to show what those consequences are.
Discussion In this article, we have presented a framework for game-based (vocational) training that we term the coaching cycle and that incorporates theories of experiential learning, deliberate practice, and a coaching-by-gaming perspective. In the previous sections, we have shown how the coaching cycle fits with theories of learning, expertise, game theory, and current training practices at the SLWC and SCCA. Even though the coaching cycle closely resembles how training is carried out today, it points out some issues where game-based training could be improved. Mainly, we have brought attention to the roles of the instructors during game-based training—roles that are either overlooked by serious games researchers or implicitly assumed by practitioners. The framework has several implications for the development of future game-based training systems:
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
666
Simulation & Gaming 43(5)
•• It has a clear focus on the instructors—a focus that is rarely seen in simulation/game research today—and the different roles that they take on before, during, and after training. This does not, however, imply that a game should be developed with only the instructors in mind. It is the learning objectives that should guide the development team, together with making the learning experience as engaging and challenging as possible (as argued for in the section on simulation and game-based training). The point is to incorporate the instructor into that equation too, making the teaching experience just as engaging and challenging as the learning experience is for the trainees, but with a different set of tasks to perform. In our case studies, we have seen that the instructors exhibit gaming behavior, such as improvisation, role-playing, “puckstering,” and giving unexpected commands. Although this is common within military contexts and in live role-playing exercises, it is rarely explicitly described in instructions for developers of digital serious games. We do think, however, that other training contexts could benefit from incorporating the coaching cycle into their curricula. •• The coaching cycle takes a stance in favor for the experiential learning paradigm, where action precedes theoretical knowledge. Therefore, only minimal theoretical instruction should occur before the actual gameplay. Instead, debriefing should be extended to incorporate that theoretical content that was previously taught in beforehand. Dynamic debriefing then becomes pivotal to learning and transfer. To some degree, the SLWC already has incorporated this in their training practice, but instructors are seldom given enough time to prepare a structured, in-depth, and dynamic debriefing session. •• Given the differing contexts that the many roles of the instructor bring about, facilitating the tasks for each phase is crucial, as shown in the section on simulation and game-based training. Central in this question is how performance can be measured and diagnosed automatically and how these data can be visualized in a useful way to provide nonintrusive system support for both the instructors as well as the trainees. Today, a lot of data are logged, but only a small portion of it is actually used during and after training. Hence, a central question is how we can use these data in a more efficient and effective way. Furthermore, automating some of the instructors’ tasks would lessen the strain on them, but also put high demands on the system AI. As a consequence, we make a trade-off between designing a fully automated system and designing a system that provides flexibility in terms of a human coach, who is aided by less complicated assessment functionality.
Deliberate Practice Another idea, which just recently has begun gaining attention from the simulation/ game community, is the notion of deliberate practice. Deliberate practice is essential
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
667
Alklind Taylor et al.
for the development of expertise, and it has two major themes that are relevant to the coaching cycle: •• constant challenges in those areas where the expert-to-be needs to improve and •• the importance of a professional coach, who identifies those weak spots in the trainee and gives feedback accordingly To have challenges that increase the effort put into the training sessions is a wellknown fact in games research, but we still see a drive toward “instructor-less training” (Chatham, 2009), a drive that ignores the coaching part. Of course, to some extent, the game AI could act as a coach, but for certain learning objectives, that kind of AI is still far from as versatile and flexible as a human coach. Maybe in the distant future, we will see training systems that completely exclude an instructor or coach, but until then, we might do well to focus on a framework that takes a human coach into consideration. To do this, serious games designers and practitioners must take a critical look at the whole context in which gaming takes place. It is our firm belief that instructors still have an important part to play there, but serious game developers need to add functionality that facilitates coaching to reduce instructors’ workload, and, as a consequence, reduce costs related to employing a large number of instructors. Thus, a main goal for researchers and developers is to design a system that enables one instructor to coach a larger number of trainees without loss in learning outcomes. Here, we see that a system where instructors work in tandem with the kind of peer-to-peer feedback suggested by Raybourn (2009) is a promising approach alongside further investigations into automatic assessment by AI agents. The article presents an empirically and theoretically motivated framework, that is, the coaching cycle. Future works include implementing it in an organizational setting. Thus, the next step will be to further test and validate the framework in an in-depth case study. The aim of our future work is to derive requirements for a more effective game-based training system, in which in-game AI helps human instructors to efficiently coach by gaming. Future work will also include further investigations of the lessons-learned processes and how they can contribute to increased quality of simulation and game-based training. Acknowledgments The authors wish to thank our initially blind, then coaching reviewers Rosemary Garris, David Kolb, and Elaine Raybourn for valuable comments on earlier drafts of this article. They also like to thank the trainees, instructors, and other staff at the Swedish Land Warfare Centre and at the Swedish Civil Contingencies Agency who directly or indirectly participated in this study.
Declaration of Conflicting Interests The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
668
Simulation & Gaming 43(5)
Funding This research has been partially sponsored as a research fund by the Swedish Civil Contingencies Agency.
References Agar, M. H. (2008). The professional stranger: An informal introduction to ethnography (2nd ed.). Bingley, UK: Emerald. Alexander, A. L., Brunyé, T., Sidman, J., & Weil, S. A. (2005, November). From gaming to training: A review of studies on fidelity, immersion, presence, and buy-in and their effects on transfer in PC-based simulations and games. Paper presented at the DARWARS Training Impact Group. Retrieved from http://www.aptima.com/publications/2005_Alexander_ Brunye_Sidman_Weil.pdf Backlund, P., Engström, H., Gustavsson, M., Johannesson, M., Lebram, M., & Sjörs, E. (2009). SIDH: A game-based architecture for a training simulator. International Journal of Computer Games Technology. doi:10.1155/2009/472672 Backlund, P., Engström, H., Johannesson, M., & Lebram, M. (2010). Games for traffic education: An experimental study of a game-based driving simulator. Simulation & Gaming: An International Journal, 41, 145-169. Backlund, P., Engström, H., Johannesson, M., Lebram, M., & Sjödén, B. (2008, July 9-11). Designing for self-efficacy in a game based simulator: An experimental study and its implications for serious games design. In Proceedings of the International Conference Visualisation (Viz08), IEEE (pp. 106–113). London, UK. Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: W. H. Freeman. Bente, G., & Breuer, J. (2009). Making the implicit explicit: Embedded measurement in serious games. In U. Ritterfeld, M. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 322-343). New York, NY: Routledge. Billett, S. (1998). Transfer and social practice. Australian and New Zealand Journal of Vocational Education Research, 6(1), 1-25. Bolton, M. K. (1999). The role of coaching in student teams: A “just-in-time” approach to learning. Journal of Management Education, 23, 233-250. Boston, C. (2002). The concept of formative assessment. Practical Assessment, Research & Evaluation, 8(9). Retrieved from http://PAREonline.net/getvn.asp?v=8&n=9 Bransford, J. D., & Schwartz, D. L. (2009). It takes expertise to make expertise: Some thoughts about why and how and reflections on the themes in chapters 15–18. In K. A. Ericsson (Ed.), Development of professional expertise: Toward measurement of expert performance and design of optimal learning environments (pp. 432-448). New York, NY: Cambridge University Press. Brennecke, A. (2009). A general framework for digital game-based training systems (Unpublished doctoral thesis). University of Rostock, Rostock. Retrieved from http://rosdok .uni-rostock.de/file/rosdok_derivate_000000003979/Dissertation_Brennecke_2009.pdf Brennecke, A., & Schumann, H. (2009, June 17-23). A general framework for digital gamebased training systems. In Proceedings of IADIS Game and Entertainment Technologies (GET 2009) (pp. 51-58), Algarve, Portugal.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
669
Alklind Taylor et al.
Chatham, R. E. (2009). Toward a second training revolution: Promise and pitfalls of digital experiential learning. In K. A. Ericsson (Ed.), Development of professional expertise: Toward measurement of expert performance and design of optimal learning environments (pp. 215-246). New York, NY: Cambridge University Press. Chesney, T. (2006). An acceptance model for useful and fun information systems. Human Technology, 2, 225-235. Colonna-Romano, J., Stacy, W., Weston, M., Roberts, T., Becker, M., Fox, S., . . . Paull, G. (2009, March 31-April 2). Virtual Puckster–behavior generation for army small team training and mission rehearsal. In Proceedings of the 18th Conference on Behavior Representation in Modeling and Simulation, Curran Associates (pp. 153-154), Sundance, UT. Crookall, D., Oxford, R., & Saunders, D. (1987). Towards a reconceptualization of simulation: From representation to reality. Simulation/Games for Learning, 17, 147-171. Crookall, D., & Thorngate, W. (2009). Acting, knowing, learning, simulating, gaming [Editorial]. Simulation & Gaming: An International Journal, 40, 8-26. Csíkszentmihályi, M. (1990). Flow: The psychology of optimal experience. New York, NY: Harper Perennial. Dargue, B. W., Smith, B., Morse, K. L., & Frank, G. (2006, October 5-6). Interfacing simulations with training content. In Meeting Proceedings of the NATO RTO Modelling and Simulation Conference, NATO (pp. 1-14), Neuilly-sur-Seine, France Available from http://www .rto.nato.int/Pubs/rdp.asp?RDP=RTO-MP-MSG-045. de Penning, L., Kappé, B., & Boot, E. (2009, November 30-December 3). Automated performance assessment and adaptive training for training simulators with SimSCORM. In Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), NTSA (pp. 1-7), Orlando, FL. Ericsson, K. A. (2006). The influence of experience and deliberate practice on the development of superior expert performance. In K. A. Ericsson, N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.), The Cambridge handbook of expertise and expert performance (pp. 685-705). Cambridge, UK: Cambridge University Press. Ericsson, K. A., Pretula, M. J., & Cokely, E. T. (2007, July-August). The making of an expert. Harvard Business Review, pp. 115-121. Frank, A. (2012). Gaming the game: A study of the gamer mode in educational wargaming. Simulation & Gaming: An International Journal, 43(1), 118–132. Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: A research and practice model. Simulation & Gaming: An International Journal, 33, 441-467. Gee, J. P. (2007). What video games have to teach us about learning and literacy. New York, NY: Palgrave Macmillan. Gee, J. P. (2009). Deep learning properties of good digital games: How far can they go? In U. Ritterfeld, M. Cody & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 67-82). New York, NY: Routledge. Greenaway, R. (2007). Dynamic debriefing. In M. Silberman (Ed.), The handbook of experiential learning (pp. 59-80). San Francisco, CA: Pfeiffer. Iuppa, N., & Borst, T. (2007). Story and simulations for serious games: Tales from the trenches. New York, NY: Focal Press.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
670
Simulation & Gaming 43(5)
Johnson, C., & Gonzalez, A. J. (2008). Automated after action review: State-of-the-art review and trends. Journal of Defense Modeling and Simulation: Applications, Methodology, Technology, 5, 108-121. Jones, K. (1998). Hidden damage to facilitators and participants. Simulation & Gaming: An International Journal, 29, 165-172. Klabbers, J. H. G. (2008). The magic circle: Principles of gaming and simulation (2nd ed.). Rotterdam, Netherlands: Sense Publishers. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall. Lederman, L. C. (1992). Debriefing: Toward a systematic assessment of theory and practice. Simulation & Gaming: An International Journal, 23, 145-160. Lincoln, Y., & Guba, E. G. S. (1985). Naturalistic inquiry. Newbury Park, CA: SAGE. Linderoth, J. (2010, August). Why gamers don’t learn more: An ecological approach to games as learning environments. Paper presented at the Nordic DiGRA, Stockholm, Sweden. Liu, J., & Wang, L. (2006). A teacher’s tool in game-based learning system: Study and implementation. In Z. Pan, R. Aylett, H. Diener, X. Jin, S. Göbel, & L. Li (Eds.), Technologies for e-learning and digital entertainment (Vol. 3942, pp. 1340-1347). Berlin, Germany: Springer. Pivec, P. (2009). Game-based learning or game-based teaching? (Report No. 1509): British Educational Communications and Technology Agency (BECTA). Retrieved May, 21, 2011, from http://webarchive.nationalarchives.gov.uk/20101102103654/emergingtechnologies. becta.org.uk/index.php?section=etr&rid=14692 Prensky, M. (2001). Digital game-based learning. St. Paul, MN: Paragon House. Prevou, M., & McGurn, L. (2010, November 29-December 2). Strategies for designing 21st century military education. In Proceedings of the Interservice/Industry Training, Simulation & Education Conference (I/ITSEC), NTSA (pp. 1-11). Orlando, FL. Proctor, M. D., Lucario, T., & Wiley, C. (2008). Are officers more reticent of games for serious training than enlisted soldiers? Journal of Defense Modeling and Simulation: Applications, Methodology, Technology, 5, 179-196. Raybourn, E. M. (2007). Applying simulation experience design methods to creating serious game-based adaptive training systems. Interacting With Computers, 19, 206-214. Raybourn, E. M. (2009). Intercultural competence game that fosters metacognitive agility and reflection. In A. Ozok & P. Zaphiris (Eds.), Online communities and social computing, lecture notes in computer science (Vol. 5621, pp. 603-612). Berlin, Heidelberg, Germany: Springer-Verlag. Ritterfeld, U., Cody, M., & Vorderer, P. (2009). Introduction. In U. Ritterfeld, M. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 3-9). New York, NY: Routledge. Rubel, R. C. (2006). The epistemology of war gaming. Naval War College Review, 59, 108-128. Ruben, B. D. (1999). Simulations, games, and experience-based learning: The quest for a new paradigm for teaching and learning. Simulation & Gaming: An International Journal, 30, 498-505.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
671
Alklind Taylor et al.
Salas, E., Rosen, M. A., Held, J. D., & Weissmuller, J. J. (2009). Performance measurement in simulation-based training: A review and best practices. Simulation & Gaming: An International Journal, 40, 328-376. Sandford, R., Ulicsak, M., Facer, K., & Rudd, T. (2006). Teaching with games: Using commercial off-the-shelf computer games in formal education (Project Report), Bristol, UK: Futurelab. Retrieved from http://www2.futurelab.org.uk/resources/documents/project_ reports/teaching_with_games/TWG_report.pdf Shute, V. J., Ventura, M., Bauer, M., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning. Flow and grow. In U. Ritterfeld, M. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 295-321). New York, NY: Routledge. Smith, R. (2006). Technology disruption in the simulation industry. Journal of Defense Modeling and Simulation: Applications, Methodology, Technology, 3, 3-10. Smith, R. (2007). The disruptive potential of game technologies: Lessons learned from its Impact on the military simulation industry. Research Technology Management, 50, 57-64. Thatcher, D. C. (1990). Promoting learning through games and simulations. Simulation & Gaming: An International Journal, 21, 262-273. Thompson, T. N., Carroll, M. B., & Deaton, J. E. (2009). Justification for use of simulation. In D. A. Vincenzi, J. A. Wise, M. Mouloua, & P. A. Hancock (Eds.), Human factors in simulation and training (pp. 39-48). Boca Raton, FL: CRC Press. Thorpe, J. (2010, November 29-December 2). Trends in modeling, simulation, & gaming: Personal observations about the past thirty years and speculation about the next ten. In Proceedings of the Interservice/Industry Training, Simulation & Education Conference (I/ITSEC), NTSA (pp. 1-53), Orlando, FL. VIRTUAL BATTLESPACE 2 VIRTUAL TRAINING KIT. (2008). [Developed by Bohemia Intractive.] Nelson Bay, New South Wales, Australia. Bohemia Interactive. Wood, W., Douglas, D., & Haugen, S. (2002, October 2-5). E-learning in the military: Meeting the challenge. In Proceedings of the IACIS 2002 Conference (pp. 673-679) Florida: Fort Lauderdale. Zachary, W., Bilazarian, P., Burns, J., & Cannon-Bowers, J. A. (1997, December). Advanced embedded training concepts for shipboard systems. In Proceedings of the Interservice/ Industry Training, Simulation & Education Conference (I/ITSEC), NTSA (pp. 1-10), Orlando, FL.
Bios Anna-Sofia Alklind Taylor (MSc, University of Exeter, UK) is currently doing her PhD in serious games at the University of Skรถvde, Sweden. Apart from an enthusiastic interest in gaming, she has a background in cognitive science and human-computer interaction. Her current research is focused on instructor roles in serious gaming. Contact: anna-sofia.alklind.taylor@his.se. Per Backlund (PhD, Stockholm University, Sweden) has a background in the fields of teaching, cognitive science, and information systems development. His research interests are in
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016
672
Simulation & Gaming 43(5)
serious games, in particular how games and game technology can be used for training and Âdissemination of information. He is currently managing the InGaMe Lab research group at the University of SkĂśvde. Contact: per.backlund@his.se. Lars Niklasson (PhD, University of Sheffield, UK) is a keen player of strategy games. His area of interest is artificial intelligence. His current research is focused on algorithms for situation analysis military, safety and security applications. Contact: lars.niklasson@his.se.
Downloaded from sag.sagepub.com at UQ Library on October 23, 2016