The relationship between facilitators questions 1[1]

Page 1

Empirical Investigations

The Relationship Between Facilitators’ Questions and the Level of Reflection in Postsimulation Debriefing Sissel Eikeland Husebø, PhD, MSN, RN; Peter Dieckmann, PhD; Hans Rystedt, PhD, RN; Eldar Søreide, MD, PhD; Febe Friberg, PhD, RN

Introduction: Simulation-based education is a learner-active method that may enhance teamwork skills such as leadership and communication. The importance of postsimulation debriefing to promote reflection is well accepted, but many questions concerning whether and how faculty promote reflection remain largely unanswered in the research literature. The aim of this study was therefore to explore the depth of reflection expressed in questions by facilitators and responses from nursing students during postsimulation debriefings. Methods: Eighty-one nursing students and 4 facilitators participated. The data were collected in February and March 2008, the analysis being conducted on 24 videorecorded debriefings from simulated resuscitation teamwork involving nursing students only. Using Gibbs’ reflective cycle, we graded the facilitators’ questions and nursing students’ responses into stages of reflection and then correlated these. Results: Facilitators asked most evaluative and fewest emotional questions, whereas nursing students answered most evaluative and analytic responses and fewest emotional responses. The greatest difference between facilitators and nursing students was in the analytic stage. Only 23 (20%) of 117 questions asked by the facilitators were analytic, whereas 45 (35%) of 130 students’ responses were rated as analytic. Nevertheless, the facilitators’ descriptive questions also elicited student responses in other stages such as evaluative and analytic responses. Conclusion: We found that postsimulation debriefings provide students with the opportunity to reflect on their simulation experience. Still, if the debriefing is going to pave the way for student reflection, it is necessary to work further on structuring the debriefing to facilitate deeper reflection. Furthermore, it is important that facilitators consider what kind of questions they ask to promote reflection. We think future research on debriefing should focus on developing an analytical framework for grading reflective questions. Such research will inform and support facilitators in devising strategies for the promotion of learning through reflection in postsimulation debriefings. (Sim Healthcare 8:135Y142, 2013)

Key Words: Debriefing, Simulation, Reflection, Facilitators

R

eflection is an essential element in the development of professional competences in the health care sector.1 Participating in a postsimulation debriefing may provide a way for health care professionals to practice reflection and to translate this into practical actions.2 The term reflection refers to ‘‘those intellectual and affective activities in which individuals engage to explore their experience in order to lead to new understanding and appreciations.’’3 If the debriefing is going to facilitate deeper reflection, it is important for facilitators and simulation instructors to be aware of how to put questions that encourage this, thereby optimizing simulation-based learning.2,4,5 Reflection is a key

From the Department of Health Studies (S.E.H., F.F.), Faculty of Social Sciences, University of Stavanger; Department of Anaesthesiology and Intensive Care (E.S.), Stavanger University Hospital, Stavanger, Norway; Danish Institute for Medical Simulation (P.D.), Capital Region of Denmark, Herlev Hospital, Copenhagen University, Copenhagen, Denmark; Department of Education (H.R.), Communication and Learning, University of Gothenburg, Gothenburg, Sweden. Reprints: Sissel Eikeland Husebø PhD, MSN, RN, Faculty of Social Sciences, Department of Health Studies, University of Stavanger, Stavanger, Norway (e-mail: sissel.i.husebo@uis.no). The authors declare no conflict of interest. Copyright * 2013 Society for Simulation in Healthcare DOI: 10.1097/SIH.0b013e31827cbb5c

element in learning from experience and helps learners to develop and integrate insights from direct experience into later action.6 According to Flanagan,7 debriefing in simulation-based learning refers to ‘‘the purposeful, structured period of reflection, discussion and feedback undertaken by students and teachers usually immediately after a scenario-based simulation exercise involving standardised patients and/or mannequins.’’7 However, debriefings can be structured in a number of ways to promote reflection. According to Fanning and Gaba,8 all debriefing models involve reflecting on the experienced event, discussing experiences with others, and learning and modifying behaviors based on the experience.8 It is suggested that debriefings structured for reflection begin with open-ended questions and emotional release and include guidance and feedback from the facilitator.9,10 Rudolph et al11 describe ‘‘debriefing with good judgment’’ as an approach to promote reflective practice.11 The ‘‘frames’’ that underlie the actual actions are invisible but can be discovered through the ‘‘advocacy-inquiry’’ approach, that is, by asking questions about the trainee’s underlying reason for their actions. According to Rogers12 and Yuen Lie Lim,13

Vol. 8, Number 3, June 2013

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.

135


there is agreement among researchers in the field that reflection is deliberate, is stimulated by a problematic situation, involves an examination of personal knowledge, and leads to new insights. However, a precise definition of the way reflection is implemented in learning settings and the way it is researched are lacking.14,15 Several attempts have been made to clarify reflection by offering conceptual frameworks, for example, Gibbs,16 Mezirow,17 Scho¨n18 and Kolb.19 Gibbs,16 for example, reinterpreted the model of reflection by Kolb19 and developed it into a 6-stage reflective cycle. According to Boud et al,3 Kolb19 does not say very much about the process of reflection, although the significance of his work on the activity of reflection may be that he sets it in a context of learning. The same applies to Gibbs.16 There are few studies that investigate the effect of debriefing on reflection. Overstreet showed that debriefing provides students the opportunity to reflect on the simulation experience and to make prospective assumptions as to how they might perform differently next time.20 Dreifuerst21 investigated reflection strategies during debriefings and demonstrated change in clinical reasoning skills. Dieckmann et al22 showed that the theoretical ideal of how debriefing should be conducted is not always fulfilled in debriefing practice. In this study, we investigate the level of reflection in questions and responses in debriefings. In doing this, we use the definition outlined in Boud et al3 where the term reflection refers to ‘‘those intellectual and affective activities in which individuals engage to explore their experience in order

to lead to new understanding and appreciations.’’ Despite the importance of facilitating reflection in debriefings, questions concerning whether and how faculty promote reflection remain unanswered in the research literature.9,23 Owing to the absence of research documenting reflective questions and answers in health care simulation debriefing, the aim of this study was to explore the depth of reflection as expressed in questions by facilitators and in responses by nursing students. Three study questions guided our work as follows: What stage of reflection did the facilitators’ questions reveal? What stage of reflection did the students’ responses reveal? What is the relationship between stages of reflection in facilitators’ questions and stages of reflection in students’ responses?

CONCEPTUAL FRAMEWORK Several potential models were explored to see whether they provided a suitable framework for analyzing the depth of reflection in facilitators’ questions and students’ responses. The reflective cycle of Gibbs16 was used as the conceptual framework in this study (Fig. 1), the model linking reflection with learning.24 Gibbs suggests how the experimental learning cycle can be implemented, including reflecting upon experiences such as simulation.16 Thus, the cycle is a circular process and includes 6 stages to encourage students to organize and structure their thinking and learning reflectively (Fig. 1) and relates to a fully structured debriefing with increased depth of reflection. Gibbs’ reflective cycle has been used in

FIGURE 1. The reflective cycle of Gibbs16 including example questions and responses from our data. 136

Facilitators’ Questions and Level of Reflection

Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.


previous studies to enhance health care students’ learning of knowledge, skills, and attributes in simulation and clinical practice.25Y27

METHODS This study has an explorative and descriptive qualitative design.28 The results are based on 24 video recordings of postsimulation debriefings in nursing education. The data were collected by the first author in February and March 2008. We used a deductive approach to grade reflections, based on the reflective cycle of Gibbs.16 The model was chosen because it is designed for the context of education and simulation and has been used to assist the reflective process in nursing education and nursing practice.29,30 In this study, Gibbs’ reflective cycle was applied to classify all facilitators’ questions and students’ responses regarding leadership in all debriefings. Finally, the relationship between all classified questions and responses were described, apart from those graded differently by the 2 researchers. Participants A total of 81 students (72 female and 9 male subjects; mean age, 27 years; range, 22Y53 years) studying in their last semester of a 3-year nursing education program participated in the study. The students were divided into 14 groups that were comprised of between 4 and 7 team members per group. Four groups were composed of mixed sexes, whereas the remainder was composed of females only. The median age in the 14 groups varied from 23 to 33 years. The student group was comparable with other student groups in Norwegian nursing education programs with respect to age and sex.31 Five female faculty members (aged 34Y60 years) were involved as facilitators and simulator operators. The facilitators selected to facilitate the simulation session had been teaching Basic Life Support (BLS) and the use of semiautomatic defibrillators for the past 2 years. All 5 facilitators had participated in a 3-day workshop in educational principles of simulation-based learning, including how to brief and debrief learners before the simulation. Ethical Considerations The study was approved by the Norwegian Social Science Data Services and the University of Stavanger, Norway. Consent forms for participation in the study were signed by the nursing students as well as the faculty staff, and confidentiality was guaranteed. All those who were asked agreed to participate in the study. For the type of research presented here, the regional committee for medical ethics of western Norway declined to consider the application because the study did not involve patients or relatives. Setting and Scenario The data were collected during voluntary resuscitation team training given to students at the University of Stavanger in the Stavanger Acute medicine Foundation for Education and Research (SAFER) simulation center. Earlier on in their education program, the students attended lectures in resuscitation teamwork and completed repeated training in BLS and the use of semiautomatic defibrillators. The students had attended simulation courses in their second year. The students received the learning objectives of the simulation Vol. 8, Number 3, June 2013

scenario in advance, and these were restated by the facilitator in the scenario briefing immediately before the scenario. The objectives were (1) optimizing leadership in resuscitation teamwork and (2) putting the BLS algorithm into practice. In this study, the simulated patient was a 71-year-old woman who had an upper femur fracture and had been moved to an out-of-hospital rehabilitation unit without a staff physician present. The patient had a history of angina pectoris and went into cardiac arrest during the scenario. Each group simulated the same scenario twice, with 3 students participating at a time, whereas the other 3 took the roles of observers. For each scenario, the students elected 1 group member to be the leader. The facilitator was present in the room observing the scenario. In the second scenario, the observers and active participants exchanged roles. After each simulation scenario, the students took part in a debriefing guided by the facilitator and analyzed leadership, team performance, and execution of the BLS algorithm. Data Collection The first author recorded 28 simulations including briefings, simulation scenarios, and debriefings as parts of the simulation sessions, resulting in 28 hours of videotaped material.32 Video recordings were chosen because they allow for the capturing and recording of interactions in the debriefing setting as they occur naturally without disturbances from direct observations and because they allow for repeated viewing and detailed analysis.33

DATA ANALYSIS All the video recordings of the debriefings were reviewed between 2 to 4 times to identify sequences focused on leadership. The material of analyses was defined as sequences in which the facilitator and students talked about the following: & The roles and tasks of the team leader and team members, and & Their collaboration and communication regarding roles and tasks. Four of the debriefings were excluded from the analysis. In 2 debriefings, the facilitator asked the first author to take over the debriefing, which may represent a potential bias. The other 2 debriefings did not provide relevant material for the study because the conversation only focused on medical technical issues, that is, how to perform chest compressions and ventilations. All recorded conversations were then transcribed verbatim and read between 2 and 6 times to ensure correctness and understanding. A total of 8 hours of material was recorded. Based on the defined sequences presented previously, the material for analysis consisted of 1-hour-20-minute verbal communication. The duration of the 24 debriefings varied from 5.5 to 35 minutes, with a median value of 20.5 minutes, whereas the transcribed parts regarding leadership varied from 0.5 to 6.5 minutes with a median value of 3.5 minutes because the remaining parts of the conversation focused on other issues. Two facilitators each performed 10 debriefings, whereas the other 2 facilitators each performed 2 debriefings. For the purpose of our analysis, we decided to quantify data. * 2013 Society for Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.

137


First, 2 researchers (F.F. and S.E.H.) independently graded the first 20 facilitators’ questions and students’ responses regarding leadership in the 6 stages of reflection of the reflective cycle of Gibbs.16 This served to refine the criteria for grading the reflection and calibrate judgments. The examples of questions and responses, given in Figure 2, clarify how we interpreted Gibbs’ different stages of reflection. Second, the 2 researchers independently graded all the remaining questions and responses. For the analysis of stages of reflection in questions and answers, 117 questions and 130 responses in 24 debriefings were used (Table 1), the number of responses being greater than the number of questions owing to several responses to one question. Examples of how questions were interpreted into stages of reflection and the responses the questions elicited can be seen in Figure 2. Questions and responses graded differently by the 2 researchers were excluded from further analysis (Appendix 1). Finally, the relationship between questions and responses graded equally by the 2 researchers was identified by sorting

and counting all responses in each of the 5 stages of grading facilitators’ questions. A detailed template with all graded questions and responses in all debriefings made it possible to analyze the relationship between questions and answers in reflection stages. Statistics Rater agreement, defined as the number of agreed assessments (x + y) divided by the number of agreed assessments + the number of disagreed assessments (z),34 was calculated for questions and responses separately and together using this formula: (x + y) + z. However, this calculation has at least 2 weaknesses: it takes no account of where the agreement was, and we would expect some agreement between the 2 raters by chance, even if they were guessing. For this reason, we additionally calculated interrater reliability of assessment of questions and responses, a coefficient indicating the extent to which the ratings of 2 independent raters are intercorrelated35 with J and linear

FIGURE 2. Examples of how facilitators’ questions were interpreted into stages of reflection and the responses those questions elicitedVdescribed as stages 1 to 5. 138

Facilitators’ Questions and Level of Reflection

Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.


TABLE 1. The number of facilitators’ questions (n = 117) graded into stages (S) of reflection. Questions for which there is disagreement in the graded stages of reflection between the coders are not included.

weighting using VassarStats (http://vassarstats.net/). It has been proposed that a J score of 0.81 to 1.00 indicates very good agreement; 0.41 to 0.80, moderate-to-good agreement; 0.21 to 0.40, fair agreement; and below 0.20, poor agreement.36

RESULTS Reliability of the Coding Rater agreement between the 2 coders for assessment of stage 1 to 5 of the Gibbs’ reflective cycle was 0.82 for questions (82% = 117/ [117 + 25]), 0.80 for responses (80% = 130 / [130 + 32]), and 0.81 for both questions and responses (81% = [117 + 130] / [117 + 130] + 58), indicating a reliable application reflective cycle to code the questions and responses. The J score for interobserver agreement for questions was 0.77 and 0.79 for responses, indicating good agreement between the 2 coders. Stages of Reflection in Questions and Responses Facilitators asked most evaluative questions, stage 3 (S3) (43 of 117) and fewest emotional questions (S2) (4 of 117) (Table 1), whereas students answered most evaluative responses (S3) (50 of 130) and fewest conclusive responses (S5) (1 of 130) (Table 2). None of the questions and responses were rated as questions about action plans (S6). The greatest difference between facilitators and students was in the analytic stage (S4). Only 23 (19.6%) of 117 questions asked by the facilitators were analytic (S4) (Table 1), whereas 45 (35%) of 130 of the students’ responses were rated as analytic (S4) (Table 2). The Relationship Between Questions and Responses To explore the relationship in the reflection stages between questions and responses, the relationship between questions and responses that were graded equally by the researchers were analyzed (Table 3). The figure in each cell indicates how many of each response in a certain stage followed each question in a certain stage. For example, descriptive questions (S1) elicited not only descriptive responses (S1), but also evaluative (S3) and analytic responses (S5). Questions in the evaluative stage (S3) were followed by responses in the same stage (27 of 41) as well as in the analytic stage (S4) (10 of 41). Analytic questions (S4) were followed by 17 responses at the same level, whereas conclusive questions (S5) elicited most responses in the analytic stage (S4). Vol. 8, Number 3, June 2013

DISCUSSION The results of this study demonstrate that there were large variations in the duration of the debriefings (from 5.5 to 35 minutes). Johnson-Russell and Bailey37 suggest that the amount of time allotted for debriefing should be commensurate with the objectives, level of the learners’ knowledge, as well as skills and scenario complexity and should last no less than 30 minutes, whereas Flanagan7 proposes that ‘‘the length of time for debriefing should not be less than the time taken for the scenario itself: usually more time is ideal.’’ A major goal of debriefing is to reinforce the objectives of the simulation to ensure that the intended learning occurred. Debriefing should also foster reflective learning.37 Possible explanations for the large variation in the duration of the debriefings could be challenges in the conduct of debriefing and scenarios, possibly owing to the beginner level of facilitators. The learning outcome of the groups with the shortest debriefings might be impaired.37 Although a substantial portion of the students’ responses to descriptive questions were in the evaluative and analytic stages (S3 and S4), the findings imply that the questions in the evaluative and analytic stages (S3 and S4) elicited more responses at these stages compared with descriptive questions. Results also showed that although approximately one third of the facilitators’ questions were in a descriptive stage, these questions promoted half the responses in the emotional and conclusive stages (S2 and S5). The results also point to the fact that questions on a deeper level of reflection such as ‘‘What else could you have done?’’ (S5) promote fewer variations in responses. One interpretation of this result might be that the formulation of these questions is more precise than the descriptive and evaluative ones. These results demonstrate the complexity of the debriefing and that descriptive questions may not only promote descriptive responses. Participants might be stimulated to engage in self-reflection by other elements in the setting as well, which would need to be explored further. Very few of the facilitators’ questions in this study were formulated as analytic questions focusing on what sense the participants could make of the situation. According to Moon,38 deep reflection includes a metacognitive stance (ie,

TABLE 2. The Number of Nursing Students’ Responses (n = 130) Graded Into Stages (S) of Reflection Responses for which there is disagreement in the graded stages of reflection between the coders are not included.

* 2013 Society for Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.

139


TABLE 3. The Relationship Between Facilitators’ Questions in the Rows (n = 96) and Nursing Students’ Responses in the Columns (n = 157) Facilitators’ Questions (n = 96) Nursing Students’ Responses (n = 110)* S1. Descriptive responses (n = 68) S2. Emotional responses (n = 37) S3. Evaluative responses (n = 20) S4. Analytic responses (n = 29) S5. Conclusive responses (n = 3) Sum

S1. Descriptive Questions (n = 28)

S2. Emotional Questions (n = 3)

S3. Evaluative Questions (n = 34)

S4. Analytic Questions (n = 23)

S5. Conclusive Questions (n = 14)

Sum

17 1 9 5 1 33

1 0 1 1 0 3

4 0 27 10 0 41

4 1 3 17 0 25

0 0 1 7 0 8

26 2 41 40 1 110

*The number of responses is higher than the number of questions owing to several responses to 1 question.

critical awareness of one’s own processes), ‘‘standing back’’ from the event, exploring motives or reasons for behavior, and taking into account the view and motives of others. In line with Moon, the facilitators’ questions in the debriefings may have encouraged a relatively superficial form of reflection, and learning that results from superficial reflection is also likely to be superficial.38 The results also revealed that the facilitators posed most descriptive and evaluative questions, which may have implied that students did not articulate their reasoning behind the actions. This, in turn, may explain why the students in the present study did not articulate any implications for future actions.11 Consequently, our results also point to the importance of using questioning techniques, such as the advocacy and inquiry tool suggested by Rudolph et al.11 There were neither questions nor answers in S6, ‘‘Action planVIf it occurred again, what would you do?’’ This stage involves a personal plan for future actions.16 According to Moon,39 a plan for future actions in the reflective process is more likely to be posited in clinical practice. At the same time, Moon39 points to ambiguity in literature regarding whether reflection should include a plan for future actions. An explanation of why the facilitators did not ask questions in S6 might be due to the beginner level of the faculty and that the facilitators were not sufficiently trained in formulating questions on this level. Wildman and Niles40 (1987) claim that it requires much time and effort to master the skills needed for promoting reflection. Another reason can be that the training of such skills was explicitly addressed in the faculty development program in the studied cases. Implications The results point to the necessity for reflective questioning to be included in faculty training to make effective use of simulations. In addition to formal training of facilitators, novice and beginner facilitators should be guided through a reflective learning process early in their career by reflective expert facilitators. The results also indicate that it is necessary to work further on structuring the debriefing to facilitate deeper reflection. Limitations One limitation of our study is that the research tool applied here has, to our knowledge, not been previously used to grade questions and answers into stages of reflection in postsimulation debriefing.16 To strengthen the reliability of the findings, the coding of questions and responses was 140

Facilitators’ Questions and Level of Reflection

conducted by 2 independent researchers. To strengthen the validity of the research tool, an independent team of raters could have been used to validate the grading. Second, the characteristics of the facilitators should have been included because the skills of the facilitator are essential for the overall quality of the simulation. The characteristics of the facilitators would have given a clearer idea of the facilitators’ reflective skills.8 Third, the nursing students recruited to this study had only performed 1 simulation before the current one. That the situation was quite new for them could imply that few answers corresponded to deeper levels of reflection.39 More experience with simulation would hopefully lead to deeper reflection for the students. Considering that the sample size was relatively small and all students and faculty were recruited from only one nursing program in Norway, the results points toward some, although limited, possibilities for generalization to other simulation settings and professions.

CONCLUSIONS The results of this study reveal that postsimulation debriefings provide students with opportunity to reflect on their simulation experience. The facilitators mostly asked questions at the descriptive and evaluative stages, whereas three quarters of students’ responses were at the evaluative and analytic stages. Nevertheless, the facilitators’ descriptive questions not only promoted responses at the descriptive stage but also at more reflective levels. If the debriefing is going to pave the way for student reflection, it is necessary to work further on structuring the debriefing to facilitate deeper reflection. It is therefore important that facilitators and simulation instructors consider what kind of questions they ask to promote reflection, thereby optimizing the conditions for simulation-based learning. In addition, future research on debriefing should focus on developing an analytical framework for grading reflective questions. Such research will inform and support facilitators in devising strategies for the promotion of learning through reflection in debriefing. REFERENCES 1. Cantrell MA. The importance of debriefing in clinical simulations. Clin Simul Nurs 2008;4:e19Ye23. 2. Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer D. Debriefing with good judgement: combining rigorous feedback with genuine inquiry. Anesthesiol Clin 2007;25:361Y376.

Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.


3. Boud D, Walker D, Keogh R. Promoting reflection in learning: a model. In: Boud D, Walker D, Keogh R, eds. Reflection: Turning Experience Into Learning. London, England: Kogan Page; 1985:18Y40.

23. Raemer D, Anderson M, Cheng A, Fanning R, Nadcarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Heathc 2011;6:S52YS57.

4. Brackenreg J. Issues in reflection and debriefing: how nurse educators structure experiential activities. Nurse Educ Prac 2004;4:264Y270.

24. Stanton F, Grant J. Approaches to experiential learning, course delivery and validation in medicine. A background document. Med Educ 1999;33:282Y297.

5. Neill MA, Wotton K. High-fidelity simulation debriefing in nursing education: a literature review. Clin Simul Nurs 2011;7:e161Ye168. 6. Boyd EM, Fales AW. Reflective learning. Key to Learning from Experience. J Humanistic Psychol 1983;23:99Y117. 7. Flanagan B. Debriefing: theory and technique. In: Riley RH, ed. Manual of Simulation in Healthcare. Oxford: OUP Oxford; 2008:155Y170. 8. Fanning RM, Gaba D. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:115Y125. 9. Dreifuerst KT. The essentials of debriefing in simulation learning: a concept analysis. Nurs Educ Perspect 2009;30:109Y114. 10. Decker S. Integrating guided reflection into simulated learning experiences. In: Jeffries PR, ed. Simulation in Nursing Education: From Conceptualization to Evaluation. New York, NY: National League for Nursing; 2007:73Y85. 11. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as ‘‘nonjudgmental’’ debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1:49Y55. 12. Rogers R. Reflection in higher education: a concept analysis. Innovative Higher Educ 2001;26:37Y57. 13. Yuen Lie Lim L-A. A comparison of students’ reflective thinking across different years in a problem-based learning environment. Instr Sci 2011;39:171Y188. 14. Harrison M, Short C, Roberts C. Reflecting on reflective learning: the case of geography, earth and environmental sciences. J Geography Higher Educ 2003;27:133. 15. Atkins S, Murphy K. Reflection: a review of the literature. J Adv Nurs 1993;18:1188Y1192.

25. Jones I, Alinier G. Introduction of a new reflective framework to enhance students’ simulation learning: a prelimnary evaluation. Available at: http://hdl.handle.net/2299/6147. Accessed September 15, 2012. 26. Hogg G, Ker J, Stewart F. Over the counter clinical skills for pharmacists. Clin Teach 2011;8:109Y113. 27. Williams RM, Sundelin G, Foster-Seargeant E, Norman GR. Assessing the reliability of grading reflective journal writing. J Phys Ther Educ 2000;14:23Y26. 28. Polit DF, Beck CT. Essentials of Nursing Research: Appraising Evidence for Nursing Practice. Philadelphia, PA: Wolters Kluwer/Lippincott Williams & Wilkins; 2010. 29. Gnash L. Supervision issues in practice: supporting and advising midwives. Br J Midwifery 2009;17:714Y716. 30. Wilding PM. Reflective practice: a learning tool for student nurses. Br J Nurs 2008;17:720Y724. 31. Røykenes K, Larsen T. The relationship between nursing students’ mathematics ability and their performance in a drug calculation test. Nurse Educ Today 2010;30:697Y701. 32. Husebø SE, Rystedt H, Friberg F. Educating for teamwork - nursing students’ coordination in simulated cardiac arrest situations. J Adv Nurs 2011;67:2239Y2255. 33. Heath C, Hindmarsh J, Luff P. Video in Qualitative Research: Analysing Social Interaction in Everyday Life. Los Angeles, CA: Sage; 2010.

16. Gibbs G. Learning by Doing: A guide to Teaching and Learning Methods. London, England: FEU; 1988.

34. Dyrholm Siemensen IM. Et eksplorativet studie av faktorer der pa˚virker sikkerheten af patient-overgange (An Explorative Study of Factors Influencing Safety in Patient Handovers) [dissertation]. Lyngby, Denmark: Denmarks Technical University; 2011.

17. Mezirow J. Transformative learning: theory to practice. N Dir Adult Contin Educ 1997;74:5Y12.

35. Polit DF. Data Analysis and Statistics for Nursing Research. Stamford, CT.: Appleton & Lange; 1996.

18. Scho¨n DA. The Reflective Practitioner: How Professionals Think in Action. New York, NY: Basic Books; 1983.

36. Fleiss JL, Levin B, Paik MC. The measurement of interrater agreement. In: Fleiss JL, Levin B, Paik MC, eds. Statistical Methods for Rates and Proportions. Hoboken, NJ: Wiley; 2003:598Y626.

19. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice-Hall; 1984. 20. Overstreet ML. The Current Practice of Nursing Clinical Simulation Debriefing: A Multiple Case Study. Knoxville, TN: University of Tennesse, Knoxville; 2009. 21. Dreifuerst KT. Debriefing for Meaningful Learning: Fostering Development of Clinical Reasoning Through Simulation [dissertation]. Bloomington, IN: Indiana University; 2010. 22. Dieckmann P, Friis SM, Lippert A, Østergaard D. The art and science of debriefing in simulation: ideal and practice. Med Teac 2009;31:287Y294.

Vol. 8, Number 3, June 2013

37. Johnson-Russell J, Bailey C. Facilitated debriefing. In: Lashley FR, Nehring WM, eds. High-fidelity Patient Simulation in Nursing Education. Sudbury, MA: Jones and Bartlett Publishers; 2010:369Y385. 38. Moon J. Getting the measure of reflection: considering matters of definition and depth. J Radiother Pract 2007;6:191Y200. 39. Moon JA. Reflection in Learning and Professional Development. London, England: Kogan Page; 2000. 40. Wildman TM, Niles JA. Reflective teachers: tensions between abstractions and realities. J Teach Educ 1987;38:25Y31.

* 2013 Society for Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.

141


APPENDIX 1. Questions (n = 26) and Responses (n = 32) for Which There Is Disagreement in the Graded Stages of Reflection Between the 2 Raters (R1 and R2). No. Q6 Q12 Q21 Q24 Q39 Q42 Q43

Questions (Q) R1 R2 Other things that you think of that might be good? 4 3 It is about teamwork, very good, and you said no? 4 1 So I am thinking about where you could have 2 5 positioned yourself to get an overview? And what is it? 2 4 Have you heard about close loop, no, it has been 4 1 confirmed now that you gave good commands in one respect. I am asking everyone what it is like as a leader, is 3 1 there a position which makes it easier to lead? Somebody said that this was not a good idea, 3 1 who was it?

No. Responses (R) R1 In a situation like this, the leader can give commands to the others. R3 About the communication, I think they communicated very well. R4 I could have talked louder and been more.

R1 R2 1 5 4 3 4 3

R8 But I think this was really good. R14 Because T. was leading and commanding

4 3

3 4

R15 No, yes, that is right.

4

1

4

5

1

3

4

5

4

5

3

1

Yes, I support you on this, this is very important, other things? Great, can you think of a third thing too?

3

1

4

3

Very good, is there something that you observed that was good? You got an overview; did you have a reasonable amount of time to get that overview? Is there something in the communication that would have helped you? Have you anything to add about the leader?

4

3

3

1

R17 I should have called the doctor earlier and got him to come sooner than half an hour. I should have seen that the situation was more critical, given clearer messages, and done a better job in leading the team. R38 She checked consciousness, plugged in the automated external defibrillator (AED) and informed the patient. R41 Fortunately S. called the ambulance, but we could have done it as soon as it happened. R47 Yes, I could have been better at talking with the patient and those I worked with. I felt I was very unsure about the bag. R53 Yes I did, now I mean both parts.

2

4

R63 She was calm and quick thinking, gave clear messages.

2

3

2

3

4

3

It went very well, but is there something you are holding back that you want to get feedback on? Is there something that you could have done differently? You were not sure about who was the leader, what do you think about that, what is important? You took care of this, who placed it (AED) first? Yes, S and A, do you want to add anything?

5

4

4

5

4

5

4

3

4

1

R64 I think she demonstrated good leadership with good control and overview, clear messages to her followers, as a leader should. I think she did that. R65 I could perhaps have been clearer on what I could do without stepping on the leader’s toes. I tried to monitor the blood pressure while you were talking to the patient and trying to monitor the pulse. R69 I correctly understood what the physician said on the telephone. I did not hear what she said at first, but then I asked once more. R71 She was calm.

2

3

5

1

2

3

2

3

5

4

Listen up, is there anyone who wants to say anything else she managed well in this situation? Yes, but why did they answer? You felt quite early on that you could not manage alone? You felt quite early on that you could not manage alone? You were helped by the situation Iyes, other things?

4

3

4

5

3 3

4 1

3 4

2 2

3

1

3

4

4

2

3

2

Q121 Can you tell us something that was okay?

4

3

4

5

Q122 Clear messages, lovely, something you S. want to say? Q124 Is there something the 2 of you E. and G. want to say?

2

1

R80 I thought you had a lot of positive points, you say that you hardly found anything positive. R83 When you came in, I did not explain the situation because I thought when I spoke with the physician, she was clammy, we knew that from the case. Then, it struck me that in a real situation, you would not know; I would have explained and the observations I did not say, you knew them. R84 I could have been a bit smarter at giving clear messages, but I am now and perhaps better at communicating or as you said in there, we should be clear about what we are doing. R88 She saw everything I did too. R96 It was confusing about the leadership because I thought that when I came in with the emergency case I was the leader. R98 You were the leader, yes, I understood it that way, I thought that N. could take over. R110 When they did the compression, I thought what do I do? Yes I have to position the AED, but what do I do in the meantime? I was a little unsure. R111 Yes, I could have been clearer about leadership and used the 2 followers differently and thought about how I could plan in a different way. R113 I was busy counting heart compressions.

1

4

1

3

R124 I see the advantage of standing at the back. It has been mentioned, but I see the advantage. R125 I felt I delegated okay. I am sure I have forgotten somethingI R128 Yes, could have been in less of a rush, given clearer messages, got all the information before calling the doctor. But I took that decision because there were not any nitro effects. But I should have had the pulse and blood pressure ready. R129 I had a good overview, but I do not know if I noticed all that happened. R144 She was the most updated on the situation and delegated tasks. They see what has to be done, J. took the bag without confirmation. And she counted to 2 and she to 30, so they collaborated. R147 You were actually calm and not stressed, you were not up here. R155 I thought I communicated well with the patient and was attentive to her. I think we each had our own area of responsibility

3

4

4 4

3 5

4

3

4

3

2 4

3 3

Q51 Q58 Q59 Q63 Q65 Q69 Q71 Q73 Q74 Q89 Q91

Q93 Q100 Q104 Q109 Q112

142

Facilitators’ Questions and Level of Reflection

Simulation in Healthcare

Copyright © 2013 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.