development of a tool to improve performance debriefing and learning: the paediatric objective...
TRANSCRIPT
Development of a tool to improve performancedebriefing and learning: the paediatric ObjectiveStructured Assessment of Debriefing (OSAD) toolJane Runnacles,1 Libby Thomas,2 Nick Sevdalis,3 Roger Kneebone,3 Sonal Arora3
▸ Additional material ispublished online only. To viewplease visit the journal online(http://dx.doi.org/10.1136/postgradmedj-2012-131676).1Department of Paediatrics,Royal Free London NHSFoundation Trust, London, UK2Simulated and InteractiveLearning Centre, King’sCollege, London, UK3Department of Surgery andCancer, Imperial College,London, UK
Correspondence toDr Jane Runnacles,Department of Paediatrics,Royal Free London NHSFoundation Trust, Pond Street,London NW3 2QG, UK;[email protected]
Received 4 December 2012Revised 29 July 2014Accepted 1 August 2014Published Online First8 September 2014
To cite: Runnacles J,Thomas L, Sevdalis N, et al.Postgrad Med J2014;90:613–621.
ABSTRACTBackground Simulation is an important educationaltool to improve medical training and patient safety.Debriefing after simulation is crucial to maximiselearning and to translate the lessons learnt to improvereal clinical performance, and thus to reduce medicalerror. Currently there are few tools to improveperformance debriefing and learning after simulations ofserious paediatric situations.Purpose The purpose of this study was to develop atool to guide and assess debriefings after simulations ofserious paediatric situations, applying the currentevidence base and user-based research.Study design A literature review and semistructuredinterviews (performed in 2010) to identify importantfeatures of a paediatric simulation debriefing. Emergenttheme analysis was used to identify key components ofan effective debriefing which could be used as a tool forassessing debriefing effectiveness.Results The literature review identified 34 relevantstudies. Interviews were carried out with 16paediatricians, both debriefing facilitators and learners.In total, 307 features of a debriefing were identified.These were grouped into eight dimensions representingthe key components of a paediatric debriefing: thefacilitator’s approach, learning environment, engagementof learners, reaction, descriptive reflection, analysis,diagnosis and application. These eight dimensions wereused to create a tool, the Objective StructuredAssessment of Debriefing (OSAD). Each dimension canbe scored on a five-point Likert scale containingdescriptions for scores 1, 3 and 5 to serve as anchorsand aid scoring.Conclusions The study identified the importantfeatures of a paediatric simulation debriefing, whichwere developed into the OSAD tool. OSAD offers astructured approach to paediatric simulation debriefing,and is based on evidence from published literature andviews of simulation facilitators and learners. OSAD maybe used as a guide or assessment tool to improve thequality of debriefing after paediatric simulation.
INTRODUCTIONSimulation is a powerful learning tool which canimprove patient safety and reduce the incidence ofadverse events.1 It can be used to teach crisis man-agement skills, but can also help the learnerdevelop the key communication, team-working anddecision-making abilities required to effectivelymanage a seriously ill patient.2 These skills areespecially important for paediatric emergencies,which are serious and challenging situations thatare rarely experienced. It is not appropriate for
trainees to learn these skills on real children3; alter-native training strategies must be sought.Simulation is one strategy that offers paediatric trai-nees the opportunity of practised experience withina safe learning environment—without exposingpatients to preventable harm.3
The use of simulation in paediatric curricula isincreasing; it is an exciting and evolving educa-tional tool with a developing evidence base sup-porting its use.4 5 It can impact on individual andteam performance through learning which is bothexperiential and immersive.6 Evidence suggests thatthe greatest benefit of simulation is the ability toprovide training with a focus on non-technicalskills such as communication and leadership.4 Thequality of team behaviour has also been shown toimprove following simulation, and this can lead tofurther reductions in medical errors.7 With studiesof adverse events around the world suggesting thatit is a failure of non-technical and team skills thatleads to patient harm,8 simulation-based trainingprovides an opportunity to address these gaps.There is evidence to confirm that participation insuch training improves clinical performance,culture and patient outcomes.1 9
Despite these benefits of simulation, it is crucialthat the learning experience within the simulatedenvironment is maximised. Feedback or debriefingto the learners through a post-scenario perform-ance debriefing is critical in optimising learningafter a simulation.10 Debriefing is defined as asocial practice during which people purposelyinteract with each other and the environment toreflect upon a recently shared common experi-ence.11 Effective debriefing is one where the learn-ing opportunity for the learner is maximised,within a psychologically ‘safe’ environment (ie, anenvironment where the learner feels they canexplore their performance, reflect on it, and freelyexpress views on it). Effective debriefing providesformative feedback to the trainee through reflectionon a training experience. It identifies learningneeds and translates lessons learned to improvefuture clinical practice. This is particularly import-ant in paediatric training because simulation scen-arios of seriously ill children can be stressful forparticipants who rarely encounter such situations.Debriefing is thus essential to build confidence, andto identify and explore gaps in performance.Serious paediatric scenarios can be complex, withparent and team interactions, and therefore post-simulation debriefing can provide a ‘safe’, lessemotive setting to reflect on behaviours and openlydiscuss ways to improve patient safety.
Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676 613
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
Although acknowledged to be one of the most importantaspects of simulation-based training,10 there is little guidance onhow debriefing should take place in paediatrics.12 There aremany different approaches to debriefing,13 14 but few studiesprovide evidence-based guidelines on the constituents of aneffective debrief or indeed methods of assessing the quality ofdebriefing. Authors experienced in simulation have producedpractical points for debriefing,15 but these are only based ontheir own beliefs (albeit expert) and do not take into consider-ation the wider literature or end-user opinion.
We took the view that the above represents a gap in ourcurrent paediatric medical education and training. We felt that aset of evidence-based guidelines could be developed into a tooland used both to measure the quality of debriefings following apaediatric simulation or as a set of best practice guidelines fornovice debriefers to study and develop their debriefing skills.Such a tool could have several key implications in terms ofcurrent paediatric simulation training and could help to addressgaps in research with regard to debriefing practices.12 It couldbe used as an assessment of the quality of paediatric debriefings(ie, the skills of the facilitator who conducts the debriefing) soas to ensure that faculty provide optimal post-training feedback.It could also be used by faculty for self-evaluation of their owndebriefing practice and as a means by which they can reflectupon their performance afterwards. A tool could also be usedfor more formal training of debriefing facilitators (ie, a‘train-the-trainers’ course). Finally, it could be used to comparethe relative effectiveness of different debriefing techniques,thereby ensuring that best practices are identified.
The purpose of this study was to develop an evidence-basedtool to guide and assess debriefings after a simulation of a situ-ation involving the management of a seriously ill child.
METHODSWe applied a two-part methodology to this study—including areview of the evidence base, followed by a qualitative descriptivestudy where we collected data prospectively using a semistruc-tured interview approach. Both the literature review and theinterview study were performed in 2010 and aimed to identifythe important features of a paediatric debriefing as viewed byexperts in the field and practising paediatricians. The existingliterature was reviewed to address the question of what constitu-tes an effective debriefing so as to ensure the tool was evidence-based. The qualitative descriptive study, in which we used aninterview method to collect our data from experienced collea-gues, was subsequently carried out to elicit the opinions of pae-diatricians experienced in giving or receiving feedback aftersimulations involving the management of a seriously ill child.This ensured that a debriefing assessment tool could be devel-oped to directly reflect the needs of practising doctors.
Ethics approval was sought and obtained for this study fromthe Institute of Education, University of London and LondonSchool of Paediatrics Simulation Committee.
Literature reviewThe purpose of the literature review was to identify the evi-dence base on paediatric debriefing. We included medical educa-tion, non-medical simulation, psychology, healthcare, educationand business publications. We searched the following databases:PubMed, Embase, ERIC, OVID, PsycINFO and Google Scholarusing the following keywords: ‘debrief*’, ‘feedback’ (linked by‘OR’) to the combination of terms ‘simulation’, ‘p(a)ediatric’(linked by ‘AND’).
An initial title screen excluded any irrelevant papers. Wereviewed all other abstracts to identify relevant papers. The fulltext of these papers was retrieved for data extraction. Given thesmall number of retrieved papers (ie, limited evidence base) wedecided not to use a critical appraisal to score and subsequentlyexclude low-scoring articles, so that as much evidence as wecould identify could be used in the tool-development process.In addition to this, we hand-searched reference lists of includedpapers and the grey literature, and contacted experts in the fieldfor any additional papers or studies that were not in the publicevidence base.
The features of what makes an effective paediatric debriefingwere abstracted from the reviewed papers by two clinicalreviewers ( JR: paediatrician; SA: simulation expert) (see onlinesupplementary appendix for data abstraction table).
Interview studyParticipantsWe performed a semistructured interview study to identify fea-tures of a paediatric debriefing from the expert and user per-spective. Eight paediatric registrars and eight consultants fromacute paediatric specialties (emergency medicine, intensive careand general paediatrics), working at eight London hospitals,were sampled purposively and interviewed in May 2010.Regarding consultants, the inclusion criterion was that they hadfacilitated over 100 debriefings of paediatric simulation scen-arios (involving the management of a seriously ill child) as aninstructor. The inclusion criterion for paediatric registrars (trai-nees) pertained to those who had been on the receiving end ofdebriefings after a simulation of a seriously ill child and couldtherefore comment on aspects of the debriefing that made iteffective from a learner perspective.
Study procedureWe designed a semistructured interview topic guide based onfindings from the literature. Input from two independent pae-diatricians ( JR, LT) ensured that the topic guide was relevantand appropriate for paediatrics. We then piloted it with fourconsultants and four registrars to ensure comprehension andrelevance. After we had refined the topic guide to ensure clarityand brevity, a clinical researcher ( JR, paediatrician trained ininterviewing) carried out interviews with 16 participants. Bythis point, thematic saturation was reached (ie, similar themeson what constitutes a paediatric debriefing were emerging inparticipants’ interviews).
The interview asked paediatricians about their views on com-ponents of effective and ineffective debriefing as well as strategiesfor improvement. An example of a question to a paediatric regis-trar was: ‘Can you think about a time when you received a gooddebrief or effective feedback? Why did you find it effective?’, andto a consultant was: ‘Can you recall when you gave a trainee agood debrief or effective feedback? What was it that made iteffective?’ (the full interview schedule is available from the corre-sponding author). The interviews were carried out face to face ata time and location convenient to the participants within theirown hospitals and lasted approximately 20 min. They wererecorded and then transcribed verbatim. Transcripts were cross-checked with the original recordings to ensure accuracy.
AnalysisWe extracted and listed features of an effective paediatricdebriefing from the papers in the literature review. Emergenttheme analysis of the interviews by the interviewer ( JR) and asecond independent blinded coder with a background in
614 Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
medical education (LT) was performed to reliably identify viewson an effective paediatric debriefing according to the paediatri-cians interviewed. After coding of the first three interviews, thecoding results were calibrated to ensure agreement. They werethen coded individually after each interview from the tenthinterview onwards to ensure thematic saturation.
The features of effective paediatric debriefing as viewed by thepaediatricians were listed alongside those identified from the lit-erature review. We tabulated the emergent components of apaediatric debriefing that were consistently identified as import-ant across both the review and the interviews. This was con-ducted through an iterative process where two researchers ( JR,LT) first independently reviewed the long lists from the reviewand interviews and grouped them into the themes. Any disagree-ments were resolved by consensus with a third researcher withexpertise in surgery and simulation (SA). The final outcome wasreviewed for consistency by a senior psychologist with expertisein qualitative methodology and patient safety (NS).
Tool developmentThe emergent components of an effective paediatric debriefingidentified from the literature review and interview study werelisted as the main dimensions of the tool. The tool was designedas a table with these dimensions in the first column and a Likertrating scale for each dimension across the rows. On the basis ofa range of other extensively validated tools16 and the need for arelatively simple scale for ease of use, a five-point Likert ratingscale was chosen (1=minimum, 5=maximum score, total score40). We wrote descriptions of observable behaviours that couldbe assessed objectively for scores 1, 3 and 5 based on the find-ings of the literature review and interviews—that is, we pro-vided anchors for these scores so that an evaluator couldallocate them. This was to allow reliable ratings without exten-sive training in a format that would be easy to use by doctors.16
RESULTSLiterature reviewThe literature review of effective debriefing yielded 32 relevantpublications. A further two publications were identified fromconsulting experts in the field (identified in table 1 with anasterisk),17 18 thereby producing a total of 34 papers whichwere included in the final analysis (table 1). Twenty-one paperswere from the setting of medical simulation, three on the use ofdebriefing in healthcare education, six from broader educationalliterature and four from business/management literature. Themajority of papers were secondary research articles includingmostly reviews and expert opinions. Only two randomisedtrials, two observational studies and one survey were identified,highlighting the paucity of empirical research in this area.
Fifteen papers highlighted the importance of the facilitator’sapproach to the debriefing, with eight papers confirming theimportance of a safe learning environment to conduct a sensi-tive debriefing. Learner engagement was specifically mentionedin 13 papers with an emphasis to include more passive membersin a group setting. Seven papers described the importance ofgauging a learner’s reaction to the scenario at the start of adebriefing, while 17 papers focused on a descriptive reflectionas a critical component. Regarding analysis of what happenedand why, 12 papers discussed the necessity of exploring trainees’reasons for their actions in order to embed deep learning.Fourteen papers described making a diagnosis of performancegaps a significant component of debriefing. Finally, 13 papershighlighted the importance of a discussion on applying lessonslearnt to future practice as being critical to closing a debriefing.
Interview studyEight senior registrars (three male, five female, 3–8 years post-graduate experience) and eight consultants (two male, six female,12–20 years postgraduate experience) were interviewed. Theycame from the acute paediatric specialties of emergency medi-cine, intensive care and acute general paediatrics (n=4, 4 and 8,respectively). After initial coding of the interview transcripts, thecodes were then grouped thematically. For example, ‘assurance,non-blaming, non-threatening, open approach/listening, non-critical but constructive, skilled facilitator, and gentle pace’ wereall grouped into ‘Approach of the facilitator’. The other examplesof how the codes were grouped can be found in table 2.
The features of effective debriefing that emerged from theseinterviews are discussed below and illustrated with verbatimquotes from the interviews (the code letter suffixed to eachquotation refers to the participant’s level of expertise as registrar(R) or consultant (C)). Within each verbatim quote, the keyfeature/s highlighted as judged by the coders are shown in par-entheses to exemplify how we drew these from the interviews;these features, across interviews, are then presented in detail intable 2.
With regard to the ‘facilitator’s approach’, participants saidthat debriefing should emphasise positive aspects and provideconstructive (not overly critical) feedback:
I think people have to be quite careful about how they deliverfeedback (Skilled facilitator). There are always things that can belearnt and sometimes people are either extremely positive andjust say you did all of it very well, and then equally that doesn’thelp you learn, but I think if you’re going to give constructivefeedback for things people could have done better (Not criticalbut constructive), I think you have to be very wary about sound-ing very negative (Non-blaming, non-threatening) (R5).
Having a ‘safe environment and learner engagement’ wasidentified by half of the participants, who described howdebriefing should involve using open questions and good use oflistening:
when you try and debrief someone you don’t know what they’rethinking…so I think it should be an open question first, like howdid you feel (Appropriate choice of questions) and they’ll probablyjust start talking and talking, and you can actually guide the actualdebrief (Learner-centred, allowed for personal reflection) (R2).
Three other interviewees also emphasised the emotionalsupport that debriefing provides and the necessity to elicit‘reaction’:
in paediatric acute cases, there’s always a lot of emotion involved,so it’s quite important to make sure that people do leave feelingthat their confidence hasn’t been completely undermined(Addressed emotions, emotional support) (R6).
A ‘detailed reflection’ was also identified as crucial to aneffective debriefing. For example:
talking through what we actually did at each step (Step-by-stepdescription) is always helpful and encourages good reflectivepractice (Allowed for personal reflection) (C1).
‘Analysis’ was another key component of debrief including anopportunity for improved insight and awareness. One partici-pant stated:
give time to ask people why they did what they did, in a non con-frontational way (Analysis of event)… because if you don’t askpeople why they’ve done things, then you’re not going to influencetheir behaviour later on… (Improved insight/awareness) (C8).
Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676 615
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
Table 1 Results of the literature review
Authors, Year Methodology of papers Components of an effective debriefing
Bishop, 200019 Expert opinion ApproachEstablishes learning environmentEngagement of learnersAnalysis
*Brett-Fleegler et al, 200917 Expert opinion Establishes learning environmentEngagement of learnersReactionAnalysisDiagnosis
Dieckmann et al, 200911 Observation study ApproachEngagement of learners
Dismukes et al, 200620 Editorial ApproachEngagement of learners
Domuracki et al, 200921 Randomised controlled trial Establishes learning environmentDreifuerst, 200922 Case studies Approach
Engagement of learnersApplication
Edelson et al, 200823 Case control Descriptive reflectionAnalysisDiagnosis
Fanning and Gaba, 200724 Literature review ApproachReactionAnalysisApplication
Folkman, 200625 Expert opinion DiagnosisApplication
Gaba, 200426 Expert opinion Descriptive reflectionAnalysisDiagnosis
*Gururaja et al, 200918 Video-based observational study ApproachEngagement of learnersDescriptive reflectionApplication
Harvard Business School, 200727 Expert opinion Establishes learning environmentDiagnosisApplication
Issenberg et al, 199928 Selective narrative review AnalysisDescriptive reflection
Issenberg et al, 200510 Systematic review Establishes learning environmentKilbourn, 199029 Case study Engagement of learners
ReactionDescriptive reflectionAnalysis
Kyle and Murray, 200830 Expert opinion ApproachDiagnosis
Lederman, 198431 Critical review AnalysisDiagnosisApplication
Lederman, 199213 Literature review Engagement of learnersDescriptive reflectionAnalysisApplication
McGaghie et al, 200632 Review Descriptive reflectionApplication
McGaghie et al, 201033 Critical review Engagement of learnersMorgan et al, 200934 Prospective randomised controlled trial Descriptive reflection
DiagnosisPearson and Smith, 198635 Expert opinion Approach
Establishes learning environmentReactionDescriptive reflectionDiagnosis
Owen and Follows, 200636 Expert opinion Descriptive reflectionAnalysisApplication
Petranek, 200037 Case study Descriptive reflection
Continued
616 Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
The majority of participants raised the importance of feed-back regarding teamwork and non-technical skills such as com-munication, leadership and teamwork. These elements are partof the ‘diagnosis’ of scenario outcome:
teamwork is, for me, the thing that I find the debrief is really usefulfor. I think you can adjust one person’s performance but we gener-ally don’t resuscitate children individually, we look after them as ateam (Feedback on team management, learning points) (C2).
As an example of ‘application’, which focused on strategiesfor future improvements, one participant simply stated:
you can come to an agreement about an action plan about whatyou might do differently (Strategies for future improvement) (R3).
When asked to suggest ways of improving the quality ofdebriefing or feedback, there was a widely shared view of theimportance of developing a culture for feedback and reflectivepractice in paediatric training:
develop a culture where people find that (Feedback) easy to doand easy to take is really key (C1).
The most commonly identified of these components weredescriptive reflection, analysis, diagnosing learning points andapplication (strategies for future improvement).
A few interviewees suggested that there should be ways ofincreasing the awareness of debriefing and many felt thatdebriefing should be formalised in some way or incorporatedinto portfolios as written reflections. Nearly half of the
participants mentioned the importance of training facilitatorsbeing skilful at debriefing.
Synthesis of findings from the literature review andinterview studyThe eight thematic coding groups from the interview study werecross-referenced with the findings of the literature review. Thefeatures that were consistently identified and common bothwithin the evidence base (review) and across end users (inter-views) resulted in eight components of an effective debriefingthat make up the core dimensions of the final ObjectiveStructured Assessment of Debriefing (OSAD) tool. These com-ponents are outlined in table 2, with the relevant studies thatmention them alongside examples from the interview study. Thethemes extracted from the review and interviews are relevant toboth one-to-one debriefings and group/team debriefingscenarios.
Developing the ‘OSAD’ toolIn the designing of the OSAD tool, end-usability and ease ofquick referencing were key to the design. A six by nine gridsystem was selected that would fit, with all the data, on to oneside of A4 paper. A five-point Likert scale was chosen for‘marking’ each debrief with 1=‘done very poorly’ to 5=‘donevery well’. The grid showed the eight components of an effect-ive debriefing down the left-hand column. The subsequent fiveparallel columns represent scores 1–5. For each of the eightcomponents, scores 1, 3 and 5 were anchored with specific
Table 1 Continued
Authors, Year Methodology of papers Components of an effective debriefing
Porter, 199938 Case study ApproachEstablishes learning environmentDiagnosis
Rall et al, 200039 Descriptive survey ApproachEngagement of learnersDescriptive reflectionApplication
Rubin and Campbell, 199740 Expert opinion ReactionDescriptive reflection
Rudolph et al, 200614 Expert opinion ApproachEngagement of learnersDescriptive reflectionDiagnosis
Rudolph et al, 200741 Expert opinion ApproachEngagement of learnersDescriptive reflection
Rudolph et al, 200842 Expert opinion ApproachEstablishes learning environmentAnalysisApplication
Salas et al, 200815 Expert opinion ApproachDiagnosisApplication
Steinwachs, 199243 Expert opinion ApproachEngagement of learnersReactionDescriptive reflectionAnalysisApplication
van de Ridder et al, 200844 Review DiagnosisWestberg, 200145 Expert opinion Reaction
Descriptive reflectionDiagnosis
Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676 617
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
descriptions of what would be expected by the debriefer toachieve that score. For example, in the row pertaining to thecomponent ‘Reaction’:▸ 1=No acknowledgment of reactions of learners, or emo-
tional impact of the experience.▸ 3=Asks the learners about their feelings, but does not fully
explore their reaction to the events.▸ 5=Fully explores reactions of learners to the event, dealing
appropriately with learners who are unhappy.Score anchors were not added for scores 2 and 4 as it was felt
some leeway needed to be left in the system for the users to usetheir own discretion, so they could further grade debriefings. Inscales with such scoring systems, the description of performancefor ‘done very well’ may be used as a guide for best practice—this is what we aimed to achieve with the specific anchors weallocated to scores of 5 for debriefing. To further facilitateimplementation, a short manual has been produced to accom-pany the tool and is available online and from the correspond-ing author.
The final eight components of effective debriefing includedwithin the OSAD tool (figure 1) are outlined below.1. Approach of the facilitator: the manner in which the facilita-
tor conducts the debriefing session, their level of enthusiasm
and positivity when appropriate, showing interest in the lear-ners by establishing and maintaining rapport and finishingthe session on an upbeat note.
2. Establishing a learning environment: introduction of thesimulation/learning session to the learners by clarifying whatis expected of them during the debriefing, emphasisingground rules of confidentiality and respect for others, andencouraging the learners to identify their own learningobjectives.
3. Engagement of the learners: active involvement of all lear-ners in the debriefing discussions, by asking open questionsto explore their thinking and using silence to encouragetheir input, without the facilitator talking for most of thedebriefing, to ensure that deep rather than surface learningoccurs.
4. Reaction of the learners: establishing how the simulation/learning session impacted emotionally on the learners.
5. Description of the scenario through reflection: self-reflectionof events that occurred in the simulation/learning session ina step-by-step factual manner, clarifying any technical clinicalissues at the start, to allow ongoing reflection from all lear-ners throughout the analysis and application phases, linkingto previous experiences.
Table 2 Components of effective debriefing identified from literature review and interview study
Component Studies
No of interviewparticipants whomentioned component
Features of effectivedebriefing from interviewstudyRegistrars Consultants
Approach of thefacilitator
Bishop, 200019; Dieckmann et al, 200911; Dismukes et al, 200620; Dreifuerst,200922; Fanning and Gaba, 200724; Gururaja et al, 200918; Kyle and Murray,200830; Pearson and Smith, 198635; Porter, 199938; Rall, 200039; Rudolph et al,200614; Rudolph et al, 200741; Rudolph et al, 200842; Salas et al, 200815;Steinwachs, 199243
8 7 Assurance, non-blaming,non-threateningOpen approach/listeningNon-critical but constructiveSkilled facilitatorGentle pace
Establishes learningenvironment
Bishop, 200019; Brett-Fleegler et al, 200917; Domuracki et al, 200921; HarvardBusiness School, 200727; Issenberg et al, 200510; Pearson and Smith, 198635;Porter, 199938; Rudolph et al, 200842
8 6 Dedicated timeStructure to the debriefChoice of environment: quiet/uninterruptedCorrect timing of sessionGround rules at the start
Engagement oflearners
Bishop, 200019; Brett-Fleegler et al, 200917; Dieckmann et al, 200911; Dismukeset al, 200620; Dreifuerst, 200922; Gururaja et al, 200918; Kilbourn, 199029;Lederman, 199213; McGaghie et al, 201033; Rall et al, 200039; Rudolph et al,200614; Rudolph et al, 200741; Steinwachs, 199243
6 5 Team approachLearner-centredAppropriate choice of questionsUse of silence
Reaction Brett-Fleegler et al, 200917; Fanning and Gaba, 200724; Kilbourn, 199029; Pearsonand Smith, 198635; Rubin and Campbell, 199740; Steinwachs, 199243; Westberg,200145
3 6 Addressed emotionsEmotional supportStep by step descriptionAllowed for personal reflectionDescriptive reflection Edelson et al, 200823; Gaba, 200426; Gururaja et al, 200918; Issenberg et al,
199928; Kilbourn, 199029; Lederman, 199213; McGaghie et al, 200632; Morganet al, 200934; Pearson and Smith, 198635; Owen and Follows, 200636; Petranek,200037; Rall, 200039; Rubin and Campbell, 199740; Rudolph et al, 200614; Rudolphet al, 200741; Steinwachs, 199243; Westberg, 200145
8 8
Analysis Bishop, 200019; Brett-Fleegler et al, 200917; Edelson et al, 200823; Fanning andGaba, 200724; Gaba, 200426; Issenberg et al, 199928; Kilbourn, 199029; Lederman,198431; Lederman, 199213; Owen and Follows, 200636; Rudolph et al, 200842;Steinwachs, 199243
6 7 Analysis of eventImproved insight/awareness
Diagnosis Brett-Fleegler et al, 200917; Edelson et al, 200823; Folkman, 200625; Gaba, 200426;Harvard Business School, 200727; Kyle and Murray, 200830; Lederman,198431;Morgan et al, 200934; Pearson and Smith, 198635; Porter, 199938; Rudolph et al,200614; Salas et al, 200815; van de Ridder et al, 200844; Westberg, 200145
8 8 Positive feedbackFeedback on team managementLearning pointsFeedback on leadershipFeedback on communication
Application Dreifuerst, 200922; Fanning and Gaba, 200724; Folkman, 200625; Gururaja et al,200918; Harvard Business School, 200727; Lederman,198431; Lederman, 199213;McGaghie et al, 200632; Owen and Follows, 200636; Rall, 200039; Rudolph et al,200842; Salas et al, 200815; Steinwachs, 199243
8 7 Strategies for futureimprovement
618 Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
6. Analysis of events: eliciting the thought processes that drovelearners’ actions, using specific examples of observable beha-viours, to allow learners to make sense of the simulation/learning session events.
7. Diagnosis: enabling learners to identify their performancegaps and strategies for improvement, targeting only beha-viours that can be changed, and thus providing structuredand objective feedback on the simulation/learning session.
8. Application to future practice: summary of the learningpoints and strategies for improvement that have been identi-fied by the learners during the debrief and how these couldbe applied to change their future clinical practice.We further comment on the content of these eight dimensions
in the Discussion.
DISCUSSIONThis study identified eight key components of effective debrief-ing, which were then used to develop and design an OSAD toolto guide and assess debriefings of simulations of serious paediat-ric situations. The eight components of effective debriefingincluded in this tool are: Approach of the facilitator; Establishinga learning environment; Engagement of the learners; Reaction/emotional impact of the learners; Description of the scenariothrough reflection; Analysis of events; Diagnosis; Application tofuture practice. The final tool is illustrated in figure 1.
The literature review and interview study both identifiedsimilar important features of a paediatric debriefing thatinformed the components of OSAD. The facilitator’sapproach was referred to by many sources and also theimportance of it being non-threatening, yet open and con-structive. These concepts have parallels with debriefing with‘good judgement’.14 Interestingly, more registrars than consul-tants mentioned the importance of an uninterrupted environ-ment with dedicated time for debriefing and ground rules atthe start. This suggests that they struggle to receive feedbackin the conditions of a busy clinical environment yet recognisethe importance of it.
Most of the interviews, and indeed the available evidencebase we reviewed, referred to the importance of descriptive orstep-by-step personal reflection on the experience. The import-ance of reflection has been well described in medical educationliterature.46 Reflection is particularly important after paediatricsimulation because children can deteriorate rapidly, and scen-arios of seriously ill children are stressful and fast-moving,meaning that it is difficult to recall and analyse one’s behaviourswhen immersed in the simulation. Analysis helps uncover thetrue reason for the learners’ actions, thus improving theirinsight, so that constructive feedback can be given. There wasalso discussion about feedback on both technical and ‘non-technical’ aspects (such as leadership and communication) in theliterature and in many of the interviews. This is importantbecause communication is particularly important in paediatricswithin the team treating a seriously ill child.47
The development of the OSAD tool has strengths in its clarityof methods and evidence base. The methodological approachtaken (review of the evidence base, followed by end-users’ per-spective) is well established to provide evidence for content val-idity as well as relevance to the clinical audience for whichOSAD is intended.48 Importantly, opinions were captured fromboth registrars, who often receive feedback, and consultants,who are experienced in providing it. The fact that saturation ofemergent themes was reached in all the enquiry areas providesconfidence that the key points have been captured. Importantly,parallel research by our team on performance debriefings inadult surgical settings identified the same features of debrief-ing,49 and hence we now have some evidence of the applicabil-ity of the OSAD components across paediatric and also adultclinical settings.50 Further testing is required to formally estab-lish the generalisability of these elements of debriefing.
Limitations of this study are related to a paucity of publishedliterature on the subject of paediatric debriefing, making it diffi-cult to obtain high-quality evidence on what is truly effective.Ideally, we would have carried out formal critical appraisals ofthe reviewed evidence, so as to base our initial evaluation of
Figure 1 Objective Structured Assessment of Debriefing (OSAD) in paediatrics.
Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676 619
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
elements of an effective debrief on better-quality evidence;however, this was not possible at the current early stage ofdevelopment of the evidence base. Interviews may have intro-duced bias in that participants could have reconstructed exam-ples and events according to their perception and insight of thesubject and how they think they should portray an opinion tothe interviewer, rather than their true thoughts on the subject.Furthermore, the sample of paediatricians included in the inter-views may not reflect those of the paediatric community as awhole. Nonetheless, thematic saturation was achieved, lendingcredibility to our findings. Further research should seek to ascer-tain the psychometric properties of OSAD. Although there isevidence to support the validity of its content, its reliability andfeasibility of use in a simulated setting remains to be tested. It isalso important to establish whether the OSAD tool measuresthe quality of debriefings consistently. The fact that the tool con-sists of observable behaviours is important in achieving this (ie,the amount of inference involved from the part of the assessoris minimised), but inter-rater reliability should be evaluated.Further research should also evaluate the meaningfulness of thescoring system and the way OSAD scores correlate with exter-nally derived criteria—that is, further validation of OSAD oughtto be carried out. This should be hypothesis driven—forinstance, we could hypothesise that more effective debriefings(as assessed via higher OSAD scores) lead to better transfer oflearning from simulation-based training to the clinical setting,or from one training scenario to another.
OSAD serves the purpose of providing a model for debrief toensure a level of standardisation in debriefing across multiplesites, particularly pertinent for groups such as the ExaminingPaediatric Resuscitation Education Using Simulation andScripting (EXPRESS) network.5 Although developed for paedi-atric simulation debriefing, OSAD also has the potential to beused by other specialties and it may also be useful for providingfeedback in the real-life clinical environment. As highlighted bythe present study, the majority of registrars felt that they did notroutinely receive feedback after managing a seriously ill child,leaving them with feelings of uncertainty and unanswered ques-tions. If tested further in a real clinical environment, OSAD hasthe potential to address this deficit by providing a structuredmethod of feedback to encourage such workplace-based learn-ing and to improve the quality of patient care and safety.
CONCLUSIONOSAD has been developed using the evidence base in the litera-ture and an interview study to create a tool to guide and assessdebriefings of simulations of serious paediatric situations. Thistool provides a structured approach to debriefing, which hasinitial evidence supporting its validity. Pending further psycho-metric testing, this tool may be used to improve the quality ofdebriefing after paediatric simulation.
POST-SCRIPTThe OSAD tool for paediatrics is currently used by the LondonSchool of Paediatrics Simulation Committee both for faculty devel-opment (to guide and assess novice debriefers) and to aid simula-tion centres in standardising the structure of the debrief for theregional ST3 specialty training simulation programme (third yearof a 7–8-year residency programme in the UK paediatric trainingsystem). Feedback that our team has had to date from simulationfacilitators in London suggests that the OSAD tool is particularlyuseful for novice debriefers, as an aide-memoire immediatelybefore and during a debriefing, and to guide reflection on theirdebriefing practices with their mentor after facilitating these
debriefings. Although feedback has also suggested that the tool canappear rather complicated and ‘wordy’ on first impression, usershave commented on its ease of use once the paediatrician userreads through it and becomes familiar with the format and tooldimensions. Following the presentation of OSAD at national andinternational conferences, other simulation centres in the UK andoverseas (including the Scottish Centre for Simulation and ClinicalHuman Factors, University of Miami and ManchesterMetropolitan University) are feeding back to our team that theyare using the instrument to help structure their debriefs or evaluatethem, or as part of ‘train-the-trainers’ debriefing courses for anysub-speciality debriefings.
Main messages
▸ Performance debriefing after simulations of seriouspaediatric situations is crucial to maximise the learningexperience and improve patient safety.
▸ A literature review and interview study of paediatriciansidentified the most important features of an effectivepaediatric debriefing.
▸ A newly developed user-informed tool using the current evidencebase, the Objective Structured Assessment of Debriefing (OSAD)tool for paediatrics, was produced from this research.
▸ OSAD may be used to guide and assess debriefings aftersimulations of serious paediatric situations.
Current research questions
▸ Does using Objective Structured Assessment of Debriefing(OSAD) in paediatrics improve the quality of debriefings insimulated and clinical settings?
▸ What is the optimal method of using OSAD in paediatrics totrain novice facilitators to improve their debriefingtechniques?
▸ What are the psychometric properties of OSAD in paediatricsas an assessment tool?
Key references
▸ Issenberg SB, McGaghie W, Petrusa E, et al. Features anduses of high-fidelity medical simulations that lead toeffective learning: a BEME systematic review. Med Teach2005;27:10–28.
▸ Dieckmann P, Molin Friis S, Lippert A, et al. The art andscience of debriefing in simulation: Ideal and practice. MedTeach 2009;31:e287–94.
▸ Raemer D, Anderson M, Cheng A, et al. Research regardingdebriefing as part of the learning process. Simul Healthc2011;6:S52–7.
▸ Fanning R, Gaba D. The role of debriefing insimulation-based learning. Simul Healthc 2007;2:115–25.
▸ Rudolph J, Simon R, Dufresne R, et al. There’s no such thingas “nonjudgmental” debriefing: a theory and method fordebriefing with good judgment. Simul Healthc 2006;1:49–55.
620 Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
Acknowledgements The authors would like to thank Dr Mehrengise Cooper, theLondon School of Paediatrics and all participants who agreed to be interviewed forthis study.
Collaborators Dr Mehrengise Cooper, London School of Paediatrics SimulationNetwork.
Contributors All authors listed contributed to the revision and final approval ofthis article. Study design: JR, SA, NS, RK. Data collection: JR, LT. Data analysis andinterpretation: JR, LT, SA, NS. Drafting and revising article: JR, SA, NS, RK, LT. Finalapproval of version to be published: JR, SA, NS, RK, LT.
Funding The London Deanery Educational Fellowship Programme provided fundingfor this work. SA and NS are affiliated with the Imperial Patient Safety TranslationalResearch Centre (http://www.cpssq.org), which is funded by the National Institute forHealth Research, UK.
Competing interests None.
Ethics approval Institute of Education, University of London.
Provenance and peer review Not commissioned; externally peer reviewed.
REFERENCES1 Aggarwal R, Mytton OT, Derbrew M, et al. Training and simulation for patient
safety. Qual Saf Health Care 2010;19(Suppl 2):i34–43.2 Undre S, Koutantji M, Sevdalis N, et al. Multidisciplinary crisis simulations: the way
forward for training surgical teams. World J Surg, 2007;31:1843–53.3 Eppich W, Adler M, McGaghie W. Emergency and critical care paediatrics: use of
medical simulation for training in acute paediatric emergencies. Curr Opin Pediatr2006;18:266–71.
4 Cheng A, Duff J, Grant E, et al. Simulation in Paediatrics: an educational revolution.Paediatr Child Health 2007;12:465–8.
5 Cheng A, Hunt E, Donoghue A, et al. EXPRESS: Examining Pediatric ResuscitationEducation Using Simulation and Scripting. The birth of an international pediatricsimulation research collaborative- from concept to reality. Simul Healthc 2011;6:34–41.
6 Kneebone R, Nestel D. Learning clinical skills–the place of simulation and feedback.Clin Teach 2005;2:86–90.
7 Shapiro MJ, Morey JC, Small SD, et al. Simulation based teamwork training foremergency department staff: Does it improve clinical team performance when addedto an existing didactic teamwork curriculum? Qual Saf Health Care2004;13:417–21.
8 Reason J. Understanding adverse events: human factors. Qual Health Care1995;4:80–9.
9 Neily J, Mills PD, Young-Xu Y, et al. Association between implementation of amedical team training program and surgical mortality. JAMA 2010;304:1693–700.
10 Issenberg SB, McGaghie W, Petrusa E, et al. Features and uses of high-fidelitymedical simulations that lead to effective learning: a BEME systematic review. MedTeach 2005;27:10–28.
11 Dieckmann P, Molin Friis S, Lippert A, et al. The art and science of debriefing insimulation: Ideal and practice. Med Teach 2009;31:e287–94.
12 Raemer D, Anderson M, Cheng A, et al. Research regarding debriefing as part ofthe learning process. Simul Healthc 2011;6:S52–7.
13 Lederman L. Debriefing: towards a systematic assessment of theory and practice.Simul Gaming 1992;23:145–60.
14 Rudolph J, Simon R, Dufresne R, et al. There’s no such thing as “nonjudgmental”debriefing: a theory and method for debriefing with good judgment. Simul Healthc2006;1:49–55.
15 Salas E, Klein C, King H, et al. Debriefing medical teams: 12 evidence- based bestpractices and tips. Jt Comm J Qual Patient Saf 2008;34:518–27.
16 Martin J, Regehr G, Reznick H, et al. Objective structured assessment of technicalskills (OSATS) for surgical residents. Br J Surg 1997;84:273–8.
17 Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulationin healthcare: development and psychometric properties. Simul Healthc2012;7:288–94.
18 Gururaja R, Yang T, Paige J, et al. Examining the effectiveness of debriefing at thepoint of care in simulation- based operating room team training. 2009 [online].http://www.ahrq.gov/downloads/pub/advances2/vol3/Advances-Gururaja_7.pdf
19 Bishop S. The complete feedback skills training book. Farnham: Gower, 2000.20 Dismukes R, Gaba D, Howard S. So many roads: facilitated debriefing in healthcare.
Simul Healthc 2006;1:23–5.21 Domuracki KJ, Moule CJ, Owen H, et al. Learning on a simulator does transfer to
clinical practice. Resuscitation 2009;80:346–9.22 Dreifuerst K. The essentials of debriefing in simulation learning: a concept analysis.
Nurs Educ Perspect 2009;30:109–14.23 Edelson D, Litzinger B, Arora V, et al. Improving in-hospital cardiac arrest
process and outcomes with performance debriefing. Arch Intern Med2008;168:1063–9.
24 Fanning R, Gaba D. The role of debriefing in simulation-based learning. SimulHealthc 2007;2:115–25.
25 Folkman J. The power of feedback: 35 principles for turning feedback from othersinto personal and professional change. New Jersey: John Wiley & Sons, 2006.
26 Gaba D. The future vision of simulation in healthcare. Qual Saf Health Care2004;13:i2–10.
27 Harvard Business School. Giving feedback: expert solutions to everyday challenges.Boston: Harvard Business Press, 2007.
28 Issenberg SB, McGaghie W, Hart I, et al. Simulation technology for health careprofessional skills training and assessment. JAMA 1999;282:861–6.
29 Kilbourn B. Constructive feedback: learning the art. Virginia: OISE press, 1990.30 Kyle R, Murray WB. Clinical Simulation: operations, engineering and management.
Burlington, MA: Academic Press, 2008.31 Lederman L. Debriefing: a critical re-examination of the postexperience
analytic process with implications for its effective use. Simul Gaming1984;15:415–31.
32 McGaghie W, Issenberg B, Petrusa E, et al. Effect of practice on standardisedlearning outcomes in simulation-based medical education. Med Educ2006;40:792–7.
33 McGaghie W, Issenberg B, Petrusa E, et al. A critical review of simulation basedmedical education research: 2003–2009. Med Educ 2010;44:50–63.
34 Morgan P, Tarshis J, LeBlanc V, et al. Efficacy of high- fidelity simulation debriefingon the performance of practicing anaesthetists in simulated scenarios. Br J Anaesth2009;103:531–7.
35 Pearson M, Smith D. Debriefing in experience-based learning. Simul Games Learn1986;1621:155–72.
36 Owen H, Follows V. GREAT simulation debriefing. Med Educ 2006;40:488–9.37 Petranek C. Written debriefings: the next vital step in learning with simulations.
Simul Gaming 2000;31:108–18.38 Porter T. Beyond metaphor: applying a new paradigm of change to experiential
debriefing. JEE 1999;22:85–90.39 Rall M, Manser T, Howard S. Key elements of debriefing for simulator training. Eur J
Anaesthesiol 2000;17:515–26.40 Rubin I, Campbell T. The ABCs of effective feedback: a guide for caring
professionals. San Francisco, CA: Jossey Bass, 1997.41 Rudolph J, Simon R, Rivard P, et al. Debriefing with good judgment: combining
rigorous feedback with genuine inquiry. Anesth Clin 2007;25:361–76.42 Rudolph J, Simon R, Raemer D, et al. Debriefing as formative assessment: closing
performance gaps in medical education. Acad Emerg Med 2008;15:1–7.43 Steinwachs B. How to facilitate a debriefing. Simul Gaming 1992;23:186–95.44 van de Ridder J, Stokking K, McGaghie W, et al. What is feedback in clinical
education? Med Educ 2008;42:189–97.45 Westberg J. Fostering reflection and providing feedback: helping others learn from
experiences. New York: Springer publishing company, 2001.46 Schön D. Educating the reflective practitioner. San Francisco, CA: Jossey-Bass, 1987.47 Lambden S, DeMunter C, Dowson A, et al. The Imperial Paediatric Emergency
Training Toolkit (IPETT) for use in paediatric emergency training: development andevaluation of feasibility and validity. Resuscitation 2013;84:831–6.
48 Abell N, Springer D, Kamata A. Developing and validating rapid assessmentinstruments. New York, NY: Oxford University Press, 2009.
49 Ahmed M, Sevdalis N, Nestel D, et al. Identifying best practice guidelines fordebriefing in surgery: a tri-continental study. Am J Surg 2012;203:523–9.
50 Arora S, Ahmed M, Paige J, et al. Objective structured assessment of debriefing(OSAD): bringing science to the art of debriefing in surgery. Ann Surg2012;256:982–8.
Runnacles J, et al. Postgrad Med J 2014;90:613–621. doi:10.1136/postgradmedj-2012-131676 621
Original article
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from
of Debriefing (OSAD) toolpaediatric Objective Structured Assessmentperformance debriefing and learning: the Development of a tool to improve
Sonal AroraJane Runnacles, Libby Thomas, Nick Sevdalis, Roger Kneebone and
doi: 10.1136/postgradmedj-2012-1316768, 2014
2014 90: 613-621 originally published online SeptemberPostgrad Med J
http://pmj.bmj.com/content/90/1069/613Updated information and services can be found at:
MaterialSupplementary
676.DC1.htmlhttp://pmj.bmj.com/content/suppl/2014/09/08/postgradmedj-2012-131Supplementary material can be found at:
These include:
References #BIBLhttp://pmj.bmj.com/content/90/1069/613
This article cites 40 articles, 9 of which you can access for free at:
serviceEmail alerting
box at the top right corner of the online article. Receive free email alerts when new articles cite this article. Sign up in the
CollectionsTopic Articles on similar topics can be found in the following collections
(77)Patients
Notes
http://group.bmj.com/group/rights-licensing/permissionsTo request permissions go to:
http://journals.bmj.com/cgi/reprintformTo order reprints go to:
http://group.bmj.com/subscribe/To subscribe to BMJ go to:
group.bmj.com on November 19, 2014 - Published by http://pmj.bmj.com/Downloaded from