Article Text

Download PDFPDF

Acute cardiovascular care
The role of simulation-based education in cardiology
  1. Khalid Barakat
  1. Correspondence to Dr Khalid Barakat, Department of Cardiology, Barts Heart Centre, Barts Health NHS Trust, St Bartholomew’s Hospital, London EC1A 7BE, UK; khalid.barakat{at}bartshealth.nhs.uk

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Learning objectives

  • To develop an understanding of the potential use of simulation in healthcare.

  • To understand the importance of identifying a trainee’s or learner’s needs before developing simulation-based training.

  • To develop an understanding of the minimal standards for delivering high-quality simulation as set out by the Association for Simulated Practice in Healthcare.

Introduction

Simulation-based education (SBE) is a ‘technique and not a technology’ which aims to ‘replace or amplify real-life experiences with guided experience that evoke or replicate substantial aspects of the real world in a fully interactive environment’.1 Many simulations within healthcare can achieve a high degree of fidelity in which participants behave and act in a manner in which they would in a real-life scenario.1

Despite the paucity of evidence underpinning the traditional apprenticeship model for training,1 most reviews discussing the potential of SBE in medical healthcare examine the evidence that SBE is either equivalent to or better than this traditional model. The purpose of the current review is to explore some of the wider educational principles underpinning the use of SBE in medical healthcare and to examine both the potential drivers and obstacles to the development of SBE in cardiology training, human factors training and in the maintenance of competency. Commonly used terminology in suimulation based education are summarised in table 1.

Apprenticeship training model and SBE: two sides of the same coin

In an ideal training programme, a novice learning a new practical skill would be guided by an expert through five different stages of skill acquisition described by Dreyfus and Dreyfus.2–6 In this model, the following four components will contribute to successful learning:

  • The pace of learning is determined by the learner.

  • There would be immediate feedback for the trainee on their performance.

  • There is as little risk to the patient as possible through a combination of case selection (simple cases initially, moving to more complex with experience).

  • At each stage of training, the needs of the trainee would determine the objectives of the training.

Each of these components of training can be difficult to achieve in a workplace-based learning environment which can be ’unregulated, unpredictable, undertheorised, unstructured, with variable training, and learner development’.7 In educational terms this was described by Vygotsky,8 who coined the term ‘the zone of peripheral development (ZPD)’, which is defined as ‘the distance between the actual development level of a learner and that of their potential development level’. The influence of expert guidance on the learner is greatest within this zone, and it is for this reason that a failure to control or manipulate the clinical environment to tailor the needs of the learner will contribute to the learners’ perception that training is fragmented. A British Cardiovascular Society working group looking at simulation-based learning identified the four components of idealised training discussed above as best suited to being delivered through SBE.9 Furthermore, the working group also identified the European Working Time Directive, pressures of service provision and workflow, variations in frequency of certain procedures in some training centres, and patient safety as potential drivers for the increasing use of SBE in cardiology training.

SBE needs to be seen as complementary to and embedded within a medical service delivering high-quality healthcare.1 9 As an example of such a scenario, Kneebone et al 10 described a situation in which trainees move seamlessly between the clinical and simulation environment. In such a setting, the training needs of an individual trainee would determine their learning objectives within a simulated environment, followed by a return to the clinical environment and traditional training, with a subsequent return to the simulated environment in a cyclical pattern. Taking training in pacemaker implantation as an example, the learner’s identified training need might be suturing of the wound. This could be addressed with a specifically designed task-based simulation and then reinforced with training in the real world. Review of training progression would allow one to see if the objectives had been achieved. If they had not, then further simulation-based training could be developed and the cycle repeated. This has the potential advantage of allowing repetitive practice away from the patient and then further training in the ‘real world’ to ensure that any simulated practice is directly applicable to very specific skills.

Deliberate practice

In their observations of musicians at the Berlin Music Academy, Ericsson et al 11 estimated that it takes 10 000 hours (20 hours a week for 10 years) to become an expert musician. Since then they have hypothesised that experts are ‘made and not born’, and that to become an expert in anything from sport to music to medicine one needs to engage in deliberate practice.12 In Ericsson et al’s  original study of musicians of the Berlin Philharmonic Orchestra, he noted that the musicians, all experts in their own right, identified individuals within their orchestra that they deemed ‘masters’.11 The latter were defined by their ability to achieve performance levels for a new piece of music more quickly than their peers, and have a significantly reduced rate of decline in their performance in their 50s such that they were able to significantly extend their performing careers into their sixth decade. Remarkably, there was no significant difference in the total amount of time that experts and masters spent in practice, but there was a difference in the type of practice that they performed. The masters focused much more on their perceived weaknesses during practice than the experts. If such deliberate practice can separate highly accomplished musicians in this manner, then perhaps targeted simulated practice might deliver similar results for those performing complex technical procedures within medicine.

The five-stage model of skill acquisition

Understanding the developmental stages that a learner moves through during the acquisition of skills is important when developing needs-based training for an individual. Dreyfus and Dreyfus have described a linear stage of progression in the acquisition of skills based on phenomenological research of fighter pilots, chess players and foreign language students.2–6 They described five stages, each with increasing expertise. The first stage is that of the novice. An individual is taught rules for responding to context-free features or inputs in a binary fashion, analogous to a simple computer program. They give the example of a learner car driver being given the rule to change gear when the speed reaches a certain point. In this context the learner will always change gears at that speed.

The second stage (advanced beginner) is reached with increasing experience. The learner either independently or by instruction from their mentor begins to construct ‘maxims’ which allow them to contextualise these inputs and to respond appropriately to these newly recognised aspects. For the learner driver the revving of the engine becomes an aspect to consider in addition to the speed on the odometer when deciding the timing of a gear change.

Competency is the third stage of development and potentially the most challenging. During this stage an overwhelming number of contextualised inputs as well as responses are now recognised.

Through experience or formal instruction, the learner begins to devise a hierarchical structure for these multiple inputs, allowing the formation of a plan. This process of prioritising multiple inputs and aspects can lead to anxiety and fear in the trainee who feels responsible for any plans formulated. In the earlier two stages, a learner can blame any failure on the lack of adequate rules for the situation encountered. Using the car driver as an example, a driver approaching a bend on a camber will need to consider the speed and the revving noise of the engine and the road conditions (wet or dry) before making a decision on speed on entry to the corner. With increasing experience the learner learns to prioritise these different aspects to decide which is or are the most important for formulating an appropriate response.

An individual enters the fourth stage (proficient) when the recognition of the problem becomes almost intuitive. Once recognised the proficient individual still uses rules and analysis to formulate a response. The car driver will immediately recognise the problems of entering a corner in the wet and will then formulate the appropriate strategy for negotiating a response.

An individual is deemed an expert when the whole process becomes intuitive and almost instantaneous dependent on implicit knowledge formed from years of experience. The driver entering a corner will perceive the problems and respond appropriately without any thought.

Dreyfus and Dreyfus assign intuitive knowledge to increasing levels of expertise as others have before them.13

In the Dreyfus and Dreyfus model, the expert acts in a non-reflective intuitive manner which does not fit the construct of a reflective learner or physician, which is central to the learning described by Schön.14 However, Dreyfus and Dreyfus note that at the competent stage the learner experiences anxiety and fear because they feel responsible for their analysis and decisions. This is akin to reflection but is only specifically described for this level of development.

Although this model describes a very linear developmental stage, each stage of skills acquisition requires experience, which implies recurrent exposure to a wide range of situations many of which may be similar and therefore occur in a potentially spiral fashion.

Although the Dreyfus and Dreyfus model was based on the observations of fighter pilots, the different stages of skills acquisition can be seen as applicable to cardiology trainees progressing through technical skills training. By breaking down the skills required for any given procedure to its constituent parts, one is able to develop a progressive ‘building blocks’ approach to task training. Taking coronary angiography as an example, its constituent parts would include familiarisation with the basic equipment, arterial access and sheath introduction, guidewire and catheter manipulation to the ascending aorta, intubation of the left and right coronary artery, correct use of the three-way taps on the manifold, controlled injection of dye, and optimum acquisition of images. By focusing serially on each individual constituent element, to ensure competency at each stage before moving to the next, one can allow a trainee to concentrate on very small steps and minimise the feelings of being overwhelmed by the whole. This approach also allows precise learning objectives to be defined, which can then both be addressed in real-world training and supplemented with simulation-based learning as described by Kneebone et al.10 Taking the use of the manifold as an example, the process of filling the syringe with dye to injecting the coronary artery and returning the system to pressure monitoring is a seven-step process which should ideally be completed without the trainee having to look down at the manifold. Training in this constituent element of coronary angiography is ideal for part-task simulation away from the clinical environment both to reduce patient risk and to reduce the amount of time required in real-world training to achieve competency in the procedure as a whole.

Delivering high-quality simulation

High-fidelity immersive simulation, which is required to deliver this type of human factors training, is expensive both in terms of faculty and infrastructure. A review of the educational literature demonstrates that the greatest benefit of simulation comes from the debrief and is far greater than the skills acquisition from task training.15 It is recommended that the time allotted for a debrief be twice if not three times the length of the original simulation. A high-quality debrief will allow facilitated reflection by the participants of not only what was done but highlight the underlying thought processes (mental frames) underpinning those decisions—good or bad, right or wrong. This allows the greatest learning to be achieved by reinforcing those mental frames that are correct and realigning those that are not.

The Association for Simulated Practice in Healthcare (ASPiH), which ‘is the leading national learned body in the UK that focuses on the use and development of SBE and technology enhanced learning (TEL) for healthcare workforce education as well as training and patient safety improvement’, has published 21 minimum quality standards for simulation-based education and training.16

These standards mandate that the development of simulation-based training will require that a formal curriculum map is undertaken by individuals with an expertise in SBE to define those parts of the syllabus that are best delivered or perhaps can only be delivered with SBE.

Our own institution as well as others nationally have realised the difficulty that trainees have in achieving competency in pericardiocentesis. In order to address this, we have organised simulation-based training in pericardiocentesis to occur regularly during a trainee’s career with an opportunity for regular direct observation of procedural skills (DOPS) of simulated pericardiocentesis. This training has been mapped to the specialty training curriculum for cardiology in 2010 (amendments in 2016) published by the Joint Royal Colleges of Physicians Training Board,17 which states that ‘a trainee needs to be able to carry out pericardiocentesis in the diagnosis and treatment of patients with pericardial disease’. Furthermore, the trainee needs to be able to demonstrate the knowledge of the indications for diagnostic and therapeutic pericardiocentesis, demonstrate the ability to perform pericardiocentesis and demonstrate sympathy for the patient’s anxiety. On the basis of this criterion, we have developed three levels of competency in pericardiocentesis:

  • Level 1: the trainee can demonstrate knowledge of the indications for diagnostic and therapeutic pericardiocentesis, and demonstrate an understanding of the technical skills required for performing pericardiocentesis.

  • Level 2: in addition to level 1 competency, the trainee can demonstrate the ability to undertake a pericardiocentesis.

  • Level 3: in addition to level 2 competency, the trainee can demonstrate sympathy for the patient’s anxiety.

On the basis of the defined levels of competency, our training programme for each level is as follows:

  • Level 1: lecture, seminar and tutorial-based teaching, which may include online access to learning material. Assessment can take the form of a case-based discussion, viva or knowledge-based assessment.

  • Level 2: simulation-based training using a part-task trainer. Assessment can take the form of a competency-based assessment with DOPS of a simulated pericardiocentesis using a part-task trainer.

  • Level 3: immersive simulation-based training using hybrid simulation combining a part-task trainer with a simulated patient. Assessment can take the form of a competency-based assessment with DOPS of a simulated pericardiocentesis scenario.

The use of simulation beyond education and training of trainees

Human factors and single and multidisciplinary team training

Studies from aviation have demonstrated that incidents are more commonly due to human failures rather than technical failures.18 Moreover, for experienced pilots and operators, such human failures are less a result of failure in technical skills and more the result of failings in non-technical skills. The latter, initially termed crew resource management skills by the European aviation industry in the 1990s, is defined as the cognitive, social and personal skills that complement technical skills and contribute to safe and efficient practice.18 The major components of non-technical skills (human factors) as applied to healthcare would include the following:

  • Leadership.

  • Communication.

  • Teamwork.

  • Situational awareness.

  • Problem solving.

  • Management of stress and fatigue.

Incidents within healthcare, as in the aviation industry or nuclear industry, are usually caused by human factors. In recognition of the importance of these human factors, the National Quality Board (NQB) has released a concordat19 which aims to ‘enhance clinical performance through an understanding of the effects of teamwork, tasks, equipment, workspace, culture and organisation on human behaviour and abilities and application of that knowledge in clinical settings’. The NQB aims to achieve this by embedding human factors principles and practice into the systems, culture and processes of the National Health Service. This concordat aims to empower both commissioners and providers to develop training in human factors as a quality indicator, which in time will become an important, perhaps mandatory, consideration when commissioning services. It is through the linking of patient safety, commissioning and training in human factors that will potentially have the greatest impact in increasing simulation-based training within all sectors within the healthcare industry.

Maintenance of certification, revalidation and relicensing

Although not mandatory, the American College of Cardiology is using percutaneous coronary intervention  (PCI) simulation as part of the process of maintaining certification in interventional cardiology.20 This is based on the results of a study with the Samantha Sim suite21, which could differentiate between novices and those who were experts or skilled operators in PCI. Interestingly, it did not have the resolution to differentiate between those who were proficient and experts. The importance of demonstrating ongoing competency has become increasingly important in the delivery of healthcare, and it is difficult to see that simulation-based assessment will not become an important part of the assessment of individual cardiologists and perhaps also cardiology teams much akin to the processes already in place in the nuclear and airline industries.

Conclusions

The use of simulation in healthcare is increasing rapidly, and driven by concordat’s such as those of the NQB will become embedded within all aspects of high-quality, safe and effective healthcare delivery. It is however an expensive training modality, and its use needs to be specifically targeted in line with minimum standards set out by the ASPiH. High-quality simulation, if used appropriately and in an embedded complementary fashion to other aspects of training, promises to be effective in the education of trainees in practical skills and in the broader training of teams in non-technical skills and perhaps in the use of maintenance of competency of teams and individualsSupplementary file 1 .

Table 1

Common terminology used in simulation-based education

Key messages

  • Simulation-based education is a technique not a technology.

  • Learning objectives need to be identified before developing a simulation-based education programme.

  • The greatest learning from simulation-based education is dependent on the quality of feedback given to the learner.

  • The benefits in simulation-based education in reducing patient risk through human factors training is probably greater than its potential benefits in manual skills acquisition.

CME credits for Education in Heart

Education in Heart articles are accredited for CME by various providers. To answer the accompanying multiple choice questions (MCQs) and obtain your credits, click on the ‘Take the Test’ link on the online version of the article. The MCQs are hosted on BMJ Learning. All users must complete a one-time registration on BMJ Learning and subsequently log in on every visit using their username and password to access modules and their CME record. Accreditation is only valid for 2 years from the date of publication. Printable CME certificates are available to users that achieve the minimum pass mark.

References

  1. *1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. *15.
  16. 16.
  17. 17.
  18. 18.
  19. *19.
  20. 20.
  21. *21.

Footnotes

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent Not required.

  • Provenance and peer review Commissioned; externally peer reviewed.

  • Author note References which include a * are considered to be key references.