Analysis

Overcoming challenges in implementing the WHO Surgical Safety Checklist: lessons learnt from using a checklist training course to facilitate rapid scale up in Madagascar

Abstract

The WHO Surgical Safety Checklist was launched in 2009, and appropriate use reduces mortality, surgical site infections and complications after surgery by up to 50%. Implementation across low-income and middle-income countries has been slow; published evidence is restricted to reports from a few single institutions, and significant challenges to successful implementation have been identified and presented. The Mercy Ships Medical Capacity Building team developed a multidisciplinary 3-day Surgical Safety Checklist training programme designed for rapid wide-scale implementation in all regional referral hospitals in Madagascar. Particular attention was given to addressing previously reported challenges to implementation. We taught 427 participants in 21 hospitals; at 3–4 months postcourse, we collected surveys from 183 participants in 20 hospitals and conducted one focus group per hospital. We used a concurrent embedded approach in this mixed-methods design to evaluate participants’ experiences and behavioural change as a result of the training programme. Quantitative and qualitative data were analysed using descriptive statistics and inductive thematic analysis, respectively. This analysis paper describes our field experiences and aims to report participants’ responses to the training course, identify further challenges to implementation and describe the lessons learnt. Recommendations are given for stakeholders seeking widespread rapid scale up of quality improvement initiatives to promote surgical safety worldwide.

Key questions

What is already known about this topic?

  • Use of the WHO Surgical Safety Checklist dramatically improves patient outcomes in surgery, including reductions of perioperative mortality, postoperative infections and complications.

  • Wide-scale implementation of the checklist has been difficult worldwide, and significant challenges exist in low-income and middle-income countries (LMICs).

What are the new findings?

  • A 3-day multidisciplinary checklist training programme can effectively promote personal and organisational change towards improved patient safety in surgery.

  • Many known challenges to implementation in LMICs can be successfully overcome if acknowledged and addressed.

Recommendations for policy

  • All surgical provision to LMICs should have a multidisciplinary checklist training component which focuses on adaptation to local needs and cultures to overcome known implementation challenges.

  • Collaboration and communication with key Ministry of Health and national medical leadership, beyond formal meetings and letters, is critical to sustainability of these initiatives.

Background

The 2009 WHO Surgical Safety Checklist improves morbidity and mortality after surgery by up to 47% when used appropriately.1–3 Considering that the 313 million annual surgical procedures have a mortality rate of up to 10% and a disability rate of up to 17%,4 5 large-scale checklist implementation across low-income and middle-income countries (LMICs) has significant potential to save lives. While the checklist has been implemented successfully in several high-income countries (HICs),6–8 wide-scale uptake in LMICs has been slow. Current reports of successful implementation in LMICs are limited to single hospitals following extended and significant investment of both time and resources by a HIC partner.9 There is a pressing need for high-quality, low-resource strategies for large-scale implementation, including culturally sensitive training programmes with particular emphasis on overcoming known challenges to implementation.

The majority of existing literature on checklist implementation in LMICs is limited to single institutions presenting several recurring challenges. The most common challenges identified are design of the checklist; senior clinician resistance (active or passive); general paucity of buy-in owing to poor understanding, training and input; and failure of adaptation to local and personnel practice.6 8 10 Other reported challenges include general lack of supplies, functioning equipment and personnel,11 and a systematic review also highlighted general scepticism of the evidence base.12 Successful implementation has been facilitated by interactive team training sessions and clear, simplified adaptation of the checklist to local context.12

With rapid wide-scale implementation in mind, we developed a multidisciplinary 3-day checklist training model designed to overcome reported challenges to implementation. In this analysis paper, we describe the design of the programme, report participants’ perceptions and reactions at 3–4 months post-training and report participants’ self-reported challenges to checklist implementation. Using a concurrent embedded approach to a mixed-methods research design,13 we aimed to evaluate success of the training programme, understand participants’ responses to the training, identify continuing challenges to implementation and report lessons learnt.

Based on our experiences, we present lessons learnt for others seeking wide-scale, rapid implementation of nationwide surgical safety initiatives in LMICs.

Programme design

Mercy Ships operates the world’s largest civilian hospital ship offering free surgeries and training across coastal Africa.14 Since 2013, we have been creating and adapting a short-format checklist training course designed to address and overcome challenges to implementation. Our experience in Guinea in 2013 demonstrated better results with multidisciplinary training in a hospital setting rather than teaching individual anaesthetists or surgeons in a classroom and expecting them to return to their local setting and implement change15; a pilot course in the Republic of Congo showed that a short 4-day course led to improved indices of surgical safety sustained at 18 months.16 We also sought more involvement from local leadership (hospital directors and surgeons) and from the Ministry of Health aiming to use both a ‘bottom-up’ and a ‘top-down’ approach to checklist implementation. From October 2014 to June 2016, Mercy Ships was based in Toamasina, Madagascar. Madagascar has a population of approximately 24 million, a surgical workforce density of 0.78 providers per 100 000 people and an annual surgical volume of 135–191 procedures per 100 000 people.17 Most regional referral hospitals lack a reliable electricity, oxygen and paediatric equipment and cannot meet the World Federation of Societies of Anesthesiologists’ (WFSA) minimum standards.18

From December 2014 to March 2015, Mercy Ships undertook a pilot programme of checklist implementation in two regional referral hospitals in Madagascar. The pilot programme allowed us to effectively shorten the course to 3 days, adapt the workshops to Malagasy culture and train two Malagasy physicians in teaching and evaluation methodology. Then, in collaboration with the Ministry of Health and the professional medical society of Madagascar, we designed a wide-scale checklist implementation programme to reach the 21 largest governmental surgical hospitals in Madagascar. The programme took place from September 2015 to May 2016 and included an initial 3-day training course in each hospital, informal telephone follow-up 6 weeks later to members of hospital staff identified during the training as important to sustainable implementation and an in-person follow-up visit at 3–4 months postcourse to evaluate participant experiences and organisational change and address ongoing challenges to implementation through surveys and focus groups.

The 21 hospitals chosen by the Ministry of Health and Mercy Ships represented all functioning governmental regional referral hospitals at the time of project implementation. Hospital directors received introductory written material from the Ministry of Health and Mercy Ships explaining the evidence base for checklist use and the course outline. All hospital directors completed a questionnaire regarding their facilities, existing safety practices and routine surgical practice. This data provided a baseline understanding of on-the-ground practice and procedure, as well as aiding the team to anticipate local contextual challenges to implementation, such as lack of standardised surgical counting and pulse oximeters (online supplementary appendix 1). Pulse oximeters are recommended by the WFSA as essential monitoring during anaesthesia,19 are considered mandatory in HICs yet are mostly absent from operating rooms in LMICs.20 Since a pulse oximeter is the only piece of equipment needed to implement the checklist, we decided to donate sufficient numbers for each operating and recovery room to overcome the challenge that lack of equipment poses to implementation.

Hospital directors were asked to invite participants to the training time. Entire surgical teams were asked to be present for the 3-day training course, including surgeons, anaesthesia providers, operating room nurses and any other perioperative staff members (eg, nursing aids and surgical assistants). Hospitals were requested not to schedule non-emergency surgery during the training period. The Ministry of Health and Hospital Directors did not consider this to pose any significant problems since elective surgery usually only occurred in the mornings and not every day of the week. Therefore, there was judged sufficient capacity to reschedule elective surgery around the training course. The course outline was adapted in each hospital to accommodate emergency surgeries and working schedules (such as morning rounds) of each team.

The 3-day training course commenced with an introductory lecture to explain the checklist and present the evidence that using the checklist saves lives and reduces complications. Participants then formed small multidisciplinary groups to adapt the checklist to their environment, followed by classroom simulations of the checklist and further local adaptions. Over 3 days, the simulations progressed from simple case scenarios to complex ones (eg, patients with allergies, major haemorrhage or lost instruments); from the classroom to the operating room and in several hospitals, it was used during real surgical cases.

Specific attention was given to the following during course design based on pilot experiences:

  1. Resolution of material challenges

    1. The checklist requires the use of pulse oximetry. In hospitals where this critical equipment was lacking, donations were made in sufficient quantities to ensure compliance of this part of the checklist.

  2. Improvement of technical skills

    1. Where specific skills contained in the checklist were lacking, such as counting needles, swabs and instruments, these skills were taught, and facilitative tools such as a laminated ‘counting sheet’ (online supplementary appendix 1) were designed and adapted by local staff.

  3. Attention to multidisciplinary teamwork

    1. Ongoing multidisciplinary feedback and group discussion was encouraged to aid all participants’ understanding of different roles and responsibilities within the operating room team and to encourage teamwork through mutual understanding.

  4. Attention to leadership and sustainability

    1. A dinner out with key regional and hospital leadership was arranged to allow further discussion of the benefits of checklist implementation and secure top-level support for the programme in a small, informal gathering.

    2. Formal closing ceremony with hospital director and leadership, including certificates for each participant, a formal handover of donated equipment and laminated copies of the adapted checklist and count sheets was organised.

    3. Informal telephone calls to key hospital staff 6 weeks after the course encouraged continued checklist use, identified early challenges to implementation and discussed potential solutions.

Addressing previously identified challenges in course design

Several aspects of the training were developed to specifically address and overcome challenges to implementation identified by prior research on checklist implementation; these can be found in table 1.

Table 1
|
Addressing known challenges in course design

The training team usually consisted of five people; two Malagasy doctors (VAR and HNR); a HIC anaesthetist (LSB, VA or MCW), a surgical nurse (AH) and a project manager (JC or KLC) or medical student (EB). MCW, KLC, JC and AH had experience in checklist implementation and trained the others. For remote hospitals only accessible by a three-seater plane, the team consisted of three people. All the authors except HHA formed part of the training team at different times.

Language considerations

The national languages in Madagascar are Malagasy and French. Lectures were taught in French and summarised in Malagasy. The majority of the training team spoke French. Translation was provided as needed by the Malagasy physician team members who were not professional translators. In most regional hospitals, the physicians and senior administrative leadership spoke French, whereas nurses and health aides often were more comfortable in Malagasy.

Evaluation of the course and self-reported challenges to implementation

The evaluation aimed to evaluate the success of the training programme, understand participants’ response to the training, identify challenges to implementation and report lessons learnt.

Evaluation was based on the Kirkpatrick model which is used across disciplines and measures impact of using four levels (table 2).21–23 Level 1 and 2 results were used in ongoing course monitoring to ensure programme beneficence and allowed continually adaptation of the programme for maximum participant enjoyment and learning. This analysis paper presents level 3 and 4 results, analysed using a mixed-methods design as recommended for complex patient-safety research by Brown et al 24 for its strength in informing in-depth understanding and development of theory while providing evidence of effectiveness from both a quantitative and qualitative aspect. We used the concurrent embedded mixed-methods design13 as the priority was establishing first whether or not behaviour had changed and then to further explore the reasons why.

Table 2
|
Kirkpatrick model for evaluating educational courses and our data sources

Three to 4 months after each course, a team of three to four people revisited each hospital to:

  • Conduct one focus group with operating room staff, using a focus group guide (online supplementary appendix 2) to discuss the impact of the checklist and challenges to implementation.

  • Administer an anonymous survey questionnaire using six open-ended questions to measure changes in personal and organisational practice and identify challenges encountered in implementation. The questionnaire also used two closed questions with a 3-point Likert-scale response to determine participant’s individual perceptions of changes in attitude to teamwork, communication, organisation, infection control and safe anaesthesia practice. Paper survey questionnaires were distributed to each participant in attendance immediately following the focus group and were written in Malagasy; responses were translated to English in electronic format by VA and HNR for analysis. Survey questionnaire can be found in online supplementary appendix 3.

Focus group was carried out by one or more of LSB, KLC, EB or MCW, either in French or in English with Malagasy translation by VA or HNR, depending on the comfort level of French among the participants and interviewers, with any/all perioperative operating room staff able to attend, working around their shift and surgical schedule. All focus groups took place in the participants’ hospital and lasted 45–90 min. The responses were not audio-recorded and transcribed owing to budget constraints, but notes were either typed directly into an electronic device or contemporaneously handwritten on paper by a team member. All participants were over 18 years and gave uncompensated voluntary, verbal consent to participate.

Analysis

Participant free-text responses from anonymous surveys and data from focus groups were translated as needed, grouped by category or question in Excel and then manually analysed using thematic analysis25 26 by KLC and LB. Important topics in survey data were individually identified and highlighted by one or both researchers and manually coded; then codes were grouped into related themes that emerged from the codes and agreed on by KLC and LB. No analysis software was used. Descriptive statistics using Microsoft Excel and StatPlus were used to analyse the quantitative data of Likert-scale responses.

Integration of quantitative and qualitative analyses

We used a concurrent embedded approach to this mixed-methods analysis13; our priority in data collection was first a quantitative approach to determine the success of the training programme; secondarily, we aimed to identify challenges to implementation of a nationwide patient safety initiative. These challenges make up our qualitative analysis, supporting quantitative descriptive statistics in behaviour and perception among participants as presented below.

Behavioural change (Kirkpatrick levels 3–4)

In a 9-month period (September 2015–May 2016), Mercy Ships trained 427 participants in 21 hospitals. Evaluation through focus groups and surveys of 183 participants (42.9%) occurred at 3–4 months post-training in 20 hospitals. One hospital was not visited owing to a scheduling change caused by a cyclone. The median size of the focus groups was eight participants (range 4–16).

Reported personal behavioural change (Kirkpatrick level 3)

On thematic analysis of reported personal behavioural change as a result of the training programme, three key themes emerged and are reported in table 3. Descriptive statistics were used to integrate percentages of reported behavioural change to support the themes that emerged from the qualitative analysis.

Table 3
|
Reported personal behavioural change, grouped by theme

Reported changes to organisational practice and improved patient outcome (Kirkpatrick level 4)

Questionnaire responses (using a 3-point Likert scale) to the impact of training on interactions with colleagues and overall organisational change are shown in table 4. At 3–4 months, teamwork and general organisation were most commonly reported to have improved (both 81%).

Table 4
|
Reported impact of checklist implementation on interactions within the operating room team and organisational practice at 3–4 months post-training (n=183)

The Likert-scale questionnaire responses were supported by the thematic analysis from the free-text questions and focus group which also reported a perception of improved teamwork, communication, diligence and patient safety. This can be seen in the following speeches: “The work is coordinated, we have peace of mind and the patients are safe” (H7) and “The work is much more harmonious” (H8) (see also table 3).

Most sites reported that although they had good working relationships before the course, teamwork and communication still improved. Thematic analysis identified this as being through knowledge sharing and a better understanding of shared responsibility as seen in the following speech: “A good result of the training is that it got the surgeon communicating with everyone in the operating room to make sure they are good, and that is very good for the patients surgery” (H6).

One participant relayed direct patient impact experience: “We had a patient recently who had delivered her first child by caesarean in a different city with a bigger hospital; she didn’t want to be delivering in our small hospital but it was necessary. We went through the checklist before beginning and she commented that they hadn’t done that during her first delivery; she felt very safe in the hands of our small, rural team” (H16).

Thematic analysis identified changes related to diligence with blood transfusion practice. One participant identified this as the most important thing resulting from the training and said: “If we anticipate a blood loss in surgery but don’t have blood ready or a donor, we can’t proceed” (H11). Another described how verifying patient identity during the checklist prevented administration of the wrong blood to the patient (H3).

Continual adaptation of the checklist and counting practices after training was also identified by thematic analysis. Some hospitals put a photocopy of the checklist in each patient’s record, whereas others use the laminated copy on the wall. One hospital transferred the counting sheet from paper to writing on the tiles on the wall; others showed the evaluation team a pile of completed counting sheets that had been photocopied and kept in a verification file.

A total of 74% of respondents felt that the checklist had changed the infection control practices in their hospital (table 4). Several hospitals anecdotally reported a reduction of postoperative infections, believed to be a result of verifying administration in appropriate timing of antibiotic prophylaxis. “We’ve always used antibiotics but it helps to verify it’s been given in the right time frame for every patient” (H15).

No hospital reported any detrimental effect of the training.

Challenges to implementation

On thematic analysis of open-ended survey questions and focus group concerning challenges to implementation, three predominant themes emerged from the collected data: emergency surgery, lack of personnel and unwillingness to change. Group discussions after the individual interviews focused on these challenges expressed by participants, and together, the participants and interviewers explored possible solutions; details are given in table 5.

Table 5
|
Themes describing the challenges to checklist implementation

Our course was designed to overcome prior known challenges to implementation (table 1). This could explain why few of these previously published challenges were reported by our participants. In particular, the multidisciplinary nature of our course using simulations with participants role-playing different members of the team may have helped break down hierarchical challenges through increased understanding of individual roles and responsibilities. Participants were encouraged and empowered to improve patient outcomes immediately, without needing to wait for outside help in the form of supplies, equipment and personnel. Unwillingness to change behaviour over time has been cited previously in both LMICs and HICs11 12 and is difficult to overcome. We had attempted to overcome this by engagement of the Ministry of Health, hospital directors and senior hospital leadership but were unable to directly assess the difference this made.

Limitations

Evaluating the impact of training in LMICs is challenging and while mixed-methods design and Kirkpatrick evaluation of training programmes have been previously reported,27 our evaluation has limitations. Primarily, the lack of recording and transcription of interviews and the lack of data analysis by gender and profession limit the qualitative depth of our analysis. The preintervention evaluation was limited to survey reports by hospital directors rather than individual participants and thus lacks the same depth as the postintervention evaluation. Most data relies on participants’ self-reported feedback and is open to positive and negative responder bias. Positive bias creates a falsely good impression to please the evaluators; negative bias hopes to attract further investment by highlighting unmet needs and ongoing problems. Focus groups are a key component of qualitative research in HICs and have been used successfully in LMICs but in a hierarchical culture can be open to bias from participants’ fear of speaking up, and group culture can interfere with individual expression.28 Our follow-up rate was only 47% (183/427); the results may be subject to positive responder bias where only those with something good to say attended the evaluation, whereas those who didn’t like the training stayed away for fear of confrontation or having to speak negatively to a non-governmental organisation who had offered free training and donated equipment. Another explanation for the low follow-up rate is that it became clear during evaluation visits that the need for the entire surgical team to attend the evaluation was not well understood or communicated to the participants, and many hospitals assumed only a few participants would be sufficient to report back for the group as a whole.

Despite these limitations, the strengths of our approach were in seeking to undertake a countrywide implementation of the WHO checklist in close collaboration with the Ministry of Health. We designed the course to directly address known challenges and continually seek further understanding of additional challenges as they emerged. We used a well-known framework, the Kirkpatrick model to evaluate our course at 3–4 months and generate lessons learnt concerning nationwide checklist implementation.

Conclusion

This analysis paper describes the development of a 3-day checklist training programme designed to overcome known implementation challenges and with rapid scale up in mind. The evaluation in 20 hospitals, based on the Kirkpatrick model, indicates that the majority of participants were able to institute changes in both personal behaviour and organisational practice which were sustained at 3–4 months. The training programme successfully overcame several previously identified implementation challenges,11 except resistance to change from older staff. Additional challenges prevalent in Madagascar were lack of personnel and difficulties in using the checklist in emergency situations. Based on our experience in Madagascar, we offer the following lessons learnt as recommendations to others attempting widespread checklist implementation.

Lessons learnt

  • National and regional partnership with the Ministry of Health and local partnership with hospital leadership is critical and must move beyond formal office meetings to ongoing discussions; for example, over a meal and/or ongoing contact by telephone.

  • Programme design in collaboration with host country nationals and continual adaptation of teaching materials to the local culture and context are necessary for local buy-in. A one-size fits all approach is unlikely to succeed.

  • Systematic nationwide implementation is recommended, as often staff members rotate between centres, and single-hospital implementation may not be sustainable.

  • Follow-up is important, in person if possible or by telephone, to discuss ongoing implementation challenges and reinforce importance of checklist use to prevent the initial training fading from memory.

  • All surgical provision to LMICs should have a multidisciplinary checklist training component including adaptation to local needs and culture.

This project was approved by the Mercy Ships Institutional Review Board. Additionally, Mercy Ships was invited to Madagascar by the president and prime minister and had a signed protocol for the delivery of surgical services and education. The Ministry of Health approved and assisted in the development and evaluation of the educational programme.