Article Text

Evaluation of Ebola virus disease surveillance system capability to promptly detect a new outbreak in Liberia
  1. Fulton Quincy Shannon, II1,2,
  2. Luke L Bawo2,
  3. John A Crump1,
  4. Katrina Sharples3,
  5. Richard Egan4,
  6. Philip C Hill1
  1. 1 Centre for International Health, University of Otago, Dunedin, New Zealand
  2. 2 Planning Research and Development, Republic of Liberia Ministry of Health, Monrovia, Liberia
  3. 3 Mathematics and Statistics, University of Otago, Dunedin, New Zealand
  4. 4 Preventive and Social Medicine, University of Otago, Dunedin, New Zealand
  1. Correspondence to Fulton Quincy Shannon, II; fqshannon{at}gmail.com

Abstract

Introduction Liberia was heavily affected by the 2014–2016 Ebola virus disease (EVD) outbreak. With substantial investments in interventions to combat future outbreaks, it is hoped that Liberia is well prepared for a new incursion. We assessed the performance of the current EVD surveillance system in Liberia, focusing on its ability to promptly detect a new EVD outbreak.

Methods We integrated WHO and US Centers for Disease Control and Prevention guidelines for public health surveillance system evaluation and used standardised indicators to measure system performance. We conducted 23 key informant interviews, 150 health facility assessment surveys and a standardised patient (SP) study (19 visits) from January 2020 to January 2021. Data were summarised and a gap analysis conducted.

Results We found basic competencies of case detection and reporting necessary for a functional surveillance system were in place. At the higher (national, county and district) levels, we found performance gaps in 2 of 6 indicators relating to surveillance system structure, 3 of 14 indicators related to core functions, 1 of 5 quality indicators and 2 of 8 indicators related to support functions. The health facility assessment found performance gaps in 9 of 10 indicators related to core functions, 5 of 6 indicators related to support functions and 3 of 7 indicators related to quality. The SP simulations revealed large gaps between expected and actual practice in managing a patient warranting investigation for EVD. Major challenges affecting the system’s operations across all levels included limited access to resources to support surveillance activities, persistent stock out of sample collection materials and attrition of trained staff.

Conclusion The EVD surveillance system in Liberia may fail to promptly detect a new EVD outbreak. Specific improvements are required, and regular evaluations recommended. SP studies could be crucial in evaluating surveillance systems for rarely occurring diseases that are important to detect early.

  • Health systems evaluation
  • Public Health
  • Viral haemorrhagic fevers
  • Epidemiology
  • Descriptive study

Data availability statement

Data are available upon reasonable request. All data (delinked) and tools will be made available based on a reasonable request to the Centers for International Health, University of Otago.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

  • Ebola virus disease (EVD) surveillance systems in sub-Saharan Africa have been challenged with delays in detecting new outbreaks.

  • The most severe manifestation of delayed outbreak detection was the 2014–2016 EVD epidemic in West Africa.

  • Issues with EVD surveillance include limited laboratory capacity, inadequately trained staff, limited information technology resources and poor health-seeking behaviour.

  • Since the 2014–2016 EVD epidemic, multiple organisations have collaborated to improve EVD surveillance, but formal evaluations of their ability to promptly detect a new outbreak are limited.

WHAT THIS STUDY ADDS

  • Ebola virus disease (EVD) surveillance systems in sub-Saharan Africa have been challenged with delays in detecting new outbreaks.

  • The most severe manifestation of delayed outbreak detection was the 2014–2016 EVD epidemic in West Africa.

  • Issues with EVD surveillance include limited laboratory capacity, inadequately trained staff, limited information technology resources and poor health-seeking behaviour.

  • Since the 2014–2016 EVD epidemic, multiple organisations have collaborated to improve EVD surveillance, but formal evaluations of their ability to promptly detect a new outbreak are limited.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

  • The study presents detailed surveillance system-wide performance measures for policy-makers and those running health facilities, to inform the development of specific interventions to close performance gaps.

  • Future evalutions of surveillance for rarely occurring but severe diseases, which need to be detected early, could incorporate SP studies as part of the assessment process.

Introduction

The West Africa Ebola virus disease (EVD) epidemic 2014–2016, caused by Ebolavirus Zaïre, was the largest ever recorded.1 Liberia was one of the three most affected countries,2 with 10 675 suspected, probable and confirmed cases and 4810 deaths.3 According to the United Nations Office for the Coordination of Humanitarian Affairs, US$1·07 billion were mobilised in 2014 to support the Liberian response and preparedness interventions through 47 institutions. In addition, US$3·9 million were received towards surveillance, preparedness and response interventions by 2016.4

Liberia’s Integrated Disease Surveillance and Response (IDSR) strategy was revised to include specific provisions for EVD, reporting tools and standard operating protocols (SOPs) in 2016. Healthcare workers at all levels were trained on surveillance and response competencies.5 At health facilities, healthcare workers received training on providing safe and quality services. Specific provisions include EVD surveillance, clinical emergency and management, and infection prevention and control (IPC) tools. Furthermore, laboratory diagnostic capacity for priority diseases, including EVD, was improved.5

As a result of these initiatives, it is hoped that Liberia is well placed to detect and respond promptly to a new EVD outbreak,6 fulfilling its obligations under the 2005 International Health Regulations to prevent local and international spread of the disease.7 However, given the number of activities and external agencies involved,8 optimal alignment of all resources and parties leading to achievement of this goal should not be assumed. Therefore, we aimed to assess the performance of Liberia’s EVD surveillance system 5 years after the epidemic, focusing on its ability to detect and respond effectively to a potential new outbreak.

Methods

Setting

This project was linked to a World Bank funded One Health project (EERP: 02/2016 TA; Crossover Diseases: Animal to Human Surveillance) in Liberia. It was a collaboration between researchers from the Ministry of Health, Liberia and the University of Otago, New Zealand (online supplemental file 1).

Supplemental material

Liberia is a West African country with a population of approximately 5 million people. It is divided into 15 counties, subdivided into 93 health districts. The Ministry of Health and the National Public Health Institute oversee operations nationwide. Each county has a ‘county health team’, while districts have district health teams. Liberia implements a three-tier health system: the primary healthcare level includes clinics and community health programmes; the secondary level consists of health centres and county hospitals; and the tertiary level consists of two health referral hospitals.9 In Liberia, EVD surveillance is implemented under the IDSR strategy, which categorises EVD as an ‘immediately reportable epidemic-prone disease’. An alert threshold is triggered if one case is suspected. An action or epidemic threshold is activated if the case is confirmed by laboratory testing.5

Conceptual framework

We primarily based this evaluation on the WHO framework for evaluating surveillance systems because of its applicability to low-resource countries,10 which implement EVD surveillance through the IDSR strategy.5 WHO developed a framework with a list of indicators in 200411 and an accompanying manual in 2006.12 It includes four components: structure, core functions, support functions and surveillance quality and has been used to assess specific components of surveillance systems13 and the usefulness of the IDSR strategy.14 The surveillance system structure includes surveillance legislation, surveillance strategies, coordination, networking and partnership. The core functions include case detection and reporting, registration and confirmation, and routinely analysing and interpreting data captured.13 Other core functions are epidemic preparedness and response and feedback. The support functions include standards and guidelines, training and supervision and resources. Surveillance system quality involves the system’s usefulness and attributes (stability, flexibility, simplicity, acceptability, representativeness and completeness).13

In addition, the framework provides a basis for identifying challenges in implementing IDSR.10 15 We used the 2001 update of the US Centers for Disease Control and Prevention (US CDC) guidelines for the evaluation of surveillance systems to assess the surveillance system’s quality component.16 This framework was designed to evaluate surveillance systems that monitor emerging threats from bioterrorism and imminent disease outbreaks.17

Study design and data collection

Using a mixed methods approach, we assessed system performance across the national, county, district and health facility levels (figure 1). Following a desk review to characterise Liberia’s surveillance system and finalise the evaluation design, we employed three main data collection methods: key informant interviews, a health facility survey and standardised patient (SP) visitation to health facilities. Data collection was implemented between January 2020 and December 2021.

Figure 1

Flow diagram of Integrated Disease Surveillance and Response (IDSR) implementation at each level, Liberia, 2020–2021. IHR, International Health Regulations.

Key informant interviews

We conducted 23 semistructured interviews with 22 key informants (online supplemental table 1). We purposively selected seven key informants at the national level. Five counties were selected based on the case count of EVD during the 2014–2016 EVD outbreak—classified as either high, medium or low. Surveillance activities during the 2014–2016 epidemic, including data collection and EVD confirmation, were standardised across the country. The top two counties for total case count were selected from the high and medium categories, while the county with the lowest case count was selected from the list of ‘low’ burden counties. We randomly selected two health districts from each of the five counties to interview district surveillance officers. We adapted generic questionnaires from the WHO and US CDC surveillance evaluation frameworks16 18 to develop the study interview guide. Questions were aligned with performance indicators related to each component. We asked key informants questions related to indicators aligned with their specific level of operations (online supplemental table 2). We employed both face-to-face (n=13) and telephone (n=10; due to COVID-19 restrictions) interviews, which were digitally recorded and transcribed. The key informant interviews were conducted between 25 January and 30 April 2020.

Supplemental material

Health facility assessment

Using trained data collectors, we administered a survey to surveillance focal persons (SFPs) at 150 health facilities across all 15 counties to provide a nationally representative evaluation19 of the core and support functions and the quality of the surveillance system. We developed the assessment tool using the WHO Service Availability and Readiness Assessment,19 and data quality review toolkit,20 and the US CDC tools.16 We excluded specialised health facilities, such as those for tuberculosis and mental health, and randomly selected health facilities from each of three strata: clinics, health centres and hospitals. Additionally, the interviewer made a physical walk through of each health facility to observe key items directly. We implemented the questionnaire in the Census and Survey Processing System (CSPro) software application (US Census Bureau, USA),21 22 and interviewers collected data on Android mobile devices. Forms were checked for completeness and transmitted to the online CSPro database at the end of each day. The health facility assessment was conducted betwteen 1 June and 31 August 2020.

SP study

We conducted an unannounced SP study during a period of heightened EVD risk (1 December 2020–31 January 2021) when there were two EVD outbreaks in The Democratic Republic of Congo (1 August 2018–25 June 2020) and Guinea (14 February 2021). Using stratified random sampling, we selected 20 health facilities in Bong and Montserrado counties—5 hospitals, 5 health centres and 10 clinics. SPs were purposively recruited after engaging communities around health facilities selected for the study and considering their age, educational level, occupation and gender. We developed three clinical scenarios (online supplemental table 3) portraying early symptoms of EVD, consistent with the national criteria for a suspected EVD case.23 We trained SPs by educating them about EVD and using role plays to depict ‘real-life’ manifestations of the clinical symptoms and presentation. We coached them on potential biohazards in health facilities, strategies to avoid infection and approaches to decline invasive medical procedures.24 We evaluated SPs clinically25 as part of the selection process to limit and minimise alternative diagnoses. We conducted postvisit interviews with the clinical staff that screened the SPs and with the SPs. These evaluated healthcare workers’ screening practices for EVD related to the visits and documented their practices from the SPs’ perspective. Due to the COVID-19 pandemic, we enrolled fewer SPs than planned and assessed half of the sample of health facilities.

Performance measurements

We adapted indicators informed by the WHO and US CDC guidelines. We set indicator targets to represent reasonable performance expectations consistent with Liberia’s IDSR strategy. Regarding the surveillance system’s structure, we assessed indicators linked to surveillance legislation, strategies, coordination, networking and partnership. Regarding core functions, we assessed indicators related to case detection and reporting, registration, confirmation, analysis and interpretation of data routinely captured by the system. We also measured the level of preparedness at each level of the system regarding. Furthermore, we assessed the provision of and described mechanisms for feedback. Regarding support functions, indicators evaluated covered the existence of standards and procedures, the proportion of trained staff with core competencies and supervision. Additionally, we determined the proportion of surveillance units with evidence of appropriate budgetary allocation. For surveillance system quality, we assessed indicators related to simplicity, acceptability, representativeness, stability, flexibility, data completeness and usefulness for evidence-based decision-making.

Data management and analysis

All quantitative data were imported to Microsoft Excel (Microsoft Corporation, Redmond, Washington, USA) and cleaned to eliminate inconsistencies and correct typographical errors where necessary. Each dataset was stored in an Excel comma-separated value file format. The transcripts from digital recordings of the key informant interviews were cleaned and stored as separate Microsoft Word (Microsoft Corporation) documents. For quantitative data, we conducted descriptive analyses and summarised the data into frequencies and proportions. All statistical analyses were done using STATA (V.16.1) (StataCorp LLC, Texas, USA). We used weighted percentage scores, which took into account the relative numbers of the different types of facility across Liberia, to measure each indicator. For the SP study, we directly compared health facility assessment findings with their associated results from the field evaluation. We described the associations between the system’s expected performance and the actual performance in the field. For the qualitative analysis, we analysed transcripts and interview notes using NVivo V.1.5 (QSR International, Melbourne, Australia) according to the performance indicators. We conducted a gap analysis comparing the observed results to a standard predetermined target. These predetermined targets were based on WHO’s standard benchmarks for surveillance and response indicators, focusing on an overall goal of a high-performing, but not perfect, system and taking the single disease focus into account (online supplemental table 2).

Results

Key informant interviews

With respect to the key informants, 90% (n=20) were male. Their median (range) age was 41·5 (32–60) years. Eleven had completed a Master’s degree in public health or a related field, while five had completed a Bachelor’s degree (one in Public Health). The interviews lasted a median (range) of 24·8 (15–50) min (online supplemental table 1).

Surveillance system structure

At the national, county and district levels, all (100%) of the surveillance units had a copy of the roles and responsibilities of stakeholders, while 89% (n=17) reported having a strategic plan of action. Four (one per quarter) intersectoral meetings, and one cross-border collaborative initiative, were held at the national level over the year before the assessment (table 1).

Table 1

Performance of surveillance system structure at national, county and district levels, 2020–2021

Core functions

At the national, county and district levels, all (100%) of the surveillance units had the standard case definitions for EVD, and all (100%) of the surveillance managers and officers displayed correct knowledge of them. All of the surveillance officers used the required case-based form, and the one recent suspected EVD case was referred for confirmation within 24 hours. Of the 40 samples from suspected EVD patients at the National Reference Laboratory, 98% (n=39) were tested, and all results were disseminated within 72 hours. Meanwhile, 73% (n=11) of 15 surveillance units reported no stockout of EVD sample collection supplies in the previous 3 months. However, only one unit reported that they had adequate funds (table 2).

Table 2

Performance of core and support functions at national, county and district levels, Liberia, 2020–2021

Support functions

All of the surveillance managers and officers at the national, county and district levels reported having training in IDSR and field epidemiology. Similarly, there were technical guidelines and SOPs relating to EVD surveillance and response, including sample collection, across all surveillance units. However, only 1 of the 12 surveillance units with budgetary functions showed evidence of a budget for implementing surveillance activities related to EVD (table 2).

Surveillance system quality

All of the surveillance units at the national, county and district levels produced weekly and quarterly reports and bulletins (table 2; usefulness). There was one standardised form for reporting suspected EVD cases and two channels for reporting (simplicity). The EVD surveillance system is interoperable with other subsystems (eg, Lassa fever, Marburg, Yellow fever) (flexibility). Approximately, 85% of stakeholders actively participated in EVD surveillance activities at least 3 months before the assessment (table 3).

Table 3

Performance of surveillance system quality, Liberia, 2020–2021

Health facility assessment

Of 150 assessments, data were adequate for analysis from 149 facilities, representing 18% (149) of the country’s 828 facilities. Of these, 88% (n=119) were clinics, 4% (n=11) were hospitals and 7% (n=19) were health centres. Additionally, 80% (n=117) were government-managed public facilities. More than 60% (n=101) of the 149 SFPs interviewed were nurses, while about 20% (n=27) were midwives, 4 were nurse aides and 7 were trained in other professions.

Core functions

At the health facility level, 84% (n=126) of SFPs displayed correct knowledge of the standard EVD case definitions. Furthermore, 70% (n=104) of 149 health facilities had a stock of case-based EVD reporting forms. In addition, 36% (n=53) of 149 SFPs portrayed correct knowledge of collecting and packaging EVD samples. Approximately 56% (n=85) of 149 health facilities reported no stockout of EVD sample collection kits at least 3 months before the assessment (figure 2A).

Figure 2

Performance of core and support functions and surveillance system quality of Ebola virus disease (EVD) surveillance and response system at health facilities, Liberia, 2020–2021. (A) Core functions. (B) Support functions. (C) Surveillance system quality.

Support functions

Approximately 89% (n=133) of 149 health facilities reported no stockout of case-based reporting forms at least 3 months before the assessment; 71% (n=106) reported having necessary technical guidelines; and 69% (n=103) reported having training in surveillance competencies. In addition, 24% (n=36) of 149 health facilities reported having a functional designated mobile phone for reporting suspected EVD cases (figure 2B).

Surveillance system quality

All SFPs identified one standardised suspected EVD case reporting form and three unambiguous channels for reporting to the higher levels (simplicity) (table 3). All (100%) of the health facilities sent regular reports, including ‘zero reporting’, to the higher levels (representativeness). About 87% (n=129) of SFPs accepted EVD surveillance as their responsibility (acceptability). Only 40% (n=60) of 149 health facilities had a designated phone, access to a mobile cellular network, daily internet service and uninterrupted electricity during operational hours (stability) and only 13% (n=20) showed evidence of an EVD line list meeting expected standards (completeness) (figure 2C).

SP study

In total, 9 SPs made 19 visits to 10 health facilities (hospitals, health centres and clinics). Overall, 60% (n=6) of health facilities were in urban areas and 90% (n=9) were publicly owned or managed. Overall, 56% (n=5) of the 9 SPs were male, and the median (range) age was 20 (5–32) years.

Expected performance in the SP study versus actual performance

The health facility assessment showed 92% (n=13) and 80% (n=29) of SFPs in Bong and Montserrado counties were recently trained in disease surveillance competencies (table 4). All had displayed ‘correct’ knowledge of the standard clinical case definition for a suspected EVD case. In addition, 85% (n=12) and 91% (n=33) had responsive supervision. However, no SP was suspected of being an EVD case. Therefore, the sensitivity was 0%, and the timeliness of the EVD surveillance system was not estimated. Furthermore, healthcare workers infrequently probed SPs for contact with animals, 15% (n=3), similar symptoms in close contacts, 15% (n=3), or travel history, 10% (n=2) (table 4).

Table 4

EVD surveillance system performance versus outcomes in practice, SP study, Bong and Montserrado counties, Liberia, 2020–2021

Performance gap analysis

At the national county and district levels, we recorded performance gaps in 22·2% (n=8) of 36 indicators related to the surveillance system structure, core and support functions and surveillance system quality. The highest performance gaps for surveillance units were in relation to the presence of an existing and adequate budget for emergency response (74·1%), evidence of budget for surveillance activities (71·7%) or having a copy of the National Public Health Law (69·5%) (figure 3A).

Figure 3

Performance gaps in EVD surveillance and response implementation at national county, district and health facility levels, Liberia, 2020–2021. (A) Gaps at national, county and district levels. (B) Gaps at health facility level.

We recorded performance gaps in 68·2% (n=15) of 22 indicators assessed at the health facility level in core and support functions and surveillance system quality. The highest performance gaps were documented for data completeness (66·6%), SFP knowledge on the collection and packaging of EVD samples (64·1%), health facilities having a designated functional mobile phone for reporting (55·8%) and evidence of data analysis (45·3%) (figure 3B). The SP study identified a 100% gap in performance for the system’s sensitivity.

Discussion

In this study, we found the EVD surveillance system in Liberia has the basic competencies required for case detection and reporting necessary for a functional surveillance system were in place. However, at the higher (national, county and district) levels, we found performance gaps in 2 of 6 indicators relating to surveillance system structure, 3 of 14 indicators related to core functions, 1 of 5 quality indicators and 2 of 8 indicators related to support functions. Similarly, at the health facility level, we found performance gaps in 9 of 10 indicators related to core functions, 5 of 6 indicators related to support functions and 3 of 7 quality indicators. In the field evaluation, there was a large gap between expected and actual practice in managing a patient warranting investigation for EVD. These findings suggest that while Liberia has made substantial progress in EVD surveillance, there are several areas for improvement if a future EVD outbreak is to be detected promptly.

Concerning specific performance indicator findings, while no other studies have comprehensively evaluated EVD surveillance system performance, several studies have evaluated IDSR performance, within which EVD surveillance resides. Similar to our study, Nagbe et al 5 recorded inconsistencies when assessing the implementation of IDSR in Liberia. They found persistent stockouts of sample collection kits at health facilities and the need for correct knowledge of the packaging of EVD samples. Resources for surveillance were available at the national, county and district levels but not at the health facilities. Nagbe et al also found that data used for decision-making at the subnational levels were lacking.5 26

Similarly, Saleh et al 27 found limited capacity for sample collection due to regular stockouts at health facilities when they assessed the core and support function of IDSR implementation in Zanzibar, Tanzania.27 They also documented performance inconsistencies between the higher and lower levels of the surveillance system; case detection and reporting were poorest at the health facility level compared with the district and national. Support functions such as training opportunities and trained staff, regular supervision, coordination and communication and logistic support were frequently inadequate at health facilities.27

In contrast to our study, a study by Ilesanmi et al assessing the surveillance system attributes found the acceptability of the EVD surveillance system in Tonkolili District, Sierra Leone, to be poor, but its usefulness was good.28 There was limited or no supervision of health facilities in Cameroon on evaluating the cholera surveillance.29 Separate studies assessing IDSR implementation in Ethiopia and Ghana showed fewer health facilities with the standard case definitions of priority disease30; healthcare workers at health facilities displayed poor knowledge of the case definition of a suspected EVD case, focusing only on bleeding manifestations.14

In the health facility survey, we found that a high proportion of healthcare workers had correct knowledge of the standard case definition of suspected EVD and training in IDSR. Health facilities had the required screening aids for EVD available to healthcare workers. However, these did not translate into practices or behaviours consistent with the existing competencies in the health facilities. Our SP detection sensitivity was zero, which implies poor application of the standard case definition to identify the early symptoms of EVD. Healthcare workers probed less for EVD risk factors, including contact with wild animals, travel history or close relative(s) with similar symptoms. In addition, their practice of universal compliance with IPC was poor. Similar to our study, Daniels et al in 2017 found that no possible cases were detected when assessing the quality of care relating to asthma, tuberculosis, childhood diarrhoea and unstable angina in Nairobi, Kenya.24

In contrast, healthcare workers in the simulation exercises related to EVD case detection and response in Liberia31 32 and South Sudan33 showed a good understanding of the spectrum of EVD symptoms, with detection at 100%. However, the simulation exercises involved informing the healthcare workers of the scenarios before implementation, limiting the ability to assess real-life practice. Another study that evaluated hospital interventions in China showed good adherence to IPC standards overall when screening SPs presenting as people living with HIV,34 contrary to our finding. This could possibly be explained by the perceived risks of HIV infection while working in an HIV clinic.

Our study has several strengths. First, integrating the WHO and US CDC frameworks and an indicator-based approach allowed us to measure the whole system’s performance based on expected and achieved outcomes. Second, this evaluation was conducted in the context of IDSR, with EVD surveillance being a part of the integrated surveillance strategy. Hence, our results could be relevant to diseases similar to EVD and settings similar to Liberia. Third, the SP study assessed gaps between what healthcare workers were expected to do from the indicator measurements and what they did in practice.

Our study has some limitations. Purposive sampling may introduce sampling bias, although it is best suited to identify the most knowledgeable individuals. There was a male predominance of the respondents, although this did reflect the gender balance of the employees under study. We did not investigate the costs of operating the surveillance system, but we explored access to financial resources. We did not assess the Community Event-Based Surveillance (CEBS) system, but we confirmed the existence of CEBS in catchment communities of health facilities we assessed. We did not complete the SP study as planned due to circumstances related to the COVID-19 pandemic. Its relatively small sample size may have limited our ability to make precise estimates of system performance against indicators. Finally, being a cross-sectional study, we did not assess changes in system performance over time.

Liberia’s EVD surveillance and response system may not be able to detect and respond to a new EVD outbreak as effectively or promptly as desired. Surveillance systems may especially fail to meet their objectives when one or more system components at each level perform poorly, as we found in this study. In addition to the Liberian Ministry of Health’s implementation of a dissemination plan related to this study’s findings, opportunities for changes to improve the system include enhancing capacity for timely reporting at health facilities, training and retaining healthcare workers at all levels, optimising surveillance competencies, preventing stockout of key sample collection kits and regularising supervision and mentorship at the subnational levels. Further studies could focus on possible variations in level of preparedness per region, system challenges and potential reasons for performance gaps, along with options for filling them. It is important to assure the government and the people of Liberia that a widespread outbreak of this devastating disease will not happen again. Therefore, regular evaluations are advised. These should include using SP studies, which may be important to incorporate in surveillance system evaluations for all infectious diseases that rarely occur but are crucial to detect early to save thousands of lives.

Data availability statement

Data are available upon reasonable request. All data (delinked) and tools will be made available based on a reasonable request to the Centers for International Health, University of Otago.

Ethics statements

Patient consent for publication

Ethics approval

This study involves human participants and was approved by The University of Liberia Institutional Review Board (UL-IRB) (Ref. #: 20-01-196). The Human Ethics Committee (Health), University of Otago (Ref. #s: H19/155 (Health facility assessment); H19/145) (Assessment at NAtional County and Distinct levels); H20/001 (Standardised Patient Study). Participants gave informed consent to participate in the study before taking part.

Acknowledgments

We also acknowledge the data collectors' and study participants' contributions to this study.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

Footnotes

  • Handling editor Seye Abimbola

  • Contributors The success of this research project is attributed to the hard work and effective coordination of expertise from the research team. FQSII, the corresponding author, led the design and field investigations for the study. He was solely responsible for conducting the key informant interviews. He led the training exercises for hired data collectors involved with the health facility assessments and the standardised patients (SPs) study. He supervised (directly and remotely) the field activities for the health facility assessments and the SP study. In addition, he was responsible for spearheading the data analysis (with support from KS) and report writing and synthesising the reports into this research article. PCH served as a coauthor. He supported the study design, implementation and report writing. JAC served as coauthor. He also assisted in the study design and the writing of reports. KS assisted in the study design, data analysis and report writing. RE specifically supported the qualitative aspect of this paper, including the data analysis and report writing. LLB supervised field implementation and contributed to the report writing. FQSII is the guarantor.

  • Funding This study came out of an academic (Doctor of Philosophy) work funded through a University of Otago Scholarship and Foundation Trust.

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.