Article Text

Debunking highly prevalent health misinformation using audio dramas delivered by WhatsApp: evidence from a randomised controlled trial in Sierra Leone
1. Maike Winters1,
2. Ben Oppenheim2,3,
3. Paul Sengeh4,
5. Nance Webber4,
6. Samuel Abu Pratt4,
7. Bailah Leigh5,
8. Helle Molsted-Alvesson1,
9. Zangin Zeebari6,
10. Carl Johan Sundberg7,
11. Mohamed F Jalloh1,
12. Helena Nordenstedt1
1. 1Department of Global Public Health, Karolinska Institutet, Stockholm, Sweden
2. 2Center on International Cooperation, New York University, New York, New York, USA
3. 3Metabiota, San Francisco, California, USA
4. 4FOCUS1000, Freetown, Sierra Leone
5. 5College of Medicine and Allied Health Sciences, Freetown, Sierra Leone
6. 6Department of Economics, Finance and Statistics, Jönköping International Business School, Jönköping, Sweden
7. 7Department of Physiology and Pharmacology, Karolinska Institutet, Stockholm, Sweden
1. Correspondence to Dr Maike Winters; maike.winters{at}ki.se

## Abstract

Introduction Infectious disease misinformation is widespread and poses challenges to disease control. There is limited evidence on how to effectively counter health misinformation in a community setting, particularly in low-income regions, and unsettled scientific debate about whether misinformation should be directly discussed and debunked, or implicitly countered by providing scientifically correct information.

Methods The Contagious Misinformation Trial developed and tested interventions designed to counter highly prevalent infectious disease misinformation in Sierra Leone, namely the beliefs that (1) mosquitoes cause typhoid and (2) typhoid co-occurs with malaria. The information intervention for group A (n=246) explicitly discussed misinformation and explained why it was incorrect and then provided the scientifically correct information. The intervention for group B (n=245) only focused on providing correct information, without directly discussing related misinformation. Both interventions were delivered via audio dramas on WhatsApp that incorporated local cultural understandings of typhoid. Participants were randomised 1:1:1 to the intervention groups or the control group (n=245), who received two episodes about breast feeding.

Results At baseline 51% believed that typhoid is caused by mosquitoes and 59% believed that typhoid and malaria always co-occur. The endline survey was completed by 91% of participants. Results from the intention-to-treat, per-protocol and as-treated analyses show that both interventions substantially reduced belief in misinformation compared with the control group. Estimates from these analyses, as well as an exploratory dose–response analysis, suggest that direct debunking may be more effective at countering misinformation. Both interventions improved people’s knowledge and self-reported behaviour around typhoid risk reduction, and yielded self-reported increases in an important preventive method, drinking treated water.

Conclusion These results from a field experiment in a community setting show that highly prevalent health misinformation can be countered, and that direct, detailed debunking may be most effective.

Trial registration number NCT04112680.

• typhoid and paratyphoid fevers
• malaria
• public health
• epidemiology
• randomised controlled trial

## Data availability statement

Data are available in a public, open access repository.

This is an open access article distributed in accordance with the Creative Commons Attribution 4.0 Unported (CC BY 4.0) license, which permits others to copy, redistribute, remix, transform and build upon this work for any purpose, provided the original work is properly cited, a link to the licence is given, and indication of whether changes were made. See: https://creativecommons.org/licenses/by/4.0/.

## Statistics from Altmetric.com

### Key questions

• Health-related misinformation is highly prevalent and highly damaging.

• Randomised trials to counter real-world misinformation remain rare, with most evidence to date being limited to high-income settings.

#### What are the new findings?

• Two narrative audio dramas were tested via WhatsApp in Freetown, Sierra Leone; the first explicitly mentioned and debunked typhoid-related misinformation, the second focused only on providing scientifically correct information.

• Both interventions effectively reduced belief in misinformation as well as improved knowledge and self-reported protective behaviours, but stronger effects were achieved by explicitly citing and debunking misinformation.

#### What do the new findings imply?

• Explicitly addressing why misinformation is wrong via narrative public health messaging may prove effective in countering infodemics.

## Introduction

Misinformation can be as contagious as a virus—sometimes more. And like a virus, misinformation can be fatal. There is strong evidence that misinformation can reduce protective actions, encourage risky behaviours and promote the spread of infectious disease.1 2 The WHO has described the current COVID-19 pandemic as an ‘infodemic’, pointing to the overabundance of (mis)information.3 4 Widespread misinformation has posed significant challenges to the control of the pandemic, introducing (and amplifying) uncertainty about the importance and efficacy of non-pharmaceutical interventions such as masking and social distancing, as well as safety and efficacy of vaccines for SARS-CoV-2.2 3 5 The public health challenges posed by misinformation go far beyond COVID-19. Vaccine hesitancy, driven by online misinformation, has played a role in the recurrence of preventable diseases, notably measles.6–8

The rapid rise in the use of social media has increased the volume and velocity of misinformation, giving the especially virulent narratives a wider reach.9 10 Despite the urgent need for tools to counter health-related misinformation, there is limited evidence on which strategies are efficacious. Meta-analyses studying different strategies for countering misinformation found that detailed counterarguments could be effective, especially when they are delivered by a trusted source and in line with recipients’ worldviews and social norms.11 12 However, this approach does not always yield reductions in belief in misinformation.11 12 This might be explained by the continued influence effect, whereby despite credible alternatives, people still rely on the initial misinformation,13–17 or via a number of cognitive biases through which repeated exposure to information can strengthen its cognitive availability or appeal, raising the risk that corrective messaging inadvertently strengthens belief in misinformation.18–20 Fortunately, evidence thus far shows that these types of unwanted side effects of debunking do not always occur.21 22 However, many studies have methodological limitations and few use a pre-post randomised controlled design.21 An alternative approach to debunking misinformation emphasises providing correct information rather than directly countering misinformation, to avoid spreading the narrative further to people who would otherwise not have come in contact with it and thus increasing their familiarity with the misinformation.23–27

To date, there have been very few experimental studies of interventions to reduce misinformation in non-laboratory settings.13 15 28 Most studies aiming to test debunking strategies against health and non-health-related misinformation have been carried out using survey experiments, or in laboratory experiments on university campuses, with relatively small sample sizes and subjects including young, mostly female college students.11 29 Furthermore, many studies have not been anchored in a real-world context, as the effectiveness of debunking strategies was evaluated by experimentally introducing a piece of misinformation and subsequently countering its content.23 30 In summary, there is limited evidence to date to counter already existing misinformation that is prevalent among the public. In addition, as most studies have been carried out in high-income settings, little is known about debunking strategies in low-income settings that are especially vulnerable to infectious disease outbreaks. Studies that have been performed in low-income settings have mainly looked at various forms of health education to increase knowledge and uptake of protective behaviours, as opposed to specifically testing debunking strategies to target health misinformation.31–34

In Sierra Leone, there is widespread misinformation regarding typhoid, and in particular, widespread belief that typhoid and malaria are closely related.35 Interestingly, people commonly conceptualise typhoid and malaria as a single disease, ‘typhoid-malaria’. The belief structure linking these diseases is complex and varied. Some narratives indicate that malaria weakens the immune system, which in turn leads to typhoid infection; another narrative suggests that ‘typhoid and malaria walk on the same road’ or ‘are friends’, which implies that the diseases have some causal relationship. Finally, some conceptualise typhoid-malaria as a more severe case of malaria, requiring distinct treatment approaches. The notion that typhoid and malaria occur in conjunction is the common denominator across all explanations. The perceived similarity of the two diseases also makes many people believe that typhoid is caused by mosquitoes.

Although typhoid and malaria share symptoms (eg, fever), they are very different diseases: typhoid is caused by bacterial infection, usually transmitted through contaminated food, water and the faecal-oral route. The incidence of typhoid in Sierra Leone is estimated to be low (around 15 000 cases in 2019).36 Malaria is a disease spread by parasite-infected mosquitoes and is much more common than typhoid in Sierra Leone, with more than 3.7 million cases estimated in 2019.36

Typhoid can be diagnosed through blood culture. However, in Sierra Leone only one hospital currently has the necessary equipment, and resource constraints limit the availability of blood culture for clinical diagnosis.37 Instead, the Widal test is commonly used to diagnose typhoid. The Widal test reportedly has low sensitivity, specificity and positive predictive value for typhoid diagnosis,38 39 and may cross-react with malaria antigens, raising the risk of a false-positive result for patients with malaria infections.40 Confirmed coinfection of malaria and typhoid is rarely observed.41–43 However, in Sierra Leone patients are frequently diagnosed in health centres with ‘typhoid-malaria’, often without using a diagnostic test,44 which in addition to antimalarials often is treated with antibiotics.45 While there are limited data on typhoid diagnosis and related antibiotic usage in Sierra Leone,44 the overdiagnosis of typhoid has likely contributed to the unnecessary use of antibiotics, as well as ensuing antibiotic resistance.46 47 Countering typhoid misinformation could therefore inform and empower citizens to question a typhoid-malaria diagnosis and potentially avoid unnecessary usage of antibiotics.

## Methods

The Contagious Misinformation Trial (CMT) was a prospective, three-arm, superiority randomised controlled trial that took place within the community in Freetown, the capital of Sierra Leone, in 2019. The CMT investigated the efficacy of two debunking strategies to counter misinformation about typhoid in Freetown, Sierra Leone, by incorporating scientific and risk communication information into four-episode audio dramas (see table 1) delivered via WhatsApp, a widely used instant messaging platform.

Table 1

Core messages of audio dramas by intervention group

The audio dramas targeting intervention group A (the Plausible Alternative group) explicitly mentioned the misinformation and provided a detailed counterargument. The audio dramas applied to intervention group B (the Avoiding Misinformation group) did not directly discuss the misinformation, and instead only focused on providing scientifically correct information. The control group received audio messages on breast feeding, unrelated to typhoid-malaria. We tested the efficacy of the two interventions using a randomised controlled trial of 736 participants that took place in the community. Comparing the two interventions allows us to examine whether explicitly invoking and discussing misinformation yields superior results in terms of reducing belief in misinformation. We tested two main outcomes:

• The belief that typhoid is caused by mosquitoes.

• The belief that typhoid can only co-occur with malaria.

The study was designed to detect a relative reduction of 15% in belief in misinformation between one of the intervention groups and the control group. Based on pilot testing, we assumed a 50% prevalence of belief in misinformation. A sample size of 170 per group was required to provide power of 0.80 for a one-sided Wald test. Because of the clustered sampling strategy, the intracluster correlation (ICC) can potentially reduce the effective sample size compared with the calculated sample size. Based on a previous study, we assumed an ICC of 0.0148 and a design effect of 1.2.49 The sample size was expanded to 250 per group in order to address ICC and potential attrition. The postattrition sample size of 668 gives a statistical power of approximately 0.97.50

### Recruitment of participants

We selected 21 of the 64 administrative sections in Freetown as trial sites using weighted random sampling without replacement. As these sections vary widely in size (between roughly 600 and 6000 households), each section had a weighted probability of selection proportionate to its size. The weighted random selection was done by a macro written in Visual Basic for Application for Microsoft Excel. During the recruitment phase, three teams consisting of four enumerators and one supervisor visited one section per day for 7 days (7–13 October 2019). Each enumerator recruited nine new participants in each section. Eligible participants were adults (18 years and older), living in Freetown, fluent in Krio, in possession of a phone with WhatsApp and with no hearing impairments (more details about the recruitment can be found in the online supplemental material).

### ITT analysis

The belief that typhoid is caused by mosquitoes was significantly reduced in intervention group A compared with the control group in the ITT analysis (group A: adjusted OR (AOR) 0.29, 95% CI 0.18 to 0.47, see table 3 and online supplemental figure S1). In intervention group B, the reduction was not significant (AOR 0.61, 95% CI 0.39 to 0.95, p=0.029).

Table 3

Primary outcomes for intervention group A and group B versus control group

The Plausible Alternative intervention (group A) yielded a larger reduction than the Avoiding Misinformation intervention (group B) (AOR 0.46, 95% CI 0.28 to 0.76), though this result does not reach significance in the crude model (table 4). The belief that typhoid co-occurs with malaria was significantly reduced in both intervention groups in the ITT analysis (group A: AOR 0.29, 95% CI 0.19 to 0.45; group B: AOR 0.55, 95% CI 0.36 to 0.83) (table 3 and online supplemental figure S2). Group A showed a greater reduction than group B in the adjusted model (AOR 0.51, 95% CI 0.33 to 0.81, see table 4), but was not significant in the crude model (AOR 0.65, 95% CI 0.43 to 0.98). As a robustness check, we ran the ITT analysis using OLS regression, which yielded similar results (see online supplemental table S3).

Table 4

Primary outcomes for intervention group A versus group B

### Per-protocol analysis

Similarly, both intervention groups had reduced levels of belief in misinformation under the per-protocol analyses (see tables 3 and 4, online supplemental figures S1 and S2 and online supplemental table S3 for OLS models). The belief that typhoid is caused by mosquitoes was lower in both intervention groups compared with the control group (group A: AOR 0.06, 95% CI 0.02 to 0.20; group B: AOR 0.35, 95% CI 0.15 to 0.84); group A showed sharper declines in the odds than group B (AOR 0.15, 95% CI 0.04 to 0.58). Similarly, the belief that typhoid and malaria co-occur was reduced at endline in the intervention groups compared with the control group (group A: AOR 0.06, 95% CI 0.02 to 0.15; group B: AOR 0.15, 95% CI 0.06 to 0.36). There was no statistical difference between group A and group B (AOR 0.32, 95% CI 0.09 to 1.09).

### As-treated analysis

For the as-treated analysis, 30 participants (13%) were reclassified from control to intervention group A, and 10 participants (3 (1%) in group A and 7 (3%) in group B) from intervention to control. Results are robust to different reclassification techniques (see online supplemental table S4). The as-treated analysis confirmed that participants in both intervention groups were significantly less likely to believe in misinformation. The belief that typhoid is caused by mosquitoes had significantly lower odds in both groups compared with the control group (group A: AOR 0.13, 95% CI 0.07 to 0.25; group B: AOR 0.38, 95% CI 0.21 to 0.68). The belief that typhoid and malaria co-occur had even stronger associations (group A: AOR 0.12, 95% CI 0.07 to 0.21; group B: AOR 0.27, 95% CI 0.16 to 0.47) (see table 3 and online supplemental table S3 for OLS models).

### Seeding misinformation

It is possible that an informational intervention to mitigate misinformation can instead have the undesired effect of ‘seeding’ it, for example, by exposing people who previously held scientifically correct beliefs to factually incorrect beliefs, which then take hold. We analysed whether the intervention unintentionally seeded misinformation among the participants who held the correct beliefs at baseline. Participants in group A were less likely to believe the misinformation at endline compared with the control group, for both the belief that mosquitoes cause typhoid (AOR 0.35, 95% CI 0.15 to 0.81) and the belief that typhoid and malaria co-occur (AOR 0.39, 95% CI 0.18 to 0.82) (see online supplemental table S5). There was no significant difference between group B and the control group for the mosquito belief (AOR 0.70, 95% CI 0.33 to 1.51) and the belief that typhoid and malaria co-occur (AOR 0.58, 95% CI 0.29 to 1.16). There was no difference between group A and group B on both outcomes (mosquito outcome: AOR 0.41, 95% CI 0.15 to 1.05; malaria co-occurrence outcome: AOR 0.72, 95% CI 0.33 to 1.58).

### Dose–response analysis

The dose–response analysis suggests that the group A ‘Plausible Alternative’ intervention was more effective. Both intervention groups significantly reduced their beliefs in the misinformation after having listened to at least two episodes, compared with the control group (see online supplemental table S6). Limiting the analysis to only the two intervention groups, we found a significant interaction between intervention group and dose for the belief that typhoid is caused by mosquitoes (AOR 0.62, 95% CI 0.43 to 0.89), showing that with increased number of episodes, group A performed better than group B (online supplemental table S7). This effect was not observed for the belief that typhoid and malaria co-occur (AOR 0.80, 95% CI 0.58 to 1.11). Furthermore, three episodes of the drama in group A were significantly better at reducing the typhoid-mosquito belief than the four episodes in the group B drama (online supplemental table S8).

We scored participants’ knowledge about preventive methods on a scale ranging from −3 to +3. The data suggest that both interventions improved study participants’ knowledge: at endline, 67% of the participants in group A scored 1 or higher versus 66% in group B and 51% in the control group. Ordinal logistic regression showed that the two intervention groups scored significantly higher than the control group (group A: AOR 2.19, 95% CI 1.57 to 3.06; group B: AOR 1.79, 95% CI 1.27 to 2.50), but there was no statistically distinguishable effect between the two intervention groups (online supplemental table S9).

### Behavioural outcomes

Exploratory analyses around behavioural outcomes showed that participants in group A were significantly less likely than the control group to report that they were sleeping under a bednet to prevent typhoid infection (AOR 0.43, 95% CI 0.24 to 0.78). There was no statistically significant association for group B (AOR 0.64, 95% CI 0.36 to 1.12) (online supplemental table S9). Both intervention groups had significantly higher odds to report that they were drinking treated water to prevent typhoid infection (group A: AOR 2.78, 95% CI 1.67 to 4.64; group B: AOR 1.77, 95% CI 1.08 to 2.91). There were no statistical differences between intervention groups for either behavioural outcome.

## Discussion

Effectively correcting prevalent public health misinformation is an urgent challenge. The CMT tested two ways of countering prevalent misinformation about typhoid using audio dramas delivered via WhatsApp. Results show that both intervention groups reduced belief in two types of misinformation compared with the control: the belief that typhoid is caused by mosquitoes and the belief that typhoid and malaria co-occur.

Apart from changing the participants’ beliefs in prevalent misinformation, both interventions also positively influenced people’s knowledge and yielded increases in an important protective practice (drinking treated water). It should be noted that this measure was self-reported and might have suffered from social desirability bias. Further studies could gather longitudinal observational data on behavioural risk reduction following (mis)information interventions.

While both interventions reduced belief in misinformation relative to the control group, the Plausible Alternative intervention group, which mentioned misinformation before debunking it, generally experienced stronger improvements in misinformation belief reduction than the Avoiding Misinformation intervention (group B). This is consistent with evidence from laboratory-based studies.54 While both interventions contained basic elements of storytelling, the debunking strategy of the Plausible Alternative group incorporated the dramatic element of conflict and debate,55 56 which might have made the content ‘stick’ better. Both interventions contained the same volume of scientifically correct, educational content, but intervention group A ‘invested’ additional story time in debunking misinformation; it is possible that increasing the length and scientific detail of the Avoiding Misinformation intervention could increase its effectiveness. Further research on these topics is warranted. However, the Plausible Alternative intervention did not yield statistically significant improvements relative to the Avoiding Misinformation intervention in knowledge of prevention measures or behavioural outcomes.

Contrary to other trials with health communication interventions,13 14 we found no evidence that the interventions created negative side effects. Despite concerns that specifically mentioning and debunking misinformation might inadvertently spread scientifically incorrect narratives, we found that the Plausible Alternative intervention did not seed misinformation among those who had previously held correct beliefs. This could be because the risk of spreading misinformation is higher when those audiences are new to the misinformation. However, in our study a large majority of participants (94%) had heard of typhoid-malaria, which may have lowered the risk of seeding the misinformation among those who held the correct beliefs.22 26

A major strength of this study is the study design. As a randomised field experiment, the CMT contributes to a small but growing body of research that tests strategies to counter misinformation in a community rather than laboratory setting or survey experiment. The intervention was designed to approximate a real-world public health communication effort, and therefore may have stronger external validity than survey experiments and other commonly used tools to assess the efficacy of informational interventions.

Like other social media, WhatsApp, a widely used messaging platform with global reach, is a platform that can enable the spread of misinformation.57 58 At the same time, WhatsApp’s wide reach could be used to deliver effective public health communication campaigns at scale,58 while avoiding some of the challenges inherent to radio and television as information channels (eg, information must be ‘consumed’ at time of broadcast, rather than when convenient for the receiver). Further studies are warranted to test corrective messages at scale. These studies might explore the potential for spillover effects, in particular the extent to which health information and educational messaging is shared with others, whether on or off the specific technology platform used to disseminate the intervention. In the case of the CMT, study participants were explicitly instructed not to share the audio dramas. However, real-world information interventions could be much more impactful on a population level if recipients were encouraged to share them, and future studies could explore whether specific types of content, delivery or instructions can encourage ‘productive’ spillover effects of health promotion messaging.

This study also had several limitations. First, despite our ability to monitor message reception and follow-up with study participants, 30% of our participants did not receive or listen to any of the audio episodes. If interventions of this type would be implemented on a larger scale and with less intensive oversight, non-adherence could be higher. Further research could explore the effect of additional reminders and ‘nudges’ on listenership. The endline survey was conducted 8 weeks after the baseline. Future studies should assess the long-term ‘stickiness’ of improvements to knowledge and practices via these and other debunking methods. Furthermore, the misinformation we aimed to counter concerned a specific health-related myth that was not subject to politicised debates. Polarised misinformation might be harder to counter, although the evidence on this is inconclusive thus far.59 60 Further experimental work is needed to examine whether the CMT intervention elements would yield similar improvements on polarising misinformation. Similarly, while the misinformation in our study was explicitly debunked in the Plausible Alternative group, it would be of interest to study similar corrective efforts when misinformation consists of implied rather than explicit falsehoods, for instance, through the omission of relevant information.61

## Conclusion

These limitations notwithstanding, we have shown that it is possible to reduce belief in misinformation rapidly, even where such beliefs are widely held and reinforced via the health system. A communications strategy that gives room to explain why misinformation is wrong and then provides scientifically correct information, is in line with existing worldviews, delivered by credible sources and gets repeated exposure has the potential to yield desired results without unintentionally seeding misinformation. This list of attributes may sound demanding. However, the results of this field experiment provide some grounds for optimism that even as misinformation becomes more prevalent, there are effective tools at hand to counter its impact and its spread.

## Data availability statement

Data are available in a public, open access repository.

## Ethics statements

### Ethics approval

Ethical permission for this study was granted by the Sierra Leone Ethics and Scientific Review Committee on 30 May 2019 and the Swedish Ethical Review Authority in Stockholm (dnr 2019-04433).

## Acknowledgments

We would like to thank the participants of the Info Na Pawa study for their time and the enumerators for their hard work recruiting participants and administering the surveys. A big thanks to the entire staff of FOCUS1000 in Freetown for making this study possible. We would also like to thank Grant Gordon and Sarah Oh for their feedback on the manuscript.

• ## Supplementary Data

This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

## Footnotes

• Handling editor Soumitra S Bhuyan

• Contributors MW, BO, PS, MBJ, NW, SAP, BL, HM-A, ZZ, CJS, MFJ and HN contributed to the study design. MW, BO, PS, MBJ, NW, SAP, MFJ and HN contributed to the creation of the intervention. MW, BO, MFJ and HN contributed to the overall management of the study. MW, PS, MBJ, NW, HM-A, ZZ, CJS and SAP contributed to the training of enumerators and overseeing the data collection in Freetown. MW, BO, PS, MBJ, NW, SAP, MFJ and HN contributed to overseeing the administration of the intervention. MW, BO and ZZ contributed to data management and statistical analysis. MW, BO, ZZ and HN contributed to writing the manuscript. MW is responsible for the overall content as guarantor. All authors read and approved the final manuscript.

• Funding Swedish Research Council (2017-05581).

• Competing interests None declared.

• Provenance and peer review Not commissioned; externally peer reviewed.

• Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

## Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.