Collecting data to understand violence against women and children during and after the COVID-19 pandemic is essential to inform violence prevention and response efforts. Although researchers across fields have pivoted to remote rather than in-person data collection, remote research on violence against women, children and young people poses particular challenges. As a group of violence researchers, we reflect on our experiences across eight studies in six countries that we redesigned to include remote data collection methods. We found the following areas were crucial in fulfilling our commitments to participants, researchers, violence prevention and research ethics: (1) designing remote data collection in the context of strong research partnerships; (2) adapting data collection approaches; (3) developing additional safeguarding processes in the context of remote data collection during the pandemic; and (4) providing remote support for researchers. We discuss lessons learnt in each of these areas and across the research design and implementation process, and summarise key considerations for other researchers considering remote data collection on violence.
- Public Health
- Cohort study
- Cross-sectional survey
- Qualitative study
Data availability statement
Data sharing not applicable as no datasets generated and/or analysed for this study.
This is an open access article distributed in accordance with the Creative Commons Attribution 4.0 Unported (CC BY 4.0) license, which permits others to copy, redistribute, remix, transform and build upon this work for any purpose, provided the original work is properly cited, a link to the licence is given, and indication of whether changes were made. See: https://creativecommons.org/licenses/by/4.0/.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
There is limited research on how to conduct remote data collection on violence against women and children and most research relies on in-person data collection.
As violence researchers, we reflect on our experiences of redesigning eight studies in six countries to include remote data collection methods during the COVID-19 pandemic.
Shifting to remote data collection in violence research required adapting to a range of moveable and unpredictable conditions specific to each context.
To conduct remote data collection in the pandemic, the following areas were crucial in fulfilling our commitments to participants, researchers, violence prevention, and research ethics: (1) strong research partnerships; (2) adapted data collection approaches; (3) additional safeguarding processes; and (4) remote support for researchers.
Committing resources to the additional steps required to protect participant and researcher safety while using remote methods in violence research is essential.
Remote data collection to directly measure experiences of violence should only be conducted in specific circumstances, when it is possible to ensure safeguarding and either when participants are already engaged in the study, and/or in the context of strong and well-established research partnerships.
This study offers lessons learned and recommendations for whether -- and how -- to design and conduct remote data collection on violence.
Violence against women and children has become both more prevalent and less reported during the COVID-19 pandemic.1–6 Although researchers across fields have pivoted to remote rather than in-person data collection,7–9 remote research on violence has posed particular challenges. UNICEF, the United Nations Population Fund (UNFPA), the Sexual Violence Research Initiative (SVRI) and others outlined concerns with remote data collection early in the pandemic.2 3 10–13 These include participant safety, under-reporting of violence as participants could fear being overheard, and limited safeguarding amid overburdened health services and poorly functioning violence response services. Instead, experts recommended using secondary or administrative data, collecting retrospective data when safe to do so, or using proxy and indirect measures for violence.2 3 10–13
Interviewing women, children and young people about experiences of violence is often more reliable than using administrative or service data. For example, a meta-analysis with data from over 9 million participants showed that the prevalence of child sexual abuse was 12.7% if self-reported and only 0.4% if reported by health professionals, teachers or child services.14 Analyses of the Violence Against Children Surveys in six countries showed that the self-reported prevalence of physical violence was at least 60% and of sexual violence at least 10%, while formal disclosure of physical and/or sexual violence ranged from 1% to 25%.15 Interviewing children and young people about violence is also central to a commitment to child participation, which emphasises the pivotal role of children in research that concerns their lives.16–18
Since the start of the pandemic, several studies have used remote data collection to interview women, young people and children directly about violence.19 These have included: phone interviews and web-based surveys with children about violence,20–22 phone interviews with women about injuries, safety and conflict in the home and community,23 and the use of list experiments, vignettes and indirect measures to ask about violence.24
Prior to the pandemic, violence research was rarely conducted using remote data collection. A rapid review of remote data collection on violence found only 14 studies, all from high-income countries, of which only two included children.25 There is a need for further research that examines if, and how, violence research can be ethically and effectively done remotely. As a group of violence researchers, we reflect on our experiences across eight studies, some ongoing and some complete, that included remote data collection methods. Please see online supplemental file 1 for our author reflexivity statement. We briefly summarise the studies, reflect on how we designed remote research in line with ethical principles and good practices for violence research developed by WHO, the Centers for Disease Control and Prevention (CDC), UNICEF, UNFPA, and other organisations,26–30 and share our lessons to inform the work of violence researchers who are considering using remote methods.
Studies included and approaches to remote data collection
We draw on eight studies collecting data on violence against women (n=4) and children (n=4) in Brazil, Britain, Kenya, Nepal, Uganda and Zimbabwe during the COVID-19 pandemic. None of these studies included remote data collection in their original design. Table 1 summarises violence questions included and approaches to remote data collection, consent and safeguarding. Six out of eight studies were redesigned to use remote methods to interview school-age children, adolescents and young adults, and adult women about their experiences of violence. In four studies, healthcare providers and local stakeholders were interviewed about violence either in addition to, or instead of, interviewing women or children. We include examples from two longitudinal or cohort studies (the Context of Violence in Adolescence Cohort (CoVAC) study in Uganda 31 and the Maisha Fiti study in Kenya32); one cross-sectional nationally representative survey, established in the late 1980s and carried out approximately decennially (the British National Surveys of Sexual Attitudes and Lifestyles (Natsal) pilot of remote study33); three qualitative studies (the CoVAC qualitative study,34 the Bantwana programme in Uganda35 and the Child-friendly Catholic Schools Study-Zimbabwe (CCSS-Z)); and two mixed methods studies (the HERA - Healthcare Responding to Violence and Abuse study in Brazil and Nepal). Four studies were linked to violence prevention interventions at the community, school or health facility level.36–38 Phone interviews were the most frequently used method. Other remote methods included video interviews or online questionnaires. In all cases but one, ethics committees approved remote methods. In the HERA study, the national ethics committee in Nepal did not approve qualitative telephone interviews about violence due to safety concerns.
Reflections on conducting remote data collection on violence during COVID-19
Most literature on the ethics of violence research describes the importance of the following principles: (1) strong partnerships and trained researchers who can build rapport with participants, sense distress and protect participants from harm, (2) privacy and safety, (3) strong links to local violence response services and referral organisations for safeguarding, and (4) training, support and debriefing for researchers.26–29 We discuss how we redesigned our research to fulfil each of these ethical principles, summarising key decisions and challenges. We offer case examples in table 2 and lessons learnt in table 3.
Drawing on strong research partnerships
Established relationships were essential for remote violence research: partnerships and teams had been in place for 1–7 years (and the infrastructure for Natsal had been in place for 30 years). Research partners had expertise in violence research, appraised the COVID-19 situation, sought approvals, and engaged trusted counsellors and referral networks for safeguarding. Strong research partnerships allowed remote data collection to be conducted by trained interviewers with prior experience of sensitive data collection on violence, building rapport with participants, and who, in many cases, were already engaged in the study.
The shift to remote methods was planned and discussed in the context of these pre-existing collaborations. We found that it was possible to ask directly about violence remotely in some cases but not in others. The six studies that did use remote methods to interview women, children and young people about their experiences of violence were ongoing studies or, in the case of Natsal, contact was made with participants prior to the interview. Moving to remote methods in these studies offered possibilities to increase, or enable, participation. For example, in the Maisha Fiti study, phone interviews reached female sex workers who would otherwise have been unable to participate due to COVID-19 response measures, as well as migration and unstable housing caused by economic difficulties of the pandemic (table 2). In Natsal, some participants mentioned in follow-up interviews that remote options could even offer more privacy to answer survey questions. In several studies, we felt continuing data collection was part of our ethical commitments to participants and related to improving violence services in a pandemic. In the CoVAC qualitative study, where researchers had been speaking to participants since 2018, halting contact with participants seemed unethical: in fact, young people appreciated that researchers followed up on their circumstances and reached out to them during the challenging time. In the HERA study, continuing data collection was seen as a source of hope and optimism for health providers and researchers, as study activities were central to improving health systems’ responses to violence. In the Natsal study, cognisant of the challenges of remote data collection, researchers designed a pilot before proceeding with large-scale remote data collection and paused the pilot for 12 months for the team to draw on the expertise of experienced survey methodologists to adapt the study design for remote delivery. When fieldwork began, COVID-19 restrictions allowed interviewers to make initial contact with participants on the doorstep and offer either a face-to-face or remote interview.
In contrast, in two new studies—the CCSS-Z study in Zimbabwe and the Bantwana programme study in Uganda—research teams were concerned about initiating data collection remotely without pre-existing relationships or contact with participants. Concerns also included identifying and safeguarding participants during school closures, building rapport with children remotely, and making virtual sessions engaging for children. These teams either delayed interviews with children, or decided not to interview children and instead interviewed adult stakeholders, such as teachers or parents, to gather some information about violence during the pandemic. It was deemed in both studies that these adults could offer critical insights into children’s experiences of violence during pandemic conditions that should not be missed, provided the studies could be adapted to meet ethical requirements. This required substantial changes to the study design, research approach and interview questions. Remotely recruiting adults to discuss violence was also challenging, however. In the Bantwana programme study, staff known to adult participants approached them initially before connecting them with the research team. In the CCSS-Z study, the research team did not feel the relationships with school staff and parents were in place to conduct remote interviews about violence, which was heightened by the political environment at the time (table 2). The study was further revised and remote interviews were only conducted with higher level stakeholders, external to the schools, who were accustomed to discussing violence in their work and to working remotely during the pandemic. In both these cases, the lack of prior relationships between the research team and adult participants meant that making initial contact through project staff or study partners who were known to and trusted by participants was a crucial first step. However, it is possible that participants may not have been as forthcoming as if they had had strong prior relationships with the research team. These experiences highlight the nuances of initiating violence research remotely, even when not asking about personal experiences of violence.
Safety and privacy
To enable safety and privacy, study approaches were amended in four primary ways. First, consent and introductory processes were adapted for remote data collection to reduce risk of retaliatory violence and improve confidentiality. Violence was not mentioned while introducing the study: for example, Maisha Fiti means ‘life is good’ in Swahili, and the CCSS-Z study team initially referred to the study as a ‘Catholic Schools Study’ on the phone, only explaining further verbally when certain of speaking to intended participants. For most studies, consent was adapted to be sought verbally and documented either through audio-recording or by interviewers’ paper or electronic records. For Natsal, consent was sought in person at the doorstep of households where participants were offered a choice between in-person or remote interviews.
Second, we redesigned our interview scheduling. Across the studies, participants faced a range of challenges, such as additional time pressures, workloads, increased caregiving responsibilities or unpredictable working patterns. This meant that completing interviews in one sitting and keeping interviews confidential and private could be challenging. We found that data collection had to be highly adaptable and responsive. The Bantwana study halved the interview length to ensure phone interviews were manageable for participants. For both CoVAC studies and Maisha Fiti, we included an initial call to assess safety, explain study procedures and asked participants to suggest preferences for interview timing. We then designed processes for callbacks and offered flexibility around time so interviews could fit around the daily lives of participants. Planning ahead for staffing around this was necessary.
The reliability of phone or internet connection was also an ongoing challenge. This was a key concern for violence research, due to the risk of sudden disruptions to sensitive conversations, or being unable to reach participants in the case of safety concerns. In several cases we reimbursed participants to cover data used in communication, or researchers called participants, to both prevent participants from incurring costs and ensure sufficient phone credit for the call. In CCSS-Z, we also found that having more than one line of communication (establishing both online and phone contact) was also important in case of network failure. In Natsal, participants were emailed a unique URL to the online questionnaire during the telephone or video interview. At times this was slow to arrive, so the interviewer read the URL aloud. Video and phone were used to build rapport and interviewers remained on the telephone/video call while participants self-completed online questionnaires.
Third, interview processes were adapted to ensure privacy and confidentiality remotely. In line with good practices for research on sensitive topics, quantitative questions were designed with response options (e.g. ‘yes’, ‘no’, ‘few’) so it was not obvious participants were discussing violence. In the CoVAC qualitative study, topic guides were revised to ask participants open-ended questions, for example about how COVID-19 had affected their families, instead of direct questions about violence; and researchers took extra care around probing, for example on relationships or discipline at home. This approach enabled participants to maintain control over what they shared and allowed researchers to follow-up on sensitive topics when in-person interviews were possible. In the CoVAC quantitative study, the phone survey was redesigned to consist of three interviews, with more sensitive questions in later interviews.
Finally, researchers were trained to work with the participant to find a private place, to check if participants were alone, to change the subject, to listen out for signs of distress or discomfort, and to answer questions from other household members if they were to take the phone. In the HERA study, researchers would change the subject to reproductive health if the interview was interrupted. In the Natsal survey, an online self-completion questionnaire was used, with a button taking the participant to a neutral news website if privacy was lost. If a participant selected ‘prefer not to answer’ at any violence questions, an option appeared to skip to the end of the section. On completion, each section of questions was locked to prevent anyone viewing previous answers. In the CoVAC survey, ‘interruption’ options were added so researchers could log where they paused and call back later. In Maisha Fiti and the CoVAC survey, ‘safe words’ or ‘safe phrases’ were agreed with participants, so use of a phrase, such as ‘the weather is sunny’, or the name of a participant’s favourite football team or primary school, indicated that they were unable to continue the interview.
A further key consideration was how to maintain safeguarding and strong links to violence response services during the pandemic. Collecting data in person allowed counsellors to accompany participants to health or social services and provide in-person counselling. Collecting data remotely meant we were concerned about: (1) the functioning of health and social services or their ability to receive and support participants and (2) referring and accompanying participants. Study teams worked closely with referral partner organisations and adapted referral protocols in light of these challenges. The Maisha Fiti study adapted referral mechanisms to include telemedicine and telecounselling for participants who needed follow-up care for experiences of violence or mental health support: this was delivered remotely by the study counsellor who the participants had already met in person. The CoVAC study offered all young people phone counselling and referrals, and employed three experienced full-time counselling staff. In preparation, team members called over 50 local organisations to assess their functioning during the pandemic to create a new referral list of organisations. This helped in identifying focal point persons and establishing relationships before participants were referred. Similarly, in Brazil, the HERA team made a new referral list based on the availability of remote or in-person services. Natsal employed an organisation-level Disclosure of Harm Policy: researchers were trained to report incidents where they sensed risk of harm and a Disclosure Board (consisting of senior staff) made further decisions about intervention and further disclosure of participant information.
Support for researchers
Finally, modifications were needed to design safe working environments and support for researchers conducting remote data collection. In addition to the inherent challenges of conducting violence research, all research teams experienced the stress and uncertainty of COVID-19 restrictions, and many researchers also experienced bereavements or tested positive for COVID-19. To protect and respond to researcher well-being, the HERA study developed a researcher distress protocol, and the CoVAC study engaged an external counsellor to provide psychosocial support to researchers. Other strategies used across the studies included: locally produced guidance for prioritising safety, daily debriefs, zoom polls and WhatsApp groups for regular contact and motivation. Some studies provided time off for vaccinations, additional payments to offset some of the negative economic consequences of the pandemic, and developed a daily schedule to fit with caring and childcare responsibilities. For example, the CoVAC study provided a food allowance during remote researcher training so researchers could support their households.
We also created work environments where researchers could make confidential phone calls. The Maisha Fiti study used a study office from where researchers could make calls, with COVID-19 safety measures. In several cases, it was not possible for researchers to work from an office or health facility. Following a lockdown in Uganda, the CoVAC study shifted to home-based data collection and designed additional measures to support researchers and protect data. Study staff visited each researcher at home to discuss home-based data collection, privacy considerations, strategies for data storage and internet connectivity. Lessons from these conversations informed a code of conduct for home-based research, and the purchase of power banks, mobile phones, comfortable headphones and opaque folders for researchers, which were delivered to their homes.
Our learnings from remote data collection and ways forward
Table 3 summarises our key learnings from redesigning violence research during the COVID-19 pandemic. These lessons may be helpful beyond COVID-19 and have implications for researching other sensitive topics more broadly, and for research on topics where participants may disclose violence. We found that shifting to remote methods to conduct violence research requires adapting to a range of moveable and unpredictable conditions specific to: the social and political context of each study, the local COVID-19 response and the study design, and the partnership arrangements in place. Committing time and budget to the additional steps required to protect participant and researcher safety39 is essential. These lessons both affirm, and build on, much of the existing guidance.2 3 10–13 25 Namely, that remote data collection to directly measure experiences of violence should be conducted in specific circumstances, when it is possible to ensure safeguarding and either when participants are already engaged in the study, and/or in the context of strong and well-established research partnerships. In some cases, remote methods allowed us to fulfil our commitments to participants and maintain relationships during the pandemic, and in other cases, remote research prevented us from doing so. When it was not possible or safe to directly interview women and children about violence using remote methods, we used other approaches to engage participants and generate evidence on violence, acknowledging the limitations of not interviewing women and children directly. Although remote research should not replace face-to-face research on violence, these approaches could be combined. We found some participants appreciated the convenience of remote interviews, finding it easier to fit in participation alongside other commitments.
There remains much to learn about the ethics, possibilities and limitations of remote data collection methods. In this paper, we offer insights into how research teams selected and used a range of remote methods, and discuss how these approaches were developed and implemented. We are unable to evaluate which remote method was most effective. There is limited evidence on whether remote methods or face-to-face interviews improve disclosure of violence in research, however evidence suggests that more anonymous data collection methods (e.g. a computer-assisted self-administered interview) could increase disclosure of violence.40 Future research could develop, compare and test approaches to remote data collection in different settings. Further research could explore how study participants experience remote data collection methods, assess the effects of remote methods compared with face-to-face methods on violence reporting, and examine how data collection through remote methods could be used to inform violence prevention and response. Our findings suggest that remote methods may be a way to reach marginalised groups in some contexts, and further research should also explore whether remote methods could be used to reach migrant populations and street connected young people, paying attention to safety, ethics, privacy, power dynamics and access to phones and the internet. It is important that future research meaningfully engages women, children and young people as co-creators of research on remote methods. Such work could provide improved guidance to researchers and to institutional and national research ethics committees who are also exploring the safety and ethics of these methods.
As Parkes and colleagues note, ‘the researcher often benefits more from the telling than the researched,’34 41 raising important questions that we, as researchers, should be asking ourselves as we strive to fulfil our commitments to research ethics, to survivors of violence, and to our research teams as we continue to generate evidence about violence against women and children to inform policy, practice and prevention.
Data availability statement
Data sharing not applicable as no datasets generated and/or analysed for this study.
Patient consent for publication
We would like to thank study participants, interviewers and the teams who collected the data in each study. In addition, we would like to thank the following people for their contributions to remote study design and data collection: CoVAC: Louise Knight, Tvisha Nevatia, Ayoub Kakande, Michael Charles Mubiru, Dipak Nakar, Libby Nansubuga Kizito, Hassan Ssonko Sulayiman, Sylivia Nairuba, Bridget Ninsiima, Moureen Nakazibwe, Mariam Waiswa, Rose Kadondi, Patrick Safari, Prossy Bagonza, Nancy Ninsiima, Joshua Amanya, Rosemary Nankya, Mable Irene Namakoye, Grace Nabitaka, Sayyid Bukenya, Lillian Nkamwesiga, Lillian Najjuka, Mastula Nakiboneka, Hassan Muluusi, Dennis Okello, Wilson Kasaijja, Peace Nakayiwa, Timothy Laku, Carol Wabomba and Milly Naluutaya. Maisha Fiti: Helen A Weiss, Rupert Kaul, Elizabeth Rwenji, Evelyn Ombunga, Ibrahim Lwingi, Chrispo Nyabuto, Anne Mahero, Monica Okumu, Zaina Jama, Pauline Ngurukiri, Daisy Oside, Agnes Atieno, Faith Njau, Mary Akinyi, Demitila Gwala, Ruth Kamene, Wendy Watata, Emily Nyariki, the Maisha Fiti Study Champions. Bantwana Programme: Aloysious Nnyombi, Samuel Besigwa, Susan Kazooba, Jennifer Kobusingye, Brian Kyomuhendo, Clare Twineamatsiko, Moriah Bauman, Susan Kajura, Naomi Reich and Elizabeth Tusiime. CCSS-Z: Annah Theresa Nyadombo, Dorcas Mgugu, Sarah Rank, Lenah Gideon, Maggie Magadza, Pauline Chimbiro, Rati Moyo, Sarah Rank, Deborah Barron, Caroline Trigg, Tendai Nhenga-Chakarisa, Charles Muchemwa Nherera, Blessing Masumba, Sybille Chidyamatare, Rodwell Chiatezvi, Charmaine Chitoyo, Camilla Fabbri, Louise Knight. Natsal: Catherine Mercer, Pam Sonnenberg, Gillian Prior, Nigel Field, Kirstin Mitchell, Chris Bonell and Wendy Macdowall.
Handling editor Seye Abimbola
Contributors AB, ET and KD conceptualised the article with input from TB. AB and ET drafted the outline and developed the first draft. Authors wrote sections of the article and tables relevant to their respective studies. All authors reviewed and provided comments on at least two drafts. AB incorporated author comments and developed new drafts. All authors approved the final version of the article.
Funding The COVAC study is funded by the UK Medical Research Council (MRC), grant number MR/R002827/1, which supported the conceptualisation, writing and CoVAC case examples in this paper. For the HERA study (AFd'O, PR, LJB), the writing up of the paper was made possible by a National Institute of Health (NIHR) (17/63/125) using UK aid from the UK Government to support global health research. CCSS-Z is funded by Porticus.
Disclaimer The views expressed in this publication are those of the authors(s) and not necessarily those of the NIHR or the UK Government. The Maisha Fiti Study is funded by the MRC and the UK Department of International Development (DFID) (MR/R023182/1) under the MRC/DFID Concordat agreement.
Competing interests None declared.
Provenance and peer review Not commissioned; externally peer reviewed.
Author note The reflexivity statement for this paper is linked as an online supplemental file1.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.