Skip to content
ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Open Letter

Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360

[version 1; peer review: 2 approved, 1 approved with reservations]
PUBLISHED 24 May 2019
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

This article is included in the International Conference on Family Planning gateway.

Abstract

Adolescents 360 (A360) is a four-year initiative (2016–2020) to increase 15-19-year-old girls’ use of modern contraception in Nigeria, Ethiopia and Tanzania. The innovative A360 approach is led by human-centred design (HCD), combined with social marketing, developmental neuroscience, public health, sociocultural anthropology and youth engagement ‘lenses’, and aims to create context-specific, youth-driven solutions that respond to the needs of adolescent girls. The A360 external evaluation includes a process evaluation, quasi-experimental outcome evaluation, and a cost-effectiveness study. We reflect on evaluation opportunities and challenges associated with measuring the application and impact of this novel HCD-led design approach.
For the process evaluation, participant observations were key to capturing the depth of the fast-paced, highly-iterative HCD process, and to understand decision-making within the design process. The evaluation team had to be flexible and align closely with the work plan of the implementers. The HCD process meant that key information such as intervention components, settings, and eligible populations were unclear and changed over outcome evaluation and cost-effectiveness protocol development. This resulted in a more time-consuming and resource-intensive study design process. As much time and resources went into the creation of a new design approach, separating one-off “creation” costs versus those costs associated with actually implementing the programme was challenging. Opportunities included the potential to inform programmatic decision-making in real-time to ensure that interventions adequately met the contextualized needs in targeted areas.
Robust evaluation of interventions designed using HCD, a promising and increasingly popular approach, is warranted yet challenging. Future HCD-based initiatives should consider a phased evaluation, focusing initially on programme theory refinement and process evaluation, and then, when the intervention program details are clearer, following with outcome evaluation and cost-effectiveness analysis. A phased approach would delay the availability of evaluation findings but would allow for a more appropriate and tailored evaluation design.

Keywords

Human-Centered Design, User-centered Design, Design Thinking, Evaluation, Adolescents, Contraception, Africa

Rationale

In the field of public health, human-centred design (HCD), also known as design thinking, is increasingly being used during intervention development, and sometimes implementation, to design innovative solutions to complex problems1. HCD is a creative approach to programme design that focuses on building empathy with the population of interest, and generating and iteratively testing many potential solutions to complex problems. The approach has a high tolerance for both ambiguity and failure13. HCD shares some characteristics with other methods used to design programmes such as traditional socio-behavioural research but tends to be more ‘nimble/flexible/iterative’ and less protocol driven4.

Intervention evaluation is an essential component of public health research and programming, and detailed guidance on evaluation approaches and methodologies exist5,6. However, the iterative and flexible nature of HCD potentially poses some unique challenges for evaluation1,3,7. and examples of evaluations of HCD public health interventions have so far been limited7,8. There are some parallels between evaluations of HCD interventions and evaluations of interventions incorporating adaptive management or quality improvement where the intervention can also change over time914. Also, programmatic interventions, as opposed to researcher-led interventions, often involve an element of flexibility in intervention implementation, as implementers strive to develop more context appropriate interventions, and respond to programmatic targets. In such interventions, the location and timing of intervention implementation is guided primarily by programme targets or local government priorities, and less so by a need to maximise the quality of evidence of impact15,16. Despite some overlap with other evaluation scenarios, we have identified some unique challenges to the evaluation of HCD-led interventions.

Purpose of the letter

In this letter we discuss the evaluation opportunities and challenges identified during the ongoing A360 evaluation and make recommendations for future evaluations of programmes designed using HCD.

Adolescents 360

The intervention that is under evaluation is Adolescents 360 (A360), a multi-country initiative which aims to increase the voluntary use of modern contraceptives among adolescent girls aged 15–19 years. The A360 initiative is led by Population Services International (PSI) in collaboration with the Society for Family Health (SFH) in Nigeria, design partner IDEO.org, and the Centre for the Developing Adolescent at the University of California. A360 is co-funded by the Bill & Melinda Gates Foundation and the Children’s Investment Fund Foundation.

A360 set out to take a transdisciplinary approach to intervention development which involves the different discipline ‘lenses’ working jointly to create new innovations that integrate and move beyond discipline-specific approaches to address a common problem (Figure 1).

8f14388c-9e35-4035-93d0-a85312c0ace4_figure1.gif

Figure 1. The A360 approach.

A360 aims to create context-specific, youth-driven solutions that respond to the needs of young people. Through an iterative design process, unique solutions were created in each of the A360 countries. In Ethiopia, Smart Start uses financial planning as an entry point to discuss contraception with newly married couples17. In southern Nigeria, 9JA girls provides physical safe spaces (branded spaces in public health clinics) for unmarried girls18, and in northern Nigeria, Matasa Matan Arewa targets married girls and their husbands with counselling and contraceptive services using maternal and child health as an entry point19. In Tanzania ‘Kuwa Mjanja’ (be smart) delivers life and entrepreneurial skills training alongside opt-out counselling sessions and on-site contraceptive service provision to both married and unmarried adolescent girls20.

External evaluation

The independent external evaluation aims to understand whether the A360 approach leads to improved reproductive health outcomes; generate evidence about A360’s cost-effectiveness in relation to other programming approaches; and capture how A360 is implemented in different contexts, and the experience of the participating young people and communities. It has four components (Figure 2)21:

8f14388c-9e35-4035-93d0-a85312c0ace4_figure2.gif

Figure 2. A360 multi-component external evaluation.

  • 1. Outcome evaluation which has a quasi-experimental design with before-after cross-sectional surveys in all four settings (Oromia, Ethiopia; Mwanza, Tanzania; Ogun and Nasarawa, Nigeria) and a comparison group in the two Nigerian settings22.

  • 2. Process evaluation incorporating both traditional qualitative methodologies and youth participatory research23.

  • 3. Costing study to understand the drivers of cost and a more extensive cost-effectiveness study within the outcome evaluation study geographies.

  • 4. Engagement and research uptake.

HCD process and timeline

HCD takes a phased approach to intervention development. In the ‘inspiration phase’ the designers immerse themselves with girls and her influencers to understand the issues. In the ideation phase they attempt to make sense of what has been learnt, identify opportunities for design and conduct an iterative cycle of prototyping of possible solutions and strategies. The most promising prototypes are then piloted. In the case of A360, implementation at scale involved a further ‘optimisation’ period where the selected solutions were further modified to maximise scalability and affordability. In A360, the inspiration and ideation phases took place over approximately 12 months (Sept 16-Aug 17) with piloting for a further 3-4 months (September-December 17). Optimisation and scale-up started towards the end of 2017/early 2018. Among other things, this lengthy intervention design period presented some challenges for the evaluation, which we explore in the sections that follow.

Outcome evaluation

The outcome evaluation aimed to measure the impact of A360 on the voluntary use of modern contraceptives among adolescent girls aged 15–19 years in the study geographies. The outcome evaluation was designed when the implementers were at the ‘inspiration phase’ with final study protocols submitted for ethical approval when prototyping was still ongoing and before the intervention had been finalised (Figure 3). This outcome evaluation timeline was necessary in order to conduct baseline surveys in the four study settings in 2017 prior to scale-up of the finalised interventions. The implications of the concurrent intervention development and evaluation design were that key pieces of information about the intervention components and the theory of change were unclear and changed over the protocol development period. Detailed information on the intervention were needed to inform the setting for baseline surveys and the kinds of questions to be asked during the surveys. The OE study design was finalised before the interventions themselves so there remained a risk that the evaluation was not optimal.

8f14388c-9e35-4035-93d0-a85312c0ace4_figure3.gif

Figure 3. Timeline of A360 intervention development and implementation and the external evaluation.

A key consideration for the baseline survey design was who would be targeted with the A360 intervention in terms of geographical location and demographic characteristics. Early on in the intervention design process it became clear that A360 would be targeted towards only married or unmarried adolescent girls in some locations though uncertainty remained through early phases of the design process. For example, in Northern Nigeria, the outcome evaluation focused on married adolescent girls as the initial intention of A360 was to target only those who were married, however, 9ja Girls, the intervention that was designed in Southern Nigeria to target unmarried adolescent girls, was eventually also implemented in Northern Nigeria. In Ethiopia, initial HCD insights pointed towards a focus on unmarried adolescents in school but other programming pressures including donor preferences led to a later shift in focus to married rural women. The outcome evaluation was designed to target all married 15-19-year-old girls, but the final intervention focused on ‘newly married’ adolescent girls. While some change in target population might be expected with non-HCD programmatic interventions, the HCD process provided the implementers with more freedom and confidence to make additional changes. It was challenging to determine the geographies that would receive A360 as the implementers were in the middle of an intense phase of solution prototyping and were not yet thinking about larger scale implementation of A360.

The level of implementation of the intervention was also relevant for study design, e.g. would A360 be implemented at health facilities, in schools or in the community? Given the uncertainty as to how A360 would be implemented, we opted for a community-based survey, which turned out to be appropriate for the final A360 interventions. However, if A360 had been implemented primarily in health facilities or schools then evaluating the intervention at the broader community level might not have been the most efficient study design.

The uncertainties in key study design parameters meant that we had to develop multiple study design scenarios which were repeatedly revised as new information came to light. For example, initially the intervention was to have distinct implementation phases as it was rolled across Nigeria and we explored the idea of conducting a stepped-wedge trial. Following further discussions with the implementers, it became clear that there was no certainty that roll-out would be in a phased manner as they did not know yet what the intervention would comprise. In Tanzania, we explored whether a regression discontinuity design might be possible if the intervention were implemented in one or several of the existing demographic surveillance sites, however, given the uncertainty around the nature of the intervention, the implementers were reluctant to commit to implementing in those areas. This prolonged study design process entailed some added costs and was at times frustrating for all involved.

Process evaluation

The process evaluation was aligned to the HCD-driven phases of the design process for A360 and had the primary objective of presenting a descriptive and analytical account of how the implementation of A360 has played out in relation to its Theory of Change23. The process evaluation attempted to understand both the intervention development process and intervention implementation. During intervention development the process evaluation team faced challenges as the fast-paced, highly iterative HCD process meant that the decision making process, the ‘design energy’, often went undocumented4. This challenge was also noted in another evaluation of a HCD design process8. As a result, the process evaluation team needed to be flexible in order to align closely with the work plan of the implementers and methodologies such as direct participant observation were key to capturing the depth of the HCD process.

The potential for research fatigue was observed among target community members as their views were solicited by both the implementers designing the intervention and the process evaluation team interested in understanding the HCD process from the participants’ point of view. The process evaluation team, therefore, needed to balance the importance of capturing the views of community members with the potential for research fatigue.

During the design of the process evaluation, the intention was that the findings would feed into and inform the intervention design at key moments. However, there was limited uptake of process evaluation findings by implementers. For example, the process evaluation highlighted the need for the programme to do more to address broader community and social norms, but this finding had a limited impact on intervention design. Poor uptake of findings was partly a result of the fast-paced nature of A360 and the resultant demands on the implementers’ time which did not allow them sufficient time to pause and reflect on the process evaluation findings. Country implementing teams differed in how they engaged with external recommendations, with some teams receptive and willing to listen and adjust, while others were more protective of their ‘solutions’. Designers add value to public health as they take a different approach when addressing an issue, however, in the case of A360, a preference to work from a blank slate free of the constraints of public health evidence may have impacted on an ability to respond to the PE findings. Uptake of the process evaluation findings improved when the evaluation team introduced participatory action research activities which focused on questions that were important for the country teams (e.g. health care provider attitudes in Ethiopia24 and Nigeria25).

Cost-effectiveness evaluation

The cost-effectiveness study faced similar challenges as the outcome evaluation in terms of complicated and delayed development of the cost-effectiveness study protocols. Although all costing exercises face some unknowns about the intervention to be costed, the HCD process added a layer of uncertainty. Furthermore, because proponents hypothesized that it was the design process itself that would be the key factor in producing an effective (and cost-effective) intervention, an important challenge was both measuring the total design cost and isolating the cost of HCD specifically as HCD activities of the design process were tightly intertwined with the other A360 ‘lenses’. In addition, because A360 was creating a new trans-discipline approach, there was also interest in separating out the costs associated with the ‘creation’ of the A360 approach from the costs associated with implementing and scaling A360 in countries. Interviews were held with intervention staff in order to get a sense of the distribution of activities (and associated costs) across the study ‘lenses’ and between efforts to create versus implement A360. The implementing agencies planned continuous changes or tweaks to the intervention once it was up and running and so more frequent cost data draws were required during the scale-up period to capture how those change might affect the “production process” and associated costs.

Lessons learnt

Robust evaluation of a new and promising approach such as HCD is warranted yet challenging. Some of the challenges faced are not unique to interventions designed with HCD and will be recognisable by those who have led evaluations of programmatic interventions. In comparison to research-led studies, evaluation of programmatic interventions is associated with a reduced level of control and increased uncertainty. A challenge for evaluators is to find ways to deal with this uncertainty while still retaining scientific rigor.

We have the following recommendations for others who would like to evaluate programmes incorporating HCD:

  • 1. Implementers should allow adequate time to participate in the process evaluation, as well as work with the process evaluation team to ensure that findings are timed to feed into key decisions.

  • 2. The process evaluation team should maximise secondary analysis of data collected by the implementers, and joint data collection could be considered where additional data collection is needed and participant research fatigue is anticipated.

  • 3. Like HCD, an iterative and adaptive process evaluation approach is required. In A360, the process evaluation paused after the pilot phase and our team worked with the A360 implementers and donors to develop evaluation questions that reflected the solutions that were being developed, reflecting an iterated A360 Theory of Change.

  • 4. For the process evaluation, methodologies, such as direct observations, are instrumental to capturing the fast-paced, highly-iterative HCD process, and to understand the ‘design energy’ i.e. how decisions were made at key points in the design process.

  • 5. To avoid a time-consuming and resource-intensive design process, future HCD-based initiatives should consider a phased evaluation approach:

    • Conduct process evaluation during the HCD inspiration, ideation, and pilot phases.

    • Wait until the implementers have a better understanding of the emerging programme and have finalised the target geographies, target population, and intended outcomes before planning an outcome evaluation and cost-effectiveness study.

    • During the implementation phase, conduct a comprehensive process evaluation that can capture whether, how, and why the intervention changed during implementation.

The advantages of a phased approach need to be balanced against the disadvantages of lengthening of the time between implementation and the availability of evaluation findings.

Data availability

No data are associated with this article.

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 24 May 2019
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
Gates Open Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Doyle AM, Mulhern E, Rosen J et al. Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360 [version 1; peer review: 2 approved, 1 approved with reservations] Gates Open Res 2019, 3:1472 (https://doi.org/10.12688/gatesopenres.12998.1)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 24 May 2019
Views
18
Cite
Reviewer Report 15 Jul 2019
Kristien Michielsen, International Centre for Reproductive Health (ICRH), Ghent University, Ghent, Belgium 
Approved
VIEWS 18
The article presents lessons learnt from an external evaluation of the A360 programme in three African countries. A360 uses the human-centred design and hence applied three types of evaluations to evaluate it: whether the programme leads to improved reproductive health ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Michielsen K. Reviewer Report For: Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360 [version 1; peer review: 2 approved, 1 approved with reservations]. Gates Open Res 2019, 3:1472 (https://doi.org/10.21956/gatesopenres.14106.r27351)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
  • Author Response 25 Sep 2019
    Aoife Doyle, London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
    25 Sep 2019
    Author Response
    The article presents lessons learnt from an external evaluation of the A360 programme in three African countries. A360 uses the human-centred design and hence applied three types of evaluations to ... Continue reading
COMMENTS ON THIS REPORT
  • Author Response 25 Sep 2019
    Aoife Doyle, London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
    25 Sep 2019
    Author Response
    The article presents lessons learnt from an external evaluation of the A360 programme in three African countries. A360 uses the human-centred design and hence applied three types of evaluations to ... Continue reading
Views
16
Cite
Reviewer Report 15 Jul 2019
Melissa Gilliam, The Centre for Interdisciplinary Inquiry & Innovation in Sexual Health and Productive Health, University of Chicago, Chicago, IL, USA 
Approved with Reservations
VIEWS 16
In this article, the authors discuss challenges and opportunities associated with evaluating HCD interventions. In this case, the intervention was associated with A360.
 
Overall, the honesty of the authors is appreciated as rarely does one get to ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Gilliam M. Reviewer Report For: Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360 [version 1; peer review: 2 approved, 1 approved with reservations]. Gates Open Res 2019, 3:1472 (https://doi.org/10.21956/gatesopenres.14106.r27411)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
  • Author Response 25 Sep 2019
    Aoife Doyle, London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
    25 Sep 2019
    Author Response
    In this article, the authors discuss challenges and opportunities associated with evaluating HCD interventions. In this case, the intervention was associated with A360.
     
    Overall, the honesty of the authors is appreciated ... Continue reading
COMMENTS ON THIS REPORT
  • Author Response 25 Sep 2019
    Aoife Doyle, London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
    25 Sep 2019
    Author Response
    In this article, the authors discuss challenges and opportunities associated with evaluating HCD interventions. In this case, the intervention was associated with A360.
     
    Overall, the honesty of the authors is appreciated ... Continue reading
Views
21
Cite
Reviewer Report 26 Jun 2019
Isaac Holeman, Medic Mobile, Seattle, WA, USA;  University of Washington, Seattle, WA, USA 
Approved
VIEWS 21
Overview
This letter offers reflections on the authors’ experience evaluating an HCD-led global health program called A360. The letter is timely in that it raises questions that many global health programs have grappled with of late. I particularly appreciate ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Holeman I. Reviewer Report For: Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360 [version 1; peer review: 2 approved, 1 approved with reservations]. Gates Open Res 2019, 3:1472 (https://doi.org/10.21956/gatesopenres.14106.r27240)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
  • Author Response 25 Sep 2019
    Aoife Doyle, London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
    25 Sep 2019
    Author Response
    Overview
    This letter offers reflections on the authors’ experience evaluating an HCD-led global health program called A360. The letter is timely in that it raises questions that many global health programs ... Continue reading
COMMENTS ON THIS REPORT
  • Author Response 25 Sep 2019
    Aoife Doyle, London School of Hygiene & Tropical Medicine, London, WC1E7HT, UK
    25 Sep 2019
    Author Response
    Overview
    This letter offers reflections on the authors’ experience evaluating an HCD-led global health program called A360. The letter is timely in that it raises questions that many global health programs ... Continue reading

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 24 May 2019
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions

Are you a Gates-funded researcher?

If you are a previous or current Gates grant holder, sign up for information about developments, publishing and publications from Gates Open Research.

You must provide your first name
You must provide your last name
You must provide a valid email address
You must provide an institution.

Thank you!

We'll keep you updated on any major new updates to Gates Open Research

Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.