Article Text

Download PDFPDF

Synthesising quantitative and qualitative evidence to inform guidelines on complex interventions: clarifying the purposes, designs and outlining some methods
  1. Jane Noyes1,
  2. Andrew Booth2,
  3. Graham Moore3,
  4. Kate Flemming4,
  5. Özge Tunçalp5,
  6. Elham Shakibazadeh6
  1. 1School of Social Sciences, Bangor University, Wales, UK
  2. 2School of Health and Related Research (ScHARR), University of Sheffield, Sheffield, UK
  3. 3School of Social Sciences, Cardiff University, Wales, UK
  4. 4Department of Health Sciences, The University of York, York, UK
  5. 5Department of Reproductive Health and Research including UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction (HRP), World Health Organization, Geneva, Switzerland
  6. 6Department of Health Education and Promotion, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
  1. Correspondence to Professor Jane Noyes; jane.noyes{at}bangor.ac.uk

Abstract

Guideline developers are increasingly dealing with more difficult decisions concerning whether to recommend complex interventions in complex and highly variable health systems. There is greater recognition that both quantitative and qualitative evidence can be combined in a mixed-method synthesis and that this can be helpful in understanding how complexity impacts on interventions in specific contexts. This paper aims to clarify the different purposes, review designs, questions, synthesis methods and opportunities to combine quantitative and qualitative evidence to explore the complexity of complex interventions and health systems. Three case studies of guidelines developed by WHO, which incorporated quantitative and qualitative evidence, are used to illustrate possible uses of mixed-method reviews and evidence. Additional examples of methods that can be used or may have potential for use in a guideline process are outlined. Consideration is given to the opportunities for potential integration of quantitative and qualitative evidence at different stages of the review and guideline process. Encouragement is given to guideline commissioners and developers and review authors to consider including quantitative and qualitative evidence. Recommendations are made concerning the future development of methods to better address questions in systematic reviews and guidelines that adopt a complexity perspective.

  • health systems
  • systematic review
  • qualitative study
  • randomised control trial
https://creativecommons.org/licenses/by-nc/3.0/igo/

This is an open access article distributed under the terms of the Creative Commons Attribution IGO License (CC BY NC 3.0 IGO), which permits use, distribution,and reproduction in any medium, provided the original work is properly cited. In any reproduction of this article there should not be any suggestion that WHO or this article endorse any specific organization or products. The use of the WHO logo is not permitted. This notice should be preserved along with the article’s original URL.Disclaimer: The author is a staff member of the World Health Organization. The author alone is responsible for the views expressed in this publication and they do not necessarily represent the views, decisions or policies of the World Health Organization.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Summary box

  • When combined in a mixed-method synthesis, quantitative and qualitative evidence can potentially contribute to understanding how complex interventions work and for whom, and how the complex health systems into which they are implemented respond and adapt.

  • The different purposes and designs for combining quantitative and qualitative evidence in a mixed-method synthesis for a guideline process are described.

  • Questions relevant to gaining an understanding of the complexity of complex interventions and the wider health systems within which they are implemented that can be addressed by mixed-method syntheses are presented.

  • The practical methodological guidance in this paper is intended to help guideline producers and review authors commission and conduct mixed-method syntheses where appropriate.

  • If more mixed-method syntheses are conducted, guideline developers will have greater opportunities to access this evidence to inform decision-making.

Introduction

Recognition has grown that while quantitative methods remain vital, they are usually insufficient to address complex health systems related research questions.1 Quantitative methods rely on an ability to anticipate what must be measured in advance. Introducing change into a complex health system gives rise to emergent reactions, which cannot be fully predicted in advance. Emergent reactions can often only be understood through combining quantitative methods with a more flexible qualitative lens.2 Adopting a more pluralist position enables a diverse range of research options to the researcher depending on the research question being investigated.3–5 As a consequence, where a research study sits within the multitude of methods available is driven by the question being asked, rather than any particular methodological or philosophical stance.6

Publication of guidance on designing complex intervention process evaluations and other works advocating mixed-methods approaches to intervention research have stimulated better quality evidence for synthesis.1 7–13 Methods for synthesising qualitative14 and mixed-method evidence have been developed or are in development. Mixed-method research and review definitions are outlined in box 1.

Box 1

Defining mixed-method research and reviews

Pluye and Hong52 define mixed-methods research as “a research approach in which a researcher integrates (a) qualitative and quantitative research questions, (b) qualitative research methods* and quantitative research designs, (c) techniques for collecting and analyzing qualitative and quantitative evidence, and (d) qualitative findings and quantitative results”.A mixed-method synthesis can integrate quantitative, qualitative and mixed-method evidence or data from primary studies.† Mixed-method primary studies are usually disaggregated into quantitative and qualitative evidence and data for the purposes of synthesis. Thomas and Harden further define three ways in which reviews are mixed.53

  1. The types of studies included and hence the type of findings to be synthesised (ie, qualitative/textual and quantitative/numerical).

  2. The types of synthesis method used (eg, statistical meta-analysis and qualitative synthesis).

  3. The mode of analysis: theory testing AND theory building.

  • *A qualitative study is one that uses qualitative methods of data collection and analysis to produce a narrative understanding of the phenomena of interest. Qualitative methods of data collection may include, for example, interviews, focus groups, observations and analysis of documents.

  • †The Cochrane Qualitative and Implementation Methods group coined the term ‘qualitative evidence synthesis’ to mean that the synthesis could also include qualitative data. For example, qualitative data from case studies, grey literature reports and open-ended questions from surveys. ‘Evidence’ and ‘data’ are used interchangeably in this paper.

This paper is one of a series that aims to explore the implications of complexity for systematic reviews and guideline development, commissioned by WHO. This paper is concerned with the methodological implications of including quantitative and qualitative evidence in mixed-method systematic reviews and guideline development for complex interventions. The guidance was developed through a process of bringing together experts in the field, literature searching and consensus building with end users (guideline developers, clinicians and reviewers). We clarify the different purposes, review designs, questions and synthesis methods that may be applicable to combine quantitative and qualitative evidence to explore the complexity of complex interventions and health systems. Three case studies of WHO guidelines that incorporated quantitative and qualitative evidence are used to illustrate possible uses of mixed-method reviews and mechanisms of integration (table 1, online supplementary files 1–3). Additional examples of methods that can be used or may have potential for use in a guideline process are outlined. Opportunities for potential integration of quantitative and qualitative evidence at different stages of the review and guideline process are presented. Specific considerations when using an evidence to decision framework such as the Developing and Evaluating Communication strategies to support Informed Decisions and practice based on Evidence (DECIDE) framework15 or the new WHO-INTEGRATE evidence to decision framework16 at the review design and evidence to decision stage are outlined. See online supplementary file 4 for an example of a health systems DECIDE framework and Rehfuess et al16 for the new WHO-INTEGRATE framework. Encouragement is given to guideline commissioners and developers and review authors to consider including quantitative and qualitative evidence in guidelines of complex interventions that take a complexity perspective and health systems focus.

Supplemental material

Supplemental material

Supplemental material

Supplemental material

Table 1

Designs and methods and their use or applicability in guidelines and systematic reviews taking a complexity perspective

Taking a complexity perspective

The first paper in this series17 outlines aspects of complexity associated with complex interventions and health systems that can potentially be explored by different types of evidence, including synthesis of quantitative and qualitative evidence. Petticrew et al17 distinguish between a complex interventions perspective and a complex systems perspective. A complex interventions perspective defines interventions as having “implicit conceptual boundaries, representing a flexible, but common set of practices, often linked by an explicit or implicit theory about how they work”. A complex systems perspective differs in that “complexity arises from the relationships and interactions between a system’s agents (eg, people, or groups that interact with each other and their environment), and its context. A system perspective conceives the intervention as being part of the system, and emphasises changes and interconnections within the system itself”. Aspects of complexity associated with implementation of complex interventions in health systems that could potentially be addressed with a synthesis of quantitative and qualitative evidence are summarised in table 2. Another paper in the series outlines criteria used in a new evidence to decision framework for making decisions about complex interventions implemented in complex systems, against which the need for quantitative and qualitative evidence can be mapped.16 A further paper18 that explores how context is dealt with in guidelines and reviews taking a complexity perspective also recommends using both quantitative and qualitative evidence to better understand context as a source of complexity. Mixed-method syntheses of quantitative and qualitative evidence can also help with understanding of whether there has been theory failure and or implementation failure. The Cochrane Qualitative and Implementation Methods Group provide additional guidance on exploring implementation and theory failure that can be adapted to address aspects of complexity of complex interventions when implemented in health systems.19

Table 2

Health-system complexity-related questions that a synthesis of quantitative and qualitative evidence could address (derived from Petticrew et al17)

It may not be apparent which aspects of complexity or which elements of the complex intervention or health system can be explored in a guideline process, or whether combining qualitative and quantitative evidence in a mixed-method synthesis will be useful, until the available evidence is scoped and mapped.17 20 A more extensive lead in phase is typically required to scope the available evidence, engage with stakeholders and to refine the review parameters and questions that can then be mapped against potential review designs and methods of synthesis.20 At the scoping stage, it is also common to decide on a theoretical perspective21 or undertake further work to refine a theoretical perspective.22 This is also the stage to begin articulating the programme theory of the complex intervention that may be further developed to refine an understanding of complexity and show how the intervention is implemented in and impacts on the wider health system.17 23 24 In practice, this process can be lengthy, iterative and fluid with multiple revisions to the review scope, often developing and adapting a logic model17 as the available evidence becomes known and the potential to incorporate different types of review designs and syntheses of quantitative and qualitative evidence becomes better understood.25 Further questions, propositions or hypotheses may emerge as the reviews progress and therefore the protocols generally need to be developed iteratively over time rather than a priori.

Following a scoping exercise and definition of key questions, the next step in the guideline development process is to identify existing or commission new systematic reviews to locate and summarise the best available evidence in relation to each question. For example, case study 2, ‘Optimising health worker roles for maternal and newborn health through task shifting’, included quantitative reviews that did and did not take an additional complexity perspective, and qualitative evidence syntheses that were able to explain how specific elements of complexity impacted on intervention outcomes within the wider health system. Further understanding of health system complexity was facilitated through the conduct of additional country-level case studies that contributed to an overall understanding of what worked and what happened when lay health worker interventions were implemented. See table 1 online supplementary file 2.

There are a few existing examples, which we draw on in this paper, but integrating quantitative and qualitative evidence in a mixed-method synthesis is relatively uncommon in a guideline process. Box 2 includes a set of key questions that guideline developers and review authors contemplating combining quantitative and qualitative evidence in mixed-methods design might ask. Subsequent sections provide more information and signposting to further reading to help address these key questions.

Box 2

Key questions that guideline developers and review authors contemplating combining quantitative and qualitative evidence in a mixed-methods design might ask

  1. WHY: Why is a mixed-method synthesis being planned? To answer

    Compound questions requiring both quantitative and qualitative evidence?

    Questions requiring mixed-methods studies?

    Separate quantitative and qualitative questions?

  2. WHAT: What type of evidence is likely to be available?

    Separate quantitative and qualitative research studies?

    Related quantitative and qualitative research studies?

    Mixed-methods studies?

    Quantitative unpublished data and/or qualitative unpublished data, eg, narrative survey data?

  3. WHEN: At what point will quantitative and qualitative evidence be integrated?

    Throughout the review?

    Following separate reviews?

    At the question point?

    At the synthesis point?

    At the evidence to recommendations stage?

    Or a combination?

  4. HOW: How easy is it to disaggregate quantitative and qualitative data from mixed-method studies? How will quantitative and qualitative evidence be integrated? Through a:

    Narrative synthesis or summary?

    Quantitising approach, eg, frequency analysis?

    Qualitising approach, eg, thematic synthesis?

    Tabulation?

    Logic model?

    Conceptual model/framework?

    Matrix?

    Graphical approach?

    Or a combination?

  5. WHICH: Which mixed-method designs, methodologies and methods best fit into a guideline process to inform recommendations?

Complexity-related questions that a synthesis of quantitative and qualitative evidence can potentially address

Petticrew et al17 define the different aspects of complexity and examples of complexity-related questions that can potentially be explored in guidelines and systematic reviews taking a complexity perspective. Relevant aspects of complexity outlined by Petticrew et al17 are summarised in table 2 below, together with the corresponding questions that could be addressed in a synthesis combining qualitative and quantitative evidence. Importantly, the aspects of complexity and their associated concepts of interest have however yet to be translated fully in primary health research or systematic reviews. There are few known examples where selected complexity concepts have been used to analyse or reanalyse a primary intervention study. Most notable is Chandler et al26 who specifically set out to identify and translate a set of relevant complexity theory concepts for application in health systems research. Chandler then reanalysed a trial process evaluation using selected complexity theory concepts to better understand the complex causal pathway in the health system that explains some aspects of complexity in table 2.

Rehfeuss et al16 also recommends upfront consideration of the WHO-INTEGRATE evidence to decision criteria when planning a guideline and formulating questions. The criteria reflect WHO norms and values and take account of a complexity perspective. The framework can be used by guideline development groups as a menu to decide which criteria to prioritise, and which study types and synthesis methods can be used to collect evidence for each criterion. Many of the criteria and their related questions can be addressed using a synthesis of quantitative and qualitative evidence: the balance of benefits and harms, human rights and sociocultural acceptability, health equity, societal implications and feasibility (see table 3). Similar aspects in the DECIDE framework15 could also be addressed using synthesis of qualitative and quantitative evidence.

Table 3

Integrate evidence to decision framework criteria, example questions and types of studies to potentially address these questions (derived from Rehfeuss et al 16)

Questions as anchors or compasses

Questions can serve as an ‘anchor’ by articulating the specific aspects of complexity to be explored (eg, Is successful implementation of the intervention context dependent?).27 Anchor questions such as “How does intervention x impact on socioeconomic inequalities in health behaviour/outcome x” are the kind of health system question that requires a synthesis of both quantitative and qualitative evidence and hence a mixed-method synthesis. Quantitative evidence can quantify the difference in effect, but does not answer the question of how. The ‘how’ question can be partly answered with quantitative and qualitative evidence. For example, quantitative evidence may reveal where socioeconomic status and inequality emerges in the health system (an emergent property) by exploring questions such as “Does patterning emerge during uptake because fewer people from certain groups come into contact with an intervention in the first place?” or “are people from certain backgrounds more likely to drop out, or to maintain effects beyond an intervention differently?” Qualitative evidence may help understand the reasons behind all of these mechanisms. Alternatively, questions can act as ‘compasses’ where a question sets out a starting point from which to explore further and to potentially ask further questions or develop propositions or hypotheses to explore through a complexity perspective (eg, What factors enhance or hinder implementation?).27 Other papers in this series provide further guidance on developing questions for qualitative evidence syntheses and guidance on question formulation.14 28

For anchor and compass questions, additional application of a theory (eg, complexity theory) can help focus evidence synthesis and presentation to explore and explain complexity issues.17 21 Development of a review specific logic model(s) can help to further refine an initial understanding of any complexity-related issues of interest associated with a specific intervention, and if appropriate the health system or section of the health system within which to contextualise the review question and analyse data.17 23–25 Specific tools are available to help clarify context and complex interventions.17 18

If a complexity perspective, and certain criteria within evidence to decision frameworks, is deemed relevant and desirable by guideline developers, it is only possible to pursue a complexity perspective if the evidence is available. Careful scoping using knowledge maps or scoping reviews will help inform development of questions that are answerable with available evidence.20 If evidence of effect is not available, then a different approach to develop questions leading to a more general narrative understanding of what happened when complex interventions were implemented in a health system will be required (such as in case study 3—risk communication guideline). This should not mean that the original questions developed for which no evidence was found when scoping the literature were not important. An important function of creating a knowledge map is also to identify gaps to inform a future research agenda.

Table 2 and online supplementary files 1–3 outline examples of questions in the three case studies, which were all ‘COMPASS’ questions for the qualitative evidence syntheses.

Types of integration and synthesis designs in mixed-method reviews

The shift towards integration of qualitative and quantitative evidence in primary research has, in recent years, begun to be mirrored within research synthesis.29–31 The natural extension to undertaking quantitative or qualitative reviews has been the development of methods for integrating qualitative and quantitative evidence within reviews, and within the guideline process using evidence to decision-frameworks. Advocating the integration of quantitative and qualitative evidence assumes a complementarity between research methodologies, and a need for both types of evidence to inform policy and practice. Below, we briefly outline the current designs for integrating qualitative and quantitative evidence within a mixed-method review or synthesis.

One of the early approaches to integrating qualitative and quantitative evidence detailed by Sandelowski et al32 advocated three basic review designs: segregated, integrated and contingent designs, which have been further developed by Heyvaert et al 33 (box 3).

Box 3

Segregated, integrated and contingent designs32 33

Segregated design

Conventional separate distinction between quantitative and qualitative approaches based on the assumption they are different entities and should be treated separately; can be distinguished from each other; their findings warrant separate analyses and syntheses. Ultimately, the separate synthesis results can themselves be synthesised.

Integrated design

The methodological differences between qualitative and quantitative studies are minimised as both are viewed as producing findings that can be readily synthesised into one another because they address the same research purposed and questions. Transformation involves either turning qualitative data into quantitative (quantitising) or quantitative findings are turned into qualitative (qualitising) to facilitate their integration.

Contingent design

Takes a cyclical approach to synthesis, with the findings from one synthesis informing the focus of the next synthesis, until all the research objectives have been addressed. Studies are not necessarily grouped and categorised as qualitative or quantitative.

A recent review of more than 400 systematic reviews34 combining quantitative and qualitative evidence identified two main synthesis designs—convergent and sequential. In a convergent design, qualitative and quantitative evidence is collated and analysed in a parallel or complementary manner, whereas in a sequential synthesis, the collation and analysis of quantitative and qualitative evidence takes place in a sequence with one synthesis informing the other (box 4).6 These designs can be seen to build on the work of Sandelowski et al,32 35 particularly in relation to the transformation of data from qualitative to quantitative (and vice versa) and the sequential synthesis design, with a cyclical approach to reviewing that evokes Sandelowski’s contingent design.

Box 4

Convergent and sequential synthesis designs 34

Convergent synthesis design

Qualitative and quantitative research is collected and analysed at the same time in a parallel or complementary manner. Integration can occur at three points:

a. Data-based convergent synthesis design

All included studies are analysed using the same methods and results presented together. As only one synthesis method is used, data transformation occurs (qualitised or quantised). Usually addressed one review question.

b. Results-based convergent synthesis design

Qualitative and quantitative data are analysed and presented separately but integrated using a further synthesis method; eg, narratively, tables, matrices or reanalysing evidence. The results of both syntheses are combined in a third synthesis. Usually addresses an overall review question with subquestions.

c. Parallel-results convergent synthesis design

Qualitative and quantitative data are analysed and presented separately with integration occurring in the interpretation of results in the discussion section. Usually addresses two or more complimentary review questions.

Sequential synthesis design

A two-phase approach, data collection and analysis of one type of evidence (eg, qualitative), occurs after and is informed by the collection and analysis of the other type (eg, quantitative). Usually addresses an overall question with subquestions with both syntheses complementing each other.

The three case studies (table 1, online supplementary files 1–3) illustrate the diverse combination of review designs and synthesis methods that were considered the most appropriate for specific guidelines.

Methods for conducting mixed-method reviews in the context of guidelines for complex interventions

In this section, we draw on examples where specific review designs and methods have been or can be used to explore selected aspects of complexity in guidelines or systematic reviews. We also identify other review methods that could potentially be used to explore aspects of complexity. Of particular note, we could not find any specific examples of systematic methods to synthesise highly diverse research designs as advocated by Petticrew et al17 and summarised in tables 2 and 3. For example, we could not find examples of methods to synthesise qualitative studies, case studies, quantitative longitudinal data, possibly historical data, effectiveness studies providing evidence of differential effects across different contexts, and system modelling studies (eg, agent-based modelling) to explore system adaptivity.

There are different ways that quantitative and qualitative evidence can be integrated into a review and then into a guideline development process. In practice, some methods enable integration of different types of evidence in a single synthesis, while in other methods, the single systematic review may include a series of stand-alone reviews or syntheses that are then combined in a cross-study synthesis. Table 1 provides an overview of the characteristics of different review designs and methods and guidance on their applicability for a guideline process. Designs and methods that have already been used in WHO guideline development are described in part A of the table. Part B outlines a design and method that can be used in a guideline process, and part C covers those that have the potential to integrate quantitative, qualitative and mixed-method evidence in a single review design (such as meta-narrative reviews and Bayesian syntheses), but their application in a guideline context has yet to be demonstrated.

Points of integration when integrating quantitative and qualitative evidence in guideline development

Depending on the review design (see boxes 3 and 4), integration can potentially take place at a review team and design level, and more commonly at several key points of the review or guideline process. The following sections outline potential points of integration and associated practical considerations when integrating quantitative and qualitative evidence in guideline development.

Review team level

In a guideline process, it is common for syntheses of quantitative and qualitative evidence to be done separately by different teams and then to integrate the evidence. A practical consideration relates to the organisation, composition and expertise of the review teams and ways of working. If the quantitative and qualitative reviews are being conducted separately and then brought together by the same team members, who are equally comfortable operating within both paradigms, then a consistent approach across both paradigms becomes possible. If, however, a team is being split between the quantitative and qualitative reviews, then the strengths of specialisation can be harnessed, for example, in quality assessment or synthesis. Optimally, at least one, if not more, of the team members should be involved in both quantitative and qualitative reviews to offer the possibility of making connexions throughout the review and not simply at re-agreed junctures. This mirrors O’Cathain’s conclusion that mixed-methods primary research tends to work only when there is a principal investigator who values and is able to oversee integration.9 10 While the above decisions have been articulated in the context of two types of evidence, variously quantitative and qualitative, they equally apply when considering how to handle studies reporting a mixed-method study design, where data are usually disaggregated into quantitative and qualitative for the purposes of synthesis (see case study 3—risk communication in humanitarian disasters).

Question formulation

Clearly specified key question(s), derived from a scoping or consultation exercise, will make it clear if quantitative and qualitative evidence is required in a guideline development process and which aspects will be addressed by which types of evidence. For the remaining stages of the process, as documented below, a review team faces challenges as to whether to handle each type of evidence separately, regardless of whether sequentially or in parallel, with a view to joining the two products on completion or to attempt integration throughout the review process. In each case, the underlying choice is of efficiencies and potential comparability vs sensitivity to the underlying paradigm.

Searching

Once key questions are clearly defined, the guideline development group typically needs to consider whether to conduct a single sensitive search to address all potential subtopics (lumping) or whether to conduct specific searches for each subtopic (splitting).36 A related consideration is whether to search separately for qualitative, quantitative and mixed-method evidence ‘streams’ or whether to conduct a single search and then identify specific study types at the subsequent sifting stage. These two considerations often mean a trade-off between a single search process involving very large numbers of records or a more protracted search process retrieving smaller numbers of records. Both approaches have advantages and choice may depend on the respective availability of resources for searching and sifting.

Screening and selecting studies

Closely related to decisions around searching are considerations relating to screening and selecting studies for inclusion in a systematic review. An important consideration here is whether the review team will screen records for all review types, regardless of their subsequent involvement (‘altruistic sifting’), or specialise in screening for the study type with which they are most familiar. The risk of missing relevant reports might be minimised by whole team screening for empirical reports in the first instance and then coding them for a specific quantitative, qualitative or mixed-methods report at a subsequent stage.

Assessment of methodological limitations in primary studies

Within a guideline process, review teams may be more limited in their choice of instruments to assess methodological limitations of primary studies as there are mandatory requirements to use the Cochrane risk of bias tool37 to feed into Grading of Recommendations Assessment, Development and Evaluation (GRADE)38 or to select from a small pool of qualitative appraisal instruments in order to apply GRADE; Confidence in the Evidence from Reviews of Qualitative Research (GRADE-CERQual)39 to assess the overall certainty or confidence in findings. The Cochrane Qualitative and Implementation Methods Group has recently issued guidance on the selection of appraisal instruments and core assessment criteria.40 The Mixed-Methods Appraisal Tool, which is currently undergoing further development, offers a single quality assessment instrument for quantitative, qualitative and mixed-methods studies.41 Other options include using corresponding instruments from within the same ‘stable’, for example, using different Critical Appraisal Skills Programme instruments.42 While using instruments developed by the same team or organisation may achieve a degree of epistemological consonance, benefits may come more from consistency of approach and reporting rather than from a shared view of quality. Alternatively, a more paradigm-sensitive approach would involve selecting the best instrument for each respective review while deferring challenges from later heterogeneity of reporting.

Data extraction

The way in which data and evidence are extracted from primary research studies for review will be influenced by the type of integrated synthesis being undertaken and the review purpose. Initially, decisions need to be made regarding the nature and type of data and evidence that are to be extracted from the included studies. Method-specific reporting guidelines43 44 provide a good template as to what quantitative and qualitative data it is potentially possible to extract from different types of method-specific study reports, although in practice reporting quality varies. Online supplementary file 5 provides a hypothetical example of the different types of studies from which quantitative and qualitative evidence could potentially be extracted for synthesis.

Supplemental material

The decisions around what data or evidence to extract will be guided by how ‘integrated’ the mixed-method review will be. For those reviews where the quantitative and qualitative findings of studies are synthesised separately and integrated at the point of findings (eg, segregated or contingent approaches or sequential synthesis design), separate data extraction approaches will likely be used.

Where integration occurs during the process of the review (eg, integrated approach or convergent synthesis design), an integrated approach to data extraction may be considered, depending on the purpose of the review. This may involve the use of a data extraction framework, the choice of which needs to be congruent with the approach to synthesis chosen for the review.40 45 The integrative or theoretical framework may be decided on a priori if a pre-developed theoretical or conceptual framework is available in the literature.27 The development of a framework may alternatively arise from the reading of the included studies, in relation to the purpose of the review, early in the process. The Cochrane Qualitative and Implementation Methods Group provide further guidance on extraction of qualitative data, including use of software.40

Synthesis and integration

Relatively few synthesis methods start off being integrated from the beginning, and these methods have generally been subject to less testing and evaluation particularly in a guideline context (see table 1). A review design that started off being integrated from the beginning may be suitable for some guideline contexts (such as in case study 3—risk communication in humanitarian disasters—where there was little evidence of effect), but in general if there are sufficient trials then a separate systematic review and meta-analysis will be required for a guideline. Other papers in this series offer guidance on methods for synthesising quantitative46 and qualitative evidence14 in reviews that take a complexity perspective. Further guidance on integrating quantitative and qualitative evidence in a systematic review is provided by the Cochrane Qualitative and Implementation Methods Group.19 27 29 40 47

Types of findings produced by specific methods

It is highly likely (unless there are well-designed process evaluations) that the primary studies may not themselves seek to address the complexity-related questions required for a guideline process. In which case, review authors will need to configure the available evidence and transform the evidence through the synthesis process to produce explanations, propositions and hypotheses (ie, findings) that were not obvious at primary study level. It is important that guideline commissioners, developers and review authors are aware that specific methods are intended to produce a type of finding with a specific purpose (such as developing new theory in the case of meta-ethnography).48 Case study 1 (antenatal care guideline) provides an example of how a meta-ethnography was used to develop a new theory as an end product,48 49 as well as framework synthesis which produced descriptive and explanatory findings that were more easily incorporated into the guideline process.27 The definitions (box 5) may be helpful when defining the different types of findings.

Box 5

Different levels of findings

Descriptive findings—qualitative evidence-driven translated descriptive themes that do not move beyond the primary studies.

Explanatory findings—may either be at a descriptive or theoretical level. At the descriptive level, qualitative evidence is used to explain phenomena observed in quantitative results, such as why implementation failed in specific circumstances. At the theoretical level, the transformed and interpreted findings that go beyond the primary studies can be used to explain the descriptive findings. The latter description is generally the accepted definition in the wider qualitative community.

Hypothetical or theoretical finding—qualitative evidence-driven transformed themes (or lines of argument) that go beyond the primary studies. Although similar, Thomas and Harden56 make a distinction in the purposes between two types of theoretical findings: analytical themes and the product of meta-ethnographies, third-order interpretations. 48

Analytical themes are a product of interrogating descriptive themes by placing the synthesis within an external theoretical framework (such as the review question and subquestions) and are considered more appropriate when a specific review question is being addressed (eg, in a guideline or to inform policy). 56

Third-order interpretations come from translating studies into one another while preserving the original context and are more appropriate when a body of literature is being explored in and of itself with broader or emergent review questions. 48

Bringing mixed-method evidence together in evidence to decision (EtD) frameworks

A critical element of guideline development is the formulation of recommendations by the Guideline Development Group, and EtD frameworks help to facilitate this process.16 The EtD framework can also be used as a mechanism to integrate and display quantitative and qualitative evidence and findings mapped against the EtD framework domains with hyperlinks to more detailed evidence summaries from contributing reviews (see table 1). It is commonly the EtD framework that enables the findings of the separate quantitative and qualitative reviews to be brought together in a guideline process. Specific challenges when populating the DECIDE evidence to decision framework15 were noted in case study 3 (risk communication in humanitarian disasters) as there was an absence of intervention effect data and the interventions to communicate public health risks were context specific and varied. These problems would not, however, have been addressed by substitution of the DECIDE framework with the new INTEGRATE16 evidence to decision framework. A d ifferent type of EtD framework needs to be developed for reviews that do not include sufficient evidence of intervention effect.

Discussion

Mixed-method review and synthesis methods are generally the least developed of all systematic review methods. It is acknowledged that methods for combining quantitative and qualitative evidence are generally poorly articulated.29 50 There are however some fairly well-established methods for using qualitative evidence to explore aspects of complexity (such as contextual, implementation and outcome complexity), which can be combined with evidence of effect (see sections A and B of table 1).14 There are good examples of systematic reviews that use these methods to combine quantitative and qualitative evidence, and examples of guideline recommendations that were informed by evidence from both quantitative and qualitative reviews (eg, case studies 1–3). With the exception of case study 3 (risk communication), the quantitative and qualitative reviews for these specific guidelines have been conducted separately, and the findings subsequently brought together in an EtD framework to inform recommendations.

Other mixed-method review designs have potential to contribute to understanding of complex interventions and to explore aspects of wider health systems complexity but have not been sufficiently developed and tested for this specific purpose, or used in a guideline process (section C of table 1). Some methods such as meta-narrative reviews also explore different questions to those usually asked in a guideline process. Methods for processing (eg, quality appraisal) and synthesising the highly diverse evidence suggested in tables 2 and 3 that are required to explore specific aspects of health systems complexity (such as system adaptivity) and to populate some sections of the INTEGRATE EtD framework remain underdeveloped or in need of development.

In addition to the required methodological development mentioned above, there is no GRADE approach38 for assessing confidence in findings developed from combined quantitative and qualitative evidence. Another paper in this series outlines how to deal with complexity and grading different types of quantitative evidence,51 and the GRADE CERQual approach for qualitative findings is described elsewhere,39 but both these approaches are applied to method-specific and not mixed-method findings. An unofficial adaptation of GRADE was used in the risk communication guideline that reported mixed-method findings. Nor is there a reporting guideline for mixed-method reviews,47 and for now reports will need to conform to the relevant reporting requirements of the respective method-specific guideline. There is a need to further adapt and test DECIDE,15 WHO-INTEGRATE16 and other types of evidence to decision frameworks to accommodate evidence from mixed-method syntheses which do not set out to determine the statistical effects of interventions and in circumstances where there are no trials.

When conducting quantitative and qualitative reviews that will subsequently be combined, there are specific considerations for managing and integrating the different types of evidence throughout the review process. We have summarised different options for combining qualitative and quantitative evidence in mixed-method syntheses that guideline developers and systematic reviewers can choose from, as well as outlining the opportunities to integrate evidence at different stages of the review and guideline development process.

Review commissioners, authors and guideline developers generally have less experience of combining qualitative and evidence in mixed-methods reviews. In particular, there is a relatively small group of reviewers who are skilled at undertaking fully integrated mixed-method reviews. Commissioning additional qualitative and mixed-method reviews creates an additional cost. Large complex mixed-method reviews generally take more time to complete. Careful consideration needs to be given as to which guidelines would benefit most from additional qualitative and mixed-method syntheses. More training is required to develop capacity and there is a need to develop processes for preparing the guideline panel to consider and use mixed-method evidence in their decision-making.

Conclusion

This paper has presented how qualitative and quantitative evidence, combined in mixed-method reviews, can help understand aspects of complex interventions and the systems within which they are implemented. There are further opportunities to use these methods, and to further develop the methods, to look more widely at additional aspects of complexity. There is a range of review designs and synthesis methods to choose from depending on the question being asked or the questions that may emerge during the conduct of the synthesis. Additional methods need to be developed (or existing methods further adapted) in order to synthesise the full range of diverse evidence that is desirable to explore the complexity-related questions when complex interventions are implemented into health systems. We encourage review commissioners and authors, and guideline developers to consider using mixed-methods reviews and synthesis in guidelines and to report on their usefulness in the guideline development process.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.
  27. 27.
  28. 28.
  29. 29.
  30. 30.
  31. 31.
  32. 32.
  33. 33.
  34. 34.
  35. 35.
  36. 36.
  37. 37.
  38. 38.
  39. 39.
  40. 40.
  41. 41.
  42. 42.
  43. 43.
  44. 44.
  45. 45.
  46. 46.
  47. 47.
  48. 48.
  49. 49.
  50. 50.
  51. 51.
  52. 52.
  53. 53.
  54. 54.
  55. 55.
  56. 56.
  57. 57.
  58. 58.
  59. 59.
  60. 60.
  61. 61.
  62. 62.
  63. 63.
  64. 64.

Footnotes

  • Handling editor Soumyadeep Bhaumik

  • Contributors JN, AB, GM, KF, ÖT and ES drafted the manuscript. All authors contributed to paper development and writing and agreed the final manuscript. Anayda Portela and Susan Norris from WHO managed the series. Helen Smith was series Editor. We thank all those who provided feedback on various iterations.

  • Funding Funding provided by the World Health Organization Department of Maternal, Newborn, Child and Adolescent Health through grants received from the United States Agency for International Development and the Norwegian Agency for Development Cooperation.

  • Disclaimer ÖT is a staff member of WHO. The author alone is responsible for the views expressed in this publication and they do not necessarily represent the decisions or policies of WHO.

  • Competing interests No financial interests declared. JN, AB and ÖT have an intellectual interest in GRADE CERQual; and JN has an intellectual interest in the iCAT_SR tool.

  • Patient consent Not required.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement No additional data are available.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.