Original ResearchSynthesizing diverse evidence: the use of primary qualitative data analysis methods and logic models in public health reviews
Introduction
Public health policy is increasingly based on summaries of information collated through systematic reviews of the literature.1 Systematic review methods developed by the Cochrane Collaboration2 and the National Institute for Health and Clinical Effectiveness (NICE)3 have explored questions regarding the effectiveness of clinical interventions, and have consequently given preference to quantitative studies. Public health, however, may offer particular challenges to the conventional systematic review method due to the nature of the evidence available and the complexity of the interventions.4, 5
A systematic review endeavours to use transparent and replicable methods to identify, evaluate and interpret available evidence to address a research question. A review will define inclusion and exclusion criteria, include an examination of study quality, and will often synthesize findings into evidence statements.5, 6 The quality of the evidence included is assessed according to the study design, conduct and analysis.1 Reviewers set the minimum quality standard for evidence that will be considered, based on the conventional hierarchy of design that places experimental studies and, in particular, randomized controlled trials at the top. These study design hierarchies, however, are problematic in areas of research such as public health, with its preponderance of non-trial evidence exploring wider issues such as how interventions work, patients' experiences, or how public health can be improved and health inequalities reduced.7, 8 In addition to these issues, many areas of study lack research of sufficient quality or quantity on a topic to contribute to a meaningful systematic review.9
In recognition of these limitations, there has been increasing interest in developing review methods to incorporate diverse types of evidence including qualitative research.7, 10, 11 Conventional systematic reviews have been criticized on a number of grounds, including: they provide a lack of context for social interventions12; they are of limited use to policy makers, practitioners and other groups due to the lack of studies available8; they exclude important work12; and they lack consideration of feasibility and implementation. Widening the types of evidence included in a review may help to overcome these criticisms.
As the potential for different types of evidence to make a contribution to a review has been explored, methods for the synthesis of qualitative research have expanded.13 Approaches such as ‘qualitative meta-synthesis’14 are being increasingly applied in a wide variety of areas.15, 16 Researchers in the area caution, however, that approaches to qualitative synthesis of secondary research need to be further developed to be just as explicit as methods in primary research,9 and that forms of data extraction used for this type of study require further improvement and evaluation.10, 11 Whilst it is argued that the benefit of including diverse study types in a review is to provide context for interventions and explanations for their effects,17 the integration of different types of data in the same review remains a key challenge.17 In some reviews, different types of evidence are given different weighting or are used to answer different sub-questions. Alternatively, it has been suggested that qualitative evidence could be used to refocus the outcome of the quantitative synthesis.18
In addition to these challenges associated with the incorporation of diverse evidence types, public health reviews examine interventions that are often complex. This may be associated with the characteristics of the intervention or study populations, or may be a result of examining multi-factorial outcomes rather than a causal chain between an agent and an outcome that is relatively short and simple.4, 19 There may be long and complex causal pathways that are subject to effect modifications and variation between settings, thus creating considerable challenges for reviews to link public health interventions to outcomes.19
It has been suggested that conceptual models (logic models) could prove useful by providing a structure for exploring these complex relationships between public health practice and outcomes.20 Logic models (also known as impact models) originate from the field of programme evaluation, and are typically diagrams or flow charts that convey relationships between contextual factors, inputs, processes and outcomes.21 It is argued that logic models are valuable in providing a ‘roadmap’ to illustrate influential relationships and components from inputs to outcomes.20, 22 These models have been used widely in the health promotion literature to identify domains underlying best practice.23, 24, 25
The work outlined in this paper aimed to pilot a new approach to systematic review of the evidence, which had the potential to overcome these issues of study design hierarchies, limited available evidence and complex causal pathways. The method was developed with the objective of drawing on acknowledged systematic review processes, yet enabling diverse sources of evidence to be examined and synthesized, to develop an improved understanding of the processes and outcomes underpinning a complex area of public health.
Section snippets
Methods
The approach described in this paper was developed following an earlier phase of work using a conventional systematic review methodology. This review had the purpose of examining evidence relating to interventions to improve employee mental well-being in the workplace. The review identified that there was ‘insufficient evidence’ of organization-wide approaches to promoting mental well-being, and suggested that useful evidence may have been excluded because of the narrow focus of the original
Results
A revised logic model (Fig. 3) was built by the process of examining the coded data to identify core elements of the workplace and associations between elements in an iterative process. The review findings further developed and expanded the initial model, suggesting a distinction between elements of work context, work content and individual factors. Examination of the data also highlighted where authors reported that stronger potential associations between causative elements and outcomes may be
Conclusions and recommendations
In contrast to systematic reviews that offer evidence statements, or meta-analysis of quantitative data to give pooled effect sizes, the logic framework does not offer ready answers to questions of where best practice is to be found. Work aiming to develop specific guidance may benefit from having a less broad focus than that described here. However, the wider focus did provide a method of illuminating complex pathways within public health, which may then be further examined via other methods.
Acknowledgements
The authors would like to thank Simon Pickvance who contributed to the development of the revised framework.
References (35)
- et al.
Factors affecting uptake of childhood immunisation: a bayesian synthesis of qualitative and quantitative evidence
Lancet
(2002) - et al.
Systematic evidence to support evidence-based medicine
(2003) The guidelines manual
(2009)- et al.
Guidelines for systematic reviews of health promotion and public health interventions taskforce. The challenges of systematically reviewing public health interventions
J Public Health
(2004) - et al.
Evidence-based public health practice: improving the quality and quantity of the evidence
J Public Health Med
(2002) - et al.
Systematic reviews in health care
(2001) - et al.
Synthesising quantitative and qualitative research in evidence-based patient information
J Epidemiol Commun Health
(2007) - et al.
An emerging framework for including different types of evidence in systematic reviews for public policy
Evaluation
(2005) Systematic research synthesis to inform the development of policy and practice in education