Elsevier

Public Health

Volume 124, Issue 2, February 2010, Pages 99-106
Public Health

Original Research
Synthesizing diverse evidence: the use of primary qualitative data analysis methods and logic models in public health reviews

https://doi.org/10.1016/j.puhe.2010.01.002Get rights and content

Abstract

Objectives

The nature of public health evidence presents challenges for conventional systematic review processes, with increasing recognition of the need to include a broader range of work including observational studies and qualitative research, yet with methods to combine diverse sources remaining underdeveloped. The objective of this paper is to report the application of a new approach for review of evidence in the public health sphere. The method enables a diverse range of evidence types to be synthesized in order to examine potential relationships between a public health environment and outcomes.

Study design

The study drew on previous work by the National Institute for Health and Clinical Excellence on conceptual frameworks. It applied and further extended this work to the synthesis of evidence relating to one particular public health area: the enhancement of employee mental well-being in the workplace.

Methods

The approach utilized thematic analysis techniques from primary research, together with conceptual modelling, to explore potential relationships between factors and outcomes.

Results

The method enabled a logic framework to be built from a diverse document set that illustrates how elements and associations between elements may impact on the well-being of employees.

Conclusions

Whilst recognizing potential criticisms of the approach, it is suggested that logic models can be a useful way of examining the complexity of relationships between factors and outcomes in public health, and of highlighting potential areas for interventions and further research. The use of techniques from primary qualitative research may also be helpful in synthesizing diverse document types.

Introduction

Public health policy is increasingly based on summaries of information collated through systematic reviews of the literature.1 Systematic review methods developed by the Cochrane Collaboration2 and the National Institute for Health and Clinical Effectiveness (NICE)3 have explored questions regarding the effectiveness of clinical interventions, and have consequently given preference to quantitative studies. Public health, however, may offer particular challenges to the conventional systematic review method due to the nature of the evidence available and the complexity of the interventions.4, 5

A systematic review endeavours to use transparent and replicable methods to identify, evaluate and interpret available evidence to address a research question. A review will define inclusion and exclusion criteria, include an examination of study quality, and will often synthesize findings into evidence statements.5, 6 The quality of the evidence included is assessed according to the study design, conduct and analysis.1 Reviewers set the minimum quality standard for evidence that will be considered, based on the conventional hierarchy of design that places experimental studies and, in particular, randomized controlled trials at the top. These study design hierarchies, however, are problematic in areas of research such as public health, with its preponderance of non-trial evidence exploring wider issues such as how interventions work, patients' experiences, or how public health can be improved and health inequalities reduced.7, 8 In addition to these issues, many areas of study lack research of sufficient quality or quantity on a topic to contribute to a meaningful systematic review.9

In recognition of these limitations, there has been increasing interest in developing review methods to incorporate diverse types of evidence including qualitative research.7, 10, 11 Conventional systematic reviews have been criticized on a number of grounds, including: they provide a lack of context for social interventions12; they are of limited use to policy makers, practitioners and other groups due to the lack of studies available8; they exclude important work12; and they lack consideration of feasibility and implementation. Widening the types of evidence included in a review may help to overcome these criticisms.

As the potential for different types of evidence to make a contribution to a review has been explored, methods for the synthesis of qualitative research have expanded.13 Approaches such as ‘qualitative meta-synthesis’14 are being increasingly applied in a wide variety of areas.15, 16 Researchers in the area caution, however, that approaches to qualitative synthesis of secondary research need to be further developed to be just as explicit as methods in primary research,9 and that forms of data extraction used for this type of study require further improvement and evaluation.10, 11 Whilst it is argued that the benefit of including diverse study types in a review is to provide context for interventions and explanations for their effects,17 the integration of different types of data in the same review remains a key challenge.17 In some reviews, different types of evidence are given different weighting or are used to answer different sub-questions. Alternatively, it has been suggested that qualitative evidence could be used to refocus the outcome of the quantitative synthesis.18

In addition to these challenges associated with the incorporation of diverse evidence types, public health reviews examine interventions that are often complex. This may be associated with the characteristics of the intervention or study populations, or may be a result of examining multi-factorial outcomes rather than a causal chain between an agent and an outcome that is relatively short and simple.4, 19 There may be long and complex causal pathways that are subject to effect modifications and variation between settings, thus creating considerable challenges for reviews to link public health interventions to outcomes.19

It has been suggested that conceptual models (logic models) could prove useful by providing a structure for exploring these complex relationships between public health practice and outcomes.20 Logic models (also known as impact models) originate from the field of programme evaluation, and are typically diagrams or flow charts that convey relationships between contextual factors, inputs, processes and outcomes.21 It is argued that logic models are valuable in providing a ‘roadmap’ to illustrate influential relationships and components from inputs to outcomes.20, 22 These models have been used widely in the health promotion literature to identify domains underlying best practice.23, 24, 25

The work outlined in this paper aimed to pilot a new approach to systematic review of the evidence, which had the potential to overcome these issues of study design hierarchies, limited available evidence and complex causal pathways. The method was developed with the objective of drawing on acknowledged systematic review processes, yet enabling diverse sources of evidence to be examined and synthesized, to develop an improved understanding of the processes and outcomes underpinning a complex area of public health.

Section snippets

Methods

The approach described in this paper was developed following an earlier phase of work using a conventional systematic review methodology. This review had the purpose of examining evidence relating to interventions to improve employee mental well-being in the workplace. The review identified that there was ‘insufficient evidence’ of organization-wide approaches to promoting mental well-being, and suggested that useful evidence may have been excluded because of the narrow focus of the original

Results

A revised logic model (Fig. 3) was built by the process of examining the coded data to identify core elements of the workplace and associations between elements in an iterative process. The review findings further developed and expanded the initial model, suggesting a distinction between elements of work context, work content and individual factors. Examination of the data also highlighted where authors reported that stronger potential associations between causative elements and outcomes may be

Conclusions and recommendations

In contrast to systematic reviews that offer evidence statements, or meta-analysis of quantitative data to give pooled effect sizes, the logic framework does not offer ready answers to questions of where best practice is to be found. Work aiming to develop specific guidance may benefit from having a less broad focus than that described here. However, the wider focus did provide a method of illuminating complex pathways within public health, which may then be further examined via other methods.

Acknowledgements

The authors would like to thank Simon Pickvance who contributed to the development of the revised framework.

References (35)

  • K. Roberts et al.

    Factors affecting uptake of childhood immunisation: a bayesian synthesis of qualitative and quantitative evidence

    Lancet

    (2002)
  • K. Khan et al.

    Systematic evidence to support evidence-based medicine

    (2003)
  • National Institute for Health and Clinical Excellence

    The guidelines manual

    (2009)
  • N. Jackson et al.

    Guidelines for systematic reviews of health promotion and public health interventions taskforce. The challenges of systematically reviewing public health interventions

    J Public Health

    (2004)
  • E. Waters et al.

    Evidence-based public health practice: improving the quality and quantity of the evidence

    J Public Health Med

    (2002)
  • P. Glasziou et al.

    Systematic reviews in health care

    (2001)
  • M. Goldsmith et al.

    Synthesising quantitative and qualitative research in evidence-based patient information

    J Epidemiol Commun Health

    (2007)
  • S. Oliver et al.

    An emerging framework for including different types of evidence in systematic reviews for public policy

    Evaluation

    (2005)
  • D. Gough

    Systematic research synthesis to inform the development of policy and practice in education

  • M. Dixon-Woods et al.

    Conducting a critical interpretive synthesis of the literature on access to healthcare by vulnerable groups

    BMC Med Res Meth

    (2006)
  • M. Dixon-Woods et al.

    Including qualitative research in systematic reviews: opportunities and problems

    J Eval Clin Prac

    (2001)
  • M. Pettigrew et al.

    Relevance rigour and systematic reviews

  • M. Dixon-Woods et al.

    Synthesising qualitative research: a review of published reports

    Qual Res

    (2007)
  • P. Stern et al.

    Women's health and the self-care paradox: a model to guide self-care readiness – clash between the client and nurse

    Health Care Women Int

    (1986)
  • D. Walsh et al.

    Meta-synthesis method for qualitative research: a literature review

    J Adv Nurs

    (2005)
  • M. Dixon-Woods et al.

    Integrative approaches to qualitative and quantitative evidence

    (2004)
  • Cited by (0)

    View full text