Changing health care provider performance through measurement
Introduction
In developing and transition countries, quality of health care services are generally low in large part because adherence to medical protocols is low (Das and Gertler, 2007, Das and Hammer, 2007, Holloway et al., 2013, Leonard and Masatu, 2007, Rowe et al., 2005) and increases in adherence are one of the most effective ways to improve outcomes and prevent childhood deaths (Black et al., 2003, Jones et al., 2003, Rowe et al., 2005). Many types of interventions to improve quality have been tested in developing countries with somewhat positive results, including improved supervision, additional training, and interventions to change the workplace culture (such as institutional and management changes and group-based techniques). Yet few of these techniques have lasting impacts or have been rolled out effectively at scale and low adherence remains. Importantly there is increasing evidence of an important know-do gap (Das and Hammer, 2007, Leonard and Masatu, 2010, Rethans et al., 1991) in which clinicians demonstrate that they know how to adhere to protocol but choose not to do so. Indeed it is possible that the short-term benefits to a wide variety of interventions and the demonstrated know-do gap are connected and demonstrate a basic Hawthorne effect: when faced with the immediate attention and scrutiny inherent in any intervention, clinicians improve adherence but adherence falls as the attention diminishes.
This paper examines a specific program designed to extend a short-term Hawthorne effect to the medium-term by maintaining scrutiny and attention without any training, explicit supervision, institutional reforms or external rewards. The study encouraged clinicians to adhere to protocol and then returned about two weeks later to see if quality had improved. To test for the medium-term impact we designed the intervention with a follow-up at about 6 weeks after the original encouragement. As we show in this paper, we observed the opposite of what we expected: quality was only marginally higher in the short-term (the two week window) but significantly higher for the medium-term (the 8 week window). As a result of this surprising finding, we returned almost a year and half later to visit the same clinicians and found that quality was still higher than in the baseline though slightly less than at the medium-term.
This paper uses this data on the response to the program in the short, medium and long-term and shows evidence that the know-do gap is closed with increased and sustained attention and scrutiny, and that some forms of measurement, by themselves, can lead to long-term improvements in quality.
Section snippets
Methods
The data in this paper comes from 4512 patient exit interviews conducted in two different periods and over three samples of clinicians. For a four-hour window on a randomly-selected unannounced date, members of the enumeration team asked all the patients who had visited a particular clinician a series of questions about their consultation based on the symptoms that they reported. The interviews with patients followed the Retrospective Consultation Review (RCR) instrument which allows us to
Results
In this section, we first examine the three samples in order to establish that, in the baseline, they are similar. Then we compare the impact of treatment on the treated sample observed at least 18 months after they completed the initial study (the treated sample at the follow-up visit) and compare protocol adherence at this point to the baseline and comparison samples. In other words, clinicians treated in Period 1 and observed in Period 2 are compared to untreated clinicians in both Period 1
Discussion
Using a simple intervention in which health workers were told how they were expected to improve, encouraged to improve and then received regular visits to measure quality, we show that clinicians react both to direct observation and to the scrutiny implied by having quality repeatedly measured. Clinicians improved protocol adherence immediately when someone entered the room without any new training, equipment or incentives and, as soon as the peer leaves, they return to normal—a classic
Acknowledgments
This work was funded by a Maryland Agricultural Extension Station seed grant, a contract from the Human Resources for Health Group of the World Bank, (in part funded by the Government of Norway), and the Eunice Kennedy Shriver National Center for Child Health and Human Development grant R24-HD041041 through the Maryland Population Research Center.
This study was reviewed and approved by the Institutional Review Board at the University of Maryland: 08-457.
References (25)
- et al.
Effect on maternal and child health services in Rwanda of payment to primary health-care providers for performance: an impact evaluation
Lancet
(2011) - et al.
Where and why are 10 million children dying every year?
Lancet
(2003) Assessing the impact of planned social change
Eval. Program Plan.
(1979)- et al.
Money for nothing: the dire straits of medical practice in Delhi, India
J. Dev. Econ.
(2007) - et al.
How many child deaths can we prevent this year?
Lancet (London, Engl.)
(2003) - et al.
Outpatient process quality evaluation and the Hawthorne Effect
Soc. Sci. Med. (1982)
(2006) - et al.
How can we achieve and maintain high-quality performance of health workers in low-resource settings?
Lancet (London, Engl.
(2005) - et al.
Intrinsic motivations and the non-profit health sector: evidence from Ethiopia
Person. Individ. Differ.
(August 2011) - et al.
Primary health care supervision in developing countries: supervision of health services in developing countries
Trop. Med. Int. Health
(2008) - et al.
Generosity and prosocial behavior in health-care provision: evidence from the laboratory and field
J. Hum. Resour.
(2016)