Implementation Science
Understanding and addressing complexity in public health programs
Randomized controlled trials and other quasi- experimental methods generate critically important evidence about many innovations, such as which drugs or intervention to use when a woman is referred for obstructed labor. However, these methods cannot provide all of the necessary information to implement comprehensive public health interventions in real-life settings. Implementation science can help to fill this gap.
Implementation science is the close study of implementation. It involves studying the process of introducing, institutionalizing, and sustaining policies, programs, and activities in complex settings. By their very nature, experimental methods fail to answer questions that may be decisive in shaping implementation. Experimental methods seek to control the context in which interventions are tried. However, when interventions are implemented, local political, social, programmatic, and cultural variables, such as the supply chain and the social class of providers and patients can be significant.
Our Contribution
AMDD is committed to promoting equity and universal coverage. Achieving meaningful progress in these areas requires acknowledging and understanding deep, systemic problems that can undercut the best-laid programs and policies.
For example, when the system for assigning posts and granting transfers to health workers and administrators is inconsistent or non-transparent, clinics can be left without key staff, which in turn foments apathy among workers and distrust among patients. Other examples of problems that might benefit from implementation science are the high, out-of-pocket costs that even the poorest must pay to access “free” services, or the disrespect and abuse directed at poor women while delivering their babies. Such dynamics ultimately shape the fate of evidence- based clinical interventions and globally-endorsed “best practices.”
To understand those dynamics, AMDD has undertaken a series of implementation research projects to extract broadly applicable implementation lessons.
AMDD Director Lynn Freedman also authored a comment on implementation and aspiration gaps in the Lancet Maternal Health 2016 Series. Read more on our News page.
Download AMDD's Implementation Science Brief.
Realist Evaluation
An important implementation science tool
Within the field of implementation science, AMDD uses realist evaluation as a tool to understand what works, for whom, in what setting, and why. Realist evaluation uses the basic logic of “theory-driven inquiry” that is often used in the social sciences. Realist evaluation is premised on the insight that programs, which are efforts to introduce interventions into a service delivery system, ultimately require individual actors—whether mothers, health providers, program managers, or government administrators—to make conscious decisions to change their behavior.
A realist evaluation approach therefore begins by identifying the explicit or implicit theory of change that underlies each intervention. Recognizing that individual behavior is always embedded in a larger context, realist evaluations then use qualitative methods to test and refine these program theories by exploring the complex interaction among context, mechanisms, and outcomes. The evaluations offer lessons about how particular conditions interact with program mechanisms to generate outcomes.
AMDD has utilized the realist evaluation approach to understand the maternal, newborn and child health “Manoshi” program, which is operated by the nongovernmental organization, BRAC, in urban slums of Bangladesh. The evaluation revealed several core elements that accounted for Manoshi’s success in urban Bangladesh. One of these elements was BRAC’s intentional creation of “linking social capital,” or “norms of respect and networks of trusting relationships” between people with different levels of power in society. See Urban Health section for more on this program.
In another realist evaluation, AMDD worked in partnership with the BRAC School of Public Health and the International Centre for Diarrhoeal Disease Research, Bangladesh (ICDDR,B) to conduct a review of UNICEF’s three major rural maternal and newborn health programs in Bangladesh. The research identified key program drivers and bottlenecks, as well as overarching lessons for implementation.
AMDD and partners found that “implementation support” and “implementation assessment” are indispensable and often ignored health system functions. Implementation support ensures health practitioners and administrators have access to knowledgeable support teams to help make the needed shifts in individual and organizational structures within the healthcare setting. Implementation assessment is the continual process of generating and analyzing data from the field that can feed into policymaking and implementation processes.
AMDD’s experience in implementation science, including the realist evaluations of BRAC and UNICEF programs, has shown that even when a health program is designed and implemented to focus only on biological health outcomes, its actual functioning in the lives of its stakeholders (clients, providers, and policymakers) is always a far more complex affair. Implementation science can capture some of this complexity, providing insights to be harnessed for improving the program researched, scaling up, and addressing similar maternal health challenges in different settings.
Complex Adaptive Systems
Health systems are complex adaptive systems. Among other attributes, this means that relationships among components can be non-linear. As a result, interventions do not always have the intended effect.