Apply for funding

The Marriage of Research and Policy

Posted 14th June 2017

The idea that public policy design should be informed by evidence is pretty much universal. But what does this ‘evidence’ consist of, and how should it inform the policy design process? Pose these questions and the picture becomes less clear.

In this post our Senior Partner for the Public Sector, Gulzar Natarajan, considers whether we need to revisit our interpretation of the ‘external validity’ of research, and the role that institutional knowledge has to play in informing policy-making.

Tim Ogden’s newly-released book on the use of Randomized Control Trials (RCTs) in development has been getting good reviews. The book is a compilation of interviews with researchers, critics, and non-government consumers of RCTs. One of the contributors, Nancy Birdsall, had this to say:

“My crude theory about the impact of research is not that research feeds into policy directly. Research kind of goes up in into the air, into the clouds. If it’s good, if it’s interesting, and perhaps especially if there is some compelling behavioral story behind it, as is often the case with an RCT, it gets circulated and cited. Eventually it filters down to World Bank operational people and donor staff and people in the ministries.”

But my own seventeen years of experience leading program implementation and policy formulation with local, state and central governments in India across a very broad spectrum of sectors makes me go even further and add:

“Sometimes, the story catches the attention of an influential policy maker(s) in a developing country by resonating with some of their, often diffusely held, prior beliefs. Or it arrives at a time when, due to a fortunate coming together of events, the story fits into the government’s narrative and also the evidence provides the bureaucratic justification to facilitate the decision. It then gets incorporated into (or part of) a program. Again sometimes, for a variety of reasons, it succeeds. And the story diffuses slowly…”

The critical point is that the evidence requires either an influential insider championor a ripe enough moment to take root. And even when this is taken care of, there is the small matter of getting the experiment to succeed in a system where successful implementation is a rarity. Finally, there are the natural uncertainties associated with any diffusion story. As can be imagined, the role of plain good fortune in this narrative is very large.

But it also offers important insights of relevance to us all trying to influence public policy.

The critical role of an insider champion or ripe enough time means that the tipping point is a function of the system preferences. In other words, the evidence can scream out as loudly as it can, but it is the demand side that counts.

If we accept this idea, then there are two points worth considering.

  1. Take research exploration as close as possible to the problem itself;
  2. Integrate evidence generation into the process of research exploration.

I explored this first question in an earlier post here in the context of Esther Duflo’s Richard T Ely Lecture. The plea was to start from the problem perceived by practitioners and, as suggested by Esther, to focus on the plumbing challenges. In this post, I’ll look to address the second issue.

The value of tapping into latent institutional knowledge

There’s no doubting that the vast body of experimental research which has been produced over the past two decades has led to some very impressive findings. But its success has also had the effect of skewing evidence generation heavily towards field experiments.

This overlooks the value of discovering institutional knowledge, hidden in communities and public systems. This type of institutional knowledge can be revealed through deep-dive problem solving using data analysis, interviews, observation, surveys, focus group discussions, and so on. And rigorous field experiments could help resolve uncertain elements of policy design which are deemed critical to its success, if required.

To take two examples I know well from direct engagement: in the case of India’s employment guarantee works program, perceptive officials knew all along that instead of making advance payments, releasing money only on submission of individual work estimates, or reimbursement after completion of the work, would reduce corruption. Similarly, the utility of independent third party audits or inspections of everything from engineering works to skill trainings to effluent emissions has been widely known for years.

However, both here and with most other similar examples, it is the so-called plumbing of implementation that is far less certain.

There are several benefits to a process like this:

  1. Fundamentally, it helps us to arrive at a robust enough version of program design, and also to identify those still uncertain program elements. The latter, most often in the program plumbing, can in turn be validated by the most appropriate method of evidence generation.
  2. A quick discovery of this pre-existing knowledge is likely to shorten the knowledge discovery cycle as well as lower its cost.
  3. Unlike the fragmented nature of knowledge that emerges from restrictive, often single-issue based experimental research, a knowledge discovery process is more likely to reveal a comprehensive understanding of the problem.
  4. This process of consultation is also likely to expose the plumbing challenges that are associated with implementation. This would in turn nudge the researcher to engage directly with the plumbing issues.
  5. The discovery of latent institutional knowledge through a consultative process would, by keeping stakeholders involved, increase the likelihood of it in turn being embraced by them – rather than the disconnect between experimental research and its audience within public systems, which prevents its ready adoption.
  6. This strategy can help develop a practical toolkit for scientific problem-solving that can be applied to reveal latent knowledge across different problems and contexts. This would be less likely to face the external validity challenge that is faced by field experiments.

Thinking about this in a real world context, how would the process of arriving at a strategy to sell cheap shampoo sachets to poor people be dramatically different from one for selling mosquito nets or chlorine tablets to the same category of people? Is the process of devising a strategy by private financial institutions to attract deposits of low income people any different from that aimed at increasing savings amongst the same people?

For that matter, is there any radical difference between a commercial strategy to prompt positive responses from the market at the “bottom of the pyramid” and one that seeks to get the same target group to respond to similar incentives for their own welfare?

The argument here isn’t that one form of evidence generation should take precedence over another. It is only an argument for starting the process of evidence generation from exploring untapped institutional wisdom through all possible approaches and validating certain elements, if required, with field experiments.

——

Gulzar Natarajan is a Senior Partner at GIF. He is a member of the Indian Administrative Service, the elite tier of
 the Indian bureaucracy, and was, before joining GIF, working in the Office of the Prime Minister of India. He has previously held leadership positions
as municipal commissioner of a city with a population of 1.5 million, as chairman and managing director of an electricity distribution company with 4.5 million consumers, as the head of the district government of Hyderabad District, and later as the vice chairman and managing director of the Infrastructure Corporation of Andhra Pradesh. In a seventeen-year career in the IAS, he has been intimately associated with the broad spectrum of development policy design and program implementation—rural and urban development, health and education, infrastructure creation and public-private partnerships, and regulatory governance and public finance.

Image rights:

Knowledge, by Rachel Sample on Flickr, used under a Creative Commons Attribution Non-Commercial No Derivatives 2.0 licence