Hostname: page-component-78c5997874-lj6df Total loading time: 0 Render date: 2024-11-05T10:58:21.623Z Has data issue: false hasContentIssue false

The Power of Evidence: Improving the Effectiveness of Government by Investing in More Rigorous Evaluation

Published online by Cambridge University Press:  26 March 2020

Rachel Glennerster*
Affiliation:
Abdul Latif Jameel Poverty Action Lab, MIT
*

Abstract

The current fiscal climate is focusing attention on the need for more efficient government. However, we have remarkably little rigorous information on which are the most cost-effective strategies for achieving common goals like delivering high quality education in deprived neighbourhoods or reducing carbon emissions. This paper argues that randomised impact evaluations can provide an effective way to generate the information needed to make government more effective. Advances in the theory and practice of running randomised evaluations mean that a wider range of questions can be answered than ever before. Elsewhere in the world, fundamental questions are being answered about how humans behave, which in turn are being used to design new policies which themselves are rigorously tested. By learning from these results, and by conducting more randomised evaluations on issues important to UK policy (both at home and abroad) it will be possible to design more effective policies and do more with less.

Type
Research Articles
Copyright
Copyright © 2012 National Institute of Economic and Social Research

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Abadie, A., Imbens, G. (forthcoming), ‘Estimation of the conditional variance in paired experiments’, Annales d'Economie et de Statistique.Google Scholar
Andrabi, T., Das, J., Ijaz Khwaja, A. (2009), ‘Report cards: the impact of providing school and child test-scores on educational markets’, Working Paper, http://www.hks.harvard.edu/fs/akhwaja/papers/RC_08Oct09Full.pdf.Google Scholar
Ashenfelter, O. (1978), ‘Estimating the effects of training programs on earnings’, The Review of Economics and Statistics, 60 (1), pp. 4757.CrossRefGoogle Scholar
Ashraf, N., Karlan, D., Yin, W. (2006), ‘Tying Odysseus to the mast: evidence from a commitment savings product in the Philippines’, The Quarterly Journal of Economics, 121(2), pp. 635–72.CrossRefGoogle Scholar
Baird, S., Hamory Hicks, J., Kremer, M., Miguel, E. (2011), ‘Worms at work: long-run impacts of child health gains’, Working Paper, http://www.economics.harvard.edu/faculty/kremer/files/KLPS-Labor_2011-07-25.pdf.Google Scholar
Banerjee, A., Banerji, R., Duflo, E., Glennerster, R., Khemani, S. (2010), ‘Pitfalls of participatory programs: evidence from a randomised evaluation in education in India’, American Economic Journal: Economic Policy, 2(1), pp. 130. (See also CEPR Working Paper No. DP6781 and NBER Working Paper No. 14311).Google Scholar
Banerjee, A., Cole, S., Duflo, E., Linden, L. (2007a), ‘Remedying education: evidence from two randomised experiments in India’, Quarterly Journal of Economics, 122(3), pp. 1235–64. (See also NBER Working Paper No. 11904 and BREAD Working Paper No. 109).CrossRefGoogle Scholar
Banerjee, A., Duflo, E., Glennerster, R. (2007b), ‘Putting a band aid on a corpse: incentives for nurses in the Indian public health care system’, Journal of the European Economic Association, 6(2–3), pp. 487500.CrossRefGoogle Scholar
Banerjee, A., Duflo, E., Glennerster, R., Kinnan, C. (2010), ‘The miracle of microfinance? Evidence from a randomised evaluation’, BREAD Working Paper No. 278.Google Scholar
Barios, T., Diamond, R., Imbens, G., Kolesar, M. (2010), ‘Clustering, spatial correlations and randomization inference’, NBER Working Paper No. 15760.Google Scholar
Beaman, L., Chattopadhyay, R., Duflo, E., Pande, R., Topalova, P. (2009), ‘Powerful women: does exposure reduce bias?’, The Quarterly Journal of Economics, 124(4), pp. 14971540.CrossRefGoogle Scholar
Björkman, M., Svensson, J. (2009), ‘Power to the people: evidence from a randomised field experiment on community-based monitoring in Uganda’, The Quarterly Journal of Economics, 124(2), pp. 735–69.CrossRefGoogle Scholar
Bruhn, M., McKenzie, D. (2008), ‘In pursuit of balance: randomization in practice in field experiments’, World Bank Research Working Paper No 4752.CrossRefGoogle Scholar
Brune, L., Giné, X., Goldberg, J., Yang, D. (2011), ‘Commitments to save: a field experiment in rural Malawi’, World Bank Working Paper No. 5748.Google Scholar
Burgess, S., Wilson, D., Worth, J. (2010), ‘A natural experiment in school accountability: the impact of school performance information on pupil progress and sorting’, The Centre for Market and Public Organisation 10/246. Department of Economics, University of Bristol, UK.Google Scholar
Chattopadhyay, R., Duflo, E. (2004), ‘Women as policy makers: evidence from a randomised policy experiment in India’, Econometrica, 72(5), pp. 1409–43.CrossRefGoogle Scholar
Choi, J., Laibson, D., Madrian, B., Metrick, A. (2001), ‘Defined contribution pensions: plan rules, participant decisions, and the path of least resistance’, NBER Working Paper 8655.CrossRefGoogle Scholar
Crépon, B., Devoto, F., Duflo, E., Parienté, W. (2011), ‘Impact of microcredit in rural areas of Morocco: evidence from a randomised evaluation’, J-PAL Working Paper, http://www.povertyactionlab.org/publication/impact-microcredit-rural-areas-morocco-evidence-randomised-evaluation.Google Scholar
Crépon, B., Duflo, E., Gurgand, M., Rathelot, R., Zamora, P. (2011), ‘Do labor market policies have displacement effects? Evidence from a clustered random experiment’, Working Paper, http://www.eea-esem.com/files/papers/EEA-ESEM/2011/1096/110120.jd.main.pdf.CrossRefGoogle Scholar
Dolton, P., O'Neil, D. (2002), ‘The long-run effects of unemployment monitoring and work-search programmes: experimental evidence from the United Kingdom’, Journal of Labor Economics, 20(2), pp. 381403.CrossRefGoogle Scholar
Duflo, E., Dupas, P., Kremer, M. (2011a), ‘Education, HIV and early fertility: experimental evidence from Kenya’, Working Paper, http://econ-www.mit.edu/files/6951.Google Scholar
Duflo, E., Dupas, P., Kremer, M. (2011b), ‘Peer effects, teacher incentives, and the impact of tracking’, American Economic Review, 101(5), pp. 1739–44.CrossRefGoogle Scholar
Duflo, E., Kremer, M., Robinson, R. (forthcoming) ‘Nudging farmers to use fertilizer: theory and experimental evidence from Kenya’, America Economic Review (see also NBER Working Paper No. 15131).Google Scholar
Dupas, P. (2011), ‘Do teenagers respond to HIV risk information? Evidence from a field experiment in Kenya’, American Economic Journal: Applied Economics, 3(1), pp. 134.Google Scholar
Dupas, P., Robinson, J. (2011), ‘Why don't the poor save more? Evidence from health savings experiments’, Mimeo, UCLA.CrossRefGoogle Scholar
Gavin, L., Galavotti, C., Dube, H., McNaghten, A.D., Murwira, M., Khan, R., St. Louis, M. (2006), ‘Factors associated with HIV infection in adolescent females in Zimbabwe’, Journal of Adolescent Health, 39(4), p. 596.CrossRefGoogle ScholarPubMed
Giné, X., Karlan, D., Zinman, J. (2010), ‘Put your money where your butt is: a commitment contract for smoking cessation’, American Economic Journal: Applied Economics, 2(4), pp. 213–35.Google Scholar
Godlonton, S., Munthali, A., Thornton, R. (2011), ‘Behavioural response to information? Circumcision, information, and HIV prevention’, BREAD Working Paper No. 187.Google Scholar
Holla, A., Kremer, M. (2009), ‘Pricing and access: lessons from randomised evaluation in education and health’, in Easterly, W. and Cohen, J. (eds), What Works in Development: Thinking Big and Thinking Small, Washington D.C., Brookings Institution Press.Google Scholar
Jensen, R. (2010), ‘The (perceived) returns to education and the demand for schooling’, Quarterly Journal of Economics, 125(2), pp. 515–48.CrossRefGoogle Scholar
Karlan, D., Zinman, J. (2011), ‘Microcredit in theory and practice: using randomized credit scoring for impact evaluation’, Science, 332, 6035, June, pp. 1278–84.CrossRefGoogle ScholarPubMed
Kremer, M. (2003), ‘Randomised evaluations of educational programmes in developing countries: some lessons’, American Economic Review, 93(2), pp. 102–6.CrossRefGoogle Scholar
Kremer, M., Holla, A. (2008), ‘Improving education in the developing world: what have we learned from randomised evaluations?’, Annual Review of Economics, 1, pp. 513–42.Google Scholar
Kremer, M., Leino, J., Miguel, E., Peterson Zwane, A. (forthcoming), ‘Spring cleaning: rural water impacts, valuation, and property rights institutions’, Quarterly Journal of Economics.Google Scholar
Kremer, M., Miguel, E., Thornton, R. (2009), ‘Incentives to learn’, Review of Economics and Statistics, 91(3), pp. 437–56.CrossRefGoogle Scholar
Levitt, S., List, J. (2009), ‘Field experiments in economics: the past, the present, and the future’, European Economic Review, 53(1), pp. 118.CrossRefGoogle Scholar
Luoto, J. (2009), ‘Information and persuasion: achieving safe water behaviours in Kenya’, Working Paper available at http://emlab.berkeley.edu/~webfac/bardhan/luoto.pdf.Google Scholar
McNally, S., Westcott, S. (2011), ‘Information and educational decisions’, slides presented at J-PAL Europe's “Social Experimentation in Action” conference on 18 May, http://www.povertyactionlab.org/sites/default/files/documents/5%20Presentation%20Sandra%20et%20Sarah.pdf.Google Scholar
Miguel, E., Kremer, M. (2004), ‘Worms: identifying impacts on education and health in the presence of treatment externalities’, Econometrica, 72(1), pp. 159217.CrossRefGoogle Scholar
Nguyen, T. (2008), ‘Information, role models and the perceived returns to education: experimental evidence from Madagascar’, Working Paper available at http://www.povertyactionlab.org/sites/default/files/documents/Nguyen%202008.pdf.Google Scholar
Pensions Commission (2005), A New Pension Settlement for the Twenty-First Century: The Second Report of the Pensions Commission, Norwich, The Stationery Office.Google Scholar
Savedoff, W., Levine, R., Birdsall, N. (2006), When Will We Ever Learn? Improving Lives Through Impact Evaluation, Report of the Evaluation Working Group, Washington DC, Center for Global Development.Google Scholar
Smee, C. (2005), Speaking Truth to Power: Two Decades of Analysis in the Department of Health, Abingdon, Radcliffe Publishing.Google Scholar
Soumya, H.B., Pfaffb, A., Benneara, L., Tarozzi, A., Ahmedd, K.M., Schoenfelde, A., van Geen, L. (2011), ‘Increasing gains from risk information: evidence from arsenic in Bangladesh’, Working Paper available at http://public.econ.duke.edu/~taroz/SoumyaEtAl2011Persistence.pdf.Google Scholar
White, M., Lakey, J. (1992), The Restart Effect: Evaluation of a Labour Market Programme for Unemployed People, London, Policy Studies Institute.Google Scholar