Evaluation in the Practice of Development

35 Pages Posted: 20 Apr 2016

Multiple version iconThere are 2 versions of this paper

Date Written: March 1, 2008

Abstract

Knowledge about development effectiveness is constrained by two factors. First, the project staff in governments and international agencies who decide how much to invest in research on specific interventions are often not well informed about the returns to rigorous evaluation and (even when they are) cannot be expected to take full account of the external benefits to others from new knowledge. This leads to under-investment in evaluative research. Second, while standard methods of impact evaluation are useful, they often leave many questions about development effectiveness unanswered. The paper proposes ten steps for making evaluations more relevant to the needs of practitioners. It is argued that more attention needs to be given to identifying policy-relevant questions (including the case for intervention)' that a broader approach should be taken to the problems of internal validity' and that the problems of external validity (including scaling up) merit more attention.

Keywords: Poverty Monitoring & Analysis, Science Education, Scientific Research & Science Parks, Population Policies, Tertiary Education

Suggested Citation

Ravallion, Martin, Evaluation in the Practice of Development (March 1, 2008). World Bank Policy Research Working Paper No. 4547, Available at SSRN: https://ssrn.com/abstract=1103727

Martin Ravallion (Contact Author)

Georgetown University ( email )

Washington, DC 20057
United States

Do you have negative results from your research you’d like to share?

Paper statistics

Downloads
806
Abstract Views
3,589
Rank
56,529
PlumX Metrics