A $3 million book with 8 readers? The impact of donor-driven research

One of aid donors’ less discussed activities is financing research. The Global Development Network (GDN) is probably the best known effort, a World Bank-sponsored effort to promote development research by researchers in the developing world, founded in December 1999, with an annual budget now of over $9 million (roughly the same as the entire annual budget for all National Science Foundation (NSF) funding of all economics research). In GDN’s own words, it exists to “promote research excellence in developing countries.” It has attracted contributions also from many bilateral aid agencies and from the Gates Foundation. (Meanwhile, scholarship programs for Africans and individuals from other under-represented regions to achieve their own academic success are chronically under-funded.) How excellent is GDN research? It is difficult to measure, but there are two common measures of research quality in academia (which affect big things like tenure decisions, as in “publish or perish”): publication in peer-reviewed journals and citations by other publications. Because it takes a while to accumulate citations and publications, we thought we would look at papers and books produced in the early years of the GDN and see what happened with subsequent publications and citations.

Surprisingly, GDN was unable to provide us with a list of publications that had resulted from GDN-sponsored research, nor did any of several outside evaluations put together such a list. So we unleashed Aid Watch’s crack one-woman investigative team, who assembled the record on publications and citations from two types of GDN outputs: (1) the GDN’s first Global Research Project “Explaining Growth”, and (2) papers that won Global Development Awards & Medals Competitions from GDN during the three years 2000-2002.

The Explaining Growth project involved over 200 researchers from 2000 to 2005. Its 2002-2003 budget was $3 million. The publications from this project are a direct measure of GDN impact in this effort, since these publications would not have happened otherwise. The main publication was the 2003 book, also called Explaining Growth, which as of June 14, 2009 in Google Scholar had gotten 8 citations. There were other later books on explaining growth in the Commonwealth of Independent States (6 citations), Latin America (5 citations), South Asia (6 citations), and the Middle East (1 citation) – curiously, there was no book on Africa. The editors of these volumes typically had distinguished academic careers outside of GDN, with many more citations for their personal publications.

There were 51 papers that won Global Development Awards & Medals from 2000-2002, resulting from a competition involving $1 million and about 2000 researchers. We chose to focus on these to make the number of publications manageable to follow, and to focus on the “cream of the crop.” This procedure is biased towards finding the highest quality publications. It also establishes only an upper bound on the GDN impact from this set of papers, since journal publication may have happened anyway. We tracked all 51 papers and found that they resulted in 5 tenure-quality journal publications (publication in a top general interest journal or field journal). Four of the five publications were by Latin American economists, a group that had already achieved plenty of academic success before GDN came along.

Why was there so little academic success that seemed to result from GDN efforts? Perhaps a decision taken early on was partly to blame. As a 2004 World Bank evaluation put it:

One model was to promote open competition among various institutions, with funding going to the most qualified institutions that had submitted research proposals. A second model centered on pre-selected institutions within regions, which would serve as regional hubs and nominate members to the board. The second model prevailed.

Translation: they had to decide on competition by merit vs. research bureaucracy and they chose bureaucracy. As the GDN results illustrate, academic research cannot be planned by bureaucrats.