Belcher assesses research effectiveness

  Public
By: 
bbelcher
Belcher assesses research effectiveness

Prof. Brian Belcher discusses his recent conference in Nairobi, where he presented his work on assessing research effectiveness on the impacts of international agricultural research: 

I attended an international conference on "Impacts of international agricultural research: Rigorous evidence for policy" in Nairobi, Kenya, July 6 - 8, 2017. The objectives of the conference were to present and discuss rigorous evidence from recent studies on how, and to what degree, agricultural research has contributed to reduced poverty, improved nutrition and health, and improved natural resource management. There were approximately 200 participants, mainly scientists and research managers from the international research centres of the CGIAR, a major international research consortium on agriculture and natural resources management and academic researchers. The conference was also concerned with the methodological challenges of assessing impacts of research and the political challenges of using evidence in decision making.

There was a strong focus on the impacts of technologies, and many of the studies presented used quantitative, primarily econometric, impact assessment (IA) methods. These IA approaches can be very useful when research knowledge is embedded within a technology, such as an improved management regime or, especially, an improved crop variety. With these kinds of concentrated technological interventions, it is feasible to conduct experimental or quasi-experimental comparisons of yields, income, soil quality, micro-nutrient intake, soil quality, and other variables of interest, with and without the innovation. This kind of impact assessment has long been considered the “gold standard” in international agricultural research.

However, there is increasing recognition of limits to this approach. At a technical level, there was a lively debate in a session entitled “Why don’t farmers know which varieties they are growing?” about how to reliably identify what crop varieties have been adopted and are being used in farmers’ fields. More fundamentally, Professor Nancy Cartwright, a philosopher of science, gave a provocative presentation in the opening plenary about the limitations of randomized control trials (RCTs). Her presentation emphasized the difficulties of predicting social effects of natural science and pointed out that, while RCTs have strong internal validity, they have weak external validity and offer little information about how change happens. To improve understanding you need context-specific knowledge, mixed methods, mixed teams and a good way to sum it all up. The conference agenda did reflect a growing appreciation of the importance of social learning and social processes in change. There were, for example, sessions on “Household and intra-household decision-making on technology adoption,” on “Improving food security through capacity building in research and information” and on “Gender dimensions of technology scaling."

I was invited to present a paper in a session on “Tracing out causal pathways for the impact of policy-oriented research”. My presentation on "Assessing whether and how research contributes to change: a theory based approach” explained the outcome evaluation approach we have developed and empirically tested using a theory of change. This qualitative, mixed-methods approach helps to address some of the key challenges for assessing research impacts pointed out in the conference and adds a strong learning element. A recent article describes the method and lessons learned.