Knowledge and evidence for policy and practice matters in any context. But critical scrutiny of the evidence to policy process is particularly important in development contexts, where knowledge is often produced or brokered by external actors. Launched today, the edited volume Social Realities of Knowledge for Development illustrates the varied and complex pathways through which research, knowledge or evidence may (or may not) be taken up by policymakers and practitioners.
The collection provides diverse examples of the research to policy/practice relationship -- from context-specific action research, to researchers engaging with embedded, national policy institutions or global processes. The central message that emerges across different contexts is that social relations rather than the ‘technical’ aspects of evidence are the critical factor in influence or uptake. This is not to argue against the value of good evidence: rather that good evidence alone is generally not enough.
Acknowledging the social context and content of evidence
Such an argument should not surprise many in the large community of knowledge producers, brokers and users operating at the research–policy interface. Shifts in ideas about what type of research or evidence is useful for development have seen externally imposed models and theory-based policy prescriptions challenged by stronger attention to participation and the value of local knowledge, and to co-production processes which engage key stakeholders in knowledge generation.
Paradoxically, however, greater acknowledgement of the social process involved in bridging gaps between knowledge producers and users is now often accompanied by a loss of social content in the forms of knowledge that are most highly valued as evidence. What constitutes good evidence has increasingly been defined by a particular set of claims to scientific rigour; methodological advances have moved the field towards clinical-style trials and quantitative experimental methods (although not without pushback), often accompanied by claims to value-free objectivity but at the expense of attention to messy, contested, complex social realities.
This tension plays out within many development organisations as demonstrated in this collection which brings valuable insights from a number of ESRC-DFID funded research projects, and from a wide range of organisations including MSF, Oxfam, Practical Action, the Overseas Development Institute, the African Population and Health Research Centre and Makerere University. A welcome commitment to rigorous evidence and data as a basis for policy and programming is increasingly demonstrated by such development organisations and operational agencies. As the chapter on ‘How collaboration, early engagement and collective ownership increase research impact’ by Mike Wessells and colleagues demonstrates with reference to a UNICEF programme, this can have impressive results when the right actors are aligned.
Avoiding a narrow view of evidence as ‘what works’
The risk, however, is that a relatively narrow or instrumental view of evidence of ‘what works’ for programming and for delivering results within a defined time frame is prioritised over other forms of knowledge. Undervalued evidence may include qualitative research findings, or research with less immediate or practical application but which may nonetheless be relevant for framing and guiding policy choices, or to support scaling up, transferability and institutionalisation of interventions. All are of course necessary and complementary, but may compete for resources and space in the discourse. While acknowledging a growing body of mixed methods and transdisciplinary work that aims to rise above such critiques, the current evidence-based, data-driven, results focus tends towards the narrow ‘what works’ view of evidence.
Research institutions at the intersection with policy or practice, located for example within a large development agency as in the case of UNICEF’s Innocenti Research Centre, are spaces which potentially play an important role in countering the tendency towards this instrumental view of evidence. Such organisations illustrate the challenge of the ‘embedded’ institution, attempting to balance a degree of autonomy and independence of research with the needs and demands of their organisation -- as illustrated also in the case from India by Gita Sen and partners in their article: ‘Translating health research to policy: Breaking through the impermeability’.
The ‘embedded autonomy’ of such centres can be key to keeping alive the critical challenge function of research; bringing in fresh ideas and innovation, exposing blind spots and biases, or moderating pendulum swings in ideas and ideologies that may be driven by internal or external changes. Among development agencies, such centres are few and under threat – whether from tighter budgets or through the erosion of their autonomy – but their position within a trusted agency with country-level presence means that they can play a critical role in the eco-system of trusted development knowledge actors.
Fostering the knowledge to policy interface
Within such large operational agencies, as in government bureaucracies, the skills and capacities needed to use research and knowledge effectively, to move from data-driven and evidence-based decision-making to using evidence to inform choices, are often limited. Investment in such research and policy analysis capacities – particularly within national institutions in the global South – is a critical element for creating an effective knowledge–policy interface but has been largely neglected by donors.
Such a funding shift would recognise that evidence is only one among many inputs to decision-making; that policymakers need to make informed choices and act even when evidence is imperfect or data lacking; and that co-production is not always possible with the actors who can take change forward. Brokers will rarely be neutral, but will bring a particular stance and allegiance, while policymakers will also invite research and evidence around particular positions. Above all, as illustrated throughout the collection, relationships of trust create the conditions within which evidence can inform and influence.
This publication is a timely contribution to the growing critique of the more technocratic evidence- and results-based discourse of recent years, reminding us of how and by whom knowledge is constructed as evidence and used to frame and influence particular positions. In this respect, while challenging the dominant narrative of neutral data-driven evidence that drives policy and practice, it illustrates how the construction of knowledge is in itself part of the process of social change.