By David Coleman, Senior Education Advisor at the Australian Department of Foreign Affairs and Trade
I recently took part in a panel discussion titled ‘Achieving impact through research in international development settings’, held as part of the Building Evidence in Education (or BE2) annual meetings. The panel explored how closer links between the demand for evidence by policymakers and its supply by researchers could be forged. Discussions included the types of impact policymakers are seeking from education research and, in turn, the challenges that researchers face in achieving such impact.
It was a good opportunity to preview a document we at the Australian Department of Foreign Affairs and Trade (DFAT), together with our colleagues at the Australian Council for Educational Research (ACER) and Cardno have been developing. Dubbed a ‘Super Synthesis’ of the evidence, our ambition has been to boil down the findings from robust syntheses of the ‘what works in education’ literature, and then to synthesise their collective findings into a short, easy-to-use document.
To our relief, the feedback during the panel session and in the reception afterwards was positive, even enthusiastic. Thus reassured, we redoubled our efforts and now, our Super Synthesis is done. It is an avowedly practitioner tool, consistent with the intent of the Impact Initiative: to increase the uptake and impact of research findings, through the development of a compelling and – we hope – user-friendly tool.
We had three starting points. First, there are things that every policy maker and development partner working in the field of education want to know: what works? What are the best ways to get kids into school, keep them there, and learn? Which options have the best evidence of their effectiveness? And what are the costs?
Second, a large number of meta-analyses of ‘what works’ in education for development have been completed in recent years, pointing to the ‘best’ types of investments. David Evans and Anna Popova at the World Bank did a useful analysis of this broader literature. So, there was a rich seam to mine.
And third, Ministry officials and development practitioners are notoriously time poor. So we set about creating a tool for decision makers using this powerful evidence base.
The ‘Super Synthesis’ of the evidence draws from 18 systematic reviews, meta-analyses and comparative reviews of ‘what works’ in education for development. Collectively, these reviews bring together the key findings from more than 700 rigorous studies and their supporting research. By condensing this vast literature into an operational guideline, the Super Synthesis identifies which interventions have the greatest impact on student learning and education participation in developing country contexts.
At the heart of the Super Synthesis is a two-page ‘evidence table’. The table is organised in the following way:
- There are seven ‘Domains’ (e.g. infrastructure, economic incentives and teacher workforce)
- Under each ‘Domain’, a series of ‘Intervention Types’ are identified (e.g. new buildings, cash transfers, HR reforms for teachers). A total of 39 ‘Intervention Types’ are identified
- For each ‘Intervention Type’, the evidence of impact on student participation and education quality (student learning outcomes) is rated on a four point scale
- For each ‘Intervention Type’, the evidence on relative costing is identified by the point of investment (e.g. per school; per student), and on a three-point costing scale.
- The ‘evidence table’ is supported by brief information on the methodology, strengths and caveats, a discussion of system-level investments and a complete reference list.
In essence, the Super Synthesis is designed to be a decision maker’s friend. It groups the evidence visually to enable decision makers – national governments, development partners and involved stakeholders – to easily assess possible interventions by the level of impact on participation and student learning outcomes, and the likely associated costs. This Super Synthesis cannot provide all the answers, but it is our hope that it will assist in asking informed questions about what may work best in a given country context, underpinned by powerful evidence.