Here is the Evaluation Report... so now what do we do?
When we talk about evaluation, we often focus on its approaches and methods, how to undertake an evaluation, and the report. This is all very good and useful, but it is not the whole story. Evaluation is essentially part of a wider process of learning.
|In this week’s Voices & Views, Bridget Dillon, an Evaluation Manager at DEVCO writes a guest piece looking at how evaluation uptake can be improved.|
Many agencies have recently renewed their interest and focus on the use of the knowledge produced by evaluations. This after all, is why we evaluate in the first place. DEVCO commissioned a Study on Uptake in 2014 which generated considerable discussion within the organisation, and consensus about the importance of using evaluation knowledge. Many suggestions about ways to do this were made. In a recent interview, Henriette Geiger, Head of Unit for the Latin America and Caribbean region at DEVCO says she is 100% signed up to the key findings in the Study. She notes, ’We spend too much time on doing evaluations and not enough time on uptake; it should be the other way around’
The EU study and others recently undertaken by other agencies (eg UN, World Bank, DFID) together with our collective experience, point to the need for a strong learning culture in an organisation, sustained through corporate incentives, and continuously and publicly encouraged by Senior Management. Marcus Cornaro, DDG DEVCO wants, ‘senior management to systematically quote evaluations in major policy speeches.’ Resources must be allocated to emphasize and facilitate learning. As a step in this direction, DEVCO has recently put in place a Learning and Knowledge Management Strategy and an Evaluation Policy.
There is much each and every one of us can, and must do, to ensure knowledge from evaluations is better captured and productively utilized. In the EU, and in the field of international development, this is not academic; we owe it to the targeted recipients of assistance, and to the taxpayers who fund co-operation.
In the following video, colleagues from the European Commission and an evaluation expert share some thoughts how uptake can be enhanced.
The European Union has adopted the principle, ‘Evaluate First!’ to remind colleagues to evaluate experience first in the area/issue of their proposed intervention before they design anew. It also refers to building evaluation into the very design of an intervention. Banish any notion you may have that evaluation is something considered only at the end of an intervention – it is part and parcel of its very design. Again, approach and method are important, but they need to be actively shaped by clarity around who the users are, for what purpose the evaluation is being undertaken, and when and what decision-making process the overall evaluation findings will feed into. Key users need to determine the core information they want to know; on which questions they want to focus. Their ‘stake’ in the evaluation needs to be claimed at this early stage.
We should use the process of carrying out an evaluation to take opportunities to engage the evaluation stakeholders and key users so that they have a high level of ownership. Ownership has, time and again, proved to have a strong bearing on the subsequent utilisation of the knowledge from evaluations. Stakeholders’ views should be solicited at key stages along the way – e.g. at inception, when initial findings are presented, when draft Reports are submitted - to help nuance, give depth of perspective and accuracy to the evaluation. The conflict prevention and peace-building EU strategic evaluation (2011) is a good example. It involved a long preparation process, targeted presentations and discussion during implementation which built greater ownership, and many discussions after the report was published. The evaluation was a key element in informing subsequent guidance on EU’s role in this area of work.
It is important that evaluation is understood not as an exercise which is, ‘done to people,’; it is an organised, rigorous process which involves them, their knowledge and perspectives.
The most underplayed part of the evaluation process is that which follows the completion of the report. Many assumptions are made about ‘dissemination’ of evaluation ie whom it will reach, and what people will do with the knowledge. We tend to think everyone picks up the same messages, but they do not.
Audiences need to be targeted and evaluation messages tailored to the interests of their audiences, and presented in ways which particular audiences can ‘receive’ the messages. Large tomes of information – a common feature of an evaluation – score very badly on ‘message capture’. James McNulty (EU Delegation to Zambia) highlights from his experience that communicating evaluation knowledge in a large, often dense, report, written in the passive voice is a sure fire way to ensure messages are not heard. We need to craft the messages, or ‘translate’ them for particular audiences.
Furthermore, just getting the message across is not enough. Messages need to be actively placed or ‘brokered’ in decision-making fora by key stakeholders. Evaluation managers can be instrumental in getting messages into policy development processes. They can ensure the information is with, and understood, by appropriate people, in good time, in easily absorbed form. ‘Knowledge, Policy and Power in International Development’ – a practical guide’ (Jones et al, ODI, 2012) is a good source of further information on this.
Short summaries should be made available on every evaluation. Short means short. 1 -3 pages. No more! Think about it. Would you have the time to read more? No. Why then assume that anyone else would? Short and punchy is memorable – it suits policy makers. Longer and detailed - suits academics. Which is your audience?
Summarising key similarities and key differences between evaluations which cover similar terrain can be a very effective way of informing policy and practices. These summaries should be the stock in trade of the work of evaluation units.
The bottom line, is that knowledge sharing, improving policy and practice cannot be done at arm’s length; it is fundamentally a social business. Talking and sharing face to face, using a variety of different media to engage people – short videos, online chats, group discussions, webinars - are all at our disposal these days.
So there is a lot we can do. It is exciting, it is stimulating, and made much easier by the digital age we live in.
Nonetheless, it is up to each and every one of us to make it happen!
Gone are the days of evaluation tomes collecting dust on the shelves!
This article is published as part of an Evaluation Thematic week on capacity4dev.eu. For more information you visit the Public Group on Design, Monitoring and Evaluation, where you can view an introduction to the thematic week from Philippe Loop, Head of Unit for Evaluation at DEVCO.
Find out more about the EC and EEAS development policy in the Voices & Views: Evaluation Matters.