The criteria include and complete those from the previous stage (inception report).
1. Meeting needs
This report expresses clearly and thoroughly the way in which the evaluation team understands the following points:
- The requirements which appear in the regulatory framework of the evaluation, especially as regards accountability.
- The expectations expressed in the terms of reference.
- The requirements expressed by the reference group members.
The report places the evaluation into its context, in connection to the bases of the development, cooperation or foreign policies and to any other EC policy or partner country's policy.
The proposed evaluation questions and their relevant judgment criteria reflect the identified requirements and the evaluation's intended use. They do not contain any ambiguity.
The report provides a first part of the answers to the evaluation questions and these contributions reflect the identified requirements.
2. Relevant scope
The report delineates the evaluation's central scope as regards its temporal, geographic and regulatory dimensions. It justifies the choices made for delineating the scope. It includes an analysis of major overlaps with related policies and justifies the choice of overlaps examined.
The themes, evaluation questions and judgment criteria reflect:
- The results and intended impacts identified through the reconstruction of the intervention logic.
- The various sectors, themes and instruments.
- DAC's evaluation criteria, without leaving aside efficiency and sustainability, or coherence/complementarity and community added value.
3. Defensible design
The report describes the data collection and analysis method actually applied in the desk phase. It accounts for problems encountered and possible limitations.
The report describes the data collection and analysis method to be applied in the field phase. It shows how this method will allow all the evaluation questions to be addressed appropriately, and an overall assessment to be produced. The choices are discussed and defended against other options.
The method is feasible within the evaluation context. Both the risks and limitations are clearly specified. The report states the risks that would be taken if other methodological options were adopted.
4. Reliable data
Sources of qualitative and quantitative data are stated. The evaluation team provides a self-assessment of the reliability of data. Limitations to validity are clearly stated.
5. Sound analysis
The report includes a first analysis of available data with a view to answering evaluation questions, and deduces assumptions to be tested in the field. The reasoning is explicit and well-founded.
|