Evaluation methodological approach > EN: Geographic, thematic and other complex evaluations > Desk phase - Desk study stage (1b)

Desk phase - Desk study stage (1b)


This section is structured as follows:

Each step of the desk phase is described according to the respective role of :
evaluation manager The evaluation manager
external evaluation team The external evaluation team
In this stage, the evaluation team carries on with the consultation of available documents and with its interviews with managers at the Commission's head office and in the partner country or countries. This is by no means a collection or examination of all available information. On the contrary, the evaluation team constantly focuses on its information research with a view to answering the questions, at least partially, as a start. 

The tasks during this stage are carried out by people who have expertise corresponding to the questions addressed. 

This stage also allows for finalising the methodological design and developing the tools envisaged for the field phase.



Documentary analysis


Consultation of available documents relating to the whole intervention or to its main components:

  • Relevant documents issued by the Council and the Parliament.
  • Preparatory documents, ex ante evaluation.
  • Programming documents (in addition to the documents already examined in the previous stages).
  • Decisions related to implementation and/or modification.
  • Reviews, audit reports, evaluation reports.

Consultation of documents relating to the intervention context:

  • Context indicators concerning the partner country or countries.
  • Documents on the partner country or countries' development strategy.
  • OECD statistical records on the assistance received by the partner country or countries.
  • Documents relating to interventions by other donors and international institutions in connection with the partner country or countries.
  • Abstracts of evaluation reports from the various donors for the partner country or countries.

Consultation of documents on the actions to be examined in-depth. Such documents are identified by means of data bases or interviews. For each action under examination, the documents to be gathered deal with:

  • Design and decision.
  • Implementation and monitoring.
  • Evaluation.



Interviewing managers


These interviews aim at:

  • Going deeper into the analysis done in the inception report.
  • Identifying available data, assessing their reliability and having access to them.
  • Identifying elements which may contribute to answering the questions partly and the assumptions left to be tested during the field phase.

The people met are the ones who have participated in the design of the intervention, contributed to its implementation or are likely to use the evaluation report. These work either at the Commission's head office, or in the involved delegation(s), or within the partner country or countries' government(s) at central level.



Methodological design and development of tools


The methodological design envisaged in the inception report is finalised. The evaluation team refines its approach to each question in a design table including the sub-questions to be addressed. The tools to be developed for the field phase are then listed, with an aim to provide at least one, and preferably several information sources for answering the sub-questions. 

The evaluation tools are developed and tested as far as possible. The work may include:

  • Preparation of one or several series of interviews and the corresponding interview guidelines.
  • Terms of reference of one or several focus groups
  • Preparation and test of a questionnaire and a sampling method
  • Selection of one or several case studies and development of the associated work plan(s).

This stage mobilises the national / regional members of the evaluation team. It is more or less developed depending on the consultants' availability and skills.



First phase report (desk)


The first phase report (desk) takes up the points dealt with in the inception report and goes into as much detail as necessary. The following elements are added:

  • Final indicators suggested for each criterion.
  • Progress of the collection of data available at the Commission: method used limitations, biases and risks, pending problems to be solved throughout the field phase.
  • First analysis of the data in relation to the evaluation questions and partial answers to the questions, with the assumptions (a few per question, if necessary) yet to be tested during the field phase.
  • Presentation of the data allowing for clarification of the global issues, over and above individual evaluation questions, with a view to making an overall assessment.
  • Data collection and analysis strategy for the following steps:

      - Main data collection problems to be solved bearing in mind the data already available.
      - Field phase work plan.
      - Data collection tools to be used and the associated risks or limitations. Envisaged cross-checking. The way in which harmonisation of data collection will be ensured, especially when there are several countries to be visited.

    • The analysis strategy to be applied on the field as well as the tools to be used (together with limitations and risk analysis).

The evaluation manager checks that the contents contents of the report are right and its level of quality is appropriate. He/she sends the report to the reference group members for comments. He/she convenes and chairs a reference group meeting, if necessary. Then he/she summarises the comments received and specifies the amendments that need to be made to the report. 

He/she receives and validates the final version of the report and authorises the launching of the field phase.

Specific situation(s)

Regional level evaluation and Global sector / thematic evaluation

The report sets out the way in which data collection harmonisation is to be ensured among the various countries. 

If the engagement of the evaluation team is carried out on the basis of a budget that is limited to the desk phase, then the first phase report (desk) is accompanied by a financial proposal for the field and synthesis phases.


Check lists

Contents of first phase report (desk)



  • Origin of the evaluation.
  • Delineation of the evaluation's central scope
  • Expectations expressed in the terms of reference.
  • Evaluation process.
  • Reminder of the context in which the evaluation is undertaken.

Main text

  • Evaluation questions and explanatory comments on each question.
  • Judgment criterion or criteria relating to each question.
  • Suggested indicator(s) for each criterion.
  • Progress of the gathering of data available at the Commission's head office and at the delegation's (and at the partner country's embassy if relevant): data collection method used, limitations, biases and risks, pending problems to be solved in the field phase.
  • First analysis of information linked to the evaluation questions and first partial answers, remaining assumptions to be tested in the field phase.
  • First analysis of collected data with a view to producing an overall assessment.
  • Strategy of data collection and analysis for the following phases, work plan, data collection tools to be used, test of tools, potential risks and limitations, cross-checking to be undertaken.

Annexes (indicative)

  • Presentation of intervention logic and analysis in the form of a diagram of expected effects, and approach taken to draw up the questions, judgment criteria and indicators.
  • Terms of reference.
  • Informants met.
  • Documents used.
  • Statistical data and context indicators.
  • List of projects and programs.
  • Acronyms and abbreviations.
  • etc.

Quality of first phase report (desk)

The criteria include and complete those from the previous stage (inception report).

1. Meeting needs

This report expresses clearly and thoroughly the way in which the evaluation team understands the following points:

  • The requirements which appear in the regulatory framework of the evaluation, especially as regards accountability.
  • The expectations expressed in the terms of reference.
  • The requirements expressed by the reference group members.

The report places the evaluation into its context, in connection to the bases of the development, cooperation or foreign policies and to any other EC policy or partner country's policy. 

The proposed evaluation questions and their relevant judgment criteria reflect the identified requirements and the evaluation's intended use. They do not contain any ambiguity. 

The report provides a first part of the answers to the evaluation questions and these contributions reflect the identified requirements.

2. Relevant scope

The report delineates the evaluation's central scope as regards its temporal, geographic and regulatory dimensions. It justifies the choices made for delineating the scope. It includes an analysis of major overlaps with related policies and justifies the choice of overlaps examined. 

The themes, evaluation questions and judgment criteria reflect:

  • The results and intended impacts identified through the reconstruction of the intervention logic.
  • The various sectors, themes and instruments.
  • DAC's evaluation criteria, without leaving aside efficiency and sustainability, or coherence/complementarity and community added value.

3. Defensible design

The report describes the data collection and analysis method actually applied in the desk phase. It accounts for problems encountered and possible limitations. 

The report describes the data collection and analysis method to be applied in the field phase. It shows how this method will allow all the evaluation questions to be addressed appropriately, and an overall assessment to be produced. The choices are discussed and defended against other options. 

The method is feasible within the evaluation context. Both the risks and limitations are clearly specified. The report states the risks that would be taken if other methodological options were adopted.

4. Reliable data

Sources of qualitative and quantitative data are stated. The evaluation team provides a self-assessment of the reliability of data. Limitations to validity are clearly stated.

5. Sound analysis

The report includes a first analysis of available data with a view to answering evaluation questions, and deduces assumptions to be tested in the field. The reasoning is explicit and well-founded.

First phase meeting (desk)

This is usually the second reference group meeting.


  • Reminder of evaluation scope, intervention logic and questions defined in the previous stage.
  • Presentation of first phase report (desk) by the external evaluation team, including an assessment of existing data and first partial answers to evaluation questions on the basis of existing data.
  • Debate on judgment criteria (also called reasoned assessment criteria) and suggested indicators.
  • Debate on the desk phase data collection process.
  • Debate on assumptions to be tested during the field phase.
  • Debate on the field data collection work plan.