Desk phase - Inception stage (1a)


This section is structured as follows:

Each step of the desk phase is described according to the respective role of :
evaluation manager The evaluation manager
external evaluation team The external evaluation team



Collecting basic documents


The first analysis of the intervention is done on the basis of official documents only. The Commission's services do not interfere at this point. 

Analysed documents:

  • Design and programming documents related to the intervention under evaluation.
  • Basic documents of the policies to which the intervention corresponds (development policy, co-operation policy or foreign policy)
  • Relevant documents on the partner country or countries' strategy.



Analysing the rationale of the intervention


Reconstruction of the intervention rationale:

  • Context in which the intervention has been decided upon, opportunities and constraints.
  • Needs to be met, problems to be solved and stakes to be dealt with.
  • Justification of the fact that the needs, problems or stakes cannot be dealt with more effectively within another framework (rationale).



Analysing the intervention logic


Reconstruction of the intervention logic:

  • Political priorities in which the intervention takes place.
  • Objectives, principles and priorities.
  • Categories of implemented activities.
  • Translation of objectives into expected outputs, results or impacts.
  • Presentation of the activities and expected effects as well as cause-and-effect assumptions as understood from analysed documents.
  • Comments on the intervention logic and analysis of its internal coherence.
  • Proposals for reconstructing cause-and-effect assumptions, if necessary.



Delineating the extended scope


Identification of related policies :

  • Conducted by the EC.
  • Conducted by other donors.
  • Conducted by the partner country or countries.



Consulting data bases


Consultation of the Commission's data bases and collection of available information for each individual support allocated in the framework of the evaluated intervention:

  • Identification of the support.
  • Budgetary data.
  • Progress of outputs.
  • Names and addresses of potential informants.
  • Ratings attributed through the "result-oriented monitoring" system (ROM).
  • Availability of progress reports and evaluation reports.



Pilot mission


At this point, if the nature of the evaluation is suitable, the evaluation team leader may carry out a series of interviews in the partner countries (or region) in order to:

  • Establish working relationships with the national/regional consultants.
  • Complete the collection of basic documents.
  • Draw up evaluation questions.



Proposing evaluation questions


The evaluation team prepares a first version of the evaluation questions, based on one of the following:

  • Intervention strategy analysis (mainly relevance and coherence questions).
  • Intervention logic analysis (mainly effectiveness and sustainability questions).
  • Expectations of the persons met.

This site explains how to derive a question from the intervention logic and how to write questions relating to different evaluation criteria. 

At this point, the number of proposed questions may be higher than the maximum number stated in the terms of reference. The suggested questions take the following into consideration:

  • The purpose of the evaluation, as described in the terms of reference.
  • The need to reach an overall assessment of the evaluated intervention.
  • The need not to overlook questions of efficiency and sustainability, which tend to be neglected.

Each question is subject to an explanatory comment dealing with all or some of the following points:

  • Scope of the question.
  • Clarification of terms used.
  • Way of addressing the question, possibly by including sub-questions which the evaluation team might wish to cover.
  • Potential utility of the answer.



Inception meeting


The first step leads to a discussion of the first works of the external evaluation team at the inception meeting of the reference group. The following elements are presented with slides as a visual support:

  • Objectives, principles, priorities and stakes.
  • Translation of objectives into expected effects and presentation of the intervention logic in a diagram of expected effects.
  • Analysis of the intervention logic and its internal coherence, proposal for reconstructing missing cause-and-effect assumptions.
  • Proposed list of evaluation questions and explanatory comments on each question.

The manager sends a copy of the presentation to the group members, including the distant associated members. The reference group members have one week to comment on the elements submitted to them. 

The manager sends the comments collected to the evaluation team, which takes them into consideration and finalises the set of questions. The reference group members validate the set of evaluation questions, which become an annex to the terms of reference.

Specific situation(s)

Country level evaluation

The manager contacts the delegation to facilitate a short pilot visit by the evaluation team leader to the partner country, if such a visit was planned when the evaluation team was engaged. The visit may take place immediately after the inception meeting.



Finalising and validating questions

A second version of the set of questions is drawn up in the form of a note, on the basis of the following elements:

Comments received during and after the inception meeting

Interviews with a few key people at the Commission head office and/or in the partner country or countries.

At this point, the number of questions must not exceed the maximum number stated in the terms of reference. 

The questions are validated by the reference group members, and then become an annex to the terms of reference.


Inception report


Evaluation team

The evaluation team carries on with the interviews and document analysis and digs deeper into the first pieces of work dealing with:

  • The context and the intervention rationale.
  • The intervention logic.
  • The other policies of the EC or other donors or the partner country/countries.

Every validated question is developed according to the following points:

  • Question and explanatory comment
  • Judgment criterion or criteria (also called "reasoned assessment criteria") in connection with the question
  • Possible indicators for each judgement criterion
  • Method envisaged for answering the question.

This site explains and illustrates the steps to be taken to move on from a question to a judgment criterion or several criteria and then to one or several indicators. 

At this point, the team also refines its method, in particular:

  • Likeliness of the possibility to answer each question, considering:
    • The sub-questions to be addressed
    • The availability of documents and expertise
    • The possibility of collecting data by means of appropriate tools.
  • Overall strategy envisaged for data collection and analysis, making certain that such a strategy will allow to:
    • Cross-check several types of data to answer each question, based on the triangulation principle.
    • Formulate an overall assessment beyond the answer to each question.
    • Adhere to both the budget and the schedule.
  • Detailed work plan for the collection of data available at the Commission during the following stage.
  • Countries to visit as long as the evaluation covers several countries.
  • Draft list of actions to be examined in-depth in the next stages.

A first draft of the inception report is written with the required content and submitted for quality control to an expert who is not part of the evaluation team. 

A final version taking received comments into consideration is prepared.

Evaluation manager 

Throughout the second step, the evaluation team brings the production of its inception report to a close. An official letter is drawn up to facilitate the evaluation team's contacts within the Commission and with other European institutions to be contacted. 

The inception report takes up the already validated elements, to which the following is added:

  • Reminder of the context in which the evaluation is undertaken, with particular attention paid to related policies.
  • Reminder of the evaluation questions.
  • The judgement criterion or criteria relating to each question (also called "reasoned assessment).
  • The indicators considered in relation to each criterion.
  • The method and the work plan for collecting existing data.
  • The strategy envisaged for data collection and analysis thereof.
  • In an appendix, the reconstructed diagram of expected effects and the method used to develop the questions, judgement criteria and indicators.

The evaluation manager checks that the contents of the report are right and that the level of quality is good. 

He/she asks the reference group members to comment on the report, .either by exchanging emails or at a meeting, if necessary, a week after receipt of the report, at the latest. 

The manager forwards the comments received to the evaluation team and specifies what his/her requests for modifications are. The evaluation team now has one additional week to write the final version of the report. 

The report is formally adopted by an official letter authorising the continuation of the work. 

If the countries to be visited have not been specified on the terms of reference, they should be selected as soon as possible; at the latest, at the beginning of the next step.

Specific situation(s)

Regional level evaluation and Global sector or thematic evaluation

If the countries to be visited have not been specified in the terms of reference, the choice of such countries is made as soon as possible; at the latest, at the beginning of the collection of data available at the Commission, and preferably during the preparation of the inception report.


Check lists


Inception meeting

This is usually the first reference group meeting.


  • Presentation of the evaluation's regulatory framework, its context, main users and expected uses.
  • Explanation of the reference group's role, the invited services, the group's operating rules and the involvement of distant members.
  • Presentation of the external evaluation team's first works using slides:

                - Evaluation's central scope.
                - Scope extended to related policies.
                - Intervention logic according to the official documents (" faithful logic).

                - Questions likely to be addressed by the evaluation and associated judgement criteria.

  • Discussion of proposed questions and judgment criteria by the reference group members. The evaluation questions are then validated by members of the group.
  • Debate on the priority questions.

Inception report


  • Origin of the evaluation.
  • Delineation of the evaluation's central scope.
  • Expectations expressed in the terms of reference.
  • Evaluation process.

Main text

  • Objectives, principles, priorities and stakes.
  • Translation of objectives into expected impacts and intervention logic presentation in the form of a diagram of expected effects.
  • Analysis of the intervention logic and of its internal coherence, proposal for bridging gaps in the cause-and-effect assumptions.
  • Evaluation questions and explanatory comments on each question.
  • Judgment criteria relating to each question.
  • Indicators considered for each criterion.
  • Method and work plan for the gathering of data available at the Commission.
  • Strategy for the field data collection and its analysis.



  • Documents used.
  • Terms of reference.
  • Acronyms and abbreviations.
  • etc.

Quality of the inception report

1. Meeting needs

The report explains clearly and thoroughly the way in which the evaluation team understands the following points:
  • The requirements featuring in the evaluation's regulatory framework, in particular in terms of accountability.
  • The expectations expressed in the terms of reference.
  • The requirements expressed by the reference group members.

The suggested evaluation questions and their related judgment criteria reflect the requirements identified and the intended use of the evaluation. 

Based on these various points, the report indicates the remaining points to be clarified if necessary.

2. Relevant scope

The report describes the evaluation's central scope and justifies the choices made to delineate it. It specifies the overlaps with the related policies that are going to be examined, and justifies those choices. 
The suggested evaluation questions and their related judgment criteria reflect:
  • The results and impacts identified through the reconstruction of the intervention logic.
  • The various sectors, themes and instruments.
  • DAC's evaluation criteria, bearing in mind efficiency and sustainability, as well as coherence and community added value.

3. Defensible design

The report describes the data collection and analysis method to be applied throughout the desk phase, as well as the data collection and analysis strategy intended for the field phase. 
It shows how this method will be used to appropriately address the whole set of evaluation questions, and then to produce a synthesis for the purposes of an overall assessment. The choices are discussed and defended against other options.


Experience shows that there is a tendency to use certain key terms to define the objectives, write the questions or delineate the scope without making certain that they are equally understood by everyone (for example: conducive economic environment, basic education, etc.). Misunderstandings may be hidden throughout the next phases and may arise upon the final report discussion, this having devastating effects on the evaluation quality. At this point, it is advisable to ensure that the key terms are accurately defined.


Former Capacity4dev Member
last update
7 December 2022

More actions