Desk phase - Desk study stage (1b)

.

This section is structured as follows:

                                                                                                                                                 .
Each step of the desk phase is described according to the respective role of :
1420993037_em.png The evaluation manager
1421077476_1418988054_teams.png The external evaluation team
This stage may vary in length, depending on the number of documents to be analysed.

.

.

1421077476_1418988054_teams.png

Documentary analysis

.

The evaluation team gathers and analyses all available documents (secondary data) that are directly related to the evaluation questions:

  • Management documents, reviews, audits.
  • Studies, research works or evaluations applying to similar projects/programmes in similar contexts.
  • Statistics
  • Any relevant and reliable document available through the Internet.

This is by no means a review of all available documents. On the contrary, the evaluation team only looks for what helps it to answer the evaluation questions.

.

1421077476_1418988054_teams.png

Interviewing managers

.

Members of the evaluation team undertake interviews with people who are or have been involved in the design, management and supervision of the project/programme. Interviews cover project/programme management, EC services, and possibly key partners in the country or countries concerned. 

At this stage the evaluation team synthesises its provisional findings into a series of first partial answers to the evaluation questions. Limitations are clearly specified as well as issues still to be covered and assumptions still to be tested during the field phase.

.

1421077476_1418988054_teams.png

Designing the method

.

The methodological design envisaged in the inception report is finalised. The evaluation team refines its approach to each question in a design table.

Design tables per question

The first lines of the table recall the text of the question, plus a comment on why the question was asked, and a clarification of the terms used, if necessary. The table then specifies the indicators and the analysis strategy. 

The following lines develop the chain of reasoning through which the evaluation team plans to answer the question. The chain is described through a series of sub-questions which are to be answered by the evaluation team, for instance in order:

  • to inform on change in relation to the selected indicators
  • to assess causes and effects
  • to assist in the formulation of value judgements.

Sub-questions are associated with information sources and evaluation tools. 

There are usually several versions of the design table:

  • Preliminary version appended to the inception report
  • Successive draft versions prepared during the desk phase as far as the methodological design is progressively optimised
  • Final version attached to the desk report.

.

1421077476_1418988054_teams.png

Developing tools

.

The tools to be used in the field phase are developed. Tools range from simple and usual ones like database extracts, documentary analyses, interviews or field visits, to more technical ones like focus groups, modelling, or cost benefit analysis. This site describes a series of tools that are frequently used.

The evaluation toolbox

When designing its work plan, the evaluation team may usefully consult the section of these guidelines devoted to the evaluation toolbox. This guide includes specific explanations, recommendations and examples on how to select and implement evaluation tools. It also proposes a quality assessment grid specific to each tool. 

However, it must be stressed that this guide has been prepared for evaluations at higher levels (country, region, global) and might need some translation when used in the context of project/programme evaluation.

The evaluation team relies upon an appropriate mix of tools with an aim to:

  • Cross-checking information sources
  • Making tools reinforce one another
  • Matching the time and cost constraints.

Each tool is developed through a preparatory stage which covers all or part of the following items:

  • List of sub-questions to be addressed with the tool.
  • Technical specifications for implementing the tool.
  • Foreseeable risks which may compromise or weaken the implementation of the tool and how to deal with them.
  • Mode of reporting within the evaluation team and in the final report.
  • Responsibilities in implementing the tool.
  • Quality criteria and quality control process.
  • Time schedule.
  • Resources allocated.
From evaluation questions to interviews

The evaluation questions and sub-questions should not be copied and pasted into interview guides or questionnaires. 

Evaluation questions are to be answered by the evaluation team, not by stakeholders. 

The evaluation team may build upon stakeholders' statements, but only through a careful cross-checking and analysis.

.

1421077476_1418988054_teams.png

First phase report (desk)

.

The team writes a draft version of the first phase report (desk) which recalls and formalises all the steps already taken. The report includes at least three chapters:

  • A question-by-question chapter including the information already gathered and limitations if there are any, a first partial answer, the issues still to be covered and the assumptions still to be tested, and the final version of the design table.
  • An indicative approach to the overall assessment of the project/programme.
  • The list of tools to be applied in the field phase, together with all preparatory steps already taken.

If required, the evaluation team presents the work already accomplished, in a reference group meeting. The presentation is supported by a series of slides.

.

1420993037_em.png

Facilitating the access to information and following up the evaluation work

.

The evaluation manager facilitates the retrieval of any relevant document and the access to key informants in the EC and partner Government(s).
He/she receives the first phase report (desk) which recalls the steps already taken and adds the following elements:

  • Progress of the documentary analysis and limitations if there are any.
  • Definition of any unclear term.
  • Partial answers to the evaluation questions on the basis of documents.
  • Issues still to be studied and assumptions to be tested during the field phase.
  • Final methodological design including evaluation tools ready to be applied.
  • Work plan for the field phase.

The evaluation manager submits the draft report to the reference group members for consultation. If appropriate, he/she convenes and chairs a meeting where the report is presented and discussed. Comments are taken into account by the evaluation team in a final version of the report. The evaluation manager approves the report and authorises the launching of the field phase.

Approval of reports

The members of the reference group comment on the draft version of the report. All comments are collected by the evaluation manager and forwarded to the evaluation team. The team prepares a new version of the report, taking the comments into account in two distinct ways:

  • Requests for improving methodological quality are satisfied, unless there is a demonstrated impossibility, in which case full justification is provided by the evaluation team.
  • Comments on the content of the report are either accepted or rejected. In the latter instance, dissenting views are mentioned in the report.

The manager verifies that all comments have been handled properly and then approves the report.

Check lists

See for inspiration the Check lists for Geographic, thematic and other complex evaluation.

Author

FC
Former Capacity4dev Member
last update
7 December 2022

More actions