. |
Each step of the field phase is described according to the respective role of : |
 |
The evaluation manager |
 |
The external evaluation team |
The duration of this phase is typically a matter of weeks when the works are carried out by international experts. The time-frame can be extended if local consultants are in charge, with subsequent benefits in terms of in-depth investigation and reduced pressure on stakeholders.
|
..
|
Preparation
|
.
|
 |
The evaluation team leader prepares a work plan specifying all the tasks to be implemented, together with responsibilities, time schedule, mode of reporting, and quality requirements.
|
.
The work plan is kept flexible enough to accommodate for last minute difficulties in the field.
The evaluation team provides key stakeholders in the partner country with an indicative list of people to be interviewed, surveys to be undertaken, dates of visit, itinerary, name of responsible team members.
Interviewing and surveying outsiders
A key methodological issue is how far the project/programme objectives were achieved in terms of the benefits for the targeted group and wider impact. Achievement of objectives is therefore to be judged from the side of the beneficiaries' perceptions of benefit received, rather than from the managers' perspective of outputs delivered or results. Consequently, interviews and surveys should focus on outsiders (beneficiaries and other affected groups beyond beneficiaries) as well as insiders (managers, partners, field level operators). The work plan should clearly state the planned proportion of insiders and outsiders concerned by interviews and surveys.
Surveying outsiders may require that language and/or cultural gaps be bridged.
|
.
|
 |
The evaluation manager checks that: |
.
- Public authorities in the partner country/countries are informed of the future field work through the appropriate channel.
- Project/programme management are provided with an indicative list of people to be interviewed, dates of visits, itinerary, and names of team members.
- Logistics are agreed upon in advance.
The work plan is kept flexible enough to accommodate for circumstances in the field.
|
Specific guidance in the case of:
In the case of a multi-country programme, the country case studies allow the evaluation team to gather information on the programme at the country level. Together with the desk phase, the findings of country case studies will feed the global assessment made by the evaluation team. The work plan should make it clear that country case studies are not to be considered as stand alone evaluations. Time permitting, the first country case study can be used as a test of the methodology.
In the case of a participatory evaluation, the work plan involves a series of workshops or focus groups allowing for beneficiaries to frame the data that are being gathered.
|
|
|
 |
Initial meeting
|
.
Where relevant, the evaluation team proposes an information meeting in the country/area within the first days of the field work. The following points are covered:
- Presentation and discussion of the work plan.
- How to access data and key informants.
- How to deal with and solve potential problems.
.
|
 |
Data collection and analysis
|
.
The evaluation team implements its field data collection plan. Any difficulties which arise are immediately discussed in the team. Wherever necessary, solutions are discussed with the evaluation manager.
Ethical behaviour in collecting data
The evaluation team has a responsibility not only towards the commissioning body, but also towards groups and individuals involved with or affected by the evaluation, which means that the following issues should be taken into careful consideration:
- Interviewers should ensure that they are familiar with and respectful of interviewees' beliefs, manners and customs.
- Interviewers must respect people's right to provide information in confidence, and ensure that sensitive data cannot be traced to its source.
- Local members of the evaluation team should be left free to either endorse the report or not. In the latter case, their restricted role is clearly described in the report.
- The evaluation team should minimise demands on interviewees' time.
- While evaluation team members are expected to respect other cultures, they must also be aware of the EU's values, especially as regards minorities and particular groups, such as women. In such matters, the United Nations Universal Declaration of Human Rights (1948) is the operative guide.
- Evaluation team members have a responsibility to bring to light issues and findings which relate indirectly to the Terms of Reference.
- Evaluations sometimes uncover evidence of wrongdoing. What should be reported, how and to whom are issues that should be carefully discussed with the evaluation manager.
|
It must be clear to all evaluation team members that the evaluation is neither an opinion poll nor an opportunity to express one's preconceptions. Field work is meant to collect evidence and is as strong as possible, i.e.
- Direct observation of facts including track records, photographs, etc. (strongest).
- Statements by informants who have been personally involved.
- Proxies, i.e. observation of facts from which a fact at issue can be inferred.
- Indirect reporting on facts by informants who have not been personally involved (weakest).
Preventing and correcting biases
The evaluation team members are constantly aware of potential biases like:
- Confirmation bias, i.e. tendency to seek out evidence that is consistent with the expected effects, instead of seeking out evidence that could disprove them.
- Empathy bias, i.e. tendency to create a friendly (empathetic) atmosphere, at least for the sake of achieving a high rate of answers and a fast completion of interviews, with the consequence that interviewees make over-optimistic statements about the project/programme.
- Self-censorship, i.e. reluctance of interviewees to freely express themselves and to depart from the views of their institution or hierarchy, simply because they feel at risk.
- Strategy of interviewees, i.e. purposely distorted statements with a view to attracting evaluation conclusions closer to their opinions.
- Question-induced answers, i.e. answers are distorted by the way questions are asked or the interviewer's reaction to answers.
The evaluation team improves the reliability of data by:
- Asking open questions, which prevents confirmation bias
- Mixing positive and negative questions, which prevents empathy bias and question bias
- Constantly focusing on facts, which allows for subsequent cross-checking of data and prevents interviewees from have a strategy bias
- Promising anonymity (and keeping the promises), which prevents interviewees' self censorship.
|
.
|
 |
Follow-up
|
.
The evaluation manager facilitates interviews and surveys by any appropriate means, such as mandate letters or informal contacts within the Government. The manager is prepared to interact swiftly at the evaluation team's request if a problem is encountered in the field and cannot be solved with the help of the project / programme manager.
.
|
 |
Quality control
|
.
The evaluation team leader checks the quality of data and analyses, against quality criteria set for each tool, and against general principles like:
- Clear presentation of the method actually implemented
- Compliance with work plan and/or justification for adjustments
- Compliance with anonymity rules
- Self assessment of the reliability of data and validity of analyses.
.
|
 |
Debriefing
|
.
One or several debriefing meetings are held in order to assess the reliability and coverage of data collection, and to discuss significant findings. At least one of these meetings is organised with the reference group.
The evaluation team meets at a debriefing meeting at the end of the field phase. It undertakes to review its data and analyses, to cross-check sources of information, to assess the strength of the factual base, and to identify the most significant findings.
Another debriefing meeting is held with the reference group in order to discuss reliability and coverage of data collection, plus significant findings.
The evaluation team presents a series of slides related to the coverage and reliability of collected data, and to its first analyses and findings. The meeting is an opportunity to strengthen the evidence base of the evaluation. No report is submitted in advance and no minutes are provided afterwards.
|
Specific guidance in the case of :
In the case of a multi-country programme, the evaluation team holds a debriefing meeting in each visited country, either in liaison with or else with the participation of the Delegation. A country note is written and circulated to actors in the country in order to have a factual check.
The evaluation team extends its initial interviews in order to understand the expectations of beneficiaries and other outside stakeholders. A stakeholder analysis is performed and discussed in the inception meeting.
|
|
Check lists
|
See for inspiration the Check lists for Geographic, thematic and other complex evaluation.
|