Everyday evaluation template

The evaluation budget is too small to give serious attention to the 45 evaluation questions you are supposed to answer within four weeks? Hanneke de Bode has the solution! She has shared a long rant about the contentious power of evaluations on a popular evaluation mail server.

Hanneke contributed to a discussion about the lack of published evaluations commissioned by non-governmental organisations (NGOs). Arguably, one reason is the limited quality one can achieve with often very limited resources for smaller evaluations. She has made such a beautiful point that I am not the only one sharing this on my blog – our much-esteemed colleague Jindra Cekan is also going to spread it across her networks, with Hanneke’s kind permission. And here comes Hanneke’s 101 for small evaluations! Does that ring a bell?

Most important elements of a standard evaluation report for NGOs and their donors, about twenty days of work about 20.000 € (VAT included)

In reality, the work takes at least twice as much time as calculated and will still be incomplete/ quick and dirty because it cannot decently be done within the proposed framework of conditions and answering all 87 questions or so that normally figure in the ToR.

EXECUTIVE SUMMARY

The main issues in the project/ programme, the main findings, the main conclusions and the main recommendations, presented in a positive and stimulating way (the standard request from the Comms and Fundraising departments) and pointing the way to the sunny uplands. This summary is written after a management response to the draft report has been ‘shared with you’. The management response normally says:

  • this is too superficial (even if you explain that it could not be done better, given the constraints);
  • this is incomplete (even if you didn’t receive the information you needed)
  • this is not what we asked (even if you had agreement about the deliverables)
  • you have not understood us (even if your informants do not agree among themselves and contradict each other)
  • you have not used the right documents (even if this is what they gave you)
  • you have got the numbers wrong; the situation has changed in the meantime  (even if they were in your docs)
  • your reasoning is wrong (meaning we don’t like it)
  • the respondents to the survey(s)/ the interviews were the wrong ones (even if the evaluand suggested them)
  • we have already detected these issues ourselves, so there is no need to put them in the report (meaning don’t be so negative)
BACKGROUND

Who the commissioning organisation is, what they do, who the evaluand is, what the main questions for the evaluators were, who got selected to do this work and how they understood the questions and the work in general.

METHODOLOGY

In the Terms of Reference for the evaluation, many commissioners already state how they want an evaluation done. This list is almost invariably forced on the evaluators, thereby reducing them from having independent status to being the ‘hired help’ from a Temp Agency:

  • briefings by director and SMT members for scoping and better understanding
  • desk research leading to notes about facts/ salient issues/ questions for clarification
  • survey(s) among a wider stakeholder population
  • 20-40 interviews with internal/ external stakeholders
  • analysis of data/ information
  • recommendations
  • processing feedback on the draft report
DELIVERABLES

In the Terms of Reference, many commissioners already state which deliverable they want and in what form:

  • survey(s)
  • interviews
  • round table/ discussion of findings and conclusions
  • draft report
  • final report
  • presentation to/ discussion with selected stakeholders
PROJECT/PROGRAMME OVERVIEW

Many commissioners send evaluators enormous folders with countless documents, often amounting to over 3000 pages of uncurated text with often unclear status (re. authors, purpose, date, audience) and more or less touching upon the facts the evaluators are on a mission to find. This happens even when the evaluators give them a short list with the most relevant docs (such as grant proposal/ project plan with budget, time and staff calculations, work plans, intermediate reports, intermediate assessments and contact lists). Processing them leads to the following result:

According to one/ some of the many documents that were provided:

  • the organisation’s vision is that everybody should have everything freely and without effort
  • the organisation’s mission is to work towards having part of everything to not everybody, in selected areas
  • the project’s/ programme’s ToC indicates that if wishes were horses, poor men would ride
  • the project’s/ programme’s duration was four/ five years
  • the project’s/ programme’s goal/ aim/ objective was to provide selected parts of not everything to selected parts of not everybody, to make sure the competent authorities would support the cause and enshrine the provisions in law, the beneficiaries would enjoy the intended benefits, understand how to maintain them and teach others to get, enjoy and amplify them, that the media would report favourably on the efforts, in all countries/ regions/ cities/ villages concerned and that the project/ programme would be able to sustain itself and have a long afterlife
  • the project’s/ programme’s instruments were fundraising and/ or service provision and/ or advocacy
  • the project/ programme  had some kind of work/ implementation plan

FINDINGS/ ANALYSIS

This is where practice meets theory. It normally ends up in the report like this:

Due to a variety of causes:

  • unexpectedly slow administrative procedures
  • funds being late in arriving
  • bigger than expected pushback and/ or less cooperation than hoped for from authorities- competitors- other NGOs- local stakeholders
  • sudden changes in project/ programme governance and/ or management
  • incomplete and/ or incoherent project/ programme design
  • incomplete planning of project/ programme activities
  • social unrest and/ or armed conflicts
  • Covid

The project/ programme had a late/ slow/ rocky start. Furthermore, the project/ programme was hampered by:

  • partial implementation because of a misunderstanding of the Theory of Change which few employees know about/ have seen/ understand, design and/ or planning flaws and/ or financing flaws and/ or moved goalposts and/ or mission drift and/ or personal preferences and/ or opportunism
  • a limited mandate and insufficient authority for the project’s/ programme’s management
  • high attrition among and/ or unavailability of key staff
  • a lack of complementary advocacy and lobbying work
  • patchy financial reporting and/ or divergent formats for reporting to different donors taking time and concentration away
  • absent/ insufficient monitoring and documenting of progress
  • little or no adjusting because of absent or ignored monitoring results/ rigid donor requirements
  • limited possibilities of stakeholder engagement with birds/ rivers/ forests/ children/ rape survivors/ people in occupied territories/ murdered people/ people dependent on NGO jobs & cash etc.
  • internal tensions and conflicting interests
  • neglected internal/ external communications
  • un/ pleasant working culture/ lack of trust/ intimidation/ coercion/ culture of being nice and uncritical/ favouritism
  • the inaccessibility of conflict areas
  • Covid

Although these issues had already been flagged in:

  • the evaluation of the project’s/ programme’s first phase
  • the midterm review
  • the project’s/ programme’s Steering Committee meetings
  • the project’s/ programme’s Advisory Board meetings
  • the project’s/ programme’s Management Team meetings

very little change seems to have been introduced by the project managers/ has been detected by the evaluators.

In terms of the OECD/ DAC criteria, the evaluators have found the following:

  • relevance – the idea is nice, but does it cut the mustard?/ others do this too/ better
  • coherence – so so, see above
  • efficiency – so so, see above
  • effectiveness – so so, see above
  • impact – we see a bit here and there, sometimes unexpected positive/ negative results too, but will the positives last? It is too soon to tell, but see above
  • sustainability – unclear/ limited/ no plans so far
OVERALL CONCLUSION

If an organisation is (almost) the only one in its field, or if the cause is still a worthy cause, as evaluators you don’t want the painful parts of your assessments to reach adversaries. This also explains the vague language in many reports and why overall conclusions are often phrased as:

However, the obstacles mentioned above were cleverly navigated by the knowledgeable and committed project/ programme staff in such a way that in the end, the project/ programme can be said to have achieved its goal/ aim/ objective to a considerable extent.

RECOMMENDATIONS

Most NGO commissioners make drawing up a list of recommendations compulsory. Although there is a discussion within the evaluation community about evaluators’ competence to do precisely that, many issues found in this type of evaluation have organisational; not content; origins. The corresponding recommendations are rarely rocket science and could be formulated by most people with basic organisational insights or a bit of public service or governance experience. Where content is concerned, many evaluators are selected because of their thematic experience and expertise, so it is not necessarily wrong to make suggestions.

They often look like this:

Project/ programme governance
  • limit the number of different bodies and make remit/ decision making power explicit
  • have real progress reports
  • have real meetings with a real agenda, real documents, real minutes, real decisions and real follow-up
  • adjust
  • communicate
Organisational management
  • consult staff on recommendations/ have learning sessions
  • draft implementation plan for recommendations
  • carry them out
  • communicate
Processes and procedures
  • get staff agreement on them
  • commit them to paper
  • stick to them – but not rigidly
  • communicate

Obviously, if we don’t get organisational structure and functioning, programme or project design, implementation, monitoring, evaluation and learning right, there is scant hope for the longer-term sustainability of the results that we should all be aiming for.