30.06.2013 Views

Chapter 6 - Baobab

Chapter 6 - Baobab

Chapter 6 - Baobab

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

BAOBAB: Handbook on Implementation Management – 6. Evaluation 1<br />

6. Evaluation<br />

6.1 What to expect in this chapter? An overview of topics<br />

The Core issues part of this chapter covers:<br />

• a definition of evaluation, and the difference between an evaluation<br />

from within or from the outside<br />

• how these types of evaluation fit into the implementation process<br />

• choices in evaluation: performance evaluation and impact evaluation<br />

• who we do evaluations for, and,<br />

• who does them.<br />

Under tools, internal evaluation (self-evaluation, SWOT analysis) and external<br />

evaluation (questions around commissioning an evaluation, participatory<br />

evaluation and working with an evaluator) tools are presented.<br />

6.2 Core issues<br />

6.2.1 What is an evaluation ?<br />

Basically, evaluation is about assessing<br />

what is (the actual /<br />

current situation)<br />

in relation to<br />

what was intended (the<br />

plans, targets, objectives)<br />

The process becomes complex when one has to consider<br />

different expectations, agendas, objectives and the related achievements,<br />

the lapse of time between objectives setting and achievement (or: how over<br />

ambitious one might have been when setting objectives),<br />

and the different views of reality (past and present) of different people<br />

involved.<br />

Because of this complexity, methods for evaluating have been designed to<br />

organise and manage the evaluation process, so that it is understood and<br />

transparent (and is then acceptable), and useful to all those concerned.<br />

Usefulness is important, as evaluation is essentially a management tool, a<br />

mechanism for learning about a project, programme, an enterprise or institution:


BAOBAB: Handbook on Implementation Management – 6. Evaluation 2<br />

Are you doing things right?<br />

(for example: with respect to your intentions or plans)<br />

Are you doing the right thing?<br />

(for example: in terms of real impact, policy and current<br />

thinking)<br />

The information gathered to answer these basic questions assists in decisionmaking<br />

about the ongoing management of a project, its life or its closure. The<br />

process can be considered “diagnostic and therapeutic”: it can result in changing<br />

activities, re-allocating resources, re-planning, shifting focus, expanding or<br />

rightsizing operations, organisational mergers, or even in closing down<br />

altogether.<br />

A definition to consider...<br />

An evaluation is a process to enable a<br />

project to examine its effectiveness<br />

and efficiency after a reasonable<br />

period of implementation.<br />

The purpose of an evaluation is to<br />

affirm the project’s work (plans,<br />

activities, impact), or to transform<br />

it, so that the project can<br />

effectively deliver, make an impact<br />

and meet the challenges it faces.<br />

Questions:<br />

What is effectiveness?<br />

What is efficiency?<br />

What does effectiveness<br />

translate into in your project?<br />

What does efficiency translate<br />

into in your project?


BAOBAB: Handbook on Implementation Management – 6. Evaluation 3<br />

6.2.2. Evaluation and project cycle management<br />

An evaluation event is distinct from the ongoing monitoring of activities, resources, outputs, and even impact of a project.<br />

Monitoring involves gathering information for day-to-day decision-making and management. Evaluation focuses on in-depth<br />

reflection at a switch-point considered significant in various life stages of a project; (this could be at the end of a work<br />

phase, whenever special circumstances demand it, for example a crisis), or at the end of a project itself.<br />

Evaluation asks: How have we done? after a project period is completed<br />

Monitoring asks: How are we doing? whilst a project phase is running<br />

inform plan motivate steer inform<br />

project<br />

idea<br />

Identification Phase<br />

set objectives<br />

decide<br />

• understand actual<br />

situation<br />

− problems, actors, views<br />

− objectives, visions<br />

− potentials<br />

• establish system of<br />

objectives<br />

project<br />

objectives<br />

Design Phase<br />

organize<br />

control<br />

• establish project<br />

strategy<br />

• agree on structural setup<br />

− organisational<br />

− financial<br />

• prepare decision to<br />

implement the project<br />

project plan<br />

implement<br />

monitor<br />

evaluate<br />

implement<br />

Phases of Implementation<br />

• operationalise planning<br />

• organise internal<br />

structure and external<br />

relations<br />

• assess implementation<br />

through Monitoring<br />

• evaluate<br />

monitor<br />

evaluate<br />

end of<br />

project<br />

• make required adjustmen<br />

• report<br />

• re-plan<br />

• terminate project


environment<br />

BAOBAB: Handbook on Implementation Management – 6. Evaluation 4<br />

6.2.3. Levels of evaluation: performance and impact<br />

We repeat something already presented in previous chapters:<br />

Clients / customers / target<br />

groups / beneficiaries /<br />

members<br />

Outputs /<br />

Services<br />

Activities<br />

Resources:<br />

Personnel – material<br />

-finances<br />

Inputs<br />

Commissioning<br />

agency / donor /<br />

members<br />

Task<br />

Frame conditions<br />

Information<br />

Management<br />

decisions<br />

Strategic planning<br />

monitoring<br />

organising<br />

Planning operations<br />

replanning<br />

E<br />

V<br />

A<br />

L<br />

U<br />

A<br />

T<br />

I<br />

N<br />

G<br />

impact<br />

performance


BAOBAB: Handbook on Implementation Management – 6. Evaluation 5<br />

Please refer to the above diagram, parts of which you will have seen in<br />

previous chapters. On the right-hand side under “evaluation” terms<br />

commonly used by professional evaluators are related to the project system<br />

we have introduced here, namely evaluating cost-effectiveness, benefits in<br />

relation to cost (cost-benefit), impact evaluation and performance<br />

evaluation. Performance and impact evaluation are most common and any<br />

comprehensive evaluation includes both:<br />

Performance: are all resources being used efficiently?<br />

• can you do better what we are doing?<br />

• is your staff competent and efficient ?<br />

• can you produce more / better goods?<br />

• can you deliver more / better services?<br />

• can you improve on quality?<br />

Impact are the project beneficiaries using what we<br />

deliver?<br />

• are the intended beneficiaries benefiting, or someone<br />

else?<br />

• are we making the impact we set out to make?<br />

• are our goods/ services responding to the community’s<br />

need?<br />

At all levels of evaluation appropriate measurement is necessary,<br />

for example:<br />

How much …? Over what time period? At what cost?<br />

How many people used our services? Over what<br />

period of time?<br />

How many people benefited? Over short-, medium-,<br />

and long term?<br />

What was the impact on the benefiting community?<br />

Was the impact different among different groups?<br />

6.2.4 Internal and external evaluation


BAOBAB: Handbook on Implementation Management – 6. Evaluation 6<br />

Internal evaluation involves a self-reflection or self-evaluation process, and<br />

is conducted within a project or organisation. Project staff would take on an<br />

internal evaluation as a step following (and based on) the monitoring of<br />

project processes (planning, implementation, and interaction with the<br />

external environment). It is an event during which an internal assessment of<br />

information (gathered during monitoring) happens, focusing on progress in<br />

implementation, in relation to plans and the project environment. It is<br />

conducted by project management and staff.<br />

External evaluation happens with the involvement of someone from the<br />

outside, often as a result of an external impetus, for example from a donor.<br />

It is usually initiated at a crucial point in the life of a project, for example<br />

∗ when the project results and projected benefits<br />

defined in the project plan are checked (maybe<br />

there is evidence that they cannot be attained)<br />

∗ when important frame conditions change (for<br />

example qualified and experienced staff cannot be<br />

found)<br />

∗ if a fresh look from the outside is considered<br />

necessary for a long-running project<br />

∗ when a funding phase is nearing completion and the<br />

question of re-funding (and to what extent) arises<br />

∗ when the experiences gained in a project are to be<br />

used to improve the implementation of other<br />

projects or to plan new ones.<br />

6.2.5. For whom do you evaluate?<br />

Evaluations yield information useful to many stakeholders in projects...


BAOBAB: Handbook on Implementation Management – 6. Evaluation 7<br />

o as you have seen, donors or funders require us to evaluate<br />

o partner organisations sometimes require assessments of our work<br />

o the communities we serve require reports on our work and the funding<br />

we raise in their name<br />

o the project staff need evaluations to see how they are doing, and how<br />

they can improve.<br />

Depending on for whom an evaluation is meant, the orientations and the<br />

forms of evaluation may differ.<br />

6.2.6. Who conducts evaluations?<br />

Internal evaluations take place within the project as staff and<br />

management participate in a self-reflection process without external<br />

intervention.<br />

External evaluations have (historically) been conducted as externally driven<br />

evaluations or externally facilitated evaluations.<br />

Externally driven evaluations are those arranged with external evaluators,<br />

and / or instigated - and funded - by donors.<br />

There are some potential strengths and some potential weaknesses built in<br />

to this approach, which a project team may want to make use of, or against<br />

which you may want to guard.<br />

External evaluations are conducted by individuals or teams employed on<br />

short term contracts to investigate the project and compile a report (for<br />

the donor, which may be copied to project staff).


BAOBAB: Handbook on Implementation Management – 6. Evaluation 8<br />

The evaluators often work on site, and interview project staff in depth, but<br />

the frame, structure and process of the evaluation is decided between the<br />

expert evaluators and the commissioning agency (e.g. the donor).<br />

Potential<br />

strengths:<br />

Potential<br />

weaknesses:<br />

What do you<br />

want to do?<br />

External person(s) observe internal matters.<br />

External (often international) expertise brought in to<br />

comment on a project.<br />

Evaluators can bring in new objectives or strategies not<br />

considered before.<br />

Evaluators cannot get a realistic picture of the project<br />

in a short time.<br />

Report is in the donor’s language which may differ from<br />

the project language.<br />

Donor’s questions may be answered in the evaluation but<br />

not those of the project staff.<br />

Results can be judgmental and relate to evaluators’<br />

standards of performance or impact.<br />

Evaluators get information but do not give back.<br />

Prepare carefully to introduce external persons into the<br />

project reality.<br />

Get a full and uncensored report in the project language.<br />

Make sure staff questions are included in the evaluation<br />

brief.<br />

Ensure project own objectives and strategies aare used<br />

as reference for assessments.<br />

Generally, therefore you may want to pursue the participatory evaluation<br />

approach:<br />

In externally facilitated evaluations people external to the project<br />

facilitate a participatory process in which the project stakeholders (staff,<br />

beneficiaries, partners) have a say in the frame, structure, scope and<br />

process of the evaluation, in order that the evaluation results may be of


BAOBAB: Handbook on Implementation Management – 6. Evaluation 9<br />

practical use to themselves. The external evaluators bring in external (often<br />

international) project management and technical subject (e.g. in rural small<br />

business) expertise, whilst ensuring that the voices (concerns, expertise,<br />

opinions, complaints) of all local stakeholders are heard. 1 The facilitator<br />

ensures that the process is honest and open, and is in line with appropriate<br />

current thinking in the relevant fields.<br />

1 This is known as a constructivist approach to evaluation as people are allowed to<br />

construct their own evaluation criteria, to participate in information gathering and<br />

assessment, make their own consensus judgements, and develop their own recommendations.


BAOBAB: Handbook on Implementation Management – 6. Evaluation 10<br />

6.3 Tools<br />

6.3.1 Internal evaluation<br />

One helpful tool to prepare for and focus an internal self-reflection process<br />

is the SWOT analysis, which looks at Strengths and Weaknesses that have<br />

emerged in the past, and Opportunities and Threats anticipated for the<br />

future. It can be conducted in a workshop situation, and/or among different<br />

groups ensuring that all views, opinions and realities are included in the<br />

picture.<br />

Examples of useful questions are:<br />

past future<br />

positive<br />

negative<br />

General:<br />

How did we respond to which<br />

challenges?<br />

In which areas do conflicts arise?<br />

Strengths:<br />

What have we done well?<br />

What needs have we met?<br />

What have been our strong points?<br />

What have been our assets?<br />

Weaknesses:<br />

Where have we failed?<br />

At which levels have we had problems?<br />

What have been our weak points?<br />

General:<br />

Is government policy or<br />

legislation likely to change and<br />

affect us positively or<br />

negatively?<br />

Is the funding environment<br />

likely to support our work?<br />

Opportunities:<br />

Who will need us, our services,<br />

our products?<br />

Threats:<br />

Who else will be offering the<br />

same or similar services?<br />

Who else will be producing the<br />

same products?<br />

Will we complement one another<br />

or are we rivals?


BAOBAB: Handbook on Implementation Management – 6. Evaluation 11<br />

SWOT is (only) a preparatory tool as it helps to brainstorm ideas but does<br />

not help in the assessment. A specific tool in assessment is the comparison<br />

between ideas / targets / hopes and where you are at the point in time, the<br />

comparison between the actual and the plan.<br />

This can be done in the following 7-steps-sequence:<br />

1. What did we want to achieve? What were our targets? Our hopes?<br />

Our expectations?<br />

2. What would the above have meant specifically for now, for the point<br />

in time of the evaluation?<br />

3. Where are we now? What have we really achieved? What is our actual<br />

situation? What effects did our work have that were not foreseen?<br />

4. Where have we actually progressed faster than we thought? And<br />

where are we behind in our targets / hopes / expectations? Where<br />

are we on schedule? Where are we besides it?<br />

5. What are the reasons for good progress? What are the reasons for<br />

being behind? What are the reasons for being on / behind schedule?<br />

How do we understand the deviations?<br />

6. What do we specifically learn from all of the above for each issue?<br />

What do we conclude?<br />

7. What do we want to change immediately? What do we want to do<br />

differently in the medium or long term?<br />

These seven steps should be applied to every issue in your project. Then you<br />

will get a systematic evaluation with an intention to change from learnings of<br />

the past. Obviously a table can be used to present the findings in a summary<br />

result form.<br />

1.<br />

Targets<br />

Issue A.:<br />

…<br />

Issue B.:<br />

…<br />

2.<br />

Specifics<br />

3.<br />

Actual<br />

situation<br />

4.<br />

Deviations<br />

5.<br />

Reasons<br />

6.<br />

Learning<br />

/ Conclusions<br />

7.<br />

Changes


BAOBAB: Handbook on Implementation Management – 6. Evaluation 12<br />

Issue …:<br />

…<br />

Issue …:<br />

…<br />

Issue …:<br />

…<br />

6.3.2 External evaluation<br />

External expertise to facilitate the process of an evaluation is recommended<br />

here. In this case staff, management and stakeholders of a project<br />

participate in framing and answering the evaluation questions, assessing the<br />

answers, and recommending appropriate changes.<br />

A desirable procedure for this process follows. It is basically the same as<br />

explained for the internal evaluation, only a bit more detailed. Use it as a<br />

basis in discussions with an evaluation facilitator, and to track his/her<br />

process.<br />

Steps<br />

1. Announcing the evaluation and at the same time affirming the type of<br />

evaluation process: Who is to be informed, actively involved, and


BAOBAB: Handbook on Implementation Management – 6. Evaluation 13<br />

how? Are the principles of participation and external facilitation<br />

clear to all concerned?<br />

2. Deciding on the aim of the evaluation: Why an evaluation at this<br />

time? What are we trying to achieve? What do we want to evaluate?<br />

What is the purpose of the evaluation? What do the various<br />

stakeholder groups want from this evaluation?<br />

3. Deciding what is to be evaluated: Is it important to look at<br />

performance or impact or both? Are we looking at impact on all<br />

groups or choosing a narrower focus? What is most useful for which<br />

stakeholders (donors, beneficiaries, staff, managers) here?<br />

4. Appointing the evaluation team: Who will co-ordinate, plan and<br />

organise the evaluation? Will the evaluation be facilitated by an<br />

individual or a team? Apart from someone with expertise in<br />

evaluation processes and facilitation, do we also need someone<br />

external with subject expertise in our area of work?<br />

5. Formulating a written evaluation plan outlining the process: What are<br />

the steps in such an evaluation process?<br />

(for example:<br />

⇒ revisiting our objectives,<br />

⇒ establishing indicators for measuring achievements,<br />

⇒ designing methods of measuring,<br />

⇒ collecting and organising and presenting data,<br />

⇒ assessing what has been achieved against what was intended,<br />

⇒ analysing the difference between the two,<br />

⇒ making recommendations)<br />

What are the time frames of the evaluation?<br />

6. Revisiting the project objectives (what was planned): Is our original<br />

plan available? Do we have differing views on what was planned? Can<br />

we revisit and agree now on what was intended (project outputs, use<br />

and benefit)?<br />

7. Redefining (if necessary) indicators to measure these objectives:<br />

How can achievement of these objectives be measured? Can we


BAOBAB: Handbook on Implementation Management – 6. Evaluation 14<br />

measure quantities and quality? Do we need direct or indirect<br />

indicators of achievement? Do we need to survey perceptions and<br />

opinions, as well as statistics and percentages?<br />

8. Devising the methods of data collection: Questionnaires, interviews,<br />

workshops, surveys? Does the chosen method give us the specific<br />

information we want? Does the chosen method limit our responses?<br />

How much does each method cost (in staff time, in consultants’ time,<br />

cost of paper, processing of results)? Do we need extra expertise (in<br />

organising and analysing data using computers for example)?<br />

9. Preparing and testing the data collection methods chosen: Can the<br />

respondees answer easily? Are the questions clear? Does the method<br />

yield exactly the answers we want? What about language barriers?<br />

Are cultural barriers involved? Is any training required by our staff<br />

/ volunteers in order to use this method? If so what? (include any<br />

training necessary at this stage)<br />

10. Collecting data according to method(s) decided upon: Is our<br />

collection method yielding the data we require? If not, can we make<br />

amendments?<br />

11. Analysing the data: Are our results different to those we anticipated<br />

when we planned the project? If so, by how much? Can we explain<br />

this? If so, how?<br />

12. Preparing results for presentation (usually written report plus verbal<br />

presentation): Who will read the report? Who is our audience? What<br />

points need emphasis?<br />

13. Making recommendations based on the results (include in report and<br />

presentation): What needs to be changed? Do we need to re-plan? Do<br />

we need more resources? Do we need to scale down our activities? Do<br />

we need more or less staff? Do we need to close down a section? Do<br />

we need to offer different services? Do we need to change our<br />

service times? Do we need to market ourselves differently?


BAOBAB: Handbook on Implementation Management – 6. Evaluation 15<br />

14. Reporting-back formally (present report to authorities with<br />

summary, conducting workshop report back(s): Who needs to know<br />

the results of the evaluation? Which decision-makers (politicians,<br />

staff of other departments, government ministers, donors)?<br />

15. Following-up on the decision-making based on the evaluation results:<br />

What can change immediately without staff or cost implications?<br />

What can be changed in the short-, -medium- or long-term? What<br />

process(es) of change is required (resistance, support)? Should any<br />

of the recommendations be dropped and for what reasons?


BAOBAB: Handbook on Implementation Management – 6. Evaluation 16<br />

6.4 A piece of management “software”<br />

Evaluation results are contained in a report: this requires communicating the<br />

information / findings back to those who will use it in decision-making. This<br />

is just as important with own internal evaluations as external. Here are some<br />

hints on report writing.<br />

Report Writing<br />

Overview of problems, mistakes, and possible improvements<br />

in report writing<br />

1. Reader - Writer - Relationship<br />

Difficulties Unconducive practices Recommendations<br />

The message is conveyed<br />

to (an unknown or at best)<br />

a partially known reader<br />

who is remote from the<br />

writer in time and place<br />

There is no intermediate<br />

and/or immediate feedback<br />

of the reader(s) to<br />

the messages while they<br />

are delivered<br />

The reader(s) are left to<br />

interpret the messages (=<br />

duplicate and understand<br />

them) by themselves<br />

without the opportunity to<br />

Not being aware of<br />

the reader while<br />

conceptualising a<br />

report, while actually<br />

writing, and while<br />

revising an already<br />

written draft<br />

Not being aware of<br />

readers’ reactions<br />

Not being aware of<br />

readers’ questions<br />

Investigate and imagine<br />

clearly who your possible<br />

readers are and what<br />

their interest in your<br />

text is (e.g. do they have<br />

to take a decision on the<br />

basis of your<br />

statements? do they<br />

need comprehensive<br />

information? do they<br />

want an executive<br />

summary?)<br />

Pick the reader up where<br />

you as the writer suppose<br />

her/him to be; have a<br />

‘mock-reader’ give you<br />

feed-back<br />

Write as if you dialogue<br />

with the reader; put<br />

yourself into his/her<br />

shoes and ask yourself:<br />

what would be her/his


BAOBAB: Handbook on Implementation Management – 6. Evaluation 17<br />

seek clarification with the<br />

author<br />

2. Writing as a special type of communication<br />

main question? which<br />

questions would follow<br />

after each of your<br />

statements? (re-edit!)<br />

Difficulties Unconducive Practices Recommendations<br />

The statements are<br />

unchangeable / cast in<br />

letters by which the<br />

author puts him-/herself<br />

on the spot (positively as<br />

well as negatively)<br />

To be restricted to the<br />

means of a series of<br />

words only (without the<br />

non-verbal parts of<br />

messages like tone of the<br />

voice, gestures, etc.)<br />

The message usually<br />

consists of a more<br />

comprehensive amount of<br />

statements, than verbal<br />

messages do - or even of a<br />

series of arguments / a<br />

whole article or a book<br />

Pre-emptive excuses for<br />

clear statements so that<br />

formulations are vague or<br />

unduly rigid / general:<br />

passive voice, impersonal<br />

pronouns, nouns created<br />

from verbs, safety-firstformulations<br />

Not to write close to how<br />

one talks but to try to<br />

imitate ‘grand elegance’:<br />

long sentences, difficult<br />

words<br />

Unclear writing reflects<br />

unclear thinking (which is<br />

not structured<br />

sufficiently clearly<br />

enough for the type of<br />

writing in which it is<br />

found)<br />

(Style I) Use active<br />

voice, personal<br />

pronouns, action<br />

verbs,<br />

(Style II) Make it<br />

short and sweet: write<br />

short sentences, only<br />

put the minimum<br />

necessary information<br />

into a sentence / a<br />

paragraph / your<br />

whole text<br />

(Style III) Think<br />

before / while writing:<br />

explain the<br />

background / purpose<br />

of your message; give<br />

overviews; make<br />

conclusive summaries;<br />

structure the<br />

sequence of your<br />

arguments<br />

convincingly


BAOBAB: Handbook on Implementation Management – 6. Evaluation 18<br />

3. Prescribed formats for reports<br />

Difficulties Unconducive Practices Recommendation<br />

Formats for reports<br />

are prescribed to<br />

different degrees of<br />

detail in order to<br />

assure comparability<br />

and/or quality of<br />

contributions<br />

6.5 Closure of the chapter<br />

Mechanically follow<br />

the headings given or<br />

fill out the given<br />

boxes<br />

In this chapter you can expect to have learned:<br />

♦ what evaluation is<br />

♦ what different types of evaluation are practised<br />

♦ when these are useful<br />

Put your own messages<br />

into the report format<br />

while structuring them<br />

according to the<br />

prescriptions<br />

♦ how evaluation relates to implementation and how it differs<br />

from monitoring<br />

♦ who evaluations are for<br />

♦ who evaluates<br />

♦ what the main / useful questions are in internal and external<br />

evaluation<br />

about some pointers that make for effective report writing.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!