12.07.2015 Views

Local Evaluation of Children's Services Learning from the Children's ...

Local Evaluation of Children's Services Learning from the Children's ...

Local Evaluation of Children's Services Learning from the Children's ...

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

highlighted in Table 1). These include: knowledge-driven model (an organisation’swork is driven by evaluation); problem-solving model (evaluation is used to solveproblems/address specific questions/provide a knowledge base for decision making);enlightenment model (evaluation provides a greater understanding <strong>of</strong> concepts andissues) interactive-model (practice and evaluation interactively influence oneano<strong>the</strong>r); political/tactical model (evaluation is drawn on selectively by anorganisation to legitimise decisions). It is also useful here to draw on <strong>the</strong> distinctionmade by Clarke (2001): evaluation findings tend to be used by organisations in twoways instrumentally and conceptually. The former denotes <strong>the</strong> ways organisationsmay act directly on evaluation findings in terms <strong>of</strong>, for example, changing practicesand deciding to fund particular activities; <strong>the</strong> latter denotes <strong>the</strong> ways evaluationmaterial may influence organisations’ understanding <strong>of</strong> concepts more broadly, and<strong>the</strong>reby informing strategies and strategic thinking.In practice <strong>the</strong>re were mixed perceptions among Children’s Fund local evaluatorsrelating to <strong>the</strong> extent to which <strong>the</strong>y believed that <strong>the</strong>y had had an influence onpartnerships’ decision making, although for many <strong>the</strong> relationships betweenpartnerships and evaluators appear to correspond with <strong>the</strong> problem solving model(Young, et al. 2002) and evaluation findings used instrumentally (Clarke, 2001).Evidence derived <strong>from</strong> local evaluation reports and interviews with local evaluatorssuggest that <strong>the</strong> majority <strong>of</strong> evaluations appear to pragmatically ga<strong>the</strong>r and analysedata concerned with <strong>the</strong> impact <strong>of</strong> Children’s Fund programmes, or processes suchas children’s participation and partnership working with a view to <strong>of</strong>fering materialthat is intended to be used by partnerships instrumentally, such as through changingpractices and deciding to re-commission particular projects. Indeed, manyprogramme managers participating in <strong>the</strong> NECF programme managers’questionnaire survey 4 suggested that local evaluation had helped to identifysuccessful/less successful projects (51%) and how <strong>the</strong>y worked well/less well (56%),as well as to make decisions about continuing funding projects (43%) or whichprojects to promote for mainstreaming (39%). Many programme managers alsoindicated that local evaluation had helped <strong>the</strong> partnership reflect on strategicpractices and how to improve <strong>the</strong>m (43%).There was also some indication that evaluation had <strong>of</strong>fered partnerships newconcepts or broader frames <strong>of</strong> understanding (corresponding with <strong>the</strong> enlightenment4 Based on 120 Children’s Fund programme managers’ responses.Chapter 3 39

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!