12.07.2015 Views

Local Evaluation of Children's Services Learning from the Children's ...

Local Evaluation of Children's Services Learning from the Children's ...

Local Evaluation of Children's Services Learning from the Children's ...

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

RESEARCH<strong>Local</strong> <strong>Evaluation</strong> <strong>of</strong>Children’s <strong>Services</strong><strong>Learning</strong> <strong>from</strong> <strong>the</strong> Children’s FundThe National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s FundResearch Report RR783


Research ReportNo 783<strong>Local</strong> <strong>Evaluation</strong> <strong>of</strong>Children’s <strong>Services</strong><strong>Learning</strong> <strong>from</strong> <strong>the</strong> Children’s FundThe National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s FundThe views expressed in this report are <strong>the</strong> authors’ and do not necessarily reflect those <strong>of</strong> <strong>the</strong> Departmentfor Education and Skills.© The University <strong>of</strong> Birmingham 2006ISBN 1 84478 776 1


When citing this report please show authorship in full as follows:Spicer, N., and Smith, P. (2006)The National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund (NECF) ran <strong>from</strong> January 2003 toMarch 2006. A large number <strong>of</strong> people were involved in a variety <strong>of</strong> ways. Here welist members <strong>of</strong> <strong>the</strong> team who worked on ei<strong>the</strong>r part-time or full-time bases during <strong>the</strong>thirty-nine months <strong>of</strong> <strong>the</strong> evaluation.NECF Team MembersThis report has been produced with <strong>the</strong> help <strong>of</strong> members <strong>of</strong> <strong>the</strong> Birmingham-based NECFresearch team.Lul Admasachew *David Allan*Pr<strong>of</strong>essor Marian Barnes (Deputy Co-Director)Hanne BeirensAdrian BlackledgeLiz Chilton*Jane CoadLaura Day-AshleyPr<strong>of</strong>essor Anne Edwards (Director)Ruth EvansTony FieldingRachel HekNathan HughesLucy LovelessNatasha MacNabAngus McCabe*Maggie McCutcheonKate Morris (Project Manager)Paul MasonKath PinnockGill PlumridgeHiromi YamashitaThe report draws on work undertaken by <strong>the</strong> whole team. Those who have particularlycontributed to producing it are marked by*.NECF can be contacted at –The National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s FundGround FloorThe Watson BuildingThe University <strong>of</strong> BirminghamEdgbastonBirmingham B15 2TTTel0121 414 8089 – Amanda Owen-Meehan, Senior Project AdministratorEmail A.Owenmeehan@bham.ac.uk


AcknowledgementsNECF would like to thank <strong>the</strong> following people for <strong>the</strong> time <strong>the</strong>y gave to <strong>the</strong>evaluation. It was much appreciated.• The children, young people and <strong>the</strong>ir families.• The practitioners who worked with <strong>the</strong> children and young people.• <strong>Local</strong> evaluators.• Members <strong>of</strong> partnership boards and o<strong>the</strong>r strategic groups.• The programme managers and <strong>the</strong>ir teams.• Regional managers and <strong>the</strong>ir teams.• NECF reference groups who gave on-going feedback as analyses developed.


ContentsExecutive SummaryIIntroduction 1Chapter 1Roles and Principles <strong>of</strong> <strong>Local</strong> Children’s Fund<strong>Evaluation</strong>s51.1 Roles <strong>of</strong> Children’s Fund <strong>Local</strong> <strong>Evaluation</strong>s 51.2 Factors Influencing <strong>the</strong> <strong>Evaluation</strong> Process 91.3 Children’s Fund <strong>Local</strong> <strong>Evaluation</strong>s’ Principles and Standards 121.4 Chapter Summary 18Chapter 2 Methods Adopted by Children’s Fund <strong>Local</strong> <strong>Evaluation</strong>s 212.1 Embracing Multiple-method Approaches 212.2 Measuring Impacts: <strong>the</strong> Attribution Problem 222.3 Evaluating Processes 282.4 Engaging Children and Young People in Aspects <strong>of</strong> <strong>Local</strong><strong>Evaluation</strong>s292.5 Chapter Summary 32Chapter 3How <strong>Local</strong> <strong>Evaluation</strong> Supports Children’s FundPartnerships’ Decision-Making353.1 Multiple Approaches to Disseminating <strong>Evaluation</strong> Findings 353.2 Partnerships’ Use <strong>of</strong> <strong>Evaluation</strong> Evidence in Decision-making 383.3 Factors Facilitating and Inhibiting <strong>the</strong> Use <strong>of</strong> <strong>Evaluation</strong>Findings413.4 Chapter Summary 45Chapter 4Prevention, Participation and Partnership Working: KeyMessages <strong>from</strong> Children’s Fund <strong>Local</strong> <strong>Evaluation</strong> Reports49


4.1 Key Messages <strong>from</strong> <strong>Local</strong> <strong>Evaluation</strong>: Prevention andPreventative <strong>Services</strong>4.2 Key Messages <strong>from</strong> <strong>Local</strong> <strong>Evaluation</strong>: Children and YoungPeople’s Participation49524.3 Key Messages <strong>from</strong> <strong>Local</strong> <strong>Evaluation</strong>: Partnership Working 544.4 Chapter Summary 58Conclusions 61References 63List <strong>of</strong> TablesTable 1 Roles and orientations <strong>of</strong> social policy evaluation 4Table 2Summary <strong>of</strong> <strong>the</strong> multiple roles adopted by Children’s Fundlocal evaluators over <strong>the</strong> course <strong>of</strong> partnerships’ development6


<strong>Local</strong> <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund Initiative:Opportunities, Challenges and ProspectsThis report focuses on exploring four <strong>the</strong>mes:• The roles, orientations and principles <strong>of</strong> local evaluators;• <strong>the</strong> evaluation methods that have been adopted by local evaluations;• <strong>the</strong> ways local evaluations have sought to support partnerships’ decisionmaking;• Key messages emerging <strong>from</strong> local evaluators’ reports and <strong>the</strong> ways localevaluators have sought to understand <strong>the</strong> underpinning <strong>the</strong>mes <strong>of</strong> <strong>the</strong>Children’s Fund initiative, namely, prevention, participation and partnershipworking.Executive summaryIntroduction1. There is a growing emphasis on evidence-based policy and practice in <strong>the</strong>UK. The Children’s Fund has fully embraced <strong>the</strong> importance <strong>of</strong> supportingpartnerships’ evidence based decision-making with local evaluation.2. Traditional experimental research and evaluation methods are now widelyseen as inappropriate for identifying what works in complex community-basedinitiatives such as <strong>the</strong> Children’s Fund where <strong>the</strong> relationships betweeninterventions and outcomes can be very complicated.3. The importance <strong>of</strong> drawing on qualitative as well as quantitative research andevaluation methods is widely recognised in some government departments. Itis also recognised that it may be appropriate to combine different evaluationmethods to address different questions in different settings. ‘Lay’ experiencesand perspectives such as service users’ views are now far more widely seenas legitimate forms <strong>of</strong> evidence than <strong>the</strong>y were historically.4. In <strong>the</strong> past, evaluation has tended to be equated with auditing andperformance management. <strong>Evaluation</strong> is now <strong>of</strong>ten seen as generatingExecutive Summaryi


learning for policy makers and practitioners by providing formative or evendialogic forms <strong>of</strong> feedback.5. A number <strong>of</strong> factors and conditions influence <strong>the</strong> impact <strong>of</strong> research andevaluation on policy and practice. These include perceptions <strong>of</strong> <strong>the</strong> credibility<strong>of</strong> evaluation, <strong>the</strong> ways evaluation findings are presented and disseminated,<strong>the</strong> relevance and usefulness <strong>of</strong> findings and <strong>the</strong> external pressures on policymakers and practitioners to act on findings.Roles and principles <strong>of</strong> local Children’s Fund evaluations6. The Children’s Fund Guidance establishes that partnerships shouldcommission local evaluations <strong>of</strong> <strong>the</strong>ir programmes <strong>of</strong> work, although variationsreflect individual partnership’s needs and priorities. Hence, local evaluators’roles have tended to correspond with partnerships’ starting points (in terms <strong>of</strong>wave, for example), timetables <strong>of</strong> development, and to social andgeographical contexts. Some evaluators have been commissioned toundertake whole programme evaluations, whilst o<strong>the</strong>rs focus on particularaspects <strong>of</strong> programmes’ work or selected activities and projects.7. The range <strong>of</strong> roles adopted by local evaluators includes generating learningabout processes intended to support partnerships’ strategic planning, ongoingmanagement and development, and providing an evidence base that informs<strong>the</strong> development <strong>of</strong> mainstream preventative strategies and children andfamilies’ services.8. Evaluators have also aimed to measure impacts/outcomes <strong>of</strong> Children’s Fundactivities in terms <strong>of</strong> how sub-objectives have been addressed, how <strong>the</strong>initiative has impacted on <strong>the</strong> lives <strong>of</strong> service users and how effectively targetgroups have been reached. This is described by evaluators as forming anevidence base on which partnerships are able to make decisions, ra<strong>the</strong>r thatsimply as a summative record <strong>of</strong> performance. This material is also intendedto inform decisions about projects and activities in terms <strong>of</strong> whe<strong>the</strong>r tocontinue to fund or promote for mainstreaming.9. Some evaluators have actively supported projects and programmes’capacities to undertake evaluation and monitoring activities <strong>the</strong>mselvesthrough, for example, providing training and designing ‘toolkits’.iiExecutive Summary


10. <strong>Local</strong> evaluators highlighted <strong>the</strong>ir experience <strong>of</strong> a number <strong>of</strong> engagement andmethodological problems. Some evaluators indicated that time commitmentsand pressures among project staff and strategic stakeholders influenced <strong>the</strong>irabilities to participate in <strong>the</strong> evaluation.11. One methodological issue stems <strong>from</strong> <strong>the</strong> roles many local evaluators have asboth generating learning to develop more effective practices, and adopting arole regarded by some stakeholders as analogous to auditing Children’s Fundprojects. Indeed, some local evaluators encountered service providers whowere uncertain about <strong>the</strong> purpose <strong>of</strong> evaluation and considerable time wasspent convincing <strong>the</strong>m <strong>of</strong> its benefits. Some concerns were raised that serviceproviders only <strong>of</strong>fered positive accounts <strong>of</strong> <strong>the</strong>ir progress ra<strong>the</strong>r than revealingproblematic issues since <strong>the</strong>y connected evaluation to decisions about futurecommissioning, a concern that was sometimes justified.12. <strong>Local</strong> evaluation is undertaken in a complex, changing policy environment.Uncertainties relating to <strong>the</strong> evolution <strong>of</strong> local level structures such aschildren’s trusts were described by a number <strong>of</strong> evaluators as problematic, aswas <strong>the</strong> announcement <strong>of</strong> budget cuts for <strong>the</strong> Children’s Fund initiative whichfor some undermined a systematic approach to conducting local evaluation.13. <strong>Local</strong> evaluators point to a number <strong>of</strong> principles and standards that haveguided <strong>the</strong>m. These are: to provide useful and relevant material forpartnerships; to be responsive to change; to <strong>of</strong>fer independent, balancedaccounts <strong>of</strong> partnerships’ activities; to be inclusive to different groups <strong>of</strong>stakeholders; to use credible methods and evidence to support evaluationfindings; to adopt ethical approaches to evaluation.Methods adopted by Children’s Fund local evaluations14. <strong>Local</strong> evaluators drew on multiple quantitative and qualitative data sourcesand analysis methods that reflected <strong>the</strong> different elements <strong>of</strong> partnerships’programmes <strong>the</strong>y aimed to examine. Quantitative methods, sometimescomplemented by qualitative methods, were frequently adopted for assessing<strong>the</strong> impacts <strong>of</strong> interventions. Qualitative methods, sometimes supplementedby quantitative methods, were primarily employed for generating learningExecutive Summaryiii


about processes such as <strong>the</strong> effectiveness <strong>of</strong> partnership working,commissioning and approaches to mainstreaming.15. Quantitative methods adopted by evaluators include structured questionnairesand interviews with stakeholders and analysis <strong>of</strong> secondary statistical datasources, including <strong>the</strong> analysis <strong>of</strong> project monitoring data. Qualitative methodsdrawn on by evaluators include semi-structured or unstructured interviewswith stakeholders, focus groups, observations <strong>of</strong> project activities and analysis<strong>of</strong> partnership documents.16. Evaluators have aimed to measure <strong>the</strong> impacts <strong>of</strong> Children’s Fund projectsand o<strong>the</strong>r interventions on aspects <strong>of</strong> children’s lives corresponding to <strong>the</strong>Children’s Fund sub-objectives and <strong>the</strong> Every Child Matters outcomes. Someevaluators also explored impacts in terms <strong>of</strong> how effectively projects targeted<strong>the</strong>ir services at user groups or as an assessment <strong>of</strong> <strong>the</strong> value for money <strong>of</strong>particular projects and o<strong>the</strong>r activities.17. There are a number <strong>of</strong> challenges to measuring impacts attributable tospecific Children’s Fund programmes and projects. The problems <strong>of</strong> socialexclusion tackled by <strong>the</strong> Children’s Fund are complex, multi-layered andinteracting, making it difficult to demonstrate causal links between a singleintervention and a measurable outcome. A related difficulty is <strong>the</strong> limitedextent to which time-limited local evaluation is able to identify longer-termoutcomes for children using Children’s Fund services.18. Some evaluators had difficulties responding to <strong>the</strong> Children’s Fund subobjectivesand Every Child Matters outcomes frameworks which defineoutcomes in relatively broad terms that are potentially open to a range <strong>of</strong>interpretations. Established statistical data sets tend not to be available thatclearly show change, although those relating to educational attainment andattendance are widely available. Some evaluators also noted <strong>the</strong> problems <strong>of</strong>drawing on inaccurate or inconsistent monitoring data provided by projects.19. Such challenges have led to some evaluators embracing <strong>the</strong> use <strong>of</strong> multiplemethods and multiple data sources creatively in ways that enable <strong>the</strong>m tosuggest with some confidence that Children’s Fund projects have led topositive outcomes for children using <strong>the</strong> services.ivExecutive Summary


20. Some evaluators involved children and young people as questionnairerespondents or participants in interviews and/or focus groups; many used‘participatory’ methods <strong>of</strong> engaging children and young people. A minority <strong>of</strong>local evaluators involved children more fully through evaluation design,participation in fieldwork, analysis <strong>of</strong> data and presenting findings. This wasseen as making <strong>the</strong> questions evaluators asked more relevant to children andyoung people and helping to generate richer evaluation data.21. In some cases <strong>the</strong> lack <strong>of</strong> resources available to evaluators, difficultiesestablishing representative samples <strong>of</strong> those accessing services and <strong>the</strong> lowpriority given by <strong>the</strong> partnership to this type <strong>of</strong> activity limited <strong>the</strong> waysevaluators were able to engage children and young people.How local evaluation supports Children’s Fund partnerships’ decision making22. Evaluators have each adopted a range <strong>of</strong> summative, formative, and forsome, dialogic approaches to engaging with partnerships reflecting <strong>the</strong>irmultiple roles at different stages in partnerships’ development. <strong>Evaluation</strong>material is disseminated through reports, presentations and workshops,websites, and some evaluators attend partnership board meetings, subgroupsor o<strong>the</strong>r organisational structures. A number <strong>of</strong> evaluators emphasised <strong>the</strong>importance <strong>of</strong> ongoing dialogue with strategic partners about <strong>the</strong>irdevelopment ra<strong>the</strong>r than simply providing summative evaluation outputs.23. The importance <strong>of</strong> adopting appropriate means <strong>of</strong> feeding back evaluationmessages to a range <strong>of</strong> stakeholders including strategic partners, serviceproviders and children, young people and <strong>the</strong>ir families who used Children’sFund services was recognised by some evaluators. Some produced ‘childfriendly’ versions <strong>of</strong> reports or disseminated findings at events that involvedchildren and young people. However, it would appear that few evaluatorsprioritised service users, including children and young people, as an importantaudience <strong>of</strong> evaluation findings.24. Many local evaluations developed approaches to raising <strong>the</strong> capacity <strong>of</strong>Children’s Fund projects to self-evaluate <strong>the</strong>ir work through means such asworking alongside project staff in developing appropriate self-evaluation‘tools’, or through providing training.Executive Summaryv


25. A number <strong>of</strong> local evaluation reports identified where <strong>the</strong>ir recommendationshave been developed into tangible changes in <strong>the</strong> partnership. O<strong>the</strong>rs wereuncertain about how evaluation material had or would influence <strong>the</strong>partnership. <strong>Evaluation</strong>s appear to be <strong>of</strong>fering material to partnerships to beused instrumentally, that is, through changing <strong>the</strong>ir practices and/or decidingwhich projects to re-commission. <strong>Local</strong> evaluation also appears to be usedconceptually in terms <strong>of</strong> expanding partnerships’ understandings relating toprevention, partnership working and participation (that is an enlightenmentmodel <strong>of</strong> <strong>the</strong> relationship between evaluation and decision making).26. Many programme managers participating in <strong>the</strong> NECF programme managers’questionnaire survey indicated that local evaluation helped identifysuccessful/less successful projects (51%) and how <strong>the</strong>y worked well/less well(56%). 43% indicated local evaluation had helped decision-making aboutcontinuing funding projects and promoting projects for mainstreaming (39%).43% <strong>of</strong> programme managers indicated that local evaluation had helped <strong>the</strong>partnership reflect on strategic practices and how to improve <strong>the</strong>m. 47% <strong>of</strong>programme managers indicated that local evaluation had helped to develop<strong>the</strong> partnership’s thinking about prevention, partnership working andprevention or as a means <strong>of</strong> helping to promote changes in practices andcultures among mainstream agencies (39%). Only 27% <strong>of</strong> programmemanagers suggested that local evaluation had been <strong>of</strong> limited value to <strong>the</strong>partnership.27. Evaluators suggested that <strong>the</strong>y could have more influence on partnerships byproviding findings that are relevant, useful, timely, accessible to a range <strong>of</strong>stakeholders, and provide realistic recommendations. The importance <strong>of</strong>engendering a relationship in which <strong>the</strong> partnership recognises <strong>the</strong> value <strong>of</strong>evaluation and is open to both positive and more critical feedback was alsonoted.28. A number <strong>of</strong> factors inhibit evaluators’ influence on partnerships’ decisionmaking. Pressures <strong>from</strong> ongoing change nationally and within local authorityareas as mainstream agencies embrace a number <strong>of</strong> alternative, andsometimes, diverging agendas means that local Children’s Fund evaluationmay get drawn on selectively to legitimise decisions or to demonstrate <strong>the</strong>viExecutive Summary


successes <strong>of</strong> partnerships’ work (that is a politically/tactically driven model <strong>of</strong><strong>the</strong> relationship between evaluation and decision making).29. Some evaluators indicated that <strong>the</strong>y experienced difficulties controlling <strong>the</strong>dissemination <strong>of</strong> evaluation outputs, how outputs would be used, and indeedhad experienced difficulties reporting on negative aspects <strong>of</strong> partnerships’work. In some cases Children’s Fund board members’ multiple priorities and<strong>the</strong>ir commitments to <strong>the</strong>ir own agencies and o<strong>the</strong>r partnerships meant <strong>the</strong>yhad insufficient time to act on evaluation findings.30. A number <strong>of</strong> evaluators experienced disagreement with stakeholders about<strong>the</strong> legitimacy <strong>of</strong> different evaluation methods. Some stakeholders favoured‘hard’ quantitative evidence and rejected <strong>the</strong> value <strong>of</strong> qualitative methods,whilst o<strong>the</strong>r appeared to have limited understandings <strong>of</strong> notions such asindicators and impacts or had unrealistic expectations <strong>of</strong> what evaluationscould produce. Maintaining ongoing dialogue in order to manage partnerships’expectations is <strong>the</strong>refore important.Prevention, participation and partnership working: key messages <strong>from</strong>Children’s Fund local evaluation reports31. The focus <strong>of</strong> local evaluators’ work on prevention has been to analyse <strong>the</strong>activities <strong>of</strong> Children’s Fund projects. Most evaluations conceptualisedprevention in relation to <strong>the</strong> Children’s Fund sub-objectives, and/or to <strong>the</strong>Every Child Matters outcomes framework in assessing projects’ progress. Anumber <strong>of</strong> evaluators related <strong>the</strong> work <strong>of</strong> projects to <strong>the</strong> four ‘tiers’ <strong>of</strong>prevention adopted in <strong>the</strong> Children’s Fund Guidance.32. <strong>Local</strong> evaluators widely report on <strong>the</strong> beneficial impacts <strong>of</strong> Children’s Fundpreventive projects on children and young people’s lives and that manyprojects are effective at targeting ‘hard to reach’ groups <strong>of</strong> children and youngpeople. Some evaluators suggest that voluntary and community organisationshave been particularly successful in engaging and addressing <strong>the</strong> needs <strong>of</strong>traditionally excluded groups through <strong>of</strong>fering alternative approaches to those<strong>of</strong> <strong>the</strong> mainstream agencies in terms <strong>of</strong> being flexible, accessible and nonstigmatising.Executive Summaryvii


33. Understandings <strong>of</strong> ‘prevention’ among strategic stakeholders and serviceproviders within partnerships tend to be described in local evaluation reportsas divergent, although some acknowledge that progress is being made bypartnership boards in negotiating a consensus <strong>of</strong> understanding.34. Evaluators identified diverse definitions <strong>of</strong> participation among stakeholders inmost partnerships. Many evaluators reported that pr<strong>of</strong>essionals and adults’definitions <strong>of</strong> participation tend to predominate, ra<strong>the</strong>r than those <strong>of</strong> childrenand young people <strong>the</strong>mselves. Often <strong>the</strong>se equate to consulting children andyoung people in order to inform <strong>the</strong> development <strong>of</strong> responsive services.Despite considerable commitment to participation, in practice progress tendsto be relatively limited, with few examples <strong>of</strong> children and young people’sdirect participation in decision making at strategic and project levels.35. A number <strong>of</strong> evaluators suggested that ‘hard to reach’ children and youngpeople and younger children were particularly under-represented in Children’sFund participative activities, although some evaluators identified examples <strong>of</strong>participative activities that had been effective in engaging an inclusive range<strong>of</strong> children and young people.36. <strong>Local</strong> evaluators pointed to <strong>the</strong> importance <strong>of</strong> partnerships resourcingdedicated participation <strong>of</strong>ficers, <strong>the</strong> need to train practitioners and servicemanagers and <strong>the</strong> importance <strong>of</strong> providing organisational support for serviceproviders to develop effective participative practices. Considerable investment<strong>of</strong> time is also needed if participation strategies are to be effective. Someevaluations suggest that voluntary and community organisations may beparticularly effective in promoting participation due to <strong>the</strong>ir abilities to work inflexible and innovative ways.37. Evaluators reported different experiences and understandings <strong>of</strong> partnershipworking, although within many partnerships agreements had been reached onwhat partnership working means. Some stakeholders saw partnership boards’responsibilities as ensuring probity and demonstrating that <strong>the</strong>y met CentralGovernment requirements. O<strong>the</strong>rs emphasised collaborative working betweenorganisations to bring about cultural change in children’s services.viiiExecutive Summary


38. Conditions reinforcing effective partnership working include: sharing a vision<strong>of</strong> <strong>the</strong> purpose <strong>of</strong> <strong>the</strong> initiative; transparency <strong>of</strong> decision making; clarity aboutstakeholders’ responsibilities; good communications between groups <strong>of</strong>stakeholders including members <strong>of</strong> <strong>the</strong> partnership board, Children’s Fundstaff, service providers and service users.39. Evaluators reported that some collaborative arrangements and networksbetween Children’s Fund projects had been facilitated by partnerships,although many suggested that such arrangements are relatively unevenacross partnerships’ activities. O<strong>the</strong>r collaborative arrangements and networkswere reported as having been established by <strong>the</strong> organisations <strong>the</strong>mselves.40. A number <strong>of</strong> evaluators commented that statutory agencies tended to leadstrategically, whilst voluntary and community organisations tended to provideChildren’s Fund services. However, within some partnerships voluntary andcommunity organisations had been strategically influential, indeed, <strong>the</strong>Children’s Fund had raised <strong>the</strong> pr<strong>of</strong>ile <strong>of</strong> <strong>the</strong> sector. Some stakeholderspredicted, however, that <strong>the</strong> voluntary and community sector would bemarginalised in <strong>the</strong> next phase <strong>of</strong> integrating children’s services.Executive Summaryix


IntroductionAims, scope and methodsThere is a growing academic and policy literature on developments in thinking abouthow evaluation contributes to evidence based policy and practice. Traditional‘hierarchies’ <strong>of</strong> research and evaluation methods that endorse experimental andquantitative methods are being set aside in favour <strong>of</strong> combined quantitative andqualitative methods for evaluating complex social policy initiatives. The importance <strong>of</strong>embracing <strong>the</strong> experiences, perspectives and accounts <strong>of</strong> all stakeholders includingservice users and children and young people, and making evaluation findingsaccessible to multiple audiences, not just pr<strong>of</strong>essionals and academics, have beenimportant developments in thinking on evaluation. There is also a growingunderstanding <strong>of</strong> <strong>the</strong> complex ways evaluation may (or may not) be used by policymakers and practitioners.This report considers <strong>the</strong> local evaluation <strong>of</strong> <strong>the</strong> Children’s Fund, a national initiativethat fully embraces <strong>the</strong> importance <strong>of</strong> evaluation in supporting partnerships to assessand develop <strong>the</strong>ir programmes <strong>of</strong> work. The nature and complexity <strong>of</strong> <strong>the</strong> initiativedoes, however, present a number <strong>of</strong> methodological and practical challenges toevaluators. One challenge is <strong>the</strong> highly divergent strategies and practices required at<strong>the</strong> local level, to tackle what are complex, multi-layered and long-term issues <strong>of</strong>social exclusion. This makes it particularly difficult to assess <strong>the</strong> effectiveness <strong>of</strong> <strong>the</strong>initiative using traditional evaluation methods (Edwards & Fox, 2005). Suchchallenges have resulted in pragmatic creativity in <strong>the</strong> ways local evaluations havesought to demonstrate outcomes and support Children’s Fund partnerships’ planningin a context <strong>of</strong> rapid national and local policy change. The aim <strong>of</strong> this report is toreview <strong>the</strong> range <strong>of</strong> approaches adopted by local evaluators as well asacknowledging some <strong>of</strong> <strong>the</strong> tensions and compromises that have characterised <strong>the</strong>irwork. The report focuses on exploring four major <strong>the</strong>mes:• The roles, orientations and principles <strong>of</strong> local evaluators;• The evaluation methods that have been adopted by local evaluations;• The ways local evaluations have sought to support partnerships’ decisionmaking;• Key messages emerging <strong>from</strong> local evaluators’ reports and <strong>the</strong> ways localevaluators have sought to understand <strong>the</strong> underpinning <strong>the</strong>mes <strong>of</strong> <strong>the</strong>Introduction 1


Children’s Fund initiative, namely, prevention, participation and partnershipworking.This report is based on <strong>the</strong> analysis <strong>of</strong> local external evaluation reports that havebeen made available to <strong>the</strong> National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund (NECF) todate 1 and <strong>the</strong> analysis <strong>of</strong> detailed, semi-structured interviews conducted withselected local evaluators (n=10). Interviewees were drawn <strong>from</strong> NECF case studypartnerships toge<strong>the</strong>r with a sample <strong>of</strong> members <strong>of</strong> <strong>the</strong> NECF <strong>Local</strong> Evaluators’Reference Group. The programme managers’ questionnaire survey conducted by <strong>the</strong>NECF was also drawn on to highlight some <strong>of</strong> <strong>the</strong> ways programme managers sawlocal evaluation being used. 2 The emphasis <strong>of</strong> <strong>the</strong> analysis presented in this report isto illustrate <strong>the</strong> diversity <strong>of</strong> approaches and methods employed by local evaluators 3ra<strong>the</strong>r than exhaustively catalogue all local evaluation activity in all 149 partnerships.Hence, it is not possible for this report to quantify <strong>the</strong> numbers <strong>of</strong> evaluators adoptingdifferent approaches and methods. Nei<strong>the</strong>r is it <strong>the</strong> intention <strong>of</strong> this report to assess<strong>the</strong> quality <strong>of</strong> individual evaluations or particular evaluation reports. To this endagencies/organisations conducting evaluations and individual Children’s Fundpartnerships are not identified to protect <strong>the</strong>ir anonymity.Developments in evaluation and evidence based policyIn <strong>the</strong> UK <strong>the</strong>re is growing emphasis on <strong>the</strong> use <strong>of</strong> evidence in policy makingreflecting <strong>the</strong> current government’s pragmatic ideology <strong>of</strong> ‘what works’ by whichdecisions at national and local levels are expected to be founded on robust evidence.This has led to considerable expansion in evaluation infrastructure across <strong>the</strong> country(Solesbury, 2001). Never<strong>the</strong>less, it is recognised that key social problems such associal exclusion and health inequalities have multiple and interrelated facets andcauses including poverty, unemployment, family background, education,neighbourhood, opportunity and lifestyle. Hence, <strong>the</strong>re can be no simple, causalassociation between a policy intervention and outcomes (Percy-Smith, 2000;Pierson, 2002; Coote, et al., 2004). Such recognition is reflected in complex, multiplelayeredpolicy responses to tackling social exclusion and <strong>the</strong> acknowledgement thata number <strong>of</strong> agencies would need to work in partnership to address <strong>the</strong>se issues.1 At <strong>the</strong> time <strong>of</strong> writing reports relating to 80 partnerships were available for analysis.2 Based on 120 Children’s Fund programme managers’ responses (October-November,2005).3 The local evaluation reports represent <strong>the</strong> three Children’s Fund waves, regional spread,type <strong>of</strong> local authority (rural, urban, unitary, two-tier, metropolitan authorities) and agencyconducting <strong>the</strong> evaluation (university, consultant/consultancy, voluntary sector organisation).2Introduction


These shifts in thinking are widely embraced at local level by area-based initiativesincluding <strong>the</strong> Children’ Fund.This recognition has also led to changes in thinking about evaluation. In <strong>the</strong> UKclinical medicine has historically embraced <strong>the</strong> need for evidence based decisionmaking more than many o<strong>the</strong>r sectors, although it has tended to adopt a traditionalapproach to demonstrating ‘what works’. Standards <strong>of</strong> evidence in clinical medicineemphasise experimental methods; ‘randomised control trials’ and systematic reviews<strong>of</strong> randomised control trials in particular are seen as <strong>the</strong> ‘gold standard’ <strong>of</strong> evidence(Becker & Bryman, 2004). Conversely, qualitative methods such as interviewingpr<strong>of</strong>essionals or service users would be considered <strong>the</strong> least robust forms <strong>of</strong>evidence. Whilst ‘lay’ perspectives such as <strong>the</strong> views <strong>of</strong> service users have nothistorically been valued, such experiences are now more readily acknowledged aslegitimate and important forms <strong>of</strong> evidence (Coote, et al., 2004; Becker & Bryman,2004). Traditional experimental approaches are now widely seen as inappropriate foridentifying what works in complex community and area-based initiatives in which<strong>the</strong>re are no simple cause-and-effect relationships between interventions andoutcomes. A Cabinet Office report, establishing a framework for assessing qualitativeevidence, clearly signals that government departments are now valuing <strong>the</strong>contribution <strong>of</strong> qualitative methods in social policy research and evaluation (Spencer,et al. 2004). Fur<strong>the</strong>rmore, it is widely recognised that a combination <strong>of</strong> evaluationmethods should be used appropriately to address different questions in differentsettings (Coote, et al., 2004; Becker & Bryman, 2004).There are also changes in <strong>the</strong> ways evaluators are seen as engaging with policymakers and practitioners. Ra<strong>the</strong>r than being equated with traditional auditing andperformance management roles, evaluation is frequently expected to provideformative or even dialogic feedback to help policy-makers and practitioners developprogrammes <strong>of</strong> activities. This is as well as generating more sophisticated,<strong>the</strong>oretically driven understandings <strong>of</strong> concepts such as social exclusion. Someevaluation is now committed to redressing social power imbalances and embracingdiverse and sometimes contradictory experiences, perspectives and accounts <strong>of</strong>social problems and policy solutions (Donaldson & Scriven, 2003). Some <strong>of</strong> <strong>the</strong>sepotential roles and orientations identified in <strong>the</strong> literature are summarised in TableOne. Whilst some <strong>of</strong> <strong>the</strong>se roles are well established in evaluation practice, o<strong>the</strong>rsare starting to emerge as important departures in <strong>the</strong> ways evaluation is seen asIntroduction 3


engaging with policy makers, practitioners and indeed o<strong>the</strong>r stakeholders <strong>of</strong> socialpolicies such as communities.It is also recognised in <strong>the</strong> literature that <strong>the</strong> extent to which research and evaluationinfluence policy depends on a range <strong>of</strong> factors. These may include: factors relating to<strong>the</strong> credibility <strong>of</strong> evaluation such as data quality, perceptions among practitioners <strong>of</strong>whe<strong>the</strong>r evaluation methods are valid ways <strong>of</strong> generating knowledge and <strong>the</strong> extentto which evaluators are trusted; <strong>the</strong> ways in which evaluation findings are presentedand disseminated that are accessible, timely and meet <strong>the</strong> needs <strong>of</strong> <strong>the</strong> audience;<strong>the</strong> relevance and usefulness <strong>of</strong> evaluation findings to policy makers andpractitioners; and <strong>the</strong> abilities <strong>of</strong> policy makers and practitioners to act on evaluationfindings. These may be influenced by alternative and sometimes conflictingpressures that influence <strong>the</strong>ir decision making, such as changing policy directions atnational level, policy makers and practitioners’ abilities to interpret and apply findings,systems and institutional cultures <strong>of</strong> accepting and acting on evidence and policymakers and practitioners’ abilities to act on evaluation findings within budgetaryconstraints (McDonald, 2000; Davies, et al., 2000; Percy-Smith, et al., 2000; Becker& Bryman, 2004; Coote, et al., 2004).Table 1: Roles and orientations <strong>of</strong> social policy evaluationGeneratinglearning/informingorganisations’ developmentPerformance managementConsultant/technician/analystEmpowerment/capacitybuildingevaluationResearch-orientatedevaluationTheory-driven evaluationInclusive evaluationGenerating formative learning to helporganisations develop throughout <strong>the</strong>ir lives.Understanding what works and why; acting as acritical friendDemonstrating and showing accountability forimpact/success/failure <strong>of</strong> an organisation’sactivitiesProviding solutions to organisations’problems/questions; analysis <strong>of</strong> secondary data onbehalf <strong>of</strong> an organisationFacilitating organisations’ reflection and supporting<strong>the</strong> capacity for self-evaluation/monitoringDeveloping and expanding practitioners/policymakers’ understandings <strong>of</strong> conceptsThe explicit use <strong>of</strong> a <strong>the</strong>oretical position thatinforms analysis/interpretation <strong>of</strong> evaluationmaterialAttempts to redress power imbalances andinequalities in society; sees communities asprinciple interest groups <strong>of</strong> evaluation4Introduction


Chapter 1: Roles and Principles <strong>of</strong> <strong>Local</strong> Children’s Fund<strong>Evaluation</strong>sThis chapter describes <strong>the</strong> roles adopted by local evaluations <strong>of</strong> Children’s Fundpartnerships and reflects on a number <strong>of</strong> key factors that have influenced <strong>the</strong> abilities<strong>of</strong> local evaluators to undertake <strong>the</strong>ir roles at different stages in partnerships’development. The chapter <strong>the</strong>n explores <strong>the</strong> range <strong>of</strong> principles adopted by localevaluators that were seen as reinforcing <strong>the</strong> credibility and effectiveness <strong>of</strong> <strong>the</strong>irwork.1.1 Roles <strong>of</strong> Children’s Fund <strong>Local</strong> <strong>Evaluation</strong>sThe Children’s Fund Guidance clarifies that local programmes are expected to beinnovative and develop new practices and approaches to tackle social exclusion forchildren and young people (CYPU, 2001). The lessons <strong>from</strong> <strong>the</strong> Children’s Fundprogrammes are to be elicited and <strong>the</strong> ways in which projects and o<strong>the</strong>r Children’sFund activities contribute to achieving outcomes for children are to be understood.The document indicates that local evaluators have <strong>the</strong> role <strong>of</strong> gauging <strong>the</strong>performance <strong>of</strong> Children’s Fund activities and projects; this was expected to supportpartnerships to make appropriate changes to <strong>the</strong>ir programmes:The key purpose <strong>of</strong> local evaluation is to allow partnerships and services tobetter understand how well <strong>the</strong>y are performing and to act on emergingevidence by making relevant changes to <strong>the</strong>ir programme. This will also ensurethat <strong>the</strong> development <strong>of</strong> <strong>the</strong> partnership and services is evidence-based.(CYPU, 2001, p25)Fur<strong>the</strong>rmore, it is aimed that local evaluation will identify key lessons <strong>from</strong> practice toinform local policy on children’s services:<strong>Local</strong> evaluation is to be commissioned by <strong>the</strong> partnership to enable you tounderstand what works, how and why; to monitor progress against local andnational objectives and targets; and to feed in lessons to reconfigure servicesas a result <strong>of</strong> evaluation findings… <strong>the</strong> overall purpose (<strong>of</strong> local evaluation)should be to inform future policy in relation to <strong>the</strong> local context and <strong>the</strong>particular needs and circumstances <strong>of</strong> <strong>the</strong> community. The local evaluationshould consider whe<strong>the</strong>r services provided by <strong>the</strong> partnership are reachingyoung people, children and families showing early signs <strong>of</strong> difficulty and whateffect if any <strong>the</strong> services are having on <strong>the</strong>ir lives.(CYPU, 2001, p7)In practice, local evaluators have interpreted <strong>the</strong> aims defined in <strong>the</strong> Guidance andapplied aspects <strong>of</strong> <strong>the</strong>m as outlined in <strong>the</strong>ir reports. The Children’s Fund initiative hasallowed for partnerships to develop strategies and practices according to <strong>the</strong>irrelevance at <strong>the</strong> local level. Reflecting this, local programmes have sought to define<strong>the</strong> specific roles evaluators take on to correspond with programmes’ priorities,Chapter 1 5


starting points and timetables <strong>of</strong> development, and to social and geographicalcontexts and histories <strong>of</strong> developing services. Some local evaluators have <strong>the</strong>reforebeen commissioned to undertake whole programme evaluations looking at allaspects <strong>of</strong> <strong>the</strong> progress <strong>of</strong> a programme, whilst o<strong>the</strong>rs have focused on particularaspects <strong>of</strong> programmes’ work or selected activities and projects. It is clear thatindividual evaluators have adopted a wide range <strong>of</strong> tasks; <strong>the</strong> range <strong>of</strong> tasks adoptedby a cross section <strong>of</strong> evaluations is illustrated in Table 2.Table 2: Summary <strong>of</strong> <strong>the</strong> multiple roles adopted by Children’s Fund localevaluators over <strong>the</strong> course <strong>of</strong> partnerships’ developmentPrinciple rolesKey tasksStrategic planning • Supporting a partnership by providing information andadvice that is drawn on in strategic planning• Providing baseline data such as identifying existingservices and activities, gaps in service provision; localsocioeconomic data; how <strong>the</strong> partnership fitsalongside existing statutory/VSC services in <strong>the</strong> localauthority area. Such activities are depicted asinforming early decisions such as commissioningprojects and o<strong>the</strong>r activities• Assessing <strong>the</strong> position <strong>of</strong> partnerships in relation tolocal and national policy changes in order to keeppartnerships up-to-date in a context <strong>of</strong> changing policydrawing on policy and research documentsOngoing management/development• Evaluating processes such as partnership working andchildren’s participation• Highlighting examples <strong>of</strong> successful/unsuccessfulpractices• Highlighting examples <strong>of</strong> good practice <strong>from</strong> outside<strong>the</strong> local authority area to inform programmes• Highlighting challenges and barriers to a partnership’swork• Evaluating specified <strong>the</strong>mes or aspects <strong>of</strong>programmes’ work• Looking at change over <strong>the</strong> course <strong>of</strong> a partnership’slife• Mapping partnership activities over time (for example,descriptions <strong>of</strong> <strong>the</strong> range <strong>of</strong> ways Children’s Fundservice providers embrace children and youngpeople’s participation)6Chapter 1


• Providing sets <strong>of</strong> recommendations to partnerships• Evaluating projects to inform ongoingcommissioning/funding <strong>of</strong> projectsInforming <strong>the</strong>mainstream• Evaluating <strong>the</strong> Children’s Fund is seen as providing anevidence base that informs mainstream development<strong>of</strong> preventative strategies and children and familiesservicesCapacity building/evaluation support• Supporting projects (alongside programmes’ centralteams) in relation to evaluation and monitoringrequirements through one-to-one visits with projects,evaluation seminars and developing evaluation‘toolkits’. Evaluators have adopted training roles andhave facilitated projects exchanging practices aroundmonitoring and evaluation• Supporting projects to seek ongoing resources;assisting in <strong>the</strong> development <strong>of</strong> action plansMeasuringimpacts/outcomes• Assessing <strong>the</strong> extent to which (selected) Children’sFund activities are successfully achieving <strong>the</strong> subobjectivesand overall strategic objective <strong>of</strong> <strong>the</strong>initiative at various stages in a partnership’sdevelopment• Assessing <strong>the</strong> impacts <strong>of</strong> projects/activities onservices users (children and young people, parentsand carers)• Assessing whe<strong>the</strong>r Children’s Fund projects/activitiesare effectively reaching specified groups <strong>of</strong> serviceusers• Assessing <strong>the</strong> extent to which partnerships have madean impact on <strong>the</strong> development on local children andfamilies’ services/development <strong>of</strong> strategic plans• Evaluating projects to inform <strong>the</strong> prioritisation <strong>of</strong>ongoing funding<strong>Local</strong> evaluators have typically sought to divide <strong>the</strong>ir work, or at least <strong>the</strong>ir approach,in terms <strong>of</strong> a consideration <strong>of</strong> processes and/or impacts. Illustrating this point, <strong>the</strong>evaluation <strong>of</strong> a Wave Three partnership, for example, clarifies how <strong>the</strong>ir work wasdivided according to <strong>the</strong>se areas:Chapter 1 7


1. Process evaluation: critical assessment <strong>of</strong> <strong>the</strong> processes used to deliver <strong>the</strong>Fund, mainly prevention, partnership and participation, but also strategic plansand <strong>the</strong> commissioning and monitoring processes.2. Impact evaluation: critical assessment <strong>of</strong> <strong>the</strong> impact <strong>the</strong> processes have hadon <strong>the</strong> children and families who access Children’s Fund services. Specifically<strong>the</strong> impact that government envisions in <strong>the</strong> Children Act 2004Some evaluations placed more emphasis on generating learning about processes,while o<strong>the</strong>rs placed more emphasis on evaluating impacts. Hence, local evaluationstend to correspond with both <strong>the</strong> generating learning/informing organisations’development and performance management roles outlined in Table 1.Many evaluators appear, however, to have adopted different roles over <strong>the</strong> course <strong>of</strong><strong>the</strong>ir work. In <strong>the</strong> early stages <strong>of</strong> partnerships’ development, evaluators tended t<strong>of</strong>ocus on providing feedback on processes such as commissioning and partnershipworking, and in doing so adopted formative forms <strong>of</strong> engagement with partnerships inwhich <strong>the</strong>y supported early strategic planning and <strong>the</strong> ongoing management anddevelopment <strong>of</strong> programmes. As partnerships became more established manyevaluations switched <strong>the</strong>ir focus to producing summative outputs in which <strong>the</strong>ymeasured impacts/outcomes. They also focussed on aspects <strong>of</strong> programmes whichhad been successful in meeting targets and sought to more systematically identifyareas <strong>of</strong> work which would be appropriate to continue to fund or to promote formainstreaming.The majority <strong>of</strong> evaluators described <strong>the</strong> role <strong>of</strong> evaluating impacts and that <strong>of</strong>generating information to inform ongoing development <strong>of</strong> partnerships asoverlapping; <strong>the</strong> former is expected to form an evidence base for partnershipdecision making, ra<strong>the</strong>r than simply a summative record <strong>of</strong> performance. Illustratingthis overlap, an evaluation report <strong>of</strong> a Wave Three partnership emphasises <strong>the</strong>analysis <strong>of</strong> partnership and project performance, findings <strong>from</strong> which form anevidence-base that is intended to inform <strong>the</strong> programme and strategic development<strong>of</strong> mainstream agencies:The aim <strong>of</strong> this local evaluation is to provide <strong>the</strong> partnership and projects withan analysis <strong>of</strong> current performance, and to use this understanding to develop astrong evidence base to make changes and improvements to <strong>the</strong> programme,which can also feed into mainstream provision and strategic development.8Chapter 1


Similarly, a Wave One partnership report highlights that evaluation aims to identifywhat works and how, how <strong>the</strong>se relate to targets, and that findings are expected toinform decision making beyond <strong>the</strong> Children’s Fund initiative:The local evaluator is <strong>the</strong> ‘critical friend’ <strong>of</strong> <strong>the</strong> Children’s Fund programme,providing <strong>the</strong> local agencies with information about what works, what actualimpact <strong>the</strong> projects have had, and how this impact came about. Monitoring <strong>the</strong>progress towards national as well as local objectives and targets is an importantaspect <strong>of</strong> local evaluation. The evaluator is required to focus on <strong>the</strong> processes,outputs and short term outcomes <strong>of</strong> Children’s Fund services and should beable to influence <strong>the</strong> short term direction and thrust <strong>of</strong> local services, almost inan action research model.Two fur<strong>the</strong>r roles were apparent in local evaluators’ work: firstly, some evaluationshave tended to adopt consultant/technician/analyst roles (see Table 1) by which <strong>the</strong>yhave carried out specific tasks defined by partnerships such as ga<strong>the</strong>ring andanalysing baseline data at some stages in <strong>the</strong> evaluation. These activities includemapping <strong>of</strong> activities introduced by a partnership or more broadly carrying outbaseline surveys <strong>of</strong> existing services relevant to children and young people in a <strong>Local</strong>Authority area, or <strong>the</strong> mapping <strong>of</strong> socioeconomic contexts that have informed earlydecisions on commissioning. These tasks indicate that some evaluators hadrelatively limited scope to define <strong>the</strong>ir own methods, interpret data and evaluatepartnerships’ activities. Secondly, a number <strong>of</strong> local evaluations have adoptedcapacity building roles (<strong>of</strong>ten also referred to as empowerment evaluation) in which<strong>the</strong>y have sought to support partnership stakeholders, frequently project <strong>of</strong>ficers, toundertake evaluation activities <strong>the</strong>mselves. The latter is detailed more fully inChapter 2 <strong>of</strong> this report.It is <strong>the</strong>refore evident that evaluations have <strong>of</strong>ten adopted multiple roles that changein emphasis over time, including varying degrees <strong>of</strong> capacity-building. Someevaluators also referred to changes stemming <strong>from</strong> policy developments at nationaland local levels that have influenced <strong>the</strong>ir abilities to undertake different roleseffectively; <strong>the</strong>se are described below.1.2 Factors Influencing <strong>the</strong> <strong>Evaluation</strong> ProcessRelations with stakeholders<strong>Local</strong> evaluators highlighted a number <strong>of</strong> engagement and methodological problems<strong>the</strong>y encountered. <strong>Local</strong> evaluators described a broad problem <strong>of</strong> engaging withservice users through projects, and <strong>of</strong> ‘research fatigue’ <strong>from</strong> stakeholder groups withresponses being fewer than anticipated. Some reports pointed to <strong>the</strong> problems <strong>of</strong>Chapter 1 9


time commitments and pressures among project providers and strategicstakeholders, limiting <strong>the</strong>ir engagement in <strong>the</strong> evaluation. It was noted by a WaveThree evaluator, for example, that <strong>the</strong> bureaucratic requirements imposed uponservice providers by <strong>the</strong> initiative made <strong>the</strong>m reluctant to ask more <strong>of</strong> <strong>the</strong>m:… <strong>the</strong> amount <strong>of</strong> paperwork being asked <strong>of</strong> people and <strong>the</strong> amount <strong>of</strong> tasks<strong>the</strong>y were being asked to do in order to get relatively small amounts <strong>of</strong> money<strong>from</strong> <strong>the</strong> Children’s Fund, were overburdening….we really felt that imposingmore on <strong>the</strong>m, ei<strong>the</strong>r in terms <strong>of</strong> asking <strong>the</strong>m to collect stuff for us...or providinga lot <strong>of</strong> time for interviewing…really wasn’t fair.One methodological issue some local evaluators raised related to <strong>the</strong> perception <strong>of</strong><strong>the</strong>ir role by different stakeholders. Whilst <strong>the</strong>re was an understanding thatevaluators should generate learning to inform partnerships and projects’ decisionmaking, <strong>the</strong>re was <strong>the</strong> more difficult perception, especially amongst serviceproviders, that local evaluation was analogous to an audit <strong>of</strong> <strong>the</strong>ir activities. Indeed, anumber <strong>of</strong> local evaluators described encountering service providers who wereuncertain about <strong>the</strong> purpose <strong>of</strong> evaluation and considerable time was spentconvincing <strong>the</strong>m <strong>of</strong> its benefits. For example, a local evaluator suggested:I invested a lot <strong>of</strong> time in just going to see <strong>the</strong>m, reassuring <strong>the</strong>m, beingabsolutely open with my interview questions and giving <strong>the</strong>m <strong>the</strong> schedule,giving <strong>the</strong>m <strong>the</strong> information once I’d written it up. Not for <strong>the</strong>m to change orcomment, but just to look at it, to say this is what I’m going to do. And that wasalso quite time consuming as well, as I remember.In some cases <strong>the</strong>re were concerns that service providers were <strong>of</strong>fering only positiveaccounts to local evaluators as <strong>the</strong>y were connecting evaluation to futurecommissioning decisions. Indeed, this concern was sometimes justified as evaluatorswere asked by Children’s Fund programmes to provide information that coulddetermine <strong>the</strong> future <strong>of</strong> particular services. This may have resulted in <strong>the</strong> reluctance<strong>of</strong> service providers to report more problematic aspects <strong>of</strong> <strong>the</strong>ir services, suspectingthat this could threaten future funding. As an evaluator suggested: Honest evaluationgets confused with <strong>the</strong> PR that is necessary to show that I’m [a service provider]doing this and we need on-going funding. Similarly, a report indicated that: ‘TheChildren’s Fund is both a funding and development body and providers areunderstandably wary <strong>of</strong> sharing too many difficulties and problems with <strong>the</strong> fundersfor fear <strong>of</strong> losing <strong>the</strong>ir funds’. An evaluator <strong>of</strong> a Wave Two partnership said: Whatwas also slightly inhibiting was that at that particular point <strong>the</strong> service providersweren’t used to being evaluated, and some almost saw it as <strong>the</strong> same as beinginspected, and so were very cagey. Reflecting this issue many local evaluation10Chapter 1


eports tend to provide affirmation <strong>of</strong> progress ra<strong>the</strong>r than highlight less successfulaspects <strong>of</strong> project work. This approach may have meant <strong>the</strong> loss <strong>of</strong> important lessonslearnt <strong>from</strong> mistakes made, perhaps occurring more frequently where projects areseen as innovative or ground breaking.<strong>Evaluation</strong> in a context <strong>of</strong> changeEvaluators have undertaken <strong>the</strong>ir work in a complex, changing policy environment inwhich <strong>the</strong>re have been different understandings locally <strong>of</strong> how <strong>the</strong> Children’s Fundshould be used. Policy changes in <strong>the</strong> lead up to <strong>the</strong> Children’s Bill/Act anduncertainties about financial management arrangements for <strong>the</strong> Children’s Fund (<strong>the</strong>change in responsibilities <strong>from</strong> <strong>the</strong> Children and Young Persons Unit to <strong>the</strong>Department for Education and Skills), were described by a number <strong>of</strong> evaluators asproblematic. Uncertainties relating to <strong>the</strong> ongoing evolution <strong>of</strong> local level structuressuch as children’s trusts were also described as potentially problematic, as was <strong>the</strong>announcement <strong>of</strong> budget cuts for <strong>the</strong> Children’s Fund initiative, undermining asystematic approach to planning and conducting evaluations. Indeed, NECFevidence suggests that this experience appears to have parallels with <strong>the</strong> difficultiesChildren’s Fund service providers have in planning <strong>the</strong>ir activities in <strong>the</strong> longer term.Illustrating this point an evaluation report <strong>of</strong> a Wave Three partnership states:‘[funding cuts] led to <strong>the</strong> evaluation having to be conducted on a year-by-year basiswith no guarantee <strong>of</strong> continued funding. The continued uncertainty over funding foryear 3 will adversely affect <strong>the</strong> strategic planning <strong>of</strong> <strong>the</strong> evaluation’. The changeswere also seen as potentially fur<strong>the</strong>r undermining building relationships with keystakeholder groups including service providers. As one local evaluator reporthighlighted: ‘Funding uncertainties experienced by many projects also meant that<strong>the</strong>re was some ambivalence to be engaged in external evaluation when <strong>the</strong>reappeared to be little or no benefit’.A number <strong>of</strong> local evaluators reported that <strong>the</strong>ir roles and positions have beensubject to unanticipated change or, have found that such roles resulted in tensions.For example, a local evaluator was originally engaged with Children’s Fund projectsby providing facilitation and support to enable projects to learn, develop and improve<strong>from</strong> evaluation activities. At a later stage <strong>the</strong> same evaluator was called on to assistin commissioning decisions about future funding for projects based on <strong>the</strong>ir analysis<strong>of</strong> projects’ performance. This role change was described as exaggerating potentialdisengagement and mistrust <strong>from</strong> some project staff who saw evaluation asChapter 1 11


tantamount to auditing when originally it had been promoted as a tool fordevelopment. <strong>Local</strong> evaluators described how this motivated some projects toprovide positive information to help support future funding ra<strong>the</strong>r than a morecomprehensive representation <strong>of</strong> <strong>the</strong>ir successes and failures. One evaluatorexplained:I have to say, that at <strong>the</strong> beginning <strong>of</strong> <strong>the</strong> evaluation, it was just going to be aprogramme for <strong>the</strong> programme itself. But once <strong>the</strong> evaluation was almostcomplete, it was quite clear to me that it was <strong>the</strong>n going to be used to auditservices, which was inhibiting. Because it puts one in a very difficult positionabout what level <strong>of</strong> information you put in, because it can be used as a rodagainst services. So that was quite difficult and quite ethically challenging.Conversely, it was suggested by some evaluators that a number <strong>of</strong> projects haverecognised that evaluation can be a useful tool for demonstrating effectiveness topotential funders. For example, an evaluator <strong>of</strong> a Wave One partnership suggested:… if <strong>the</strong>y don’t have any evaluation material it makes it very difficult for us tocontinue <strong>the</strong> service because we can’t see what impact <strong>the</strong>y are having on <strong>the</strong>young people <strong>the</strong>y’re working with. And I think service providers kind <strong>of</strong>understand that. So <strong>the</strong>y do see a reason why <strong>the</strong>y should be evaluating andalso to kind <strong>of</strong> show <strong>of</strong>f really to people and develop some best practice in <strong>the</strong>area.The same evaluator indicated that more positive relationships had been built withprojects over time through building clear understandings <strong>of</strong> <strong>the</strong> purpose andimportance <strong>of</strong> evaluation:We definitely have a better working relationship than we used to I suppose. Ithink that’s because I’ve been here for a few years now and <strong>the</strong> <strong>Children's</strong> Fundhas been going for a long time. And <strong>the</strong>y kind <strong>of</strong> understand what’s expected <strong>of</strong><strong>the</strong>m more, <strong>the</strong>y understand why we need to evaluate and why <strong>the</strong>y need toevaluate as well.1.3 Children’s Fund <strong>Local</strong> <strong>Evaluation</strong>s’ Principles and StandardsThe Children’s Fund Guidance makes a number <strong>of</strong> statements about <strong>the</strong> standardslocal evaluators are expected to adopt; <strong>the</strong>se include <strong>the</strong> usefulness <strong>of</strong> evaluationsfor developing evidence based understandings <strong>of</strong> what works and why, as well asbeing ‘formative, open, independent and inclusive’ (CYPU, 2001, p11). This alsoreflects current government thinking on standards that evaluations <strong>of</strong> public policyshould adopt. A recent Cabinet Office report, for example, states that evaluationshould: contribute to advancing knowledge and understanding; be defensible interms <strong>of</strong> methods that are appropriate to <strong>the</strong> evaluation questions being addressed;12Chapter 1


conducted rigorously in terms <strong>of</strong> systematic, transparent data collection, analysis andinterpretation; and make credible claims that are well founded, plausible and basedon data ga<strong>the</strong>red (Spencer, et al., 2004).A number <strong>of</strong> local evaluators clearly explicate <strong>the</strong> principles guiding <strong>the</strong> conduct <strong>of</strong><strong>the</strong>ir evaluations in <strong>the</strong>ir reports. O<strong>the</strong>r evaluation reports allude to <strong>the</strong>ir guidingprinciples and standards in <strong>the</strong> descriptions <strong>of</strong> <strong>the</strong>ir approaches and methodologies.The commonly stated principles are described below although it should be noted that<strong>the</strong>re are considerable variations between different local evaluations reflecting <strong>the</strong>irparticular sets <strong>of</strong> aims.Usefulness and relevanceMany local evaluators placed significant emphasis on providing material that wouldbe useful and relevant to Children’s Fund programmes. The need is recognised forregular and frequent reporting <strong>from</strong> local evaluators to ensure that findings andrecommendations have not been made redundant due to a long time lapse in <strong>the</strong>irproduction. The importance <strong>of</strong> being useful is also reflected by <strong>the</strong> formative,developmental approach to relationships between evaluators and partnerships. Thisis reflected in practices such as providing material to inform programme planning, aswell as producing findings at relevant intervals in a partnership’s development inorder to help stakeholders to reflect on and adapt practices.As suggested above, documentary and interview analysis points to <strong>the</strong> ways localevaluators have responded to <strong>the</strong> needs <strong>of</strong> Children’s Fund programmes. A range <strong>of</strong>approaches to defining <strong>the</strong> scope and direction were apparent. For some evaluations<strong>the</strong> partnership closely defined <strong>the</strong> local evaluators’ roles and aspects <strong>of</strong> its work that<strong>the</strong> evaluator would examine, although most appeared to have some latitude togenerate research designs and employ methods <strong>of</strong> data ga<strong>the</strong>ring and analysis <strong>the</strong>yfelt were appropriate to address <strong>the</strong> specific questions and needs <strong>of</strong> partnerships.O<strong>the</strong>r evaluations appear to have had greater latitude to define <strong>the</strong> aspects <strong>of</strong>partnerships’ work that <strong>the</strong>y examined. For example, one evaluator <strong>of</strong> a Wave Twopartnership explained:Well it was a bit <strong>of</strong> both. The programme manager decided that she wanted asecond evaluation, and she wanted it to be more in-depth. So she gave me <strong>the</strong>remit in terms <strong>of</strong>, I want it to cover services and I want to know exactly what<strong>the</strong>y’re doing and what <strong>the</strong>y’re not doing. How it was executed was absolutelydown to me… I think although it was a free rein; I think that was an advantageChapter 1 13


to a certain extent, because I was able to employ <strong>the</strong> techniques that I thoughtwere best at that particular time.Many evaluators were engaged in detailed meetings with programme managers andmembers <strong>of</strong> management teams in <strong>the</strong> early stages <strong>of</strong> evaluations and at regularintervals in order to develop sets <strong>of</strong> questions to be addressed, to agreemethodologies, timescales, <strong>the</strong>mes and elements <strong>of</strong> <strong>the</strong> partnership to be evaluatedand a programme <strong>of</strong> feedback. Some evaluators appear to recognise <strong>the</strong> potentialsignificance and benefits <strong>of</strong> adopting a participative approach to evaluation (that is,responding to <strong>the</strong> views <strong>of</strong> service users in shaping <strong>the</strong>ir evaluation work), whilst alsoresponding to <strong>the</strong> articulated needs <strong>of</strong> those stakeholders managing <strong>the</strong> partnership(in particular programme managers and management teams). One Wave Onepartnership, for example, held an event that involved local young people who wereasked to raise questions <strong>the</strong>y would like to be addressed by <strong>the</strong> local evaluation.Chapter 2 <strong>of</strong> this report provides examples <strong>of</strong> participative approaches <strong>of</strong> some localevaluations.Responsiveness to changeThe majority <strong>of</strong> evaluation reports indicate that <strong>the</strong> aims and scope <strong>of</strong> evaluationswere defined in agreement with partnership members; most <strong>of</strong>ten <strong>the</strong> programmemanager and/or members <strong>of</strong> <strong>the</strong> management team. Many evaluations also stressed<strong>the</strong> importance <strong>of</strong> adopting a flexible approach enabling <strong>the</strong>m to respond to changingneeds <strong>of</strong> <strong>the</strong> partnership over time. This is seen as important in a rapidly changingnational and local policy environment (as noted in Section 1.2 above). Someevaluations emphasised <strong>the</strong> importance <strong>of</strong> regular catch-up meetings with keyprogramme staff to ensure <strong>the</strong> ongoing relevance <strong>of</strong> <strong>the</strong> evaluation. O<strong>the</strong>revaluations adopted an integrated approach within partnership structures (such asbeing formal members <strong>of</strong> partnership sub-groups that focus on evaluation andlearning) that enabled <strong>the</strong>m to interactively engage with partnership stakeholders.One Wave One evaluator indicated that ideas about evaluation were shared througha number <strong>of</strong> networking events. Such ga<strong>the</strong>rings brought toge<strong>the</strong>r evaluation <strong>of</strong>ficers<strong>from</strong> initiatives such as Sure Start, Children’s Fund and Special Education projects t<strong>of</strong>orm a Monitoring and <strong>Evaluation</strong> Group that aims to combine approaches toevaluation consistently within <strong>the</strong> local authority area.14Chapter 1


Independent, balanced accounts <strong>of</strong> partnerships’ activitiesRelatively few reports explicitly stated that a key principle <strong>of</strong> <strong>the</strong> evaluation was itsindependence. One report, for example, indicated that it <strong>of</strong>fered an independent,outsider’s view and hence <strong>of</strong>fered perspectives that practitioners may not have. Anumber <strong>of</strong> reports indicated that it was <strong>the</strong>ir intention to generate learning <strong>from</strong> both<strong>the</strong> successes as well as <strong>the</strong> failures <strong>of</strong> a partnership and its projects, and to highlightboth intended and unintended consequences <strong>of</strong> partnerships’ work. Some reportsstated that <strong>the</strong>y had adopted a ‘critical friend’ stance in <strong>the</strong>ir relations withpartnerships; that negative comments were seen as assisting partnerships toimprove practice ra<strong>the</strong>r than as a judgement <strong>of</strong> poor performance. For example, anevaluation report <strong>of</strong> a Wave One partnership states: ‘Negative comments, where <strong>the</strong>yare given, are intended to be constructive – in <strong>the</strong> spirit <strong>of</strong> <strong>the</strong> “critical friend”’. Manylocal evaluation reports included some criticism <strong>of</strong> <strong>the</strong> progress <strong>of</strong> programmes,<strong>of</strong>fering explanations as to why progress had been slow, proposing possible solutionsto bring about future improvements. Where <strong>the</strong> role <strong>of</strong> critical friend appeared to workmost successfully was where local evaluators were sensitive to <strong>the</strong> complex policyenvironments programmes were operating within, and where key members <strong>of</strong> <strong>the</strong>Children’s Fund programme were receptive to <strong>the</strong> benefits <strong>of</strong> critical appraisal andprepared to use those findings to bring about improvements.A number <strong>of</strong> evaluators described <strong>the</strong> potential problems <strong>of</strong> balancing <strong>the</strong> intention <strong>of</strong>being responsive to <strong>the</strong> needs <strong>of</strong> partnerships, and maintaining <strong>the</strong>ir independence.An evaluator <strong>of</strong> a Wave Two partnership indicated that whilst she felt <strong>the</strong> evaluationwas in principle relatively independent: I think [<strong>the</strong>re was] minimal input <strong>from</strong> <strong>the</strong>partnership board as well, because <strong>the</strong>n it gets very political and agenda-driven. AndI didn’t feel that phase two [project evaluation] was that way, <strong>the</strong>re had been somedifficulties in managing this independence in practice in terms <strong>of</strong> providing messagesthat <strong>the</strong> partnership board required <strong>of</strong> <strong>the</strong> evaluation:… as an independent evaluator, it’s all great and well, wonderful words andeverything, but actually some people don’t understand that. They still think thatyou’re paid by <strong>the</strong> <strong>Children's</strong> Fund <strong>the</strong>refore you will find what <strong>the</strong>y want. Andsometimes <strong>the</strong> partnership board think, well we pay for you, so you will findwhat we want you to find, in <strong>the</strong> way that we want you to find it right.Some reports also stressed <strong>the</strong> importance <strong>of</strong> evaluation work being open tochallenge or correction by partnerships, indeed, <strong>the</strong> stage at which an evaluationengages with a partnership is important in terms <strong>of</strong> how definitive findings areexpected to be. Some reports, particularly early and interim reports, for example,Chapter 1 15


acknowledge that findings presented should be considered provisional, and thatmore definitive conclusions are expected in later reports. Some evaluations aretaking steps to confirm and verify evaluation findings through means such asestablishing reference groups <strong>of</strong> strategic partners and children and young peoplethat review and comment on material generated by <strong>the</strong> evaluation.Inclusive to different groups <strong>of</strong> stakeholdersMost evaluators stated that <strong>the</strong>y were drawing on <strong>the</strong> voices <strong>of</strong> different stakeholdersto comment on different aspects <strong>of</strong> a partnership’s work. Such stakeholders includeprogramme managers and members <strong>of</strong> Children’s Fund staff, both statutory and VSCrepresentatives <strong>of</strong> partnership boards and o<strong>the</strong>r decision making structures, serviceproviders, and importantly children and young people and parents and carers whoare engaged in Children’s Fund activities. It is also recognised that it is important toensure that all participants in an evaluation receive effective feedback (see Chapter 3<strong>of</strong> this report). As suggested in a Wave Three partnership evaluation report, it isimportant to take an inclusive approach: So people feel involved in <strong>the</strong> evaluationprocess not subject to it. A small number <strong>of</strong> reports state that evaluations seek tobalance <strong>the</strong> interests <strong>of</strong> various stakeholder groups, ra<strong>the</strong>r than seek to support anyone group <strong>of</strong> stakeholders. As a local evaluation report <strong>of</strong> a Wave One partnershipstates:The approach taken in <strong>the</strong> design, implementation and reporting <strong>of</strong> thisevaluation is that <strong>of</strong> a public interest ra<strong>the</strong>r than ‘partisan’ evaluation. Itaddresses itself primarily to <strong>the</strong> work that has been done in relation to <strong>the</strong>stated aims and objectives <strong>of</strong> <strong>the</strong>…Children’s Fund. Its role is nei<strong>the</strong>r toadvocate for any one group or organisation, nor to arbitrate within or between<strong>the</strong>m.O<strong>the</strong>r evaluations have adopted inclusive and participative approaches, aiming toinvolve children and young people, parents and carers in <strong>the</strong> evaluation designand/or its practice, as well as interview/questionnaire/focus group respondents. Forexample, <strong>the</strong> specific aim stated by a Wave One local evaluator was to empowerthose potentially benefiting <strong>from</strong> <strong>the</strong> Children’s Fund by eliciting <strong>the</strong>ir experiences <strong>of</strong>Children’s Fund services:We chose to focus on stakeholders like children and <strong>the</strong>ir carers due to <strong>the</strong> factthat <strong>the</strong>y are <strong>the</strong> direct users <strong>of</strong> <strong>the</strong> Children’s Fund projects. Here we intend toempower <strong>the</strong> beneficiaries <strong>of</strong> <strong>the</strong> Children’s Fund by giving <strong>the</strong>m a chance toexpress <strong>the</strong>ir views on ‘what works’ and ‘what doesn’t’ regarding <strong>the</strong> projectsthat <strong>the</strong>y have been taking part in.16Chapter 1


It is not possible to accurately quantify levels and types engagement <strong>of</strong> children andyoung people, parents and carers in local evaluations, although analysis <strong>of</strong> reportssuggests that around half do not involve <strong>the</strong>se groups. Of <strong>the</strong> rest, most involve <strong>the</strong>mas interview/questionnaire/focus group respondents. A minority <strong>of</strong> evaluations appearto be involving <strong>the</strong>m in <strong>the</strong> design stages, undertaking data collection or analysingevaluation data. Never<strong>the</strong>less, <strong>the</strong>re are some examples <strong>of</strong> local evaluations thathave extensively involved children and young people. One such evaluation <strong>of</strong> aWave Three partnership that conducted a piece <strong>of</strong> work on uptake <strong>of</strong> services,involved children and young people in recruiting <strong>the</strong> researcher, designing <strong>the</strong>evaluation, ga<strong>the</strong>ring data and analysing findings. See Chapter 2 <strong>of</strong> this report for areview <strong>of</strong> <strong>the</strong> means adopted by some evaluators to engage children and youngpeople in various aspects <strong>of</strong> evaluations.The credibility <strong>of</strong> methods and evidenceA number <strong>of</strong> evaluators refer to <strong>the</strong> importance <strong>of</strong> adopting a defensible evaluationdesign, that is <strong>the</strong> evaluation is conducted rigorously and systematically in terms <strong>of</strong>methods <strong>of</strong> data collection, analysis and interpretation and that methods adopted areappropriate in answering particular questions evaluations set out to address. Somereports stress <strong>the</strong> importance <strong>of</strong> making credible claims and ensuring <strong>the</strong>se are fullysupported by <strong>the</strong> data. O<strong>the</strong>r reports state that <strong>the</strong> local evaluations were based on<strong>the</strong> openness and transparency <strong>of</strong> <strong>the</strong> evaluation process and <strong>the</strong> methods used interms <strong>of</strong> data collection, analysis and interpretation <strong>of</strong> data. Hence, messagescontained in evaluation outputs should be fully supported by evidence. A broadly heldview is that evaluation methods should reflect <strong>the</strong> particular aspects <strong>of</strong> a partnershipthat evaluations are examining; this is reflected in use <strong>of</strong> multiple methods localevaluations use across different aspects <strong>of</strong> <strong>the</strong>ir work. O<strong>the</strong>r reports emphasise <strong>the</strong>importance <strong>of</strong> a broad-based view <strong>of</strong> partnerships both in terms <strong>of</strong> examining asmany aspects <strong>of</strong> partnerships’ work as possible and situating this work within localand national policy contexts and academic literature. The ways evaluation methodshave been used in combination in examining aspects <strong>of</strong> partnerships’ work andaddressing particular questions are explored in Chapter 2 <strong>of</strong> this report.Ethical approaches to evaluationA number <strong>of</strong> evaluation reports allude to ethical considerations that have guided <strong>the</strong>conduct <strong>of</strong> ga<strong>the</strong>ring data. Such considerations include <strong>the</strong> importance <strong>of</strong> beingculturally and linguistically sensitive in working with diverse groups and to berespectful to evaluation participants. Some evaluators indicated that <strong>the</strong>y had appliedChapter 1 17


methods that were seen as being appropriate to different groups <strong>of</strong> stakeholders;methods for working with children and young people, for example, would be verydifferent to those for interviewing strategic stakeholders. O<strong>the</strong>r evaluators point to <strong>the</strong>importance <strong>of</strong> building trust among stakeholders, in particular service providers andservice users as <strong>the</strong> basis <strong>of</strong> encouraging a willingness to commit time and provideaccurate information. An evaluation <strong>of</strong> a Wave Three partnership highlights <strong>the</strong>danger <strong>of</strong> building false expectations among service providers, who might believethat participating in <strong>the</strong> evaluation may lead to <strong>the</strong> continuation <strong>of</strong> funding.1.4 Chapter Summary1. The Children’s Fund Guidance establishes that partnerships shouldcommission local evaluations <strong>of</strong> <strong>the</strong>ir programmes <strong>of</strong> work, although variationsreflect individual partnership’s needs and priorities. Hence, local evaluators’roles have tended to correspond with partnerships’ starting points (in terms <strong>of</strong>wave, for example), timetables <strong>of</strong> development, and to social andgeographical contexts. Some evaluators have been commissioned toundertake whole programme evaluations, whilst o<strong>the</strong>rs focus on particularaspects <strong>of</strong> programmes’ work or selected activities and projects.2. The range <strong>of</strong> roles adopted by local evaluators includes generating learningabout processes intended to support partnerships’ strategic planning, ongoingmanagement and development, and providing an evidence base that informs<strong>the</strong> development <strong>of</strong> mainstream preventative strategies and children andfamilies’ services.3. Evaluators have also aimed to measure impacts/outcomes <strong>of</strong> Children’s Fundactivities in terms <strong>of</strong> how sub-objectives have been addressed, how <strong>the</strong>initiative has impacted on <strong>the</strong> lives <strong>of</strong> service users and how effectively targetgroups have been reached. This is described by evaluators as forming anevidence base on which partnerships are able to make decisions, ra<strong>the</strong>r thatsimply as a summative record <strong>of</strong> performance. This material is also intendedto inform decisions about projects and activities in terms <strong>of</strong> whe<strong>the</strong>r tocontinue to fund or promote for mainstreaming.18Chapter 1


4. Some evaluators have actively supported projects and programmes’capacities to undertake evaluation and monitoring activities <strong>the</strong>mselvesthrough, for example, providing training and designing ‘toolkits’.5. <strong>Local</strong> evaluators highlighted <strong>the</strong>ir experience <strong>of</strong> a number <strong>of</strong> engagement andmethodological problems. Some evaluators indicated that time commitmentsand pressures among project staff and strategic stakeholders influenced <strong>the</strong>irabilities to participate in <strong>the</strong> evaluation.6. One methodological issue stems <strong>from</strong> <strong>the</strong> roles many local evaluators have asboth generating learning to develop more effective practices, and adopting arole regarded by some stakeholders as analogous to auditing Children’s Fundprojects. Indeed, some local evaluators encountered service providers whowere uncertain about <strong>the</strong> purpose <strong>of</strong> evaluation and considerable time wasspent convincing <strong>the</strong>m <strong>of</strong> its benefits. Some concerns were raised that serviceproviders only <strong>of</strong>fered positive accounts <strong>of</strong> <strong>the</strong>ir progress ra<strong>the</strong>r than revealingproblematic issues since <strong>the</strong>y connected evaluation to decisions about futurecommissioning, a concern that was sometimes justified.7. <strong>Local</strong> evaluation is undertaken in a complex, changing policy environment.Uncertainties relating to <strong>the</strong> evolution <strong>of</strong> local level structures such aschildren’s trusts were described by a number <strong>of</strong> evaluators as problematic, aswas <strong>the</strong> announcement <strong>of</strong> budget cuts for <strong>the</strong> Children’s Fund initiative whichfor some undermined a systematic approach to conducting local evaluation.8. <strong>Local</strong> evaluators point to a number <strong>of</strong> principles and standards that haveguided <strong>the</strong>m. These are: to provide useful and relevant material forpartnerships; to be responsive to change; to <strong>of</strong>fer independent, balancedaccounts <strong>of</strong> partnerships’ activities; to be inclusive to different groups <strong>of</strong>stakeholders; to use credible methods and evidence to support evaluationfindings; to adopt ethical approaches to evaluation.Chapter 1 19


Chapter 2: Methods Adopted by Children’s Fund <strong>Local</strong><strong>Evaluation</strong>sThis chapter reflects on <strong>the</strong> ways local evaluations have embraced multiple(quantitative and qualitative) methods and types <strong>of</strong> evidence in assessing differentaspects <strong>of</strong> Children’s Fund partnerships’ work. The chapter highlights <strong>the</strong> range <strong>of</strong>methods employed by evaluators to measure impacts <strong>of</strong> Children’s Fund projectsand o<strong>the</strong>r interventions, and <strong>the</strong> methods used to generate learning <strong>from</strong> processesincluding partnership working and commissioning. The methods used to ga<strong>the</strong>r data<strong>from</strong> children, young people and <strong>the</strong>ir families and o<strong>the</strong>r ways children and youngpeople have engaged in <strong>the</strong> evaluation process are <strong>the</strong>n described.2.1 Embracing Multiple-Method Approaches<strong>Local</strong> evaluators tended to draw on multiple quantitative and qualitative data sourcesand analysis methods that reflected <strong>the</strong> different elements <strong>of</strong> partnerships’programmes <strong>the</strong>y aimed to examine. Quantitative methods, sometimescomplemented by qualitative methods, were frequently adopted for assessing <strong>the</strong>impacts <strong>of</strong> Children’s Fund project interventions in terms <strong>of</strong> successfully reachingtarget groups and having a positive impact on aspects <strong>of</strong> <strong>the</strong>ir lives. Qualitativemethods, sometimes supplemented by quantitative methods, were primarilyemployed for generating learning about <strong>the</strong> processes <strong>of</strong> Children’s Fund activitysuch as <strong>the</strong> effectiveness <strong>of</strong> partnership working, commissioning processes andapproaches to mainstreaming. This research strategy reflects a trend in <strong>the</strong> UK,referred to earlier in this report, towards broadening what is considered legitimateevidence in evaluative research, and in particular a move towards embracing andvaluing qualitative as well as quantitative data collection reflecting <strong>the</strong>acknowledgement that different methods are appropriate in evaluating differentaspects <strong>of</strong> organisations’ work (for example, Spencer, et al., 2004).A Wave One evaluator’s multi-strategy approach, for example, typified thisorientation through using both quantitative and qualitative methods differentially toevaluate varying aspects <strong>of</strong> <strong>the</strong> partnership’s work. This evaluator carried outstructured (quantitative) questionnaire surveys with strategic partnership membersand with service providers, toge<strong>the</strong>r with value for money assessments alongsidemore qualitatively oriented methods such as focus groups, in-depth interviews withstakeholders and observations. The structured questionnaires were deemedappropriately confidential and non-intrusive to ga<strong>the</strong>r honest opinions andperspectives to effectively …measure Children’s Fund performance against EveryChild Matters outcomes and to ga<strong>the</strong>r fur<strong>the</strong>r information about <strong>the</strong> Children’s Fundaims <strong>of</strong> prevention, partnership working and participation. Follow-up, qualitative in-Chapter 2 21


depth interviews were <strong>the</strong>n carried out, mainly with service providers, allowing <strong>the</strong>evaluators to … delve deeper into <strong>the</strong> processes and outcomes (impact) <strong>of</strong> servicesand generate a rich collection <strong>of</strong> data for analysis. The direct observation <strong>of</strong> serviceprovision followed this and was regarded as an opportunity for evaluators …to backup information ga<strong>the</strong>red <strong>from</strong> questionnaires and interviews… and to contextualiseanalysis within <strong>the</strong> service setting.Similarly, ano<strong>the</strong>r Wave One evaluation used both qualitative and quantitativemethods to analyse processes at both strategic and operational levels. Qualitativeanalysis <strong>of</strong> key partnership documents such as planning and discussion documentsand records <strong>of</strong> partnership board meetings and o<strong>the</strong>r activities at strategic levelhelped evaluators …understand and describe <strong>the</strong> development <strong>of</strong> <strong>the</strong> Children’sFund in relation to <strong>the</strong> core aims and principles <strong>of</strong> <strong>the</strong> initiative. A more quantitativelyorientated analysis <strong>of</strong> survey data <strong>from</strong> a census <strong>of</strong> all service providers enabledlocal evaluators to establish a: … description <strong>of</strong> a range <strong>of</strong> processes and structuresunderpinning <strong>the</strong> planning and implementation <strong>of</strong> services within <strong>the</strong> Children’s FundFramework. The sections below elaborate fur<strong>the</strong>r on methods adopted in evaluatingimpacts and processes such as those discussed above, and reflect on some <strong>of</strong> <strong>the</strong>challenges local evaluators have experienced in attributing impacts to specificChildren’s Fund activities.2.2 Measuring Impacts: <strong>the</strong> Attribution ProblemThe focus <strong>of</strong> measuring impacts varied between and within evaluation reports; manyexplored <strong>the</strong> impacts <strong>of</strong> project interventions on aspects <strong>of</strong> children’s livescorresponding to <strong>the</strong> Children’s Fund sub-objectives and <strong>the</strong> Every Child Mattersoutcomes. An extended exemplar <strong>of</strong> <strong>the</strong> work an evaluation <strong>of</strong> a Wave Onepartnership provided below illustrates <strong>the</strong> use <strong>of</strong> multiple quantitative and qualitativedata sources and methods at different levels for assessing impacts <strong>of</strong> Children’sFund projects on school attendance rates and educational attainment. It alsoillustrates some <strong>of</strong> <strong>the</strong> problems <strong>of</strong> demonstrating that outcomes are attributed to aparticular intervention.Some evaluators also explored impacts in terms <strong>of</strong> <strong>the</strong> extent to which <strong>the</strong> Children’sFund initiative was effectively targeting its services at specified user groups at a locallevel or as an assessment <strong>of</strong> <strong>the</strong> value for money <strong>of</strong> particular activities andinterventions. For <strong>the</strong>se pieces <strong>of</strong> work service providers were <strong>of</strong>ten asked torespond to structured surveys to establish whe<strong>the</strong>r successful targeting had taken22Chapter 2


place. In one Wave One evaluation this approach was deemed an appropriatemethod as opinions could be ga<strong>the</strong>red directly <strong>from</strong> <strong>the</strong> source <strong>of</strong> provision:‘Respondents were asked to think about <strong>the</strong> services <strong>the</strong>ir project provides and <strong>the</strong>groups <strong>of</strong> individuals using those services in order to identify which level(s) <strong>of</strong>prevention project activity is oriented towards’. An example <strong>of</strong> an attempt to measureimpact by focusing on value for money can be identified in a Wave Three evaluation.Assessing <strong>the</strong> value for money <strong>of</strong> Children’s Fund projects involved comparing <strong>the</strong>costs <strong>of</strong> providing services with <strong>the</strong> numbers <strong>of</strong> interventions and numbers <strong>of</strong> serviceusers by drawing on monitoring returns data.Although <strong>the</strong> interpretation <strong>of</strong> ‘impact’ varied between local evaluators, many faced<strong>the</strong> significant challenge <strong>of</strong> attempting to confidently attribute identified impacts to <strong>the</strong>work <strong>of</strong> specific Children’s Fund projects. As suggested earlier in this report <strong>the</strong>problems <strong>of</strong> social exclusion are complex, multi-layered and interacting, making itdifficult to demonstrate causal links between a single intervention and a measurableoutcome (Percy-Smith, 2000; Pierson, 2002; Coote, et al., 2004). Writing aboutdevelopments in thinking about conducting social policy evaluation, Coulton notesthat <strong>the</strong> lives <strong>of</strong> children are complex and multi-layered: ‘…children and <strong>the</strong>ir familieslive within local communities…children interact with neighbours; participate in localinstitutions; receive social, health, and educational services’ (1995).<strong>Local</strong> evaluators <strong>of</strong> Children’s Fund partnerships described <strong>the</strong> difficulties <strong>the</strong>yexperienced in attempting to attribute impacts to <strong>the</strong> intervention <strong>of</strong> Children’s Fundprojects, especially in <strong>the</strong> context <strong>of</strong> <strong>the</strong> initiative’s preventive agenda. This suggeststhat <strong>the</strong> difficulties <strong>of</strong> attribution appear to be particularly acute in complex, multilayered,multi-agency community initiatives such as <strong>the</strong> Children’s Fund. Anevaluation <strong>of</strong> a Wave One partnership that reports on detailed work to explore <strong>the</strong>impacts <strong>of</strong> selected project interventions concludes that, despite attempts to locaterelationships between <strong>the</strong> initiative and indicators such as school attendance andeducational attainment indicators, it had not been possible to ‘… confidently attributecurrent trends directly to <strong>the</strong> work <strong>of</strong> <strong>the</strong> X’s Children’s Fund’. Caveats relating towork on impacts such as <strong>the</strong>se are also evident in o<strong>the</strong>r local evaluation reports. Forexample a large-scale quantitative evaluation <strong>of</strong> preventative interventionscommissioned by a Wave One partnership carried out extensive analysis <strong>of</strong> data setsincluding quarterly project monitoring data, postcode data, Indices <strong>of</strong> MultipleDeprivation and school performance data in order to demonstrate changes in schoolattendance and achievement. Although <strong>the</strong>se statistics suggested that <strong>the</strong>re hadChapter 2 23


een notable improvements in school outcomes, <strong>the</strong> report was careful to point outthat ‘… causation can not be attributed to <strong>the</strong> impact <strong>of</strong> <strong>the</strong> X Children’s Fund’.One related difficulty reflects <strong>the</strong> limited timescale <strong>of</strong> <strong>the</strong> Children’s Fund initiative in<strong>the</strong> time-limited nature <strong>of</strong> local evaluators’ work. For example, a local evaluator <strong>of</strong> aWave One partnership described how <strong>the</strong> longer-term nature <strong>of</strong> prevention <strong>of</strong>children and young people’s social exclusion contradicts <strong>the</strong> short-term, snapshotnature <strong>of</strong> local evaluation: … we wouldn’t have any specific indicators with thatparticular project because it’s a long term project and we won’t know <strong>the</strong> results formany years to come, and <strong>Children's</strong> Fund won’t be here. Similarly, an evaluator for aWave Three partnership pointed out <strong>the</strong> problems attributing impact to projects forwhich children may not be involved over a sustained period: … because it’s a rollingprogramme and children dip in and out… that is difficult to evaluate, you won’t get <strong>the</strong>same child sort <strong>of</strong> continually going to <strong>the</strong> same project, it’s a moving sample.O<strong>the</strong>r problems in assessing impacts reported by evaluators relate to <strong>the</strong> nature <strong>of</strong><strong>the</strong> outcome measures and <strong>the</strong> data available to demonstrate impacts. Someevaluators felt that it was difficult to respond to <strong>the</strong> Children’s Fund sub-objectivesand Every Child Matters outcomes frameworks since <strong>the</strong>y tend to define outcomes inrelatively broad terms that are potentially open to a range <strong>of</strong> interpretations, and thatestablished statistical data sets tend not to be available that clearly show change.This point was also noted in a report by <strong>the</strong> NECF (2004a) that explores <strong>the</strong> use <strong>of</strong>indicators in assessing <strong>the</strong> impact <strong>of</strong> Children’s Fund activities. Illustrating this view aWave One partnership evaluator suggested:… at <strong>the</strong> moment <strong>the</strong> whole thing is set up obviously kind <strong>of</strong> aiming vaguely at<strong>the</strong> kind <strong>of</strong> Every Child Matters outcomes or <strong>the</strong> original… <strong>the</strong> original CYPUoutcomes or objectives… But many <strong>of</strong> those are so global and so broad thatindividual projects can’t hope to demonstrate achievement against those and<strong>the</strong>y have got to be helped to break that right down into small steps. That is stilla major issue and I still find that <strong>the</strong>re’s a great deal <strong>of</strong> confusion even atproject director level sometimes <strong>the</strong>re’s confusion between outcomes andoutputs.Linked to this issue, a number <strong>of</strong> evaluators also described Children’s Fund quarterlymonitoring data as <strong>the</strong> principle source <strong>of</strong> information for assessing project impacts,in terms <strong>of</strong> <strong>the</strong> deployment <strong>of</strong> budgets and <strong>the</strong> numbers and characteristics <strong>of</strong> <strong>the</strong>children and young people using <strong>the</strong> services. However, difficulties related both to<strong>the</strong> ability <strong>of</strong> projects to provide this information and to <strong>the</strong> usefulness <strong>of</strong> <strong>the</strong>se datato local evaluators. In particular inconsistencies and unsubstantiated estimates24Chapter 2


provided by projects weakened <strong>the</strong> accuracy <strong>of</strong> this data set. Reflecting <strong>the</strong> views <strong>of</strong>a number <strong>of</strong> evaluators a Wave One evaluator described how <strong>the</strong> impacts <strong>of</strong>Children’s Fund interventions on educational targets, namely measures <strong>of</strong>educational attainment and attendance rates, are more easily demonstrated thano<strong>the</strong>r outcomes since <strong>the</strong>y are relatively precise, concrete concepts, and thatstatistical data sets exist that can be drawn on for secondary analysis:There are also issues around availability <strong>of</strong> data and we’ve been quite reliant onbeing able to match up individual records, which services provide us with, withsecondary data and we’ve only been able to do that for some areas <strong>of</strong>Children’s Fund sort <strong>of</strong> targets… <strong>the</strong>y’re only able to look really at Children’sFund targets around educational attainment and inclusion [attendance].A related issue, as was suggested in one local evaluation report, <strong>the</strong>re hastraditionally been reluctance among practitioners and decision makers to accept <strong>the</strong>credibility <strong>of</strong> outputs which are not what is <strong>of</strong>ten referred to as hard evidence, that isquantitative evaluation data, in <strong>the</strong> context <strong>of</strong> evaluative, and indeed o<strong>the</strong>r types <strong>of</strong>research. It is perhaps <strong>the</strong>refore inevitable that attempts by some evaluators toattribute <strong>the</strong> work <strong>of</strong> <strong>the</strong> Children’s Fund by drawing on qualitative evidence, such asby eliciting <strong>the</strong> voices <strong>of</strong> service users, has not been viewed by some Children’sFund stakeholders as legitimate forms <strong>of</strong> evidence that demonstrate impacts. Oneevaluator, for example, described <strong>the</strong> problems <strong>of</strong> demonstrating to members <strong>of</strong> apartnership board how a Children’s Fund service had had an impact on <strong>the</strong> lives <strong>of</strong>those using it:It might just be some kind <strong>of</strong> headline stats <strong>the</strong>y [<strong>the</strong> partnership board] wantreally, something quite quick that <strong>the</strong>y can kind <strong>of</strong> take in… I know that thatdrop in [Children’s Fund service] is a lifeline for some <strong>of</strong> those parents thatattend <strong>the</strong>re. They wouldn’t know what to do with <strong>the</strong>ir children if <strong>the</strong>y didn’t goto that drop in. And I can kind <strong>of</strong> write that up but it’s not hard evidence.Sometimes our senior <strong>of</strong>ficers want just <strong>the</strong> hard evidence, just <strong>the</strong> stats reallywhich sometimes doesn’t really tell <strong>the</strong> whole story.Example <strong>of</strong> an evaluation <strong>of</strong> impacts attributed to school-based Children’sFund projectsThe local evaluation report <strong>of</strong> a Wave One Children’s Fund partnership clearlydemonstrates <strong>the</strong> challenges in attempting to attribute <strong>the</strong> Children’s Fund activity tooutcome indicators, in this case to those related to education. The evaluation took<strong>the</strong> five <strong>the</strong>mes identified by <strong>the</strong> local Children’s Fund programme and attempted tomeasure outcomes linked to specified projects related to <strong>the</strong>se. One overarchingdifficulty identified by <strong>the</strong> evaluation related to measuring <strong>the</strong> effects <strong>of</strong> a time-limitedChapter 2 25


initiative such as <strong>the</strong> Children’s Fund, as <strong>the</strong> evaluation report states: ‘… a number <strong>of</strong>issues have been raised about <strong>the</strong> suitability <strong>of</strong> many <strong>of</strong> <strong>the</strong>se indicators whenmeasuring a relatively short-term, preventive programme such as <strong>the</strong> X’s Children’sFund’.The evaluation assessed <strong>the</strong> impacts <strong>of</strong> two Children’s Fund-supported projects thatinvolved one-to-one work with specific pupils that were aimed to improve schoolattendance and attainment: <strong>the</strong> first, a project that focused on pupil attendance,aimed to ‘(build) supportive relationships with all those involved with individual pupilswho are showing poor attendance’. The second, a volunteer reading project, wasdescribed as providing ‘extra educational support for many children who have fallenbelow <strong>the</strong> expected reading age for <strong>the</strong>ir school year’. Although <strong>the</strong>se projects weretargeted within a number <strong>of</strong> schools, <strong>the</strong> former fourteen, <strong>the</strong> latter sixty, <strong>the</strong>evaluators experienced a number <strong>of</strong> difficulties in attempting to apply quantitativemeasures when assessing <strong>the</strong>ir impact, particularly when applied to <strong>the</strong> readingproject. As <strong>the</strong>y explain: ‘measuring <strong>the</strong> impact <strong>of</strong> this kind <strong>of</strong> work…is problematic,as it tends to be informal, non-curriculum based and thus difficult to quantify’.The evaluation encountered a number <strong>of</strong> difficulties in using quantitative data sets(school attendance and unauthorised absence and performance at Key Stages 2 and3) to measure outcomes <strong>of</strong> Children’s Fund activity locally. <strong>Local</strong> and national schoolattendance figures were compared; improvements between 2001 and 2003 werehigher locally than <strong>the</strong> national average. They were cautious, however, in suggestingthat this could be attributed to <strong>the</strong> local Children’s Fund projects as <strong>the</strong>y also foundthat attainment levels remained at less than <strong>the</strong> national average during this time.The evaluation <strong>the</strong>refore acknowledges <strong>the</strong> difficulties with attributing improvementsto <strong>the</strong> work <strong>of</strong> X Children’s Fund despite <strong>the</strong> high number <strong>of</strong> schools that weretargeted and <strong>the</strong>refore focuses on examples <strong>of</strong> specific cohorts <strong>of</strong> pupils in specificschools potentially benefiting <strong>from</strong> X Children’s Fund.Using this comparison <strong>the</strong> evaluation finds ‘clear improvements’ in attendance during<strong>the</strong> time <strong>the</strong> project has been running. However, again caution is advised asdiscrepancies between ‘authorised’ and ‘unauthorised’ absences makes for a morecomplicated picture – when considered separately ei<strong>the</strong>r one <strong>of</strong> <strong>the</strong>se had actuallyincreased in some schools during this period. This cautionary note compounds <strong>the</strong>persistent issue <strong>of</strong> attribution which <strong>the</strong> evaluators feel is not solved by focusing onindividual schools: ‘since <strong>the</strong>se statistics are at school level…it is difficult to26Chapter 2


confidently attribute any trends in this data to <strong>the</strong> work <strong>of</strong> <strong>the</strong> X (project) as o<strong>the</strong>rfactors may be affecting <strong>the</strong> figures’.The evaluation <strong>the</strong>n analysed child level data based on twelve randomly selectedchildren in <strong>the</strong> schools targeted by <strong>the</strong> project focusing on school attendance. At thislevel <strong>the</strong> evaluation report appears able to identify and evidence significant impacts<strong>of</strong> <strong>the</strong> project work with more confidence. For example, one pupil’s attendance levelsincreased by 40% over a seven month period, during which s/he had receivedsupport <strong>from</strong> <strong>the</strong> project: ‘…it is clear that without <strong>the</strong> support and encouragement <strong>of</strong><strong>the</strong> X (Project), all <strong>of</strong> <strong>the</strong>se children and probably many o<strong>the</strong>rs, would have missed alarger percentage <strong>of</strong> <strong>the</strong>ir early years education’. There were also attempts tomeasure <strong>the</strong> impacts <strong>of</strong> <strong>the</strong> reading project at individual child level by considering anational pupil survey conducted by <strong>the</strong> organisation operating <strong>the</strong> project. The surveymeasured impacts in terms <strong>of</strong> ‘attitudes to reading’ and demonstrated animprovement in this during 2002. Evaluators detailed <strong>the</strong> improved reading ages <strong>of</strong>five randomly chosen pupils at one <strong>of</strong> <strong>the</strong> targeted schools, as described by <strong>the</strong> staffdelivering <strong>the</strong> project. Qualitative impacts were also described. These were based onsome one-to-one interviews with children accessing <strong>the</strong> service and duringevaluation events where children were encouraged to express <strong>the</strong>ir thoughts andopinions on <strong>the</strong> projects <strong>the</strong>y had been accessing through art work and writing.Analysis <strong>of</strong> such data led evaluators to note <strong>the</strong> development <strong>of</strong> ‘supportiverelationships built between <strong>the</strong> children and <strong>the</strong> project volunteers’ and ‘<strong>the</strong> effort… toincorporate <strong>the</strong> children’s views on how <strong>the</strong>ir work should progress and particularlywhat books <strong>the</strong>y enjoyed’.It <strong>the</strong>refore appears that it was when <strong>the</strong> evaluator reported upon data ga<strong>the</strong>red <strong>from</strong>child-level analysis that impacts could be described with greater confidence. Thesequalitative data typically referred to <strong>the</strong> experiences <strong>of</strong> individual children ra<strong>the</strong>r thanbeing <strong>of</strong> a quantitative nature. There remained, however, potential difficulties with<strong>the</strong>se data. For example, <strong>the</strong> selection <strong>of</strong> <strong>the</strong> five pupils by those providing <strong>the</strong>service may not have resulted in a typical sample.Overall, however, <strong>the</strong> evaluators argued that <strong>the</strong>ir multiple methods approach(analysis <strong>of</strong> quantitative data at national, local, school and individual child-levels,toge<strong>the</strong>r with qualitative interviews allowing for service providers’ and service users’perceptions <strong>of</strong> project impacts to be revealed) whilst imperfect, allowed <strong>the</strong>m toChapter 2 27


suggest with some confidence that Children’s Fund projects had led to positiveoutcomes for children using <strong>the</strong> services.2.3 Evaluating ProcessesMany local evaluation reports undertook assessments <strong>of</strong> key Children’s Fundprocesses associated with <strong>the</strong> delivery <strong>of</strong> programmes including: setting targets andcommissioning services; processes supporting <strong>the</strong> infrastructure <strong>of</strong> <strong>the</strong> programme,including communications arrangements, financial management arrangements andcross project information and referral arrangements; processes in place to report onprogress, including monitoring and evaluation arrangements; work on establishingnew models <strong>of</strong> working among mainstream agencies; children and young people’sparticipation and <strong>the</strong> effectiveness <strong>of</strong> strategic partnership arrangements.A range <strong>of</strong> methods tended to be used by individual evaluators. A significant number<strong>of</strong> reports describe <strong>the</strong> use <strong>of</strong> data elicited <strong>from</strong> in-depth qualitative interviews orfocus groups with strategic stakeholders when discussing perceptions andexperiences relating to <strong>the</strong> effectiveness <strong>of</strong> strategic processes developed by localChildren’s Funds. The focus for interviews was typically <strong>the</strong> experiences and learningindividual members <strong>of</strong> partnership boards could <strong>of</strong>fer, with some evaluators carryingout interviews with stakeholders in <strong>the</strong> wider strategic landscape in <strong>the</strong> locality, forexample key individuals in Social <strong>Services</strong> or large voluntary organisations who maybe affected by Children’s Fund activity.O<strong>the</strong>r methods adopted include observational work within strategic and in projectsettings, quantitative questionnaire surveys to establish <strong>the</strong> systems and strategies inplace between stakeholders and documentary analysis <strong>of</strong> key partnership planningdocuments and board minutes. A Wave Three evaluator, for example, described howdrawing on observational, interviews and focus group work at both strategic andoperational levels <strong>of</strong> Children’s Fund activity enabled a more detailed, more holisticpicture <strong>of</strong> partnership working, prevention and participation to emerge:I think it gave me a holistic picture <strong>of</strong> everything for a start because <strong>of</strong> courselike I say, <strong>the</strong> ‘three Ps’ [prevention, participation and partnership], I feel arealmost concepts as well as… <strong>the</strong>y’re almost virtual… and people have differentunderstandings in different areas, and may have different understandings <strong>of</strong> it.So it’s about looking at <strong>the</strong>ir understanding, or <strong>the</strong>ir interpretation <strong>of</strong> preventionand participation, and partnership working. They gave me a more holisticunderstanding <strong>of</strong> that, and shaped that.28Chapter 2


A Wave Two evaluation report describes how initial questionnaire survey work withstrategic partners was followed by in-depth qualitative focus group research with<strong>the</strong>se stakeholders; this had allowed <strong>the</strong>m to examine a broad range <strong>of</strong> <strong>the</strong>mes indepth and explore fur<strong>the</strong>r issues that emerged over <strong>the</strong> course <strong>of</strong> <strong>the</strong> evaluation and<strong>the</strong>reby gain a relatively complete understanding <strong>of</strong> key processes and <strong>the</strong>ireffectiveness:… we looked at <strong>the</strong> composition <strong>of</strong> <strong>the</strong> Board; at <strong>the</strong> organisations represented.We looked at how Executive Board members were recruited on to <strong>the</strong> Boardand <strong>the</strong>n looked at <strong>the</strong> relationships between <strong>the</strong> organisations on <strong>the</strong> Boardand how those had relationships with <strong>the</strong> Children’s Fund team. We looked athow <strong>the</strong> involvement <strong>of</strong> each organisation affected <strong>the</strong> ways <strong>of</strong> working andallocation <strong>of</strong> resources within <strong>the</strong> programme; we looked at <strong>the</strong> frequency andnature <strong>of</strong> meetings and at <strong>the</strong> input <strong>of</strong> children, young people and parents andcarers into that process; <strong>the</strong> effectiveness <strong>of</strong> communication and <strong>the</strong> extent towhich <strong>the</strong>y had increased <strong>the</strong> capacity <strong>of</strong> <strong>the</strong> local community and voluntarygroups.2.4 Engaging Children and Young People in Aspects <strong>of</strong> <strong>Local</strong> <strong>Evaluation</strong>sMany evaluations have involved children and young people in <strong>the</strong>ir work in a range <strong>of</strong>ways. This has been particularly pertinent in <strong>the</strong> context <strong>of</strong> <strong>the</strong> Children’s Fundprogramme, which has emphasised <strong>the</strong> participation <strong>of</strong> potential beneficiaries <strong>of</strong>Children’s Fund services, namely children, young people and <strong>the</strong>ir families, but alsothose <strong>of</strong> communities more broadly. Indeed, such groups have been traditionallyexcluded <strong>from</strong> evaluation and as subjects <strong>of</strong> social policy research (for example,Coote, et al., 2004). The analysis <strong>of</strong> local evaluation reports suggests that less thanhalf <strong>of</strong> evaluations involved children and young people as questionnaire respondentsor participants in interviews and/or focus groups; <strong>of</strong> <strong>the</strong>se many indicated that <strong>the</strong>yhad used ‘participatory’ methods <strong>of</strong> engaging children and young people. One localevaluator <strong>of</strong> a Wave Three partnership, for example, emphasised <strong>the</strong> importance <strong>of</strong>capturing <strong>the</strong> ‘felt’ impact <strong>of</strong> <strong>the</strong> programme on <strong>the</strong> lives <strong>of</strong> individual children, youngpeople and <strong>the</strong>ir families. This evaluation team attempted to do this by employing avariety <strong>of</strong> data collection techniques, designed to ga<strong>the</strong>r information in appropriateenvironments:… where appropriate, interviews with service users made use <strong>of</strong> a portablediary room, to gauge user’s opinions <strong>of</strong> <strong>the</strong> service <strong>the</strong>y attend…<strong>the</strong> aim <strong>of</strong> <strong>the</strong>diary room was to provide a private, fun space in which users could give <strong>the</strong>irthoughts and opinions on <strong>the</strong> service <strong>the</strong>y attend…<strong>the</strong> diary room was childfriendlyand safe.A Wave One partnership evaluation report indicates that as a result <strong>of</strong> work in whichchildren and young people’s voices were elicited, <strong>the</strong>y were also able to collate ‘aChapter 2 29


ich source <strong>of</strong> data regarding perspectives about such issues as problems <strong>of</strong> dailylife, physical environments and <strong>the</strong> experiences <strong>of</strong> accessing support services’.Ano<strong>the</strong>r evaluator suggested that by tailoring <strong>the</strong> methods to children and youngpeople, <strong>the</strong>y would be more engaged and <strong>the</strong>refore more willing to express <strong>the</strong>irviews:I mean we’ve definitely chosen specific methodologies for children and youngpeople like <strong>the</strong> photos and drawing and <strong>the</strong> art and craft work because wewanted to make it fun really. We didn’t really want to give <strong>the</strong>m ano<strong>the</strong>r kind <strong>of</strong>boring tick box questionnaire, so we’ve definitely tailored our methodologiesaround <strong>the</strong>mA Wave One evaluation pursued a participatory approach in <strong>the</strong>ir work with childrenand young people. This involved various creative activities and pursuits that wereconsidered appropriate on a number <strong>of</strong> levels in terms <strong>of</strong> encouraging individualexpression, establishing appropriate targeting within services and empoweringservice beneficiaries:Wherever possible a participatory approach was developed, for example usingimages, artwork and games, to engage young people and staff incommunicating what <strong>the</strong>y thought and felt about <strong>the</strong> activity <strong>the</strong>y had beeninvolved in. This sort <strong>of</strong> activity went some way in identifying outcomes relatingto target children, along with children’s engagement. It was an approach thatmirrored <strong>the</strong> original rhetoric <strong>of</strong> <strong>the</strong> Children’s Fund Guidance in relation toempowering children, young people, families and communities to becomeinvolved in planning and delivering services, eventually taking responsibility andcontrol <strong>of</strong> solutions to problems for <strong>the</strong>mselves.A minority <strong>of</strong> local evaluators appear to have involved children more fully throughevaluation design, participation in fieldwork, analysis <strong>of</strong> data and presenting findings.For example, a Wave Three evaluation drew on <strong>the</strong> Alderson and Montgomery model<strong>of</strong> ‘participation in decision making’ whereby children and young people are includedin <strong>the</strong> evaluation process <strong>from</strong> <strong>the</strong> outset. The process had two stages; firstlyincluding children and young people in decisions relating to <strong>the</strong> design <strong>of</strong> <strong>the</strong>evaluation and <strong>the</strong>n inviting <strong>the</strong>m to have a significant input in how <strong>the</strong> research wasconducted. Similarly, a Wave Three evaluation used a participatory action researchmodel attempting to ascertain why children, young people and <strong>the</strong>ir families might notbe inclined to use Children’s Fund services in this partnership area. This involveddrawing on <strong>the</strong>se potential service-users at all stages <strong>of</strong> <strong>the</strong> evaluation process, asdescribed below:30Chapter 2


• Recruitment <strong>of</strong> <strong>the</strong> researcher: children and young people were involved inwriting <strong>the</strong> job description, advertising post and interviewing candidates.• Design <strong>of</strong> <strong>the</strong> evaluation: steering groups <strong>of</strong> children, young people andparents helped design and test <strong>the</strong> evaluation approaches and fieldworkinstruments.• Fieldwork: children and young people undertook fieldwork with support andguidance.• Analyses <strong>of</strong> <strong>the</strong> findings: children helped to analyse and interpret datacollected and consider how to present <strong>the</strong> findings.• Presentation <strong>of</strong> <strong>the</strong> findings: children and young people were involved indisseminating findings.A similar approach was adopted by a Wave One evaluation. This evaluation workedwith <strong>the</strong> partnership to establish <strong>the</strong> ‘… questions <strong>the</strong>y (children and young people)would want answered <strong>from</strong> an evaluation’. Following this, <strong>the</strong> intention was to workwith a small team <strong>of</strong> older young people to ‘select several questions and visitidentified projects to talk to staff and young people involved…about <strong>the</strong> work <strong>the</strong>y aredoing’. What were called ‘Young Reporters’ collected evaluation material <strong>from</strong>projects in an attempt to draw children and young people into <strong>the</strong> evaluation process.According to <strong>the</strong> evaluators, a key advantage was that children and young peoplebrought informality to <strong>the</strong> research process and an ability to elicit appropriate data,<strong>the</strong> generation <strong>of</strong> which may not have o<strong>the</strong>rwise been recorded, as is described in<strong>the</strong> evaluation report:The Young Reporters are able, in a very direct way, to feedback to project staffwhat <strong>the</strong> children… are saying and also to ask challenging questions…Feedback <strong>from</strong> projects suggests that this is quite a disarming process thatinduces replies that might not have been so readily given to an adult evaluatorFor this and a number <strong>of</strong> o<strong>the</strong>r evaluations, recognising <strong>the</strong> skills <strong>of</strong> children andyoung people and applying <strong>the</strong>se to research processes was seen as enrichingevaluation data. By being interviewed by <strong>the</strong>ir peers, younger service users wereable to ‘give candid views about <strong>the</strong>ir experiences’. Fur<strong>the</strong>rmore <strong>the</strong> success <strong>of</strong> <strong>the</strong>Young Reporters could be used as an exemplar <strong>of</strong> good practice within <strong>the</strong>Children’s Fund partnership and be drawn upon as a useful evaluation tool:It has been acknowledged within <strong>the</strong> programme that <strong>the</strong> Young Reportersinitiative represents a model <strong>of</strong> good practice…projects are learning <strong>from</strong> <strong>the</strong>Chapter 2 31


example. Fur<strong>the</strong>r efforts, such as a proposed good practice poster, should bemade to ensure that this model is effectively disseminated.Despite <strong>the</strong> positive experiences <strong>of</strong> involving children and young people in <strong>the</strong>evaluation processes, <strong>the</strong>re were some limitations to this work. These relate to lack<strong>of</strong> resources available to evaluators, difficulties establishing representative samples<strong>of</strong> those accessing services and <strong>the</strong> low priority given by <strong>the</strong> partnership to this type<strong>of</strong> evaluation activity. Lack <strong>of</strong> adequate resources for substantial pieces <strong>of</strong> work withchildren and young people had potential consequences for important relationships <strong>of</strong>trust. A Wave One evaluation, for example, described how limitations in evaluationfunding had undermined some aspects <strong>of</strong> <strong>the</strong>ir work:Due to <strong>the</strong> evaluation parameters and resources, it was only possible to makeone contact with <strong>the</strong> (young) advisers. While <strong>the</strong> input received wasdisseminated widely in <strong>the</strong> planning process, it was not possible to conductfur<strong>the</strong>r consultation and feedback to <strong>the</strong> advisers.Fur<strong>the</strong>rmore, a Wave One evaluation faced difficulties when attempting to involvechildren with special needs in <strong>the</strong>ir work; this required evaluators to balance <strong>the</strong>methodological requirement for representation with <strong>the</strong> perceived complexity <strong>of</strong>working with children and young people characterised as having special needs: Wetried to make it as random as possible but I think inevitably given <strong>the</strong> nature <strong>of</strong> <strong>the</strong>work and in some cases <strong>the</strong> quite extreme difficulties that those children had, <strong>the</strong>rewere definite sensitivities involved which we had to be constantly mindful <strong>of</strong>.2.5 Chapter Summary1. <strong>Local</strong> evaluators drew on multiple quantitative and qualitative data sourcesand analysis methods that reflected <strong>the</strong> different elements <strong>of</strong> partnerships’programmes <strong>the</strong>y aimed to examine. Quantitative methods, sometimescomplemented by qualitative methods, were frequently adopted for assessing<strong>the</strong> impacts <strong>of</strong> interventions. Qualitative methods, sometimes supplementedby quantitative methods, were primarily employed for generating learningabout processes such as <strong>the</strong> effectiveness <strong>of</strong> partnership working,commissioning and approaches to mainstreaming.2. Quantitative methods adopted by evaluators include structured questionnairesand interviews with stakeholders and analysis <strong>of</strong> secondary statistical datasources, including <strong>the</strong> analysis <strong>of</strong> project monitoring data. Qualitative methods32Chapter 2


drawn on by evaluators include semi-structured or unstructured interviewswith stakeholders, focus groups, observations <strong>of</strong> project activities and analysis<strong>of</strong> partnership documents.3. Evaluators have aimed to measure <strong>the</strong> impacts <strong>of</strong> Children’s Fund projectsand o<strong>the</strong>r interventions on aspects <strong>of</strong> children’s lives corresponding to <strong>the</strong>Children’s Fund sub-objectives and <strong>the</strong> Every Child Matters outcomes. Someevaluators also explored impacts in terms <strong>of</strong> how effectively projects targeted<strong>the</strong>ir services at user groups or as an assessment <strong>of</strong> <strong>the</strong> value for money <strong>of</strong>particular projects and o<strong>the</strong>r activities.4. There are a number <strong>of</strong> challenges to measuring impacts attributable tospecific Children’s Fund programmes and projects. The problems <strong>of</strong> socialexclusion tackled by <strong>the</strong> Children’s Fund are complex, multi-layered andinteracting, making it difficult to demonstrate causal links between a singleintervention and a measurable outcome. A related difficulty is <strong>the</strong> limitedextent to which time-limited local evaluation is able to identify longer-termoutcomes for children using Children’s Fund services.5. Some evaluators had difficulties responding to <strong>the</strong> Children’s Fund subobjectivesand Every Child Matters outcomes frameworks which defineoutcomes in relatively broad terms that are potentially open to a range <strong>of</strong>interpretations. Established statistical data sets tend not to be available thatclearly show change, although those relating to educational attainment andattendance are widely available. Some evaluators also noted <strong>the</strong> problems <strong>of</strong>drawing on inaccurate or inconsistent monitoring data provided by projects.6. Such challenges have led to some evaluators embracing <strong>the</strong> use <strong>of</strong> multiplemethods and multiple data sources creatively in ways that enable <strong>the</strong>m tosuggest with some confidence that Children’s Fund projects have led topositive outcomes for children using <strong>the</strong> services.7. Some evaluators involved children and young people as questionnairerespondents or participants in interviews and/or focus groups; many used‘participatory’ methods <strong>of</strong> engaging children and young people. A minority <strong>of</strong>local evaluators involved children more fully through evaluation design,participation in fieldwork, analysis <strong>of</strong> data and presenting findings. This wasChapter 2 33


seen as making <strong>the</strong> questions evaluators asked more relevant to children andyoung people and helping to generate richer evaluation data.8. In some cases <strong>the</strong> lack <strong>of</strong> resources available to evaluators, difficultiesestablishing representative samples <strong>of</strong> those accessing services and <strong>the</strong> lowpriority given by <strong>the</strong> partnership to this type <strong>of</strong> activity limited <strong>the</strong> waysevaluators were able to engage children and young people.34Chapter 2


Chapter 3: How <strong>Local</strong> <strong>Evaluation</strong> Supports Children’s FundPartnerships’ Decision-MakingThis chapter reviews <strong>the</strong> ways local evaluations have engaged with Children’s Fundpartnerships in seeking to support <strong>the</strong>ir decision making. The chapter highlights <strong>the</strong>range <strong>of</strong> ways evaluators have disseminated findings to partnerships and to whichgroups <strong>of</strong> stakeholders. Some <strong>of</strong> <strong>the</strong> key ways evaluation findings are used bypartnerships are <strong>the</strong>n explored and <strong>the</strong> factors facilitating and inhibiting <strong>the</strong> abilities <strong>of</strong>evaluators to influence partnerships’ decision making are identified.3.1 Multiple Approaches to Disseminating <strong>Evaluation</strong> FindingsIndividual evaluators have adopted a range <strong>of</strong> summative, formative, and for some,dialogic approaches to engaging with partnerships, reflecting <strong>the</strong>ir multiple roles atdifferent stages in partnerships’ development. Hence, evaluators have disseminatedmaterial to partnerships through writing reports (including interim reports, full reports,reports that focus on particular <strong>the</strong>mes and those that examine project impacts andgood practices). They have also provided material through presentations andworkshops to partnership stakeholders, distributing material on websites, andthrough periodic attendance at Children’s Fund partnership board meetings,subgroups and o<strong>the</strong>r organisational structures.A number <strong>of</strong> evaluations emphasised <strong>the</strong> importance <strong>of</strong> creating a dialogue withstrategic partners in terms <strong>of</strong> developing how <strong>the</strong>y might take things forward ra<strong>the</strong>rthan providing summative evaluation outputs, as an evaluator <strong>of</strong> a Wave Twopartnership suggested:I think <strong>the</strong>re’s definitely a dialogue. I mean I think we… and it’s a supportiveone. We don’t just kind <strong>of</strong> give information and <strong>the</strong>n leave it; it’s about workingwith <strong>the</strong> partnership trying to say, we will feed it back but <strong>the</strong>n do a little miniworkshop to get <strong>the</strong>m to do some thinking around it, and about what that meansfor <strong>the</strong>m and <strong>the</strong>ir partners and things. So it’s much more <strong>of</strong> a supportive kind <strong>of</strong>role really.Many evaluations sought to combine different approaches to engaging withpartnerships. For example a Wave One evaluator described how <strong>the</strong>ir adoption <strong>of</strong> anaction research model <strong>of</strong> working led to <strong>the</strong> dissemination <strong>of</strong> formative andsummative findings through ongoing discussions with <strong>the</strong> partnership. It was hopedthat such conversations would encourage <strong>the</strong> partnership to reflect on practices andhence be developmental:We do say ‘this is what you’ve done’ but, obviously, part <strong>of</strong> what we’re doing ismaking recommendations around how <strong>the</strong>y might develop in <strong>the</strong> future andChapter 3 35


<strong>the</strong>n trying to encourage discussion around that. In that sense, it is kind <strong>of</strong> anaction research type model, where we are making recommendations and tryingto facilitate discussion around those recommendations, as well as saying ‘this iswhat you’ve done’, sort <strong>of</strong> reflecting back practice to <strong>the</strong>m.Ano<strong>the</strong>r Wave One evaluation stressed <strong>the</strong> importance <strong>of</strong> breaking down boundariesbetween evaluation and practice; effective evaluation should engage in apartnership’s organisational and practice development:… our approach is action research and what that basically means is thatinstead <strong>of</strong> in a sense doing a systematic evaluation analysis <strong>of</strong> each individualproject or anything like that… we pretty much collapse <strong>the</strong> boundaries betweenwhat we call evaluation, organisational development, organisational learning,you know, practice development and policy development. In a sense <strong>the</strong>y allbecome part <strong>of</strong> a learning system.Multiple audiences for evaluation findingsMany evaluators also pointed to <strong>the</strong> importance <strong>of</strong> adopting appropriate means <strong>of</strong>feeding back messages <strong>from</strong> <strong>the</strong> evaluation to a range <strong>of</strong> Children’s Fundstakeholders including strategic partners, service providers and children, youngpeople and <strong>the</strong>ir families who used Children’s Fund services. <strong>Local</strong> evaluationreports tended not to comment on <strong>the</strong>ir intended audience, although <strong>the</strong>irpresentation suggests <strong>the</strong>y are intended for members <strong>of</strong> Children’s Fund programmepartnership boards. However, a relatively small number <strong>of</strong> local evaluators have alsoproduced ‘child friendly’ versions <strong>of</strong> reports and some local evaluators have providedversions <strong>of</strong> <strong>the</strong>ir reports for managers and practitioners <strong>of</strong> specific Children’s Fundprojects.One evaluator suggested that events had been an effective way <strong>of</strong> highlighting some<strong>of</strong> <strong>the</strong> issues raised by evaluation to a wider constituency <strong>of</strong> stakeholders. Thepartnership had run a children’s event to highlight key evaluation issues relating toprojects at which project workers, children, young people and <strong>the</strong>ir families attended,toge<strong>the</strong>r with members <strong>of</strong> <strong>the</strong> partnership board, representatives <strong>from</strong> <strong>the</strong> children’strust and o<strong>the</strong>r strategic managers <strong>from</strong> statutory agencies attended. Events drawingthose working at strategic and operational levels toge<strong>the</strong>r were also highlighted asbeing potentially effective for disseminating messages in a Wave One partnership.Although initially sceptical about <strong>the</strong> potential for limited interest among invitedpr<strong>of</strong>essionals, <strong>the</strong> evaluator commented on <strong>the</strong> value <strong>of</strong> encouraging first-handexperiences <strong>of</strong> children and young people’s skills and abilities in an informal setting.It was suggested that such events can begin to develop strategic thinking around36Chapter 3


participation and work towards <strong>the</strong> reengagement between <strong>the</strong> strategic andoperational aspects <strong>of</strong> Children’s Fund activity:… we planned a huge fun day in effect, an hour <strong>of</strong> which was taken up with usdoing all sorts <strong>of</strong> work; asking questions and so on. Now my interpretation <strong>of</strong>that is that pr<strong>of</strong>essionals would see <strong>the</strong> hour as being <strong>the</strong> bit that <strong>the</strong>y should beat which <strong>the</strong>y may or may not get to and <strong>the</strong> rest <strong>of</strong> it as being not work. Butactually those people that were <strong>the</strong>re for <strong>the</strong> day came away thinking that <strong>the</strong>y’dgot something quite special out <strong>of</strong> <strong>the</strong> process because <strong>the</strong>y’d seen how <strong>the</strong>children had inter-acted across <strong>the</strong> whole day. They’d learnt something aboutchildren’s participation across <strong>the</strong> whole day and actually for some I think it wasan interesting process <strong>of</strong> re-engaging with children. I mean clearly some had<strong>the</strong>ir own children and so on and o<strong>the</strong>rs have gone in <strong>the</strong>ir own personal lives<strong>the</strong>ir own children are grown up and so on. But my sense is that <strong>the</strong>re is aseparation sometimes between children’s policy-makers and children.It would appear however that few evaluators prioritised service users, includingchildren and young people, as an important audience <strong>of</strong> evaluation findings. Forexample, one evaluator described how it was initially difficult to find <strong>the</strong> mostappropriate ways <strong>of</strong> engaging parents in evaluation feedback:[I asked] I just want to know what you think about it. Is that something that youcan relate to or…? Any <strong>of</strong> <strong>the</strong> recommendations are <strong>the</strong>y completely out <strong>of</strong> <strong>the</strong>water that you just think, what on earth are you talking about; don’t know whatthat’s about? And I think to be honest with you, people were quite afraid <strong>of</strong> <strong>the</strong>evaluation to begin with, like <strong>the</strong>y couldn’t challenge it, or <strong>the</strong>y couldn’t questionit. Or <strong>the</strong>y couldn’t say, well what do you mean by that, what on earth does thatmean?Building partnerships’ capacities to undertake evaluationIn addition to effectively communicating findings to partnership stakeholders, manylocal evaluations described a key focus <strong>of</strong> <strong>the</strong>ir work as developing approaches toraising <strong>the</strong> capacity <strong>of</strong> Children’s Fund projects to self-evaluate <strong>the</strong>ir work. Someevaluators regarded this as central to <strong>the</strong> evaluation process, whe<strong>the</strong>r this wasthrough evaluators working alongside project staff in developing appropriate selfevaluation‘tools’ during <strong>the</strong> life <strong>of</strong> <strong>the</strong> evaluation, or through more formal trainingwithin workshops involving members <strong>of</strong> <strong>the</strong> partnership. For example, one evaluationsupported Children’s Fund projects to identify, summarise and record <strong>the</strong>ir successesand failures and future strategies. Ano<strong>the</strong>r evaluator was specifically involved inbuilding <strong>the</strong> capacity <strong>of</strong> children and young people to evaluate Children’s Fundservices.For some evaluations developing self-evaluation tools contributed to a number <strong>of</strong>o<strong>the</strong>r dissemination and supportive activities. A Wave One evaluator, for example,Chapter 3 37


developed self-evaluation tools through a series <strong>of</strong> workshops during <strong>the</strong> lifetime <strong>of</strong><strong>the</strong> evaluation which aimed to <strong>of</strong>fer ongoing support to projects developing <strong>the</strong>irabilities to evaluate <strong>the</strong>mselves. Self-evaluation agreements were drawn up forprojects who were required to report on aims, indicators and <strong>the</strong>ir methods <strong>of</strong>preference towards drawing on data already available and complementing this withdata elicited <strong>from</strong> surveys, interviews and observations. The development <strong>of</strong>evaluation tools was also a feature <strong>of</strong> an evaluation <strong>of</strong> a Wave Two partnership; akey aim was described as being to: ‘examine <strong>the</strong> processes by which <strong>the</strong>Management Team collect data and monitor progress… how can evaluation beembedded in <strong>the</strong> process?’.A Wave Two evaluation was able to support those at <strong>the</strong> operational level throughfocus group work following <strong>the</strong> analysis <strong>of</strong> a survey on self-evaluation toolsdistributed to all service providers. O<strong>the</strong>r evaluators were commissioned specificallyto carry out this role, as is described by a Wave One evaluator: …<strong>the</strong>y’vecommissioned us, purely on a consultancy basis to write an evaluation tool kit for<strong>the</strong>ir project; a self-evaluation tool kit.The emphasis on developing self-evaluation practices has varied according to <strong>the</strong>needs <strong>of</strong> <strong>the</strong> partnerships and available resources. A Wave One evaluator describedhow an attempt to carry out this work had been squeezed out as time and financialconstraints had taken over proceedings. This has been a cause <strong>of</strong> some frustrationamongst evaluators who may have regarded this aspect <strong>of</strong> <strong>the</strong> research as crucial,for example those describing <strong>the</strong>mselves as action researchers. Conversely a WaveOne partnership described how concerns over future resources had positively driven<strong>the</strong> development <strong>of</strong> self-evaluation tools, as those at strategic level sought toencourage service providers to consider <strong>the</strong>ir place beyond <strong>the</strong> life <strong>of</strong> <strong>the</strong> Children’sFund:It’s partly about an increasing focus on <strong>the</strong> mainstreaming agenda and <strong>the</strong> needto get projects to <strong>the</strong> point where <strong>the</strong>y are able to pursue o<strong>the</strong>r forms <strong>of</strong> funding,etc. beyond <strong>the</strong> <strong>Children's</strong> Fund and <strong>the</strong> feeling that sustainability was partlyabout being able to demonstrate that <strong>the</strong>y were achieving key outcomes.3.2 Partnerships’ Use <strong>of</strong> <strong>Evaluation</strong> Evidence in Decision MakingYoung, et al. (2002) propose a framework to describe <strong>the</strong> range <strong>of</strong> waysorganisations draw on evaluation findings to support <strong>the</strong>ir decision making processesin practice (as distinct <strong>from</strong> <strong>the</strong> intended roles or orientations <strong>of</strong> evaluations38Chapter 3


highlighted in Table 1). These include: knowledge-driven model (an organisation’swork is driven by evaluation); problem-solving model (evaluation is used to solveproblems/address specific questions/provide a knowledge base for decision making);enlightenment model (evaluation provides a greater understanding <strong>of</strong> concepts andissues) interactive-model (practice and evaluation interactively influence oneano<strong>the</strong>r); political/tactical model (evaluation is drawn on selectively by anorganisation to legitimise decisions). It is also useful here to draw on <strong>the</strong> distinctionmade by Clarke (2001): evaluation findings tend to be used by organisations in twoways instrumentally and conceptually. The former denotes <strong>the</strong> ways organisationsmay act directly on evaluation findings in terms <strong>of</strong>, for example, changing practicesand deciding to fund particular activities; <strong>the</strong> latter denotes <strong>the</strong> ways evaluationmaterial may influence organisations’ understanding <strong>of</strong> concepts more broadly, and<strong>the</strong>reby informing strategies and strategic thinking.In practice <strong>the</strong>re were mixed perceptions among Children’s Fund local evaluatorsrelating to <strong>the</strong> extent to which <strong>the</strong>y believed that <strong>the</strong>y had had an influence onpartnerships’ decision making, although for many <strong>the</strong> relationships betweenpartnerships and evaluators appear to correspond with <strong>the</strong> problem solving model(Young, et al. 2002) and evaluation findings used instrumentally (Clarke, 2001).Evidence derived <strong>from</strong> local evaluation reports and interviews with local evaluatorssuggest that <strong>the</strong> majority <strong>of</strong> evaluations appear to pragmatically ga<strong>the</strong>r and analysedata concerned with <strong>the</strong> impact <strong>of</strong> Children’s Fund programmes, or processes suchas children’s participation and partnership working with a view to <strong>of</strong>fering materialthat is intended to be used by partnerships instrumentally, such as through changingpractices and deciding to re-commission particular projects. Indeed, manyprogramme managers participating in <strong>the</strong> NECF programme managers’questionnaire survey 4 suggested that local evaluation had helped to identifysuccessful/less successful projects (51%) and how <strong>the</strong>y worked well/less well (56%),as well as to make decisions about continuing funding projects (43%) or whichprojects to promote for mainstreaming (39%). Many programme managers alsoindicated that local evaluation had helped <strong>the</strong> partnership reflect on strategicpractices and how to improve <strong>the</strong>m (43%).There was also some indication that evaluation had <strong>of</strong>fered partnerships newconcepts or broader frames <strong>of</strong> understanding (corresponding with <strong>the</strong> enlightenment4 Based on 120 Children’s Fund programme managers’ responses.Chapter 3 39


model and <strong>the</strong> notion <strong>of</strong> using evaluation conceptually). 47% <strong>of</strong> programmemanagers indicated that local evaluation had helped to develop <strong>the</strong> partnership’sthinking about prevention, partnership working and prevention or as a means <strong>of</strong>helping to promote changes in practices and cultures among mainstream agencies(39%).Importantly, few programme managers suggested that local evaluation had been <strong>of</strong>limited value to <strong>the</strong> partnership (27%), indeed, a number <strong>of</strong> evaluation reportsidentified where <strong>the</strong>ir recommendations have been developed into tangible changesin <strong>the</strong> partnership, although o<strong>the</strong>rs appeared not to be certain about how evaluationmaterial had or would influence <strong>the</strong> partnership. An example <strong>from</strong> a Wave Twoevaluation demonstrates how data ga<strong>the</strong>red on Children’s Fund projects had had adirect impact on how <strong>the</strong> partnership directed project work, for example byrecommending <strong>the</strong> systematic collection <strong>of</strong> baseline data and <strong>the</strong> need to encourageinvolvement at project level, <strong>of</strong> children with special needs.The evaluation report suggests that perhaps <strong>the</strong> most significant impact <strong>of</strong> <strong>the</strong>evaluation data was its influence on decisions around future commissioning <strong>of</strong>Children’s Fund projects, as <strong>the</strong> following extract describes: ‘Findings <strong>from</strong> <strong>the</strong>evaluation informed <strong>the</strong> deliberations <strong>of</strong> <strong>the</strong> X Children’s Fund Appraisal Panel thatmet in December to consider <strong>the</strong> X Children’s Fund Continuation Funding Proposalsand Appraisal Records, 2004-2006’. Ano<strong>the</strong>r evaluator, for example, pointed to <strong>the</strong>importance <strong>of</strong> disseminating <strong>the</strong> issues children raised as part <strong>of</strong> <strong>the</strong> evaluation; thishad been particularly influential in terms <strong>of</strong> informing <strong>the</strong> partnership:There was work with children who had ADHD a project where <strong>the</strong>y wereparticularly good on participation and <strong>the</strong>y really did, <strong>the</strong>y really kind <strong>of</strong> broughtwork through that really did involve children and you could see <strong>the</strong> kind <strong>of</strong> pridethat those children had in that work and that did, I believe anyway, have animpact on <strong>the</strong> partnership and also on <strong>the</strong> work <strong>of</strong> health pr<strong>of</strong>essionals in thatarea.Ano<strong>the</strong>r evaluator suggested that evaluation findings had informed <strong>the</strong> partnership <strong>of</strong>successes among projects; <strong>the</strong> partnership had drawn on <strong>the</strong>se finding in makingdecisions about which project activities to promote for mainstreaming:… <strong>the</strong>re were pieces <strong>of</strong> work that I think were mainstreamed in <strong>the</strong> endbecause <strong>of</strong> what we said; because <strong>the</strong> reports that we produced did evidencesuccess in a way that <strong>the</strong> partnership could understand and in a way which kind40Chapter 3


<strong>of</strong> had an impact. So we used a case study approach on work that we thoughtwas particularly good or <strong>of</strong> particular attention.3.3 Factors Facilitating and Inhibiting <strong>the</strong> Use <strong>of</strong> <strong>Evaluation</strong> FindingsA number <strong>of</strong> local evaluators pointed to factors that had facilitated <strong>the</strong>m effectivelyinfluencing partnerships’ work. For example, an evaluation report <strong>of</strong> a Wave OneChildren’s Fund partnership defines a number <strong>of</strong> measures that are seen as enabling<strong>the</strong> partnership to respond to evaluation findings:• information provided should be relevant and useful;• evaluation outputs are delivered at appropriate and relevant times in apartnership’s development;• results are made widely available to a range <strong>of</strong> stakeholders and presented inappropriate, accessible formats;• recommendations should be realistic in terms <strong>of</strong> <strong>the</strong> ability <strong>of</strong> <strong>the</strong> partnershipto respond.Ano<strong>the</strong>r evaluator stressed <strong>the</strong> importance <strong>of</strong> engendering a relationship with <strong>the</strong>partnership in which <strong>the</strong> partnership board and programme team recognised <strong>the</strong>value <strong>of</strong> evaluation and were open to both positive and more critical feedback:I would say that <strong>the</strong> management team were very open to our approach toevaluation which is to identify <strong>the</strong> positives and to be honest about what couldbe done better… one <strong>of</strong> <strong>the</strong> things that I think made <strong>the</strong> delivery <strong>of</strong> <strong>the</strong>evaluation effective for us was that we shared <strong>the</strong> same values as <strong>the</strong>programmeProblems using evaluation in a context <strong>of</strong> changeMany local evaluation reports and evaluators interviewed, however, emphasised anumber <strong>of</strong> factors that had inhibited <strong>the</strong>ir ability to effectively influence partnerships’decision making. These stem <strong>from</strong> <strong>the</strong> climate <strong>of</strong> ongoing change nationally andwithin local authority areas, meaning that, in practice, local evaluation <strong>of</strong> <strong>the</strong>Children’s Fund has tended to be politically/tactically driven (drawing on Young’s2002 framework).The Children’s Fund initiative presents a number <strong>of</strong> challenges in seeking to applylocal evaluation findings to decision making and practice. The rapidly changingnational context means that changes in partnerships’ directions are <strong>of</strong>ten politicallyChapter 3 41


motivated ra<strong>the</strong>r that based on local evaluation. The introduction <strong>of</strong> <strong>the</strong> CentralGovernment stipulation in 2002 that 25% <strong>of</strong> programmes’ budgets are allocated tocrime prevention activities (CYPU and <strong>the</strong> Youth Justice Board, 2002), for example,created significant tensions between Central Government priorities and <strong>the</strong> abilities<strong>of</strong> partnerships to determine <strong>the</strong>ir own programmes (Morris & Spicer 2003; Mason, etal., 2005).Partnerships also experience considerable change locally as mainstream health andsocial care agencies embrace a number <strong>of</strong> alternative and sometimes divergingagendas. Reflecting this context, as <strong>the</strong> ongoing work <strong>of</strong> <strong>the</strong> National <strong>Evaluation</strong>suggests, Children’s Fund partnerships have sometimes struggled to establish <strong>the</strong>iragendas (for example, NECF, 2004b & c). Indeed, a small number <strong>of</strong> local Children’sFund evaluators indicated that evaluation was to an extent being driven by shiftingmainstreaming agendas in which partnerships are under pressure to use evaluationto both demonstrate what Children’s Fund projects are achieving to mainstreamagencies and to demonstrate <strong>the</strong> effectiveness <strong>of</strong> new practices promoted by <strong>the</strong>initiative. The implications <strong>of</strong> this were described by a Wave One local evaluator:… <strong>the</strong>y had <strong>the</strong>ir own agenda and that was to do with <strong>the</strong> migration intochildren’s trust and <strong>the</strong> emergence <strong>of</strong> integrated services; I just saw thatagenda kind <strong>of</strong> looming large… I think by <strong>the</strong> end <strong>of</strong> my time <strong>of</strong> evaluation with<strong>the</strong> Children’s Fund I had a sense that <strong>the</strong> Children’s Fund were really comingup against that limit… things becoming much more top down and much lessbottom up. It was almost as though <strong>the</strong> Children’s Fund had done its job; it hadcreated some good and interesting work… and some projects were being kind<strong>of</strong> cherry picked for mainstreaming; <strong>the</strong> o<strong>the</strong>rs were simply being let go to <strong>the</strong>wall.Some evaluators indicated that <strong>the</strong>y experienced difficulties controlling howevaluation outputs would be disseminated, which groups would receive <strong>the</strong>messages, and how outputs would be used. The evaluator <strong>of</strong> a Wave Onepartnership highlighted <strong>the</strong> ways <strong>the</strong> messages contained within evaluation outputshad to be carefully poised in terms <strong>of</strong> reporting on negative aspects <strong>of</strong> partnerships’work; indeed, senior figures within <strong>the</strong> partnership had sought to exercise controlover <strong>the</strong> distribution <strong>of</strong> evaluation material:… it’s a little bit sad that it’s so politically driven now, and it is, and reports havebeen written very, very sensitively. I don’t know whe<strong>the</strong>r you noticed that, but<strong>the</strong>y have been written very sensitively. And we have been a bit pressuredabout where <strong>the</strong> reports go and… well I haven’t but powers above have, where<strong>the</strong>y go, and if <strong>the</strong>y’re in <strong>the</strong> public domain.42Chapter 3


Similarly an evaluator <strong>of</strong> a Wave Two partnership suggested:…because it’s almost like <strong>the</strong>y know what <strong>the</strong>y want to find, and <strong>the</strong>y’re justusing <strong>the</strong> evaluation to get <strong>the</strong>re. But <strong>the</strong>y know exactly what <strong>the</strong>y want to find;you know almost <strong>the</strong> conclusions. I could work <strong>the</strong> conclusion now, and I’veonly just started <strong>the</strong> evaluation, I’m only on week one <strong>of</strong> my phase threeevaluation.A related particular problem inhibiting partnerships’ abilities to absorb and act onevaluation material, noted by a local evaluator appears to stem <strong>from</strong> strategicstakeholders’ multiple priorities and <strong>the</strong>ir commitments to <strong>the</strong>ir own agencies ando<strong>the</strong>r partnerships in <strong>the</strong> locality meaning <strong>the</strong>y tend to have insufficient time to act onevaluation findings in a climate <strong>of</strong> rapid change:… what <strong>the</strong>y’ve said is, ‘we’d like to get more involved, but… we’ve got our dayjobs. This is just one partnership board that we’re on, we’re on three, we’re onten, we’re on eight. So we can’t input into it as much as we can’… So in terms<strong>of</strong> <strong>the</strong>ir learning, I think it’s quite minimal. But <strong>the</strong>n having said that, because <strong>of</strong>all <strong>the</strong> o<strong>the</strong>r things that are going on in <strong>the</strong> borough, that doesn’t mean that <strong>the</strong>ydon’t know about <strong>the</strong>m, or indeed that <strong>the</strong>y don’t want to know about <strong>the</strong>m.Evaluators’ accounts suggest, however, that partnerships may be able to respondmore fully to evaluation findings at different stages in <strong>the</strong>ir development. According toan evaluator <strong>of</strong> a Wave Two Children’s Fund partnership, <strong>the</strong> partnership was opento receiving lessons <strong>from</strong> <strong>the</strong> evaluation and act on <strong>the</strong>m in its early stages: I thinkwhat also helped <strong>the</strong> evaluation was that <strong>the</strong> programme was at a phase where itwas open to suggestion. So it was kind <strong>of</strong> malleable, so if I said, we really need amore consistent referral process, or we really need <strong>the</strong> monitoring stats to beanalysed, it was being done. The same local evaluator noted how <strong>the</strong> partnershipbecame less open to criticism during <strong>the</strong> course <strong>of</strong> <strong>the</strong> evaluation as <strong>the</strong> localauthority began to move towards <strong>the</strong> integration <strong>of</strong> children’s services and children’strusts. Whereas at <strong>the</strong> inception <strong>of</strong> <strong>the</strong> evaluation <strong>the</strong> partnership was very open tochange, as <strong>the</strong> evaluation has progressed <strong>the</strong> resulting reports have to be writtenvery, very sensitively…we have been a bit pressured about where <strong>the</strong> reports goand…if <strong>the</strong>y’re in <strong>the</strong> public domain.Perceptions <strong>of</strong> <strong>the</strong> value and credibility <strong>of</strong> local evaluationA fur<strong>the</strong>r key issue influencing evaluators’ abilities to influence <strong>the</strong> decision making <strong>of</strong>Children’s Fund partnerships is that, in some cases, local evaluators experienced adegree <strong>of</strong> dissonance between <strong>the</strong>ir views on what represented legitimate evaluationmethods, and those <strong>of</strong> some partnership board members. As noted earlier, <strong>the</strong>Chapter 3 43


legitimacy <strong>of</strong> qualitative methods and multiple (quantitative and qualitative) evaluationmethods is being increasingly recognised among UK policy makers as an appropriateapproach to evaluating complex interventions (Coote, et al., 2004; Spencer, et al.,2004). Indeed, as outlined in Chapter 2 <strong>of</strong> this report, local evaluators have tended t<strong>of</strong>ully embrace multiple methods, and hence recognise that different methods aremore appropriate in examining different elements <strong>of</strong> a programme’s work. Anevaluator <strong>from</strong> a Wave One partnership, for example, described how <strong>the</strong> partnershipboard tended to favour ‘hard’ quantitative evidence ra<strong>the</strong>r than qualitative evaluationdata, and found it difficult to accept <strong>the</strong> legitimacy and relevance <strong>of</strong> <strong>the</strong> latter:It might just be some kind <strong>of</strong> headline statistics <strong>the</strong>y want really…I know that<strong>the</strong> drop-in [project] is a life-line to some <strong>of</strong> those parents that attend <strong>the</strong>re, <strong>the</strong>ywouldn’t know what to do with <strong>the</strong>ir children if <strong>the</strong>y didn’t go to that drop-in, andI can write that up but it’s not hard evidence, just <strong>the</strong> stats really whichsometimes doesn’t really tell <strong>the</strong> whole story… And even though we might havesay all <strong>the</strong> stats <strong>the</strong>y’re wanting, what <strong>the</strong> actual people on <strong>the</strong> ground mightsay might be something completely different.Similarly, a Wave Two evaluation described how partnership board members’ limitedunderstandings <strong>of</strong> notions such as indicators and impacts had been a frustratingbarrier to progress in terms <strong>of</strong> <strong>the</strong>ir unrealistic expectations <strong>of</strong> <strong>the</strong> evaluation:The partnership board didn’t really understand... why can’t you show how gooda service is?... And I said, well actually, if you haven’t got those indicators inplace, I can’t magic <strong>the</strong>m out <strong>of</strong> <strong>the</strong> air. And you need to have <strong>the</strong>m on anongoing basis, because I can’t do <strong>the</strong>m in retrospect, because you should havebeen recording what risk factor <strong>the</strong> children came in with, effectively what’sbeen done with <strong>the</strong>m and what <strong>the</strong>y came out with. I can do what <strong>the</strong>y cameout with, but I can’t compare that to anything at that point <strong>of</strong> entry.Many <strong>of</strong> <strong>the</strong> reports also remained tentative in <strong>of</strong>fering definitive findings on progress,and have given observations that can continue to inform an evolving programme thatis formative. This has been because local evaluators have had to take account <strong>of</strong>time lapses in <strong>the</strong> expected progress <strong>of</strong> programmes resulting <strong>from</strong> problems <strong>of</strong>getting programmes started in some parts <strong>of</strong> <strong>the</strong> country, and funding uncertaintiesencountered by programmes that have slowed progress. Additionally, localevaluators have produced final reports when some programmes are about half waythrough <strong>the</strong>ir lives resulting in evaluations being short or medium term in duration. Asa consequence, local evaluators have been reluctant to identify lasting outcomes.Hence, <strong>the</strong>re can be a tension between providing evaluation outputs that areappropriate for readers in terms <strong>of</strong> being short and using accessible terms asopposed to fully evidencing statements made and fully exploring complex issues.44Chapter 3


Evaluators sometimes struggled to address <strong>the</strong> multiple requirements <strong>of</strong> <strong>the</strong>evaluation with <strong>the</strong> timing <strong>of</strong> <strong>the</strong>ir engagement and <strong>the</strong> expectations <strong>of</strong> when <strong>the</strong>ywould feedback. It was recognised, for example, that early engagement andfeedback might assist programmes to plan <strong>the</strong>ir development on an informed basis.However, this would limit that evaluation’s ability to present what were seen asdefinitive conclusions. Conversely, later evaluation engagement/feedback were seenas providing a greater degree <strong>of</strong> confidence in findings, but as a retrospective viewthat may be less helpful in informing a partnership about future directions.O<strong>the</strong>r evaluators pointed to <strong>the</strong> importance <strong>of</strong> maintaining ongoing dialogue withprogramme managers and members <strong>of</strong> <strong>the</strong> programme team in order to ensure thatevaluation outputs are useful, accessibly presented and that <strong>the</strong>y are realistic aboutwhat <strong>the</strong> evaluation is able to produce. For example, one evaluator pointed to <strong>the</strong>problems <strong>of</strong> not effectively managing <strong>the</strong> expectations <strong>of</strong> <strong>the</strong> partnership:… <strong>the</strong> previous director didn’t do enough in <strong>the</strong> way <strong>of</strong> dialogue with us…whathappened was that we were producing types <strong>of</strong> report and working on focuseswhich he didn’t particularly want or didn’t find particularly useful…whathappened in <strong>the</strong> end though was that <strong>the</strong>… clearly regretted <strong>the</strong> fact that he’dcome in half way through and hadn’t had a hands on approach because he feltvery strongly that for local evaluation to work it had to be a sort <strong>of</strong> collaborativething and in fact at times I feel he wants to be slightly too hands on for it to becalled evaluation to be honest.Ano<strong>the</strong>r evaluator described a number <strong>of</strong> problems that limited <strong>the</strong> impact that <strong>the</strong>evaluation had on <strong>the</strong> work <strong>of</strong> <strong>the</strong> partnership. Such problems included limited clarityamong stakeholders about <strong>the</strong> purposes <strong>of</strong> evaluation, that <strong>the</strong> local evaluation hadproduced outputs that were too ‘academic’ in style and that it had not been able toproduce ‘hard’ data demonstrating outcomes which <strong>the</strong> programme required:… we began to get this sort <strong>of</strong> critical feedback about how you know <strong>the</strong> wholething was too academic and too weighty and it wasn’t telling <strong>the</strong>m what <strong>the</strong>yneeded to know and so on and when we tried to get to <strong>the</strong> bottom <strong>of</strong> this werealised that <strong>the</strong>y were expecting to walk away with sets <strong>of</strong> targets met and youknow hard data about outcomes and things; which we were never being askedto find actually.3.4 Chapter Summary1. Evaluators have each adopted a range <strong>of</strong> summative, formative, and forsome, dialogic approaches to engaging with partnerships reflecting <strong>the</strong>irChapter 3 45


multiple roles at different stages in partnerships’ development. <strong>Evaluation</strong>material is disseminated through reports, presentations and workshops,websites, and some evaluators attend partnership board meetings, subgroupsor o<strong>the</strong>r organisational structures. A number <strong>of</strong> evaluators emphasised <strong>the</strong>importance <strong>of</strong> ongoing dialogue with strategic partners about <strong>the</strong>irdevelopment ra<strong>the</strong>r than simply providing summative evaluation outputs.2. The importance <strong>of</strong> adopting appropriate means <strong>of</strong> feeding back evaluationmessages to a range <strong>of</strong> stakeholders including strategic partners, serviceproviders and children, young people and <strong>the</strong>ir families who used Children’sFund services was recognised by some evaluators. Some produced ‘childfriendly’ versions <strong>of</strong> reports or disseminated findings at events that involvedchildren and young people. However, it would appear that few evaluatorsprioritised service users, including children and young people, as an importantaudience <strong>of</strong> evaluation findings.3. Many local evaluations developed approaches to raising <strong>the</strong> capacity <strong>of</strong>Children’s Fund projects to self-evaluate <strong>the</strong>ir work through means such asworking alongside project staff in developing appropriate self-evaluation‘tools’, or through providing training.4. A number <strong>of</strong> local evaluation reports identified where <strong>the</strong>ir recommendationshave been developed into tangible changes in <strong>the</strong> partnership. O<strong>the</strong>rs wereuncertain about how evaluation material had or would influence <strong>the</strong>partnership. <strong>Evaluation</strong>s appear to be <strong>of</strong>fering material to partnerships to beused instrumentally, that is, through changing <strong>the</strong>ir practices and/or decidingwhich projects to re-commission. <strong>Local</strong> evaluation also appears to be usedconceptually in terms <strong>of</strong> expanding partnerships’ understandings relating toprevention, partnership working and participation (that is an enlightenmentmodel <strong>of</strong> <strong>the</strong> relationship between evaluation and decision making).5. Evaluators suggested that <strong>the</strong>y could have more influence on partnerships byproviding findings that are relevant, useful, timely, accessible to a range <strong>of</strong>stakeholders, and provide realistic recommendations. The importance <strong>of</strong>engendering a relationship in which <strong>the</strong> partnership recognises <strong>the</strong> value <strong>of</strong>evaluation and is open to both positive and more critical feedback was alsonoted.46Chapter 3


6. A number <strong>of</strong> factors inhibit evaluators’ influence on partnerships’ decisionmaking. Pressures <strong>from</strong> ongoing change nationally and within local authorityareas as mainstream agencies embrace a number <strong>of</strong> alternative, andsometimes, diverging agendas means that local Children’s Fund evaluationmay get drawn on selectively to legitimise decisions or to demonstrate <strong>the</strong>successes <strong>of</strong> partnerships’ work (that is a politically/tactically driven model <strong>of</strong><strong>the</strong> relationship between evaluation and decision making).7. Some evaluators indicated that <strong>the</strong>y experienced difficulties controlling <strong>the</strong>dissemination <strong>of</strong> evaluation outputs, how outputs would be used, and indeedhad experienced difficulties reporting on negative aspects <strong>of</strong> partnerships’work. In some cases Children’s Fund board members’ multiple priorities and<strong>the</strong>ir commitments to <strong>the</strong>ir own agencies and o<strong>the</strong>r partnerships meant <strong>the</strong>yhad insufficient time to act on evaluation findings.8. A number <strong>of</strong> evaluators experienced disagreement with stakeholders about<strong>the</strong> legitimacy <strong>of</strong> different evaluation methods. Some stakeholders favoured‘hard’ quantitative evidence and rejected <strong>the</strong> value <strong>of</strong> qualitative methods,whilst o<strong>the</strong>r appeared to have limited understandings <strong>of</strong> notions such asindicators and impacts or had unrealistic expectations <strong>of</strong> what evaluationscould produce. Maintaining ongoing dialogue in order to manage partnerships’expectations is <strong>the</strong>refore important.Chapter 3 47


Chapter 4: Prevention, Participation and PartnershipWorking: Key Messages <strong>from</strong> Children’s Fund<strong>Local</strong> <strong>Evaluation</strong> Reports.This chapter reflects on a number <strong>of</strong> key messages presented within local evaluationreports and how <strong>the</strong>se relate to <strong>the</strong> <strong>the</strong>mes driving <strong>the</strong> Children’s Fund programme,namely, prevention and preventative services, children and young people’sparticipation and partnership working. Some <strong>of</strong> <strong>the</strong> ways local evaluators havesought to conceptualise prevention, participation and partnership working are alsohighlighted.4.1 Key Messages <strong>from</strong> <strong>Local</strong> <strong>Evaluation</strong>: Prevention and Preventative <strong>Services</strong>The focus <strong>of</strong> local evaluators’ work on prevention has been to analyse <strong>the</strong> activities<strong>of</strong> Children’s Fund projects. Relatively few reports concentrate on <strong>the</strong> place <strong>of</strong>prevention and early intervention, o<strong>the</strong>r than in very broad policy terms, such as how<strong>the</strong> local prevention strategy relates to Children’s Fund planning. Hence, mostevaluations have sought to conceptualise prevention in relation to <strong>the</strong> Children’sFund sub-objectives, and/or to <strong>the</strong> outcomes framework in Every Child Matters in<strong>the</strong>ir assessments <strong>of</strong> projects’ progress. A number <strong>of</strong> evaluators also drew on <strong>the</strong>four ‘tiers’ <strong>of</strong> prevention (diversionary; early prevention; heavy end prevention; andrestorative prevention) adopted in <strong>the</strong> Children’s Fund Guidance based on <strong>the</strong> work<strong>of</strong> Hardiker, et al. (1991) as a frame <strong>of</strong> reference for locating <strong>the</strong> activities <strong>of</strong>Children’s Fund services.As noted above, measuring/demonstrating effective prevention was most widelyattempted through quantitative analysis <strong>of</strong> project monitoring/secondary statisticaldata sets and multiple methods in which service providers and service users’perspectives were elicited in order to attempt to show that preventative services hadhad a positive impact on service users’ circumstances and to identify <strong>the</strong> practices<strong>the</strong>y employed that were effective. There were many positive messages in localevaluation reports in relation to <strong>the</strong> beneficial impacts <strong>of</strong> Children’s Fund preventiveprojects. Illustrating <strong>the</strong> sorts <strong>of</strong> conclusions evaluations are reaching on preventativeactivities a report on a Wave One partnership summarises that:The children and young people who access services funded by <strong>the</strong> X Children’sFund are benefiting <strong>from</strong> <strong>the</strong> support that is being provided. The Fund isreaching and supporting children and young people who are suffering socialexclusion, some <strong>of</strong> <strong>the</strong>m at high levels. Families are also benefiting <strong>from</strong> <strong>the</strong>work <strong>of</strong> <strong>the</strong> X Children’s Fund in ways that are helping to reduce <strong>the</strong> effects <strong>of</strong>poverty… The X Children’s Fund has actively supported innovative work whichpoints to new models <strong>of</strong> working with children and young people.Chapter 4 49


A number <strong>of</strong> evaluations described how partnerships had effectively introducedprojects focusing on early intervention and drawing in traditionally ‘hard to reach’groups. For example, a Wave One evaluation report suggests: ‘…a large number [<strong>of</strong>projects] have focused on reaching commonly excluded groups and in supporting<strong>the</strong>m to access mainstream services’. Similarly, a report <strong>of</strong> ano<strong>the</strong>r Wave Onepartnership states that: ‘…some projects were using <strong>the</strong> Children’s Fund tospecifically target children <strong>of</strong>ten excluded <strong>from</strong> mainstream provision. These includedprojects whose primary focus was raising awareness in order to foster understandingand integration’.Some evaluators also described how drawing upon <strong>the</strong> skills and expertise <strong>of</strong> <strong>the</strong>voluntary and community sector has been a particularly important factor in effectivelyengaging and addressing <strong>the</strong> needs <strong>of</strong> traditionally excluded groups. For example, aWave One evaluation report describes how voluntary and community organisationshave been able to <strong>of</strong>fer alternative approaches to those <strong>of</strong> <strong>the</strong> mainstream:… families and children are prepared to work with Children’s Fund services thatare non-stigmatising, voluntary and non-compulsory, flexible and accessible. Asa result services have developed a variety <strong>of</strong> methodologies and approachesthat meet <strong>the</strong> needs <strong>of</strong> children and young people and in some instances <strong>the</strong>irfamilies.Similarly, a Wave One evaluator suggested that voluntary/community-led Children’sFund projects were able to provide more flexible services than <strong>the</strong> statutory sector:…<strong>the</strong>re is some evidence that voluntary or community sector led projects tend to bemore flexible in <strong>the</strong> timing or ‘coverage’ <strong>of</strong> <strong>the</strong> services <strong>the</strong>y <strong>of</strong>fer when compared tostatutory led projects (e.g. 50% vs. 18% were accessible at weekends).O<strong>the</strong>r evaluations describe how developing collaborative arrangements with o<strong>the</strong>rlocal initiatives and/or agencies has been important to <strong>the</strong> success <strong>of</strong> Children’s Fundpreventive services. One evaluation report, for example, describes how <strong>the</strong>collaboration between Children’s Fund and o<strong>the</strong>r local preventive strategies such asSure Start and <strong>the</strong> Teenage Pregnancy Strategy gave impetus to <strong>the</strong> work <strong>of</strong>Children’s Fund funded outreach workers who have successfully <strong>of</strong>fered support tothose families who are not normally in contact with any formal services. A fur<strong>the</strong>rexample <strong>of</strong> this is given in a Wave One evaluation report that describes a familysupport service which is successfully ‘fostering closer working between agencies in50Chapter 4


order to provide effective preventive services for children and families’. Thepartnership supplies workers who provide support in homes and schools in responseto children and parents’ needs. The project is described as successfully focussing onlow level preventive work by supporting families ‘… who are struggling and have aneed but who are not yet at crisis intervention point and whom may be distrustful <strong>of</strong>social services or o<strong>the</strong>r pr<strong>of</strong>essionals’.A key message noted in local evaluation reports is that <strong>the</strong>re tend to be divergentunderstandings <strong>of</strong> ‘prevention’ among stakeholders within partnerships, althoughprogress is being made by some partnership boards in negotiating a consensus <strong>of</strong>understanding. This corresponds with <strong>the</strong> findings <strong>of</strong> <strong>the</strong> work <strong>of</strong> <strong>the</strong> NECF (NECF,2004d). One local evaluation report, for example, notes that within <strong>the</strong> partnership:‘There is no clear single agreed unitary definition <strong>of</strong> <strong>the</strong> meaning <strong>of</strong> prevention’.Similarly, ano<strong>the</strong>r local evaluation suggested:If <strong>the</strong>re is no shared understanding [<strong>from</strong> interviews with key stakeholders] as towhat prevention means <strong>the</strong>re is a risk <strong>of</strong> confusion in implementation. Indeed as<strong>the</strong>se extracts demonstrate prevention means so many different things, itperhaps risks meaning nothing.Some reports highlighted that some service providers tend not to explicate what <strong>the</strong>yhope to achieve and how <strong>the</strong>ir activities are connected with prevention. Indeed, anumber <strong>of</strong> evaluators noted very divergent views among service providers about <strong>the</strong>iraims, for example a report on a Wave One partnership states:Some project managers said that <strong>the</strong>y provided a service to ‘plug <strong>the</strong> gap’ anddid not think it was necessary to change any practices whilst o<strong>the</strong>rs had more<strong>of</strong> a reform agenda with clear explanations <strong>of</strong> how <strong>the</strong>ir services provided newopportunities to make a difference to <strong>the</strong> everyday lives <strong>of</strong> disadvantagedchildren and young people.It was also noted in a number <strong>of</strong> reports, however, that despite limited clarity aroundprevention in terms <strong>of</strong> what this means and how this can be applied, a number <strong>of</strong>partnerships were linked to networks and partnerships that were emerging locallythat were taking thinking on prevention fur<strong>the</strong>r. A local evaluation report <strong>from</strong> a WaveOne partnership, for example, describes how a youth crime prevention project hascreated a multi-disciplinary panel to decide on preventive interventions for youngpeople at risk <strong>of</strong> <strong>of</strong>fending. As <strong>the</strong> panel consisted <strong>of</strong> key strategic players in <strong>the</strong> localauthority, it is not unlikely that such discussions around prevention have been heardat partnership board level.Chapter 4 51


4.2 Key Messages <strong>from</strong> <strong>Local</strong> <strong>Evaluation</strong>: Children and Young People’sParticipation<strong>Local</strong> evaluators aimed to establish Children’s Fund stakeholders’ understandings <strong>of</strong><strong>the</strong> aims <strong>of</strong> children and young people’s participation. Within some programmespartnership boards appear to have agreed understandings <strong>of</strong> participation; <strong>the</strong>setend to equate to a ‘consult and inform’ model <strong>of</strong> participation whereby children’sviews and experiences are elicited in order to inform <strong>the</strong> design and development <strong>of</strong>child focused and responsive services. In most programmes, however, localevaluators reported diverse views <strong>of</strong> how participation should be defined and appliedranging <strong>from</strong> promoting children’s involvement in all aspects <strong>of</strong> service design anddevelopment, through to some circumspection among stakeholders about <strong>the</strong>difficulties <strong>of</strong> overcoming current ways <strong>of</strong> working whilst avoiding tokenistic gestures<strong>of</strong> participation. Never<strong>the</strong>less, evaluators reported that pr<strong>of</strong>essionals and adults’definitions <strong>of</strong> participation have tended to predominate, ra<strong>the</strong>r than those <strong>of</strong> childrenand young people <strong>the</strong>mselves. Similarly, among projects evaluated, participation wasprimarily equated with consultation activity. Recommendations made in localevaluation reports emphasise <strong>the</strong> need for partnerships to develop shared definitions<strong>of</strong> participation and to develop a coherent strategy in implementing participation.Such findings correspond with those <strong>of</strong> <strong>the</strong> NECF (see NECF, 2004c).Within a number <strong>of</strong> evaluation reports participation is conceptualised in relation tohierarchical typologies. Different activities are depicted in terms <strong>of</strong> <strong>the</strong> extent to whichchildren initiate or control <strong>the</strong> decision making process as a framework for locatingChildren’s Fund activities. Many local evaluators described high degrees <strong>of</strong>commitment to participation among stakeholders. However, in practice, progress inintroducing participative activities tended to be reported as relatively limited andactivities <strong>of</strong>ten correspond with lower degrees <strong>of</strong> initiation or control according to <strong>the</strong>hierarchical scales.Many local evaluators pointed out, particularly in interim reports in which <strong>the</strong>y tendedto comment on <strong>the</strong> early stages <strong>of</strong> Children’s Fund partnerships’ development, thatpartnerships were under pressure to prioritise setting up <strong>the</strong>ir programmes over andabove establishing <strong>the</strong> principles <strong>of</strong> how to generate meaningful engagement withchildren, parents and indeed <strong>the</strong> wider community. Some evaluators accepted,however, that <strong>the</strong>y had <strong>of</strong>fered preliminary findings that were based on evidence <strong>of</strong>emerging practices ra<strong>the</strong>r than being able to comment on <strong>the</strong> longer term effects <strong>of</strong>partnerships’ work on participation.52Chapter 4


Never<strong>the</strong>less, participation in practice typically excluded children, young people and<strong>the</strong>ir parents/carers <strong>from</strong> direct engagement in decision making leading to <strong>the</strong>development <strong>of</strong> <strong>the</strong> partnership strategically or in particular projects. However, someevaluators pointed to examples <strong>of</strong> such forms <strong>of</strong> participation, albeit inconsistentlyapplied across partnerships’ activities. Where participation was reported it wasusually where children and young people had been respondents to consultationexercises ra<strong>the</strong>r than active involvement in decision making relating to servicedevelopment. An evaluation report for a Wave One partnership, for example,suggested that despite broad levels <strong>of</strong> commitment, <strong>the</strong> practice <strong>of</strong> introducingparticipation that was meaningful to children and young people was more limited:There was evidence <strong>of</strong> a commitment to participation through key objectivesetting, resources being dedicated and <strong>the</strong> monitoring <strong>of</strong> <strong>the</strong> levels and types <strong>of</strong>participation. From children’s interviews <strong>the</strong>re was limited evidence <strong>of</strong> <strong>the</strong>numbers <strong>of</strong> children involved in service planning and development, withchildren saying <strong>the</strong>y would welcome <strong>the</strong> opportunity <strong>of</strong> fur<strong>the</strong>r involvement.A number <strong>of</strong> local evaluators pointed to Children’s Fund partnerships and projects’difficulties in ensuring that children and young people participating wererepresentative <strong>of</strong> <strong>the</strong> range <strong>of</strong> characteristics <strong>of</strong> children locally (a finding that is alsonoted in NECF, 2004c). Children and young people who were assumed to be ‘hard toreach’, such as those <strong>from</strong> refugee families and young carers, were seen asparticularly under-represented in participative activities. There are, however,exceptions, for example, a Wave One local evaluation report describes how a projectsupporting black and minority ethnic and refugee communities was encouragingactive participation in <strong>the</strong> development <strong>of</strong> <strong>the</strong> service by responding to <strong>the</strong>ir ‘selfidentifiedneeds’. This project was described as succeeding in its purpose <strong>of</strong> being‘empowering’, by developing ‘a variety <strong>of</strong> methodologies and approaches that meet<strong>the</strong> needs <strong>of</strong> children and young people’. Evaluators also commented on <strong>the</strong> agepr<strong>of</strong>ile <strong>of</strong> children who had participated in projects as being older, and that stepsneeded to be taken to ensure younger children’s views were also being heard. Someevaluators commented on <strong>the</strong> need to ensure that inappropriate generalisations werenot made <strong>from</strong> a small group <strong>of</strong> children, and that children were not being calledupon to represent large constituencies <strong>of</strong> views when no arrangements were in placefor <strong>the</strong> range <strong>of</strong> views and ideas to be captured.<strong>Local</strong> evaluators identified key processes that a number <strong>of</strong> partnerships had adoptedthat had appeared to have effectively promoted children and young people’sparticipation. Among <strong>the</strong>se is <strong>the</strong> importance <strong>of</strong> resourcing dedicated skills and timeChapter 4 53


for staff, typically dedicated participation <strong>of</strong>ficers, to work on promoting participativestrategies, <strong>the</strong> need to train practitioners and service managers and to provideorganisational support for Children’s Fund service providers to develop effectiveapproaches to participation. It is also widely acknowledged that considerableinvestment <strong>of</strong> time is likely to be required in order for participation strategies tobecome effective, hence many evaluators recommend that partnerships factor inrealistic timescales in enabling participative practices to become an integral part <strong>of</strong>service delivery and strategic involvement.The positioning <strong>of</strong> voluntary and community sector organisations within partnershipswas identified by a number <strong>of</strong> local evaluators as being an important factor inencouraging a wide range <strong>of</strong> stakeholders to participate in Children’s Fund activities,particularly at operational level. A Wave One report, for example, suggests thatvoluntary and community organisations have <strong>the</strong> skills to draw ‘…families andchildren who have not traditionally engaged well with mainstream services [intoparticipative activities]’. Several reports describe how voluntary and communityorganisations have been able to bring a more flexible approach to project activity,and are more willing to experiment with appropriate and innovative ways <strong>of</strong>accessing and involving service-users.Some reports raise <strong>the</strong> concern, however, that lessons relating to participatory workcould potentially be lost if engagement with <strong>the</strong> voluntary and community sector isnot sustained. This could be done, for example, by ensuring that resources are inplace to support ongoing staff training. Similar concerns related to sustainability weredescribed in o<strong>the</strong>r evaluation reports; a number <strong>of</strong> evaluators highlighted <strong>the</strong>importance <strong>of</strong> partnerships having in place strategies, protocols and guidance tobring coherence to participative practices. For example, a Wave One local evaluationasserts that in order to ensure ongoing accountability, permanent structuressupporting participatory practices need to be implemented and sustained:‘Partnerships need to be both imaginative and transparent, with clarity <strong>of</strong> objectivesand processes’.4.3 Key Messages <strong>from</strong> <strong>Local</strong> <strong>Evaluation</strong>: Partnership WorkingMany reports analysing partnership working adopted <strong>the</strong> Guidance document as areference point in defining expectations <strong>of</strong> how partnership boards should beoperating. Some later reports also referred to <strong>the</strong> emerging policy context <strong>of</strong>partnership working described in Every Child Matters to help define important54Chapter 4


dimensions to partnership working. Early interim reports concentrated on describing<strong>the</strong> activities <strong>of</strong> partnership boards in setting up <strong>the</strong>ir programmes <strong>of</strong> work. Laterreports focused on <strong>the</strong> wider aspects <strong>of</strong> partnership working in terms <strong>of</strong> <strong>the</strong> way itrelates to <strong>the</strong> changing policy context towards integrated children’s services followinglegislative changes, placing more emphasis on mainstreaming and outcome analysis,with many making recommendations to widen <strong>the</strong> membership <strong>of</strong> <strong>the</strong> partnershipboard. For example, an evaluation report on a Wave One partnership points to <strong>the</strong>importance <strong>of</strong> sustaining partnership working as new configurations <strong>of</strong> children’sservices emerge. It is suggested that information-sharing and networking betweenuser groups, policy makers and practitioners ‘…can provide a strategic basis formainstreaming Children’s Fund work’.A number <strong>of</strong> evaluation reports noted that Children’s Fund partnerships tended tostart with small groups <strong>of</strong> statutory sector senior managers who assembled <strong>the</strong>infrastructure <strong>of</strong> <strong>the</strong> local programmes and commissioned services. Within somepartnerships this was followed by a gradual expansion in <strong>the</strong> size and makeup <strong>of</strong>strategic stakeholders, although comments <strong>from</strong> stakeholders suggest that numbers<strong>of</strong> board members have declined over time and that more junior managers werereplacing some original senior managers. This had both eroded <strong>the</strong> continuity <strong>of</strong>stakeholders’ contributions as well as reflecting a lowering <strong>of</strong> <strong>the</strong> priority <strong>of</strong> <strong>the</strong>Children’s Fund by some agencies. <strong>Local</strong> evaluators also noted that few partnershipboards had regular health service representation at meetings.<strong>Local</strong> evaluators reported different experiences and understandings <strong>of</strong> partnershipworking among partnership board members. They also found that discussions hadtaken place and broad agreements reached on what partnership working meant interms <strong>of</strong> governance arrangements, <strong>the</strong> vision or direction <strong>of</strong> <strong>the</strong> partnership and acommitment to <strong>the</strong> structures and processes to take <strong>the</strong> partnership forward. <strong>Local</strong>evaluators noted that some members saw <strong>the</strong> Fund in administrative terms with <strong>the</strong>responsibilities <strong>of</strong> <strong>the</strong> board being to ensure probity and demonstrating that it metCentral Government requirements. In contrast some board members viewed <strong>the</strong>irpartnership as an expression <strong>of</strong> close collaborative working between agencies andsectors to help bring about cultural change in <strong>the</strong> way children’s services weredelivered. In particular, partnership working was viewed as providing opportunities forsharing information and learning about good practice as well as agreeing on servicedevelopment and shared priorities and <strong>the</strong>reby reducing duplication <strong>of</strong> services.Chapter 4 55


Where partnership working was described as being effective, evaluators described<strong>the</strong> conditions that promoted that effectiveness and recommended preserving <strong>the</strong>sefeatures while being able to effectively adapt to a changing policy environment and<strong>the</strong> financial challenges this implied. Important conditions include having a sharedvision <strong>of</strong> <strong>the</strong> purpose <strong>of</strong> <strong>the</strong> initiative and transparency <strong>of</strong> decision making where itwas clear to stakeholders where responsibilities lay, what decisions were being madeand what actions were being taken by whom. A critical condition for effectivepartnership working was noted as good communications between stakeholdersrepresenting <strong>the</strong> different aspects <strong>of</strong> a programme including <strong>the</strong> partnership board,Children’s Fund staff, service providers and service users. This was particularlynotable in terms <strong>of</strong> perceived relationships between <strong>the</strong> operational and strategiclevels <strong>of</strong> <strong>the</strong> Children’s Fund partnership, <strong>the</strong> lack <strong>of</strong> which was regarded as a majorbarrier to <strong>the</strong> ‘mainstreaming’ <strong>of</strong> Children’s Fund practices and philosophy. Thisdifficulty was sometimes exacerbated by what is described by a Wave Two localevaluation report as ‘…complicated linkages between <strong>the</strong> vertical and horizontallayers <strong>of</strong> <strong>the</strong> partnership’s structure’.Some local evaluations identified cases <strong>of</strong> relatively adversarial relations betweenboard members, in which unresolved differences existed between board members,sometimes stemming <strong>from</strong> negative experiences <strong>of</strong> partnership working betweenagencies prior to <strong>the</strong> Children’s Fund. This was described as having an adverseimpact on <strong>the</strong> effectiveness <strong>of</strong> <strong>the</strong> partnership board, producing fragmented, singleagency and unfocused board activities.<strong>Local</strong> evaluators also described <strong>the</strong> collaborative arrangements between Children’sFund projects, noting however, that such arrangements tended to be relativelyuneven across partnerships’ activities. For example, an evaluation report <strong>of</strong> a WaveOne partnership suggested: ‘There were piecemeal operational relationships acrossprojects that were forged through some enthusiastic staff developing contacts’.Never<strong>the</strong>less, some reports refer to established and emerging networks <strong>of</strong> voluntaryand community sector organisations that provide Children’s Fund services. Whilstsome networks were facilitated by Children’s Fund staff in order to share informationand good practice, many were reported as having been established by <strong>the</strong>organisations <strong>the</strong>mselves.A number <strong>of</strong> local evaluators commented on <strong>the</strong> relationships between statutory andvoluntary and community organisations in Children’s Fund partnership arrangements56Chapter 4


and that in many localities, local authority agencies took <strong>the</strong> lead in programme setup, whilst voluntary and community organisations tended to be more active later in<strong>the</strong>ir roles as service providers. Some reports note a degree <strong>of</strong> frustration amongvoluntary and community stakeholders who felt <strong>the</strong>y had relatively limited influencecompared to some statutory agencies, although this was not <strong>the</strong> case across allpartnerships. In some partnerships voluntary and community organisations had ahigh pr<strong>of</strong>ile in partnership board activities and <strong>the</strong> Children’s Fund was seen as bothraising <strong>the</strong> pr<strong>of</strong>ile <strong>of</strong> <strong>the</strong> voluntary and community sector and as promoting productiverelations with statutory sector agencies.A number <strong>of</strong> more recent evaluation reports speculated on <strong>the</strong> ways <strong>the</strong> relationshipsbetween <strong>the</strong> statutory and voluntary and community sectors may play out aschildren’s trusts become established. A commonly-held anxiety is that gains madethrough Children’s Fund activity may be lost in <strong>the</strong> next phase <strong>of</strong> integratingchildren’s services as it was predicted that <strong>the</strong> voluntary and community sector wouldbe relatively marginal in decision making. Illustrating this point one report suggests:The Children’s Fund has assisted working relations between <strong>the</strong> statutory andvoluntary and community sectors, but some voluntary organisations see<strong>the</strong>mselves as being at risk <strong>of</strong> again being on <strong>the</strong> margins with <strong>the</strong> move to <strong>the</strong>local authority taking <strong>the</strong> lead in integrating children’s services through <strong>the</strong>Children’s Trust.However, some local evaluation reports point to concerns among voluntary andcommunity sector stakeholders as closer relationships between <strong>the</strong>mselves and <strong>the</strong>statutory sector might undermine <strong>the</strong> distinctive skills and working practices that havebeen deemed valuable within Children’s Fund partnerships. For example, a WaveOne evaluation described how <strong>the</strong> ‘anti-managerial’ culture <strong>of</strong> voluntary andcommunity organisations and <strong>the</strong>ir independence had facilitated work with ‘hard-toreach’families: ‘… sometimes independence becomes necessary for voluntary andcommunity organisations to maintain a critical distance <strong>from</strong> mainstreamarrangements for <strong>the</strong> benefit <strong>of</strong> particular groups <strong>of</strong> children’.Some local evaluators identified links between partnership boards and o<strong>the</strong>r strategicgroups in <strong>the</strong> locality, and that board members were familiar with policy changes ino<strong>the</strong>r sectors that would impact on <strong>the</strong> Children’s Fund programme. O<strong>the</strong>r reportsrecorded mixed experiences among strategic Children’s Fund stakeholders abouthow well connected <strong>the</strong>y were to o<strong>the</strong>r strategic groups, with some board membersacross different programmes tending to feel relatively uninformed about widerChapter 4 57


strategic changes in children’s services. Some evaluators noted that partnershipboards generally felt dislocated <strong>from</strong> <strong>the</strong> broader strategic environment, andrecognised <strong>the</strong>y needed to connect with o<strong>the</strong>r strategic groups to prepare for <strong>the</strong>mainstreaming issues concerning <strong>the</strong> future <strong>of</strong> <strong>the</strong> Children’s Fund. Some localevaluators also noted, however, that emerging networks and working practices wouldbe valuable as partnerships prepared for <strong>the</strong> move towards more integratedchildren’s services, as one Wave Two report suggests:… [<strong>the</strong>re has been] considerable progress in breaking down pr<strong>of</strong>essionalbarriers between respective organisations that traditionally have not workedeffectively toge<strong>the</strong>r. This would appear to be one <strong>of</strong> <strong>the</strong> key legacies <strong>of</strong> <strong>the</strong> XChildren’s Fund that needs to be sustained in <strong>the</strong> Children’s Trust.4.4 Chapter Summary1. The focus <strong>of</strong> local evaluators’ work on prevention has been to analyse <strong>the</strong>activities <strong>of</strong> Children’s Fund projects. Most evaluations conceptualisedprevention in relation to <strong>the</strong> Children’s Fund sub-objectives, and/or to <strong>the</strong>Every Child Matters outcomes framework in assessing projects’ progress. Anumber <strong>of</strong> evaluators related <strong>the</strong> work <strong>of</strong> projects to <strong>the</strong> four ‘tiers’ <strong>of</strong>prevention adopted in <strong>the</strong> Children’s Fund Guidance.2. <strong>Local</strong> evaluators widely report on <strong>the</strong> beneficial impacts <strong>of</strong> Children’s Fundpreventive projects on children and young people’s lives and that manyprojects are effective at targeting ‘hard to reach’ groups <strong>of</strong> children and youngpeople. Some evaluators suggest that voluntary and community organisationshave been particularly successful in engaging and addressing <strong>the</strong> needs <strong>of</strong>traditionally excluded groups through <strong>of</strong>fering alternative approaches to those<strong>of</strong> <strong>the</strong> mainstream agencies in terms <strong>of</strong> being flexible, accessible and nonstigmatising.3. Understandings <strong>of</strong> ‘prevention’ among strategic stakeholders and serviceproviders within partnerships tend to be described in local evaluation reportsas divergent, although some acknowledge that progress is being made bypartnership boards in negotiating a consensus <strong>of</strong> understanding.58Chapter 4


4. Evaluators identified diverse definitions <strong>of</strong> participation among stakeholders inmost partnerships. Many evaluators reported that pr<strong>of</strong>essionals’ and adults’definitions <strong>of</strong> participation tend to predominate, ra<strong>the</strong>r than those <strong>of</strong> childrenand young people <strong>the</strong>mselves. Often <strong>the</strong>se equate to consulting children andyoung people in order to inform <strong>the</strong> development <strong>of</strong> responsive services.Despite considerable commitment to participation, in practice progress tendsto be relatively limited, with few examples <strong>of</strong> children and young people’sdirect participation in decision making at strategic and project levels.5. A number <strong>of</strong> evaluators suggested that ‘hard to reach’ children and youngpeople and younger children were particularly under-represented in Children’sFund participative activities, although some evaluators identified examples <strong>of</strong>participative activities that had been effective in engaging an inclusive range<strong>of</strong> children and young people.6. <strong>Local</strong> evaluators pointed to <strong>the</strong> importance <strong>of</strong> partnerships resourcingdedicated participation <strong>of</strong>ficers, <strong>the</strong> need to train practitioners and servicemanagers and <strong>the</strong> importance <strong>of</strong> providing organisational support for serviceproviders to develop effective participative practices. Considerable investment<strong>of</strong> time is also needed if participation strategies are to be effective. Someevaluations suggest that voluntary and community organisations may beparticularly effective in promoting participation due to <strong>the</strong>ir abilities to work inflexible and innovative ways.7. Evaluators reported different experiences and understandings <strong>of</strong> partnershipworking, although within many partnerships agreements had been reached onwhat partnership working means. Some stakeholders saw partnership boards’responsibilities as ensuring probity and demonstrating that <strong>the</strong>y met CentralGovernment requirements. O<strong>the</strong>rs emphasised collaborative working betweenorganisations to bring about cultural change in children’s services.8. Conditions reinforcing effective partnership working include: sharing a vision<strong>of</strong> <strong>the</strong> purpose <strong>of</strong> <strong>the</strong> initiative; transparency <strong>of</strong> decision making; clarity aboutstakeholders’ responsibilities; good communications between groups <strong>of</strong>stakeholders including members <strong>of</strong> <strong>the</strong> partnership board, Children’s Fundstaff, service providers and service users.Chapter 4 59


9. Evaluators reported that some collaborative arrangements and networksbetween Children’s Fund projects had been facilitated by partnerships,although many suggested that such arrangements are relatively unevenacross partnerships’ activities. O<strong>the</strong>r collaborative arrangements and networkswere reported as having been established by <strong>the</strong> organisations <strong>the</strong>mselves.10. A number <strong>of</strong> evaluators commented that statutory agencies tended to leadstrategically, whilst voluntary and community organisations tended to provideChildren’s Fund services. However, within some partnerships voluntary andcommunity organisations had been strategically influential, indeed, <strong>the</strong>Children’s Fund had raised <strong>the</strong> pr<strong>of</strong>ile <strong>of</strong> <strong>the</strong> sector. Some stakeholderspredicted, however, that <strong>the</strong> voluntary and community sector would bemarginalised in <strong>the</strong> next phase <strong>of</strong> integrating children’s services.60Chapter 4


Conclusions<strong>Local</strong> evaluators have responded to <strong>the</strong> challenges <strong>of</strong> evaluating <strong>the</strong> complex, multilayeredarrangements developed by Children’s Fund partnerships by creativelyadopting evaluation methods. Many have drawn on multiple quantitative andqualitative methods, as well as data sets at a number <strong>of</strong> different levels, to show <strong>the</strong>impacts <strong>of</strong> projects and o<strong>the</strong>r activities, or used methods differentially to answerquestions about different aspects <strong>of</strong> partnerships’ work. A widely experienceddilemma, however, relates to <strong>the</strong> difficulties in definitively demonstrating <strong>the</strong> impactsattributable to specific projects or activities, reflecting <strong>the</strong> complexity <strong>of</strong> <strong>the</strong> initiativeand limited time frames within which evaluations have worked. Some evaluators arevery forthright in acknowledging <strong>the</strong> limits to what <strong>the</strong>y are able to show, whilst o<strong>the</strong>rsacknowledge <strong>the</strong> importance <strong>of</strong> openness and transparency in <strong>the</strong> evaluationprocess and <strong>the</strong> methods used in terms <strong>of</strong> data collection, analysis andinterpretation. What this highlights is <strong>the</strong> importance <strong>of</strong> evaluators managingstakeholders’ expectations about what evaluation is able to show, and its limits.Many evaluators also recognise <strong>the</strong> value <strong>of</strong> evaluation evidence based on serviceusers, and in particular, children and young people’s voices, although <strong>the</strong> extent towhich children and young people have been engaged in evaluation activities is veryvaried. The idea that children’s voices should be heard, valued and acted upon isconsistent with <strong>the</strong> participative ethos <strong>of</strong> <strong>the</strong> Children’s Fund initiative. However, anumber <strong>of</strong> factors have made it difficult for evaluators to work in this way. It is widelyrecognised that additional time and resources may be required to effectively workwith children and young people. Moreover, many evaluators describe a degree <strong>of</strong>circumspection, and occasionally hostility to qualitative methods such as interviewingin which children and young people express <strong>the</strong>ir views about <strong>the</strong>ir use <strong>of</strong> projectsand <strong>the</strong>ir perceptions <strong>of</strong> <strong>the</strong> benefits <strong>of</strong> using <strong>the</strong>m. Such methods are frequentlyseen as more ‘anecdotal’, with <strong>the</strong> implication that <strong>the</strong>y produce lesser forms <strong>of</strong>evidence than ‘hard’, quantitative outcome. Never<strong>the</strong>less, <strong>the</strong> value <strong>of</strong> qualitativemethods in social policy research is increasingly recognised in governmentdepartments. This highlights <strong>the</strong> need for evaluators to carefully work withstakeholders to develop clear understandings <strong>of</strong> <strong>the</strong> ways difficult sets <strong>of</strong> evaluationmethods can be effectively used in combination to assess <strong>the</strong> work <strong>of</strong> differentelements <strong>of</strong> programmes’ work.Conclusions 61


The importance <strong>of</strong> providing findings that are relevant, useful, timely and accessibleto a range <strong>of</strong> stakeholders appears to be widely recognised by local evaluators, as is<strong>the</strong> importance <strong>of</strong> providing realistic recommendations to partnerships. Many localevaluators regarded ongoing dialogue with partnerships as critical to successfullyinfluencing <strong>the</strong>ir work through engagement in partnership structures such asevaluation sub-groups or disseminating material through events, ra<strong>the</strong>r than simplywriting reports for strategic audiences. Indeed, running events may be a moreinclusive way <strong>of</strong> disseminating <strong>the</strong>ir findings to strategic stakeholders as well as frontlineservice providers and service users including children and young people.However, it is evident that many local evaluators have, in practice, experienceddifficulties influencing <strong>the</strong> development <strong>of</strong> partnerships’ strategies and practices, orare unclear about how <strong>the</strong>ir findings have been translated into strategies andpractice. Some evaluators suggested <strong>the</strong>y had experienced difficulties maintaining<strong>the</strong>ir independence, in reporting on negative aspects <strong>of</strong> partnerships’ work or thatpartnerships had drawn on evaluation material selectively to demonstrate <strong>the</strong>successful aspects <strong>of</strong> <strong>the</strong>ir work or to legitimise decisions. This was seen in <strong>the</strong>context <strong>of</strong> pressures <strong>from</strong> ongoing change nationally and within local authority areas.Partnership board members’ multiple priorities and <strong>the</strong>ir commitments to <strong>the</strong>ir ownagencies and o<strong>the</strong>r partnerships meant <strong>the</strong>y were unable to act on evaluationfindings or had insufficient time to do so. This highlights <strong>the</strong> challenge that localevaluators have had in providing material that is relevant and useful to partnershipswhilst having <strong>the</strong> space to produce balanced, independent accounts <strong>of</strong> <strong>the</strong>partnerships’ work.62Conclusions


ReferencesBecker, S. and Bryman, A. (eds) (2004) Understanding Research for Social Policyand Practice: Themes, Methods and Approaches, Bristol: The Policy Press.Clarke, A. (2001) ‘Research and <strong>the</strong> policy-making process’ in N. GilbertResearching Social Life (2 nd ed.), London: Sage.Coote, A., Allen, J., and Woodhead, D. (2004) Finding Out What Works:Understanding Complex, Community-Based Initiative, London: Kings Fund.Coulton, C., Connell, J., Kubisch, A., Schorr, L., and Weiss, C. (eds) (1995) NewApproaches to Evaluating Community Initiatives: Concept, Methods and Contexts,New York: The Aspen Institute.Children and Young People’s Unit (CYPU, 2001) Children’s Fund Guidance, London:DfES.CYPU and Youth Justice Board (2002) Use <strong>of</strong> Children’s Fund Partnership Fundingfor Crime Prevention Activities Jointly Agreed with Youth Offending Teams:Guidance.Davies, H., Nutley, S., and Smith, P. (eds) (2000) What Works? Evidence-BasedPolicy and Practice in Public <strong>Services</strong>, Bristol: The Policy Press.Donaldson, S. & Scriven, M. (eds) (2003) Evaluating Social Programmes andProblems: Visions for <strong>the</strong> New Millennium, London: Lawrence Erlbaum Associates.Edwards, A., and Fox, C. (2005) ‘Using Activity Theory to evaluate a complexresponse to social exclusion’, Education & Child Psychology 21:1, pp50-60.Hardiker, P., Exton, K., and Barker, M. (1991) Policies and Practices in PreventiveChild Care, Aldershot: Ashgate.Mason, P. Morris, K., and Smith, P. (2005) ‘A complex solution to a complicatedproblem? Early messages <strong>from</strong> <strong>the</strong> national evaluation <strong>of</strong> <strong>the</strong> Children’s Fundpreventative programme'. Children & Society 19:2, pp131-143.McDonald, G. (2000) ‘Social Care: rhetoric and reality’ in Davies, H., Nutley, S., andSmith, P. (eds) What Works? Evidence-Based Policy and Practice in Public <strong>Services</strong>,Bristol: The Policy Press.Morris, K., and Spicer, N. (2003) The National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund:Early Messages for Developing Practice, London: DfES.National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund (NECF, 2004a) Assessing <strong>the</strong> Impact <strong>of</strong><strong>the</strong> Children’s Fund: The Use <strong>of</strong> Indicators, Birmingham: University <strong>of</strong> Birmingham.National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund (NECF, 2004b) Collaborating for <strong>the</strong>Social Inclusion <strong>of</strong> Children and Young People, London: Department for Educationand Skills.National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund (NECF, 2004c) Children, Young People,Parents and Carers’ Participation in Children’s Fund Case Study Partnerships,London: Department for Education and Skills.References 63


National <strong>Evaluation</strong> <strong>of</strong> <strong>the</strong> Children’s Fund (NECF, 2004d) Prevention and EarlyIntervention in <strong>the</strong> Social Inclusion <strong>of</strong> Children and Young People, London:Department for Education and Skills.Percy-Smith, J. (2000) Policy Responses to Social Exclusion: Towards Inclusion?Maidenhead: Open University Press.Pierson, J. (2002) Tackling Social Exclusion, London: Routledge.Solesbury, I. (2001) Evidence Based Policy: Whence it Came and Where it is Going,UK Centre for Evidence Based Policy and Practice Working Paper 1, London: ESRC.Spencer, L., Ritchie, J., Lewis, J., and Dillon, L. (2004) Quality in Qualitative<strong>Evaluation</strong>: A Framework for Assessing Research Evidence, London: Cabinet Office.Young, K., Ashby, D., Boaz, A., and Grayson, L. (2002) ‘Social science and <strong>the</strong>evidence-based policy movement’, Social Policy & Society 1:3, pp215-24.64References


Copies <strong>of</strong> this publication can be obtained <strong>from</strong>:DfES PublicationsP.O. Box 5050Sherwood ParkAnnesleyNottinghamNG15 0DJTel: 0845 60 222 60Fax: 0845 60 333 60Minicom: 0845 60 555 60Oneline: www.dfespublications.gov.uk© The University <strong>of</strong> Birmingham 2006Produced by <strong>the</strong> Department for Education and SkillsISBN 1 84478 776 1Ref No: RR783www.dfes.go.uk/research

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!