12.07.2015 Views

A Framework for Evaluating Systems Initiatives

A Framework for Evaluating Systems Initiatives

A Framework for Evaluating Systems Initiatives

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Because they interact with the policy process, initiatives focused on context share many things incommon with advocacy and policy change ef<strong>for</strong>ts in general. Both may include similar strategiessuch as coalition building, communication campaigns, grassroots organizing, and media advocacy.And both are “hard to measure” because they evolve over time and their activities and outcomes canshift quickly. 29These similarities in strategy and purpose allow systems initiatives focused on context to draw fromthe now growing body of work on advocacy and policy change evaluation to identify useful evaluationapproaches. 30 For example, evaluation methods that are applicable to both advocacy ef<strong>for</strong>ts andsystems initiatives may include public polling, media tracking, policy tracking, policymaker or bellwetherinterviews, or intense-period debriefs. 31ExamplesThe Build Initiative evaluation uses a theory of change approach. The initiative has three theories ofchange that focus on 1) what an early childhood system must include to produce results; 2) the actionsor strategies needed to build an early childhood system, and 3) what special role outside Build supportscan provide to catalyze change. 32 Since Build began in 2002, the evaluation has focused on thesecond theory of change. In recent years, the evaluation has attempted to plausibly trace state-levelpolicy and funding changes back to Build Initiative activities.The Child and Family Policy Center leads the evaluation and partners with local evaluators to collectdata in the Build states. Local evaluation partners document theory of change components using acase study approach that employs methods they determine are most relevant in their state, such asinterviews, surveys, document review, and participant observation. Evaluators use data from thesemethods to critically examine the state’s progress around the Build theory of change and to determinewhether system-related results can be linked back to Build Initiative activities. The Child and FamilyPolicy Center then produces an annual report on overall initiative progress that includes a cross-stateanalysis of results.Another example comes from the Urban Health Initiative evaluation conducted by New YorkUniversity’s Center <strong>for</strong> Health and Public Service Research. In a unique design choice, evaluatorsintegrated a theory of change approach with a quasi-experimental comparison group design.Evaluators identified 10 non-initiative cities to compare with initiative cities on outcome and impactmeasures, including leadership, collaboration, and the use of data. Like other theory of changeevaluations, evaluators compared program theory and experience; but they believed they couldstrengthen their approach by integrating a comparison group into the design to rule out alternativeexplanations <strong>for</strong> evaluation findings. 3329Harvard Family Research Project (2007). Advocacy and Policy Change. The Evaluation Exchange, 13(1). Cambridge, MA: Author.30Ibid.31For a description of the bellwether methodology see Blair, E. (2007). <strong>Evaluating</strong> an issue’s position on the policy agenda: The bellwether methodology. TheEvaluation Exchange, 13(1), 29. For a description of the intense-period debrief see Bagnell Stuart, J. (2007). Necessity leads to innovative evaluation approachand practice. The Evaluation Exchange, 13(1), 10-11.32Bruner, C. (2004). Toward a theory of change <strong>for</strong> the Build Initiative: A discussion paper. Retrieved on June 27, 2007 from http://www.buildinitiative.org/docs/TowardaTheoryofChange.doc.33Weitzman, B.C., Silver, D. & Dillman, K. (2002). Integrating a comparison group design into a theory of change evaluation: The case of the UrbanHealth Initiative. American Journal of Evaluation, 23(4), 371-385.19A <strong>Framework</strong> <strong>for</strong> <strong>Evaluating</strong> <strong>Systems</strong> <strong>Initiatives</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!