12.07.2015 Views

A Framework for Evaluating Systems Initiatives

A Framework for Evaluating Systems Initiatives

A Framework for Evaluating Systems Initiatives

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

10) Account <strong>for</strong> and examine externalities. <strong>Systems</strong> initiatives take place within and are affectedby externalities—political, cultural, and other factors that are exogenous to systems initiativeactions. Evaluations should take these externalities into account and factor them in when makinggeneralizations from evaluation findings.Especially <strong>for</strong> systems initiatives that take place in multiple locations, progress may look differentin one location compared to another. Each site “is a laboratory of democracy, with its ownpolitical culture and its own set of opportunities and challenges. There clearly is no ‘one size fitsall’ approach, and the sheer scope and scale of systems [work] entails many different parts (notall of which are likely to be funded or acted upon at any one time).” 45 Consequently, differentsites are likely to take different systems change pathways. Results should be interpreted throughlenses that include consideration of what those pathways are and the difficulty involved withsystems change given, <strong>for</strong> example, the momentum <strong>for</strong> change that already exists in that site andthe policy and economic climate in which change is expected.11) Make continuous feedback and learning a priority. Regardless of design or methodologicalChoices, systems initiatives generally benefit from evaluations that make continuous feedbackand learning a priority. Evaluators should establish adequate feedback loops to ensure timelyreporting of both <strong>for</strong>mative and summative findings. Evaluations that address questions raisedby those who plan and implement systems initiatives can be invaluable <strong>for</strong> continuous learningand adaptation. This is particularly true with systems initiatives where a significant aspect of thework involves political will building and strategy is constantly evolving and being considered.Because systems initiatives often unfold without a predictable script, ef<strong>for</strong>ts to evaluate themshould in<strong>for</strong>m initiative strategies as they unfold, so that stakeholders can make good choices andidentify midcourse corrections as needed. More traditional evaluation approaches in which theevaluator develops the evaluation plan and then reports back when the data are all collected andanalyzed, are less appropriate here. 46Renowned evaluator Michael Quinn Patton labels this kind of approach developmentalevaluation, which he posits is well-suited <strong>for</strong> evolving, innovative, and trans<strong>for</strong>mative processestives. 47 Developmental evaluation features evaluators partneringstakeholders and essentially becoming initiative team members. Therole is to ask evaluative questions, provide data-based feedback,erally support developmental or emerging decision making. 48opmental evaluation is especially appropriate <strong>for</strong> many systems initiativeseature high uncertainty and unpredictability, and “where the strategy isle and resources to shake up a system, increase the rate and intensity ofmong system elements and actors, and see what happens…The very actg what emerges and feeding in<strong>for</strong>mation back into the evolving systems <strong>for</strong>m of developmental evaluation part of the intervention.” 4945Bruner, C. (2006). Build evaluation symposium: <strong>Evaluating</strong> early learning systems building initiatives. Unpublished manuscript.46Weiss, H. (1995). New approaches to evaluating systems: The Gwen R. Iding Brogden Distinguished Lecture Series (pp. 411-416).In Liberton, C., Kutash, K., & Friedman, R. (Eds.), 8th Annual Research Conference Proceedings on A System of Care <strong>for</strong> Children’sMental Health: Expanding the Research Base. Tampa, FL: The Research and Training Center <strong>for</strong> Children’s Mental Health.47Patton, M.Q. (2006). Evaluation <strong>for</strong> the way we work. The Nonprofi t Quarterly, 13(1), 28-33.48Ibid, p.28.47Patton, M.Q. (in press). Chapter 10: Conceptualizing the intervention: Alternatives <strong>for</strong> evaluatingtheories of change. In Patton, M.Q. (in press), Utilization-focused evaluation (4th ed.). ThousandOaks, CA: Sage Publications.34A <strong>Framework</strong> <strong>for</strong> <strong>Evaluating</strong> <strong>Systems</strong> <strong>Initiatives</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!