20.11.2012 Views

Using evaluation research to inform policy and practice in early ...

Using evaluation research to inform policy and practice in early ...

Using evaluation research to inform policy and practice in early ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

The data difference –<br />

<strong>Us<strong>in</strong>g</strong> <strong>evaluation</strong> <strong>research</strong><br />

<strong>to</strong> <strong><strong>in</strong>form</strong> <strong>policy</strong> <strong>and</strong> <strong>practice</strong> <strong>in</strong><br />

<strong>early</strong> childhood<br />

Claudia Coul<strong>to</strong>n, Ph.D. & Rob Fischer, Ph.D.,<br />

Center on Urban b Poverty & Community Development, l M<strong>and</strong>el d l<br />

School of Applied Social Sciences<br />

Rebekah Dorman, Ph.D. & Robert Staib, MSSA,<br />

Office of Early Childhood/Invest <strong>in</strong> Children, Cuyahoga County<br />

Presentation Outl<strong>in</strong>e<br />

MSASS Research <strong>to</strong> Practice Sem<strong>in</strong>ar. March 4, 2010<br />

• Context <strong>and</strong> orig<strong>in</strong>s<br />

• Role <strong>and</strong> process of <strong>evaluation</strong><br />

• Mean<strong>in</strong>g from data<br />

• Data <strong>in</strong><strong>to</strong> <strong>practice</strong><br />

1


Context <strong>and</strong> Orig<strong>in</strong>s<br />

• Comprehensive community <strong>in</strong>itiatives<br />

▫ Invest <strong>in</strong> Children is an example<br />

▫ Effort <strong>to</strong> build community capacity for all children<br />

<strong>and</strong> families <strong>in</strong> their <strong>early</strong> years<br />

▫ Theory of change about specific programs <strong>and</strong><br />

system/ t / community it change h<br />

• Role of <strong>evaluation</strong> <strong>in</strong> CCIs<br />

• Methodological challenges<br />

Context <strong>and</strong> Orig<strong>in</strong>s<br />

• Comprehensive community <strong>in</strong>itiatives<br />

• Role of <strong>evaluation</strong> <strong>in</strong> CCIs<br />

▫ Implementation <strong>and</strong> <strong>early</strong> outcomes<br />

▫ Quality of program components<br />

▫ Long term benefits<br />

▫ Cont<strong>in</strong>uous course correction<br />

• Methodological challenges<br />

2


Context <strong>and</strong> Orig<strong>in</strong>s<br />

• Comprehensive community <strong>in</strong>itiatives<br />

• Role of <strong>evaluation</strong> <strong>in</strong> CCIs<br />

• Methodological challenges<br />

▫ Numerous pieces <strong>and</strong> part<br />

▫ Measur<strong>in</strong>g system change<br />

▫ Population impacts vs. program outcomes<br />

▫ Comparison groups<br />

▫ Quality control<br />

The example of Invest <strong>in</strong> Children<br />

• LLaunch h of f EEarly l Childh Childhood d IInitiative i i i i <strong>in</strong> mid id<br />

1999 (now called Invest <strong>in</strong> Children)<br />

• Bra<strong>in</strong> <strong>research</strong><br />

• Return on <strong>in</strong>vestment<br />

• Early Childhood system<br />

• EEvaluation l i as a core value<br />

l<br />

3


Invest <strong>in</strong> Children <strong>in</strong> Context<br />

• Orig<strong>in</strong>ally a three three-year year <strong>in</strong>itiative began July 1, 1 1999;<br />

$40m <strong>in</strong> fund<strong>in</strong>g from governmental <strong>and</strong> 23 private<br />

sec<strong>to</strong>r funders); now a $22m annual budget<br />

• IIC mission is <strong>to</strong> plan, fund, coord<strong>in</strong>ate, <strong>and</strong> evaluate<br />

services for children prenatal <strong>to</strong> six <strong>and</strong> their families<br />

• OOrganized i d under d the h CCuyahoga h C County Offi Office of f EEarly l<br />

Childhood <strong>in</strong> 2004<br />

Invest <strong>in</strong> Children: Goals & Programs<br />

• GGoal l 1: Eff Effective ti PParents t & FFamilies ili<br />

▫ Home Visit<strong>in</strong>g<br />

▫ Early Childhood Mental Health<br />

• Goal 2: Safe & Healthy Children<br />

▫ Health Insurance<br />

▫ Primary Lead Poison<strong>in</strong>g Prevention<br />

▫ Medical Home<br />

• GGoal l 3: Child Children PPrepared d f for SSchool h l<br />

▫ Profession Development, centers & homes<br />

▫ Special Needs Child Care<br />

▫ Universal Pre-K<strong>in</strong>dergarten<br />

• Goal 4: A Community Committed <strong>to</strong> Children<br />

4


Role & process of <strong>evaluation</strong><br />

• Role of <strong>evaluation</strong> adopted as a core operat<strong>in</strong>g<br />

pr<strong>in</strong>ciple by IIC<br />

• Evaluation is ongo<strong>in</strong>g, <strong>in</strong>tegrated <strong>in</strong><strong>to</strong> program<br />

development <strong>and</strong> implementation<br />

▫ External <strong>evaluation</strong><br />

▫ Program moni<strong>to</strong>r<strong>in</strong>g<br />

• Commitment of philanthropic funds has allowed<br />

<strong>in</strong>vestment <strong>in</strong> <strong>evaluation</strong> over time<br />

IIC <strong>evaluation</strong><br />

Build<strong>in</strong>g on strengths –<br />

Build<strong>in</strong>g on strengths<br />

▫ Powerful child registry of all kids born <strong>in</strong> County<br />

s<strong>in</strong>ce 1992 <strong>and</strong> served by IIC <strong>and</strong> others<br />

▫ Ability <strong>to</strong> draw on rout<strong>in</strong>ely available<br />

adm<strong>in</strong>istrative data <strong>to</strong> moni<strong>to</strong>r program delivery<br />

<strong>and</strong> outcomes<br />

▫ Solid experience <strong>in</strong> measur<strong>in</strong>g -<br />

� Parent <strong>and</strong> caregiver knowledge, attitudes, <strong>and</strong><br />

behavior<br />

� Quality of care sett<strong>in</strong>gs <strong>to</strong> which children are<br />

exposed<br />

5


Evaluation process for Invest <strong>in</strong><br />

Children<br />

• Develop program logic models <strong>to</strong> reflect the basic<br />

theory underly<strong>in</strong>g the strategy<br />

• Work with lead agencies <strong>and</strong> other key stakeholders <strong>to</strong><br />

develop specific <strong>evaluation</strong> questions<br />

• Create proposal <strong>and</strong> <strong>evaluation</strong> plan for review by<br />

Invest <strong>in</strong> Children Executive Committee<br />

• Conduct <strong>evaluation</strong>; methods <strong>in</strong>clude surveys,<br />

<strong>in</strong>terviews <strong>in</strong>terviews, adm<strong>in</strong>istrative data data, observations observations, case<br />

record review, etc., as appropriate<br />

• Report f<strong>in</strong>d<strong>in</strong>gs <strong>to</strong> Executive <strong>and</strong> Partnership<br />

Committee, programs, <strong>and</strong> community<br />

Evaluation process for Invest <strong>in</strong><br />

Children<br />

• Instruments<br />

• Data Systems<br />

• Encouragement<br />

• Frustrations<br />

6


Conceptualiz<strong>in</strong>g the overall approach<br />

Invest <strong>in</strong> Children Logic Model<br />

Strategies Programs Outputs<br />

Effective<br />

Parents <strong>and</strong> Families<br />

Prenatal <strong>to</strong> three system<br />

Safe & Healthy<br />

Children<br />

Access & utilization of<br />

Preventative health care<br />

Children Prepared<br />

For School<br />

Early Care <strong>and</strong> Education<br />

System<br />

Community<br />

Committed <strong>to</strong><br />

Children<br />

Home visit<strong>in</strong>g<br />

Early Literacy <strong>and</strong> Learn<strong>in</strong>g<br />

Early Childhood Mental Health<br />

Healthy Start Outreach<br />

Medical Home<br />

Primary Lead Prevention<br />

Family Child Care Homes<br />

Regional System<br />

Professional development<br />

For Centers<br />

T.E.A.C.H.<br />

UPK<br />

Special Needs Child Care<br />

Communications campaign<br />

Community mobilization<br />

Dissem<strong>in</strong>ate f<strong>in</strong>d<strong>in</strong>gs<br />

Number of<br />

children <strong>and</strong><br />

families reached<br />

Children reached<br />

<strong>early</strong> <strong>and</strong><br />

with cont<strong>in</strong>uity<br />

Families<br />

effectively<br />

access range of<br />

available<br />

services<br />

Caregivers<br />

effectively<br />

engaged<br />

Intermediate<br />

Outcomes<br />

Parent <strong>and</strong><br />

caregivers have<br />

<strong>in</strong>creased<br />

knowledge<br />

<strong>and</strong> skills<br />

Children receive<br />

appropriate<br />

care at home <strong>and</strong><br />

<strong>in</strong> other sett<strong>in</strong>gs<br />

Children<br />

develop<br />

appropriately<br />

Scope of Services Delivered<br />

Any IIC Service<br />

Medicaid<br />

Newborn Home Visit<strong>in</strong>g<br />

Family Child Care<br />

Ongo<strong>in</strong>g Home Visit<strong>in</strong>g<br />

Early Intervention<br />

Special Needs<br />

2,764<br />

15,323<br />

30,624<br />

30,438<br />

54,940<br />

Longer-term<br />

Outcomes<br />

Positive<br />

movement on<br />

community-level<br />

child well-be<strong>in</strong>g<br />

<strong>in</strong>dica<strong>to</strong>rs<br />

Community<br />

ethic<br />

around our<br />

youngest<br />

children<br />

144,462<br />

177,121<br />

0 20,000 40,000 60,000 80,000 100,000 120,000 140,000 160,000 180,000 200,000<br />

Number of Children<br />

Service data July 1999 through December 2007<br />

7


Dissem<strong>in</strong>at<strong>in</strong>g <strong>evaluation</strong> f<strong>in</strong>d<strong>in</strong>gs<br />

• The potential audiences for <strong>evaluation</strong> f<strong>in</strong>d<strong>in</strong>gs<br />

will not come look<strong>in</strong>g for you<br />

• Use creative ways <strong>to</strong> take f<strong>in</strong>d<strong>in</strong>gs <strong>to</strong>:<br />

� Program staff<br />

� Media<br />

� K Key stakeholders<br />

t k h ld<br />

� Community<br />

• Models that can be replicated by other systems<br />

8


Evaluation purposes<br />

• Program plann<strong>in</strong>g <strong>and</strong> improvement<br />

� formative <strong>evaluation</strong><br />

• Accountability<br />

� summative <strong>evaluation</strong><br />

• Knowledge generation<br />

Elements of a learn<strong>in</strong>g partnership<br />

• OOngo<strong>in</strong>g i external l <strong>evaluation</strong> l i b by university-based i i b d social i l<br />

science <strong>research</strong>ers<br />

• Development of quarterly program <strong>in</strong>dica<strong>to</strong>rs for<br />

track<strong>in</strong>g the progress of strategies <strong>in</strong> relation <strong>to</strong> service<br />

targets<br />

• Creation of program logic models <strong>to</strong> specify both the<br />

<strong>in</strong>tentions of the program elements <strong>and</strong> <strong>to</strong> provide a<br />

fframework k f for th the <strong>evaluation</strong> l ti of f program effects ff t<br />

• Form<strong>in</strong>g of an Executive Committee <strong>in</strong>clud<strong>in</strong>g County<br />

officials, funders, <strong>and</strong> program opera<strong>to</strong>rs, who engage <strong>in</strong><br />

program ref<strong>in</strong>ement decision mak<strong>in</strong>g based on<br />

<strong>evaluation</strong> data <strong>and</strong> other <strong>in</strong>put<br />

9


Mak<strong>in</strong>g mean<strong>in</strong>g from data<br />

• Balanc<strong>in</strong>g scope with quality<br />

• Work<strong>in</strong>g <strong>to</strong> blend efforts over time<br />

• The challenge of s<strong>in</strong>gular focus<br />

• Evaluation as a <strong>to</strong>ol not an end <strong>in</strong> itself<br />

Pitfalls – Example #1<br />

• Agree<strong>in</strong>g on an <strong>evaluation</strong> measures upfront but<br />

<strong>in</strong>terpretation becomes problematic<br />

▫ Quality of home-based child care (makes program<br />

look bad)<br />

▫ Child maltreatment (makes ( pprogram g look g good) )<br />

10


Rais<strong>in</strong>g quality <strong>in</strong> home-based care<br />

Nuumber<br />

of FCCH Providers<br />

30<br />

25<br />

20<br />

15<br />

10<br />

5<br />

0<br />

9<br />

11<br />

16<br />

28<br />

15 15<br />

11<br />

5<br />

16<br />

7<br />

Time 1 (N=68)<br />

Time 2 (N=68)<br />

1 1 1<br />

0 0 0 0 0 0 0 0 0 0 0<br />

1


Pitfalls – Example #2<br />

• Complications by design<br />

▫ Sell<strong>in</strong>g rigor <strong>to</strong> get better answers but nobody’s<br />

buy<strong>in</strong>g<br />

▫ Eventually f<strong>in</strong>d<strong>in</strong>g opportunities <strong>to</strong> build <strong>in</strong> better<br />

designs<br />

No assignment <strong>to</strong> conditions<br />

• Creation of comparable groups for <strong>evaluation</strong><br />

prevented by ethical, legal, cl<strong>in</strong>ical, <strong>and</strong> logistical<br />

barriers<br />

• Eventual progress achieved by focus<strong>in</strong>g on -<br />

▫ Roll-out of new service additions that cannot be<br />

offered <strong>to</strong> everyone<br />

▫ Group-based assignment<br />

12


Pitfalls – Example #3<br />

• DDeterm<strong>in</strong><strong>in</strong>g i i if the h <strong>evaluation</strong> l i relationship l i hi really ll<br />

matters –<br />

▫ Did anyth<strong>in</strong>g change? (program design, program<br />

development)<br />

Examples of changes based on<br />

<strong>evaluation</strong> <strong>in</strong> IIC<br />

• Goal 1 – Effective parents<br />

▫ Enhanced services for teens<br />

▫ Exp<strong>and</strong>ed newborn home visit <strong>to</strong> older moms<br />

▫ Engagement strategies for ongo<strong>in</strong>g home visit<strong>in</strong>g<br />

• Goal 2 – Safe & healthy children<br />

▫ Pursuit of a medical home strategy gy<br />

• Goal 3 – Children ready for school<br />

▫ Pilot of technical assistance doses for family child<br />

care home providers<br />

▫ Inform<strong>in</strong>g the UPK pilot<br />

13


Evaluation challenges<br />

• Invitation – who called for the <strong>evaluation</strong>?<br />

• Language – do we underst<strong>and</strong> each other?<br />

• Fear – what is the true goal of the <strong>evaluation</strong>?<br />

• Scope/focus – can we really measure this?<br />

• Control – <strong>in</strong>ternal or external?<br />

• Resources – when <strong>and</strong> with what budget?<br />

Data <strong>in</strong><strong>to</strong> <strong>practice</strong><br />

Observations…<br />

▫ Data don’t make <strong>policy</strong>… People with data make <strong>policy</strong><br />

▫ Policy shapes <strong>research</strong><br />

▫ Everyone wants outcomes… outcomes few want <strong>to</strong> pay for them<br />

(or pay very much)<br />

▫ The great divides still need <strong>to</strong> be bridged<br />

14


Do’s: For the evalua<strong>to</strong>r<br />

▫ Push program staff <strong>to</strong> ask the <strong>research</strong> questions<br />

▫ Underst<strong>and</strong> the program you are evaluat<strong>in</strong>g.<br />

Numbers tell you one part of the s<strong>to</strong>ry… people<br />

tell you the other part<br />

▫ Consider the viewpo<strong>in</strong>t/feel<strong>in</strong>gs of program staff<br />

when h you present t results lt<br />

▫ Present your data so <strong>policy</strong>makers <strong>and</strong> program<br />

staff can underst<strong>and</strong> it. Keep it short.<br />

▫ Draw implications. Be ready for the media.<br />

Do’s: For program staff<br />

• Embrace <strong>evaluation</strong> as a <strong>to</strong>ol rather than a threat<br />

• Get ALL staff on board <strong>to</strong> prevent sabotage<br />

• Learn basic pr<strong>in</strong>ciples p p regard<strong>in</strong>g g g data collection<br />

• Be ready <strong>to</strong> listen <strong>to</strong> hard truths. Use the data <strong>to</strong><br />

improve the program.<br />

15


Do’s: For Funder /Policymakers<br />

• Provide adequate q fund<strong>in</strong>g gfor<br />

the <strong>evaluation</strong><br />

component (M<strong>and</strong>ate is even better!)<br />

• Allow programs time <strong>to</strong> learn from the<br />

<strong>evaluation</strong> <strong>and</strong> improve before you pull the plug<br />

• Make decisions based on data<br />

(not perceptions, relationships,<br />

beliefs, etc.)<br />

In the perfect world…<br />

• All programs have adequately funded rigorous, yet<br />

practical <strong>evaluation</strong>s that were developed through a<br />

collaborative conversation between program staff<br />

<strong>and</strong> evalua<strong>to</strong>rs.<br />

• The <strong>evaluation</strong> results are presented <strong>to</strong> program staff<br />

i <strong>in</strong> an underst<strong>and</strong>able d d bl <strong>and</strong> d sensitive i i ffashion hi <strong>and</strong> d staff ff<br />

embrace the f<strong>in</strong>d<strong>in</strong>gs as a means <strong>to</strong> improve the<br />

program <strong>and</strong> show impact.<br />

• Policymakers/Funders reward the above.<br />

16

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!