01.07.2014 Views

Conference Program PDF - SIRC : Seattle Implementation Research ...

Conference Program PDF - SIRC : Seattle Implementation Research ...

Conference Program PDF - SIRC : Seattle Implementation Research ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

S I R C<br />

<strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative<br />

2 nd Biennial <strong>Conference</strong>:<br />

Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas<br />

<strong>Seattle</strong>, Washington — May 16‐17, 2013<br />

www.seattleimplementation.org


TABLE OF CONTENTS<br />

Welcome from the <strong>Conference</strong> Directors………………………………………………………………………………............................. 2<br />

Acknowledgements…………………………………………………………………………………………………………………………………………… 2<br />

Schedule & Hotel Map<br />

Schedule‐At‐A‐Glance & Hotel Map……………………………………………………………………………………………………………. 4<br />

Thursday, May 16……................................................................................................…………………………………….. 6<br />

Friday, May 17……............................................................................................................................................. 8<br />

<strong>Conference</strong> Tracks............................................................................................................................................. 10<br />

Symposia Abstracts — Thursday, May 16<br />

Should EBPs Be Locally Grown or Factory Farmed?……………………………………………………………………................... 15<br />

Innovative Approaches for Making EPBs Work................................................................................................. 19<br />

Leadership & <strong>Implementation</strong>.......................................................................................................................... 27<br />

Breakout Sessions<br />

Breakout A: EBP Champion Symposium: Is My Patient Getting Better?.................................................... 31<br />

Breakout B: Learning from <strong>Implementation</strong> Observation......................................................................... 33<br />

Breakout C: <strong>Implementation</strong> Through Collaborations with Policymakers................................................ 37<br />

Breakout D: Advancing Fidelity Measurement.......................................................................................... 41<br />

Breakout E: Implementing Primary Care Interventions............................................................................. 45<br />

Breakout F: EBP Champion Symposium: Solving <strong>Research</strong> Dilemmas Related to <strong>Implementation</strong><br />

Fidelity........................................................................................................................................................ 49<br />

Breakout G: Sustainability & Adaptation in Social Services....................................................................... 51<br />

Breakout H: <strong>Implementation</strong> of Critical Time Intervention....................................................................... 55<br />

Breakout I: Matching <strong>Implementation</strong> to Setting...................................................................................... 59<br />

Breakout J: <strong>Research</strong>‐Community Relationships....................................................................................... 63<br />

Leveraging Technology...................................................................................................................................... 67<br />

Symposia Abstracts — Friday, May 17<br />

<strong>SIRC</strong> Instrument Review Taskforce: An Overview of Progress Made & Plans for the Future........................... 71<br />

Key Findings & Future Paths in <strong>Implementation</strong> <strong>Research</strong>............................................................................... 73<br />

<strong>Implementation</strong> in Zambia................................................................................................................................ 77<br />

Breakout Sessions<br />

Breakout K: Global Models of <strong>Implementation</strong>......................................................................................... 81<br />

Breakout L: Innovative Substance Abuse Treatment................................................................................. 85<br />

Breakout M: Statistical Methods Workshop Part I.................................................................................... 89<br />

Breakout N: Learning from Scale‐Up.......................................................................................................... 91<br />

Breakout O: Fidelity of Interventions Across the Age Spectrum................................................................ 95<br />

Breakout P: EBP Champion Symposium: <strong>Implementation</strong> of TF‐CBT Across Washington State............... 99<br />

Breakout Q: Sustainability.......................................................................................................................... 101<br />

Breakout R: Statistical Methods Workshop Part II..................................................................................... 105<br />

Breakout S: New <strong>Implementation</strong> Measures............................................................................................. 107<br />

Breakout T: Outcomes from New Interventions........................................................................................ 111<br />

Final Symposium: Interagency Collaborative Teams to Scale‐Up Evidence‐Based Practices……………………….. 115<br />

Poster Session Abstracts......................................................................................................................................... 119<br />

Additional Note Pages............................................................................................................................................. 142<br />

Nearby Restaurants & Map of Downtown <strong>Seattle</strong>................................................................................................. 147<br />

<strong>Program</strong> Evaluation Form........................................................................................................................................ 149<br />

Continuing Education Form..................................................................................................................................... 151<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 1


WELCOME FROM THE CONFERENCE DIRECTORS<br />

2013 <strong>Conference</strong> Core<br />

Planning Committee<br />

Kate Comtois, PhD, MPH 1 —<br />

Co‐Director<br />

Cara C. Lewis, PhD 2 —<br />

Co‐Director<br />

Andria K. Pierson, MEd 1 —<br />

<strong>Conference</strong> Co‐Coordinator<br />

Kristin Bumgardner, BS 1 —<br />

<strong>Conference</strong> Co‐Coordinator<br />

Rinad Beidas, PhD 3<br />

Cameo Borntrager, PhD 4<br />

Adam Carmel, PhD 1<br />

Kay Yu Yuan Chai 1<br />

Doyanne Darnell, PhD 1<br />

Shannon Dorsey, PhD (Co‐I) 1<br />

Karin Hendricks, BA 2<br />

Meghan Keough, PhD 1<br />

Suzanne Kerns, PhD (Co‐I) 1<br />

Sara J. Landes, PhD (Co‐I) 5<br />

Aaron R. Lyon, PhD (Co‐I) 1<br />

Ruben Martinez, BA 2<br />

On behalf of the <strong>SIRC</strong> Core Committee and staff, we welcome you to the<br />

2 nd Biennial <strong>Conference</strong> of the newly renamed <strong>Seattle</strong> <strong>Implementation</strong><br />

<strong>Research</strong> Collaborative. We have several initiatives beyond the<br />

conferences themselves, so we decided to change our name to reflect our<br />

larger mission. We encourage you to check out our initiatives on our<br />

website (www.seattleimplementation.org) including:<br />

Instrument Review Project<br />

Strategic Planning Group (SPG) with its related projects<br />

o <strong>Implementation</strong> <strong>Research</strong> Development Workshop<br />

(IRDW) and mock grant reviews<br />

o Junior SPG Mentoring <strong>Program</strong><br />

Impact of Infrastructure on <strong>Implementation</strong> International Survey<br />

Dissemination and <strong>Implementation</strong> Training Opportunities, &<br />

Dissemination and <strong>Implementation</strong> Reading Group<br />

This year’s conference theme is Solving <strong>Implementation</strong> <strong>Research</strong><br />

Dilemmas and we have many presentations with innovative and effective<br />

solutions. As with our first conference, mornings will be spent together<br />

with two symposia, afternoons will have two breakout sessions where you<br />

can choose between 5 symposia, and we’ll come back together for a final<br />

symposium in the late afternoon. There are no set assignments for<br />

attending breakouts; you can attend whichever you like as seating<br />

permits. (Our survey last month simply allowed us to get a general idea of<br />

planned attendance for room size accommodations.)<br />

We would like to call your attention to our conference tracks this year.<br />

Please see page 10 for a schedule highlighting presentations that focus on<br />

each of these tracks:<br />

1<br />

Maria Monroe‐DeVita, PhD (Co‐I) EBP Champions* Scale‐Up<br />

1<br />

Stephen O’Connor, PhD Fidelity Sustainability<br />

6<br />

Landon Sach Global Perspectives Technology<br />

1<br />

Jennifer Villatte, PhD(c)<br />

Measurement Training<br />

1 University of Washington<br />

*i.e., the presentation includes clinicians or leaders from the practice community<br />

2 who have successfully implemented EBPs and champion them in their system<br />

Indiana University<br />

3 University of Pennsylvania<br />

We invite all of you to attend our reception and poster session Thursday<br />

4 University of Montana evening at 5:30 that includes our undergraduate, junior as well as senior<br />

5 National Center for PTSD, & VA Palo Alto Health<br />

colleagues’ work. Please enjoy hors d’oeuvres and music, a chance to<br />

Care System<br />

socialize, and encourage our junior colleagues as they start their careers<br />

6 <strong>Seattle</strong> University<br />

in implementation research.<br />

You will find evaluation pages for the conference in the back of this<br />

program. We encourage you to complete the evaluation to tell us how<br />

this meeting goes for you, and also to offer suggestions for new ventures<br />

on which <strong>SIRC</strong> might embark in the next two years.<br />

2 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


We want to extend our gratitude to the many people who have<br />

contributed to this conference and to <strong>SIRC</strong> in general. Thank you to our<br />

Advisory Committee for their wonderful ideas. Thank you to our Core<br />

Committee who has met bimonthly for over a year and volunteered an<br />

enormous amount of time as well as ideas to make the conference work.<br />

Thank you to our sponsors including NIMH who funded the majority of<br />

this conference through the R13 mechanism, as well as the in kind<br />

support of the University of Washington including the CHAMMP and<br />

WIMIRT centers, Indiana University, and the National Center for PTSD and<br />

Department of Veterans Affairs. We want to thank our many students<br />

and <strong>SIRC</strong> volunteers for their hard work in keeping the collaborative and<br />

this conference moving. Finally, we want to thank Andria Pierson and<br />

Kristin Bumgardner for their staff coordination managing all of the<br />

logistics and creating this setting where we can all learn and enjoy.<br />

Make the most of these days together and take advantage of the many<br />

opportunities to meet new colleagues and network. We look forward to<br />

seeing you at the reception tonight. We hope you will become active in<br />

<strong>SIRC</strong>, our dynamic collaborative.<br />

Welcome!<br />

2013 Advisory Board<br />

Eric Bruns, PhD 1<br />

Paul Ciechanowski, MD, MPH 1<br />

Dean L. Fixsen, PhD 2<br />

Dan Fox, MSW, LICSW 3<br />

David Kolko, PhD, ABBP 4<br />

Antoinette (Toni) Krupski, PhD 1<br />

Leif Solberg, MD 5<br />

Bradley Steinfeld, PhD 6<br />

Shannon Wiltsey Stirman, PhD 7<br />

Jürgen Unützer MD, MPH, MA 1<br />

1 University of Washington<br />

2 FPG Child Development Institute<br />

University of North Carolina at Chapel Hill<br />

3 Lutheran Community Services Northwest<br />

Kate Comtois, PhD, MPH<br />

Associate Professor, Department of<br />

Psychiatry & Behavioral Sciences,<br />

University of Washington<br />

Co‐Director, <strong>SIRC</strong><br />

Cara C. Lewis, PhD<br />

Clinical Assistant Professor,<br />

Department of Psychological &<br />

Brain Sciences, Indiana University<br />

Co‐Director, <strong>SIRC</strong><br />

4 Western Psychiatric Institute & Clinic,<br />

University of Pittsburgh<br />

5 HealthPartners Institute for<br />

Education & <strong>Research</strong><br />

6 Group Health Cooperative<br />

7 VA Boston Healthcare System, National Center<br />

for PTSD,, & Boston University<br />

FUNDED BY:<br />

Grant No. 5 R13 MH086159<br />

SPONSORED BY:<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 3


HOTEL DECA MAP<br />

Mezzanine Level<br />

Lobby Level<br />

Registration & Badges<br />

All <strong>SIRC</strong> conference attendees must be registered. The conference registration table will be located outside the<br />

Grand Ballroom.<br />

Badges are required for admission to all sessions, meals, and receptions. Please wear your badge during the<br />

conference, and remember to remove it outside the hotel.<br />

Lunch<br />

Additional seating for lunch will be available in the Governor Room on the lower level (not shown), across the hall<br />

from The District Lounge. For additional dining options, please see the list of nearby restaurants on page 147.<br />

4 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


SCHEDULE‐AT‐A‐GLANCE<br />

7:00 AM<br />

Wednesday 5/14 Thursday 5/16 Friday 5/17 Saturday 5/18<br />

Registration & Continental<br />

Breakfast<br />

8:00 AM Welcome Ceremony<br />

(8:00‐8:30)<br />

9:00 AM<br />

10:00 AM<br />

11:00 AM<br />

Innovative Approaches for<br />

Making EBPs Work<br />

(8:30‐9:45)<br />

Break<br />

Leadership &<br />

<strong>Implementation</strong><br />

(10:00‐11:15)<br />

Registration & Continental<br />

Breakfast<br />

Day 2 Welcome & <strong>SIRC</strong><br />

Instrument Review<br />

Taskforce (8:00‐8:45)<br />

Understanding<br />

<strong>Implementation</strong><br />

Scale‐up<br />

(8:45‐10:00)<br />

Break<br />

<strong>Implementation</strong> in Zambia<br />

(10:15‐11:30)<br />

Schedule<br />

12:00 PM<br />

Buffet Lunch<br />

(11:15‐12:30) Buffet Lunch<br />

(11:30‐12:45)<br />

<strong>Implementation</strong> <strong>Research</strong><br />

Development Workshop<br />

(IRDW)<br />

(9:00‐4:00)<br />

1:00 PM<br />

2:00 PM<br />

3:00 PM<br />

Concurrent Breakout<br />

Sessions: A‐E<br />

(12:30‐1:45)<br />

Break<br />

Concurrent Breakout<br />

Sessions: F‐J<br />

(2:00‐3:15)<br />

Break<br />

Concurrent Breakout<br />

Sessions: K‐O<br />

(12:45‐2:00)<br />

Break<br />

Concurrent Breakout<br />

Sessions: P‐T<br />

(2:15‐3:30)<br />

Break<br />

Strategic Planning Group<br />

(SPG) Member Invited Only<br />

4:00 PM<br />

5:00 PM<br />

6:00 PM<br />

7:00 PM<br />

8:00 PM<br />

Dinner Excursions<br />

(Time TBA)<br />

Leveraging Technology<br />

(3:30‐4:45)<br />

Reception & Poster Session<br />

with Live Music, Hors<br />

d'oeurvres, & Cash Bar<br />

(5:30‐7:30)<br />

Dinner Excursions<br />

(Time TBA)<br />

Final Symposium, Closing<br />

Remarks, Presentation of<br />

Awards, & <strong>SIRC</strong> Updates<br />

(3:45‐5:15)<br />

Strategic Planning Group<br />

(SPG): Optional Review<br />

(5:15‐6:00)<br />

<strong>SIRC</strong> Fun Run<br />

(6:15)<br />

Dinner Excursions<br />

(Time TBA)<br />

Dinner Excursions<br />

(Time TBA)<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 5


SCHEDULE—THURSDAY, MAY 16 1<br />

7:00‐8:00 REGISTRATION & CONTINENTAL BREAKFAST<br />

8:00‐8:15 Welcome to <strong>SIRC</strong> & Orientation to <strong>Conference</strong>………………………………………………………………………… Grand Ballroom<br />

Kate Comtois, PhD, MPH<br />

8:15‐8:30 Should EBPs be Locally Grown or Factory Farmed?....................................................................... Grand Ballroom<br />

Greg Simon, MD, MPH<br />

8:30‐9:45 Innovative Approaches for Making EBPs Work (MC: Cameo Borntrager, PhD)…………………………. Grand Ballroom<br />

Bridge Over Troubled Waters: The Interactive Systems Framework for Dissemination & <strong>Implementation</strong><br />

Abraham Wandersman, PhD<br />

<strong>Implementation</strong> of School‐Wide Positive Behavior Interventions & Supports: The Influence of Emotion, Self‐<br />

Efficacy, & Organizational Commitment<br />

Zed Kramer, MA, Molly K. McDonald, MA, Brandon Rennie, EdS, & Cameo Borntrager, PhD<br />

Seeing is Believing: Behavioral Rehearsal Methodology<br />

Shannon Dorsey, PhD, Rinad Beidas, PhD, & Wendi Cross, PhD<br />

9:45‐10:00 BREAK<br />

10:00‐11:15 Leadership & <strong>Implementation</strong> (MC: Adam Carmel, PhD)…………………………………………………………… Grand Ballroom<br />

Leadership & <strong>Implementation</strong><br />

Bruce J. Avolio, PhD<br />

Taking a Lesson from Usual Care: Predictors of Use of Evidence‐Based Practices for Youth<br />

Charmaine K. Higa‐McMillan, PhD<br />

11:15‐12:30 BUFFET LUNCH<br />

12:30‐1:45 Concurrent Breakout Sessions<br />

Breakout A: EBP Champion Symposium (MC: Sara J. Landes, PhD)…………………………………………… President<br />

Is My Patient Getting Better? <strong>Implementation</strong> of Mental Health Progress Monitoring/Outcomes System in an<br />

Integrated Delivery System<br />

Bradley Steinfeld, PhD<br />

Breakout B: Learning from <strong>Implementation</strong> Observation (MC: Suzanne Kerns, PhD)………………. Chancellor<br />

Training in Triple P (Positive Parenting <strong>Program</strong>): Exploring <strong>Implementation</strong> Outcomes Across Practitioner<br />

Groups in the United States, Australia, England & Canada<br />

Suvena Sethi, PhD<br />

Factors Associated with Adoption of a Mental Health Intervention for Autism Spectrum Disorders<br />

Colby Chlebowski, PhD<br />

Observed Barriers to <strong>Implementation</strong> of Empirically‐Supported Treatments by Clinicians Working with Military<br />

& Veteran Patients<br />

Craig J. Bryan, PsyD, ABPP, & David S. Riggs, PhD<br />

Breakout C: <strong>Implementation</strong> Through Collaborations with Policymakers (MC: Aaron Lyon PhD) College<br />

Effective <strong>Implementation</strong> of EBP Legislation by Engaging Providers in a Coaching Process<br />

Eric Trupin, PhD, & Gabrielle D’Angelo, MSW<br />

Negotiating <strong>Implementation</strong> Science & Evaluation <strong>Research</strong>: Lessons Learned from a National Teen Pregnancy<br />

Prevention <strong>Implementation</strong> Study<br />

Jacqueline Berman, PhD<br />

Identifying the Needs of OEF/OIF Veterans with TBI & Co‐Occurring Behavioral Health Issues<br />

Lisa Brenner, PhD, Jennifer Olson‐Madden, PhD, Bridget Matarazzo, PsyD, & Gina Signoracci, PhD<br />

Breakout D: Advancing Fidelity Measurement (MC: Jennifer Villatte, PhD(c))….………………………. Grand Ballroom<br />

Fidelity Measurements in the Real World: Feasibility of BECCI & MITI for Motivational Interviewing in Child &<br />

Youth Mental Health<br />

Melissa Kimber, MSW, PhD(c), Raluca Barac, MA, PhD, & Melanie Barwick, PhD, CPsych<br />

Comparisons Among Six Methods for Measuring Fidelity: Implications for <strong>Research</strong> & Practice<br />

Kristin Duppong Hurley, PhD<br />

An Update on Project BEST (Bringing Evidence‐Supported Treatments to South Carolina Children & Families):<br />

Challenges to Measuring Provider Fidelity<br />

Rochelle F. Hanson, PhD<br />

1 The schedule includes presenters only. Please see the abstracts for the full lists of authors.<br />

6 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


Breakout E: Implementing Primary Care Interventions (MC: Maria Monroe‐DeVita, PhD)……….. Regent<br />

Transformation & Spread of Primary Care Clinics into Medical Homes: It’s Slow, Hard Work<br />

Leif I. Solberg, MD<br />

A Qualitative Study of Fidelity: Understanding Variations in <strong>Implementation</strong> of the Patient‐<br />

Centered Medical Home<br />

Rosalind Keith, PhD<br />

CADET: Clinical & Cost Effectiveness of Collaborative Care for Depression in UK Primary<br />

Care: A Cluster Randomized Controlled Trial<br />

David A. Richards, PhD<br />

1:45‐2:00 BREAK<br />

2:00‐3:15 Concurrent Breakout Sessions<br />

Breakout F: EBP Champion Symposium (MC: Cara C. Lewis, PhD, & Cameo Borntrager, PhD)..… President<br />

Solving <strong>Research</strong> Dilemmas Related to <strong>Implementation</strong> Fidelity<br />

Rebecca Selove, PhD, MPH, Kathryn Mathes, BSN, RN, MS, PhD, & Heather Wallace, PhD<br />

Breakout G: Sustainability & Adaptation in Social Services (MC: Maria Monroe‐DeVita, PhD)…. Chancellor<br />

<strong>Implementation</strong> Strategies in Social Service Settings: A <strong>Research</strong> Agenda<br />

Byron J. Powell, AM<br />

DBT Teams in Training 2008‐2011: <strong>Implementation</strong> Follow‐up in 2012<br />

Anthony DuBose, PsyD, & André Ivanoff, PhD<br />

Understanding Modifications to CBT in Community Settings: A Comparison of Providers in Adult & Child Mental<br />

Health Service Settings<br />

Shannon Wiltsey Stirman, PhD<br />

Breakout H: <strong>Implementation</strong> of Critical Time Intervention (MC: Meghan Keough, PhD)………….. Regent<br />

From Inception to Practice: Taking an Evidence‐Based Practice from Development to <strong>Implementation</strong><br />

Challenges & Successes in Assessing Fidelity to the CTI Model Over Time<br />

Assessing the <strong>Implementation</strong> of the Critical Time Intervention Model Across 20 Homeless‐Service Agencies<br />

R. Neil Greene, PhD, & Melissa Martin, MSW<br />

Breakout I: Matching <strong>Implementation</strong> to Setting (MC: Sara J. Landes, PhD)…..………………………… Grand Ballroom<br />

Matching Training To Setting: A New <strong>Implementation</strong> Model For Dialectical Behavior Therapy<br />

Helen Best, MEd, Kate Comtois, PhD, MPH, & Nancy A. McDonald, MS, CADC, LPC<br />

User‐Centered Design & the <strong>Implementation</strong> of Evidence‐Based Interventions<br />

Aaron R. Lyon, PhD<br />

Designing an <strong>Implementation</strong> Strategy to Support the Multi‐Site Scale‐Up of an Evidence‐Based, Culturally<br />

Appropriate Practice Model for Intensive Family Support Services Across the Northern Territory, Australia<br />

Robyn Mildon, PhD<br />

Breakout J: <strong>Research</strong>‐Community Relationships (MC: Suzanne Kerns, PhD)…………………………….. College<br />

Evaluating the Success of a Statewide EBP Scale‐Up Project: The Children’s Administration‐University of<br />

Washington EBP Partnership<br />

Eric Bruns, PhD, Andrea Negrete, MEd, & Tammy Cordova, MSW<br />

Reviewing the Use of <strong>Research</strong>‐Community Partnerships to Facilitate <strong>Implementation</strong> of Evidence‐Based<br />

Practices in Children’s Community Services<br />

Nicole Stadnick, MS, MPH<br />

Developing the Autism Model of <strong>Implementation</strong> for ASD Community Providers: Use of <strong>Research</strong>‐Community<br />

Partnership<br />

Amy Drahota, PhD<br />

3:15‐3:30 BREAK<br />

3:30‐4:45 Leveraging Technology (MC: Rinad Beidas, PhD)..……………………………………………………………… Grand Ballroom<br />

Scaling Up Assessment of Therapist Fidelity in Motivational Interviewing: Preliminary Development of the<br />

AutoMITI<br />

David C. Atkins, PhD<br />

PracticeGround: An Online Platform to Help Therapists Learn, Implement, & Measure Impact of EBPs<br />

Gareth Holman, PhD<br />

Dialectical Behavior Therapy <strong>Implementation</strong> Process & Outcomes in VA & Community Settings<br />

Sara J. Landes , PhD, & Matthew Ditty, MSW<br />

4:45‐5:30 BREAK<br />

5:30‐7:30 Reception & Poster Session with Live Music, Hors d’oeuvres, & Cash Bar…….…………………….…… Grand Ballroom<br />

Featuring Classical Guitarists: Teresa Jaworski & Mark Hilliard Wilson<br />

Schedule<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 7


SCHEDULE—FRIDAY, MAY 17<br />

7:00‐8:00 REGISTRATION & CONTINENTAL BREAKFAST<br />

8:00‐8:15 Day 2 Welcome…………………………………………………………………………………………………………………………. Grand Ballroom<br />

Kate Comtois, PhD, MPH<br />

8:15‐8:45 <strong>SIRC</strong> Instrument Review Taskforce: An Overview of Progress Made & Plans for the<br />

Future.............................................................................................................................................. Grand Ballroom<br />

Cara C. Lewis, PhD<br />

8:45‐10:00 Understanding <strong>Implementation</strong> Scale‐Up (MC: Suzanne Kerns, PhD)……………………………………….. Grand Ballroom<br />

<strong>Implementation</strong> Science in an Era of Health Reform & Patient‐Centered Comparative Effectiveness <strong>Research</strong>:<br />

New Threats, New Expectations, New Opportunities<br />

Brian Mittman, PhD<br />

Synthesis of Findings from 3 Lifestyle Behavior Change <strong>Program</strong> <strong>Implementation</strong> in the VA<br />

Laura J. Damschroder, MS, MPH<br />

Racial/Ethnic Disparities & the <strong>Implementation</strong> of Evidence‐Based Practices in Public Youth‐Serving Systems<br />

Antonio R. Garcia, PhD<br />

10:00‐10:15 BREAK<br />

10:15‐11:30 <strong>Implementation</strong> in Zambia (MC: Shannon Dorsey, PhD)…………………………………………………………… Grand Ballroom<br />

<strong>Implementation</strong> of TF‐CBT in Zambia: Perspectives from Local Supervisors & Counselors<br />

Margaret Kasoma<br />

Organizational <strong>Implementation</strong> Barriers & Facilitators for Mental Health <strong>Program</strong>s in Zambia: A Mixed‐Methods<br />

Study<br />

Laura Murray, PhD<br />

Mixed Methods Assessment of <strong>Implementation</strong> Barriers & Facilitators for Mental Health <strong>Program</strong>s in Zambia:<br />

Provider Level Themes<br />

Rinad Beidas, PhD<br />

11:30‐12:45 BUFFET LUNCH<br />

12:45‐2:00 Concurrent Breakout Sessions<br />

Breakout K: Global Models of <strong>Implementation</strong> (MC: Rinad Beidas, PhD).………………………………… Chancellor<br />

Scaling Up Care for Orphans in Tanzania: A Task‐Sharing Approach to Mental Health Treatment<br />

Shannon Dorsey, PhD<br />

A Transdiagnostic Mental Health Intervention in Low Resource Countries: An Alternative Solution to Mental<br />

Health <strong>Implementation</strong> Challenges<br />

Laura Murray, PhD<br />

<strong>Implementation</strong> of Cognitive Processing Therapy Provided by Community‐Based Paraprofessionals in the<br />

Democratic Republic of Congo: Influence of Therapist Factors Randomized Clinical Trial<br />

Debra Kaysen, PhD<br />

Breakout L: Innovative Substance Abuse Treatment <strong>Implementation</strong> (MC: Doyanne Darnell,<br />

PhD).………………………………………………………………………………………………………………………………………….. President<br />

Scaling Up & Sustaining Alcohol & PTSD Screening & Intervention in US Trauma Care Systems<br />

Douglas Zatzick, MD<br />

Lessons Learned from Implementing a Web‐Based Tool for Brief Alcohol Interventions in a Large Integrated<br />

Health Care System<br />

Kenneth R. Weingardt, PhD<br />

Disseminating Contingency Management: A Training & <strong>Implementation</strong> Trial<br />

Bryan Hartzler, PhD<br />

Breakout M: Statistical Methods Workshop Part I (MC: Kate Comtois, PhD, MPH).…….…………… College<br />

Design & Analysis Challenges with Multilevel <strong>Implementation</strong> Data<br />

David C. Atkins, PhD, & Scott A. Baldwin, PhD<br />

Breakout N: Learning from Scale‐Up (MC: Meghan Keough, PhD)…………………….………………………. Grand Ballroom<br />

Overcoming <strong>Implementation</strong> <strong>Research</strong> Challenges While Studying CPT Training & <strong>Implementation</strong> Across<br />

Canada<br />

Shannon Wiltsey Stirman, PhD<br />

Financing & Scaling Up Early Intervention Services<br />

Howard H. Goldman, MD, MPH<br />

System Improvement Through Service Collaboratives: Closing Gaps & Improving Access & Coordination<br />

Brian Rush, PhD<br />

8 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


Breakout O: Fidelity of Interventions Across the Age Spectrum (MC: Suzanne Kerns, PhD)…….. Regent<br />

<strong>Implementation</strong> of the <strong>Program</strong> to Encourage Active & Rewarding Lives for Seniors<br />

(PEARLS)<br />

Lesley Steinman, MSW, MPH<br />

Common Issues with Assessing Fidelity to Complex Multi‐Modal Service <strong>Program</strong>s: Lessons<br />

Learned from Assessing Fidelity to the ACT Model<br />

Maria Monroe‐DeVita, PhD<br />

Assessing <strong>Implementation</strong> Fidelity of the Family Check‐Up: Development & Validation of<br />

the COACH Rating System<br />

Justin D. Smith, PhD<br />

2:00‐2:15 BREAK<br />

2:15‐3:30 Concurrent Breakout Sessions<br />

Breakout P: EBP Champions Symposium (MC: Shannon Dorsey, PhD)…………………………….………… President<br />

<strong>Implementation</strong> of TF‐CBT Across Washington State<br />

Joe Leroy, MSW, Dan Fox, MSW, Ron Gengler, MS, & Lori Vanderburg, MS<br />

Breakout Q: Sustainability (MC: Adam Carmel, PhD)………………………………………………………………… Chancellor<br />

Sustainability of CBT for Youth Anxiety in Community Settings Following <strong>Implementation</strong><br />

Rinad Beidas, PhD<br />

Supporting <strong>Implementation</strong> of the Triple P System: A Standardized Framework<br />

Jacquie Brown, MES, RSW, & Sara van Driel, PhD<br />

<strong>Research</strong> <strong>Implementation</strong> within a Clinical Practice: Resolving the Science/Practice Dialectic<br />

Sally A. Moore, PhD<br />

Breakout R: Statistical Methods Workshop Part II (MC: Kate Comtois, PhD, MPH)…………………… College<br />

Design & Analysis Challenges with Multilevel <strong>Implementation</strong> Data<br />

David C. Atkins, PhD, & Scott A. Baldwin, PhD<br />

Breakout S: New <strong>Implementation</strong> Measures (MC: Doyanne Darnell, PhD)……..………………………… Grand Ballroom<br />

Measuring an Evidence‐Based Model of <strong>Implementation</strong>: Preliminary Development of a Survey Instrument<br />

Josef I. Ruzek, PhD<br />

Solving Measurement Issues in <strong>Implementation</strong> Science<br />

Ruben Martinez, BA, & Cara C. Lewis, PhD<br />

Common Elements for Implementing Evidence‐Based Practices in Children’s Mental Health<br />

Lisa Saldana, PhD<br />

Breakout T: Outcomes from New Interventions (MC: Meghan Keough, PhD)…………………………… Regent<br />

Team‐Based Exposure & Ritual Prevention for Adults with Obsessive Compulsive Disorder: An Open Trial<br />

Implemented in a Community Mental Health Center<br />

Maria Mancebo, PhD<br />

<strong>Implementation</strong> of the Family Check‐Up in Community Mental Health Agencies: Clinical Effectiveness, Fidelity,<br />

& Other Outcomes<br />

Justin D. Smith, PhD<br />

Cognitive Retraining (CR) for Attention & Working Memory for Older Adults: What to Train, to Whom, & How<br />

Long?<br />

Lee Hyer, PhD, ABPP<br />

3:30‐3:45 BREAK<br />

3:45‐5:00 Final Symposium: Interagency Collaborative Teams to Scale‐Up Evidence‐Based Practices:<br />

Preliminary Results from a Large Scale <strong>Implementation</strong> (MC: Maria Monroe‐DeVita, PhD)……… Grand Ballroom<br />

Interagency Collaborative Teams for Capacity Building to Scale‐Up Evidence‐Based Practice<br />

Michael Hurlburt, PhD, Gregory A. Aarons,PhD, Danielle Fettes, Cathleen Willging, Lawrence A. Palinkas, PhD, &<br />

Mark J. Chaffin<br />

Collaboration, Negotiation, & Coalescence for Interagency‐Collaborative Teams to Scale‐up Evidence‐Based<br />

Practice<br />

Gregory A. Aarons, PhD, Michael Hurlburt, PhD, Danielle Fettes, Cathleen Willging, Lara Gunderson, MA, Mark<br />

Chaffin, & Lawrence A. Palinkas, PhD<br />

Leadership & Practice in the Face of Policy: How Supervisors & Providers Exercise Discretion in Evidence‐Based<br />

Practice <strong>Implementation</strong><br />

Lara Gunderson, MA, & Cathleen Willging<br />

5:00‐5:15 Closing Remarks, Presentation of Awards, & Updates to <strong>SIRC</strong> Website…………………………………... Grand Ballroom<br />

5:15‐6:00 Strategic Planning Group (SPG) Members: Optional Session to Review Key Themes Identified<br />

in <strong>Conference</strong> & Areas on which to Follow‐Up……..….……………………….……………………………………. Grand Ballroom<br />

6:15 <strong>SIRC</strong> Fun Run………………………………………………………………………………………………………………………….…. Meet in Lobby<br />

Schedule<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 9


CONFERENCE TRACKS<br />

The 2 nd Biennial <strong>SIRC</strong> <strong>Conference</strong> identified eight themes related to implementation research represented in the<br />

presentations. To assist those who would like to see all talks with a specific theme, we have made them into tracks.<br />

Please find below the talks that relate to each theme with their time and location.<br />

EBP CHAMPIONS<br />

Presentations featuring clinicians or leaders from the practice community who have successfully implemented<br />

EBPs and champion them in their system and beyond.<br />

Time Room Breakout 2 First Author 3 Title<br />

THURSDAY<br />

12:30 President A Steinfeld Is My Patient Getting Better? <strong>Implementation</strong> of Mental Health<br />

Progress Monitoring/Outcomes System in an Integrated Care<br />

Delivery System<br />

12:50 Chancellor B Bryan Observed Barriers to <strong>Implementation</strong> of Empirically‐Supported<br />

Treatments by Clinicians Working with Military & Veteran Patients<br />

2:00 President F Selove Solving <strong>Research</strong> Dilemmas Related to <strong>Implementation</strong> Fidelity<br />

2:00 Grand I Best Matching Training to Setting: A New <strong>Implementation</strong> Model for<br />

Dialectical Behavior Therapy<br />

FRIDAY<br />

10:15 Grand Kasoma <strong>Implementation</strong> of TF‐CBT in Zambia: Perspectives from Local<br />

Supervisors & Counselors<br />

2:15 President P Leroy <strong>Implementation</strong> of TF‐CBT Across Washington State<br />

2:55 Chancellor Q Moore <strong>Research</strong> <strong>Implementation</strong> within a Clinical Practice: Resolving the<br />

Science/Practice Dialectic<br />

FIDELITY<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

8:15 Grand Simon Should EBPs be Locally Grown or Factory Farmed<br />

8:50 Grand Kramer <strong>Implementation</strong> of School‐Wide Positive Behavior Interventions &<br />

Supports<br />

9:10 Grand Dorsey Seeing is Believing: Behavioral Rehearsal Methodology<br />

10:00 Grand Avolio Sourcing & Transmitting Leadership to Optimize Organizational<br />

Change<br />

12:30 Chancellor D Kimber Fidelity Measurements in the Real World: Feasibility of the BECCI &<br />

MITI for Motivational Interviewing in Child & Youth Mental Health<br />

12:50 Grand D Hurley Comparisons Among Six Methods for Measuring Fidelity:<br />

Implications for <strong>Research</strong> & Practice<br />

1:10 Grand D Hanson An Update on Project BEST: Challenges to Measuring Provider<br />

Fidelity<br />

12:50 Regent E Keith A Qualitative Study of Fidelity: Understanding Variation in<br />

<strong>Implementation</strong> of the Patient‐Centered Medical Home<br />

1:10 Regent E Richards CADET: Clinical & Cost‐Effectiveness of Collaborative Care for<br />

Depression in UK Primary Care: A Cluster RCT<br />

2:00 President F Selove Solving <strong>Research</strong> Dilemmas Related to <strong>Implementation</strong> Fidelity<br />

2 If Breakout is not listed, presentation is part of a symposium presented to the full conference in the Grand Ballroom.<br />

3 If the First Author is not presenting at the conference, it will be noted with an asterisk.<br />

10 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


FIDELITY<br />

Time Room Breakout First Author Title<br />

2:20 Regent H Conover* Challenges & Successes in Assessing Fidelity to the Critical Time<br />

Intervention (CTI) Model Over Time<br />

3:30 Grand Atkins Scaling Up Assessment of Therapist Fidelity in Motivational<br />

Interviewing: Preliminary Development of the AutoMITI<br />

POSTERS (THURSDAY EVENING RECEPTION)<br />

5:30 Chancellor Mason Fidelity Assessment of Widely‐Disseminated but Understudied<br />

Prevention <strong>Program</strong>s: A Framework & Illustration from the Common<br />

Sense Parenting Trial<br />

5:30 Grand White Development & Use of a Fidelity Checklist for Permanency<br />

Roundtables: A New Child Welfare Intervention<br />

5:30 Grand Meisel Comparing Self, Clinician, & Observer Reports of Cognitive<br />

Processing Therapy (CPT) Adherence<br />

FRIDAY<br />

10:15 Grand Kasoma <strong>Implementation</strong> of TF‐CBT in Zambia: Perspectives from Local<br />

Supervisors & Counselors<br />

1:25 Chancellor K Kaysen <strong>Implementation</strong> of Cognitive Processing Therapy Provided by<br />

Community‐Based Paraprofessionals in the Republic of Congo:<br />

Influence of Therapist Factors Randomized Clinical Trial<br />

1:25 President L Hartzler Disseminating Contingency Management: A Training &<br />

<strong>Implementation</strong> Trial<br />

12:45 Grand M Atkins Part I: Design & Analysis Challenges with Multilevel <strong>Implementation</strong><br />

Data<br />

12:45 Regent O Steinman <strong>Implementation</strong> of the <strong>Program</strong> to Encourage Active & Rewarding<br />

Lives for Seniors (PEARLS)<br />

1:05 Regent O Monroe‐DeVita Common Issues with Assessing Fidelity to Complex Multi‐Modal<br />

Service <strong>Program</strong>s: Lessons Learned from Assessing Fidelity to the<br />

ACT model<br />

1:25 Regent O Smith Assessing <strong>Implementation</strong> Fidelity of the Family Check‐Up:<br />

Development & Validation of the COACH Rating System<br />

2:15 President P Leroy <strong>Implementation</strong> of TF‐CBT Across Washington State<br />

2:15 College R Atkins Part II: Design & Analysis Challenges with Multilevel <strong>Implementation</strong><br />

Data<br />

2:15 Regent T Mancebo Team‐Based Exposure & Ritual Prevention for Adults with Obsessive<br />

Compulsive Disorder: An Open Trial Implemented in a CMHC<br />

Schedule<br />

GLOBAL PERSPECTIVES<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

12:30 Chancellor B Sethi Training in Triple P: Exploring <strong>Implementation</strong> Outcomes Across<br />

Practitioner Groups in the US, Australia, England, & Canada<br />

12:30 Grand D Kimber Fidelity Measurements in the Real World: Feasibility of the BECCI &<br />

MITI for Motivational Interviewing in Child & Youth Mental Health<br />

1:10 Regent E Richards CADET: Clinical & Cost‐Effectiveness of Collaborative Care for<br />

Depression in UK Primary Care: A Cluster RCT<br />

2:40 Grand I Mildon Designing & <strong>Implementation</strong> Strategy to Support the Multi‐Site<br />

Scale‐Up of an Evidence‐Based, Culturally Appropriate Practice<br />

Model for Intensive Family Support Services Across the Northern<br />

Territory, Australia<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 11


GLOBAL PERSPECTIVES<br />

Time Room Breakout First Author Title<br />

POSTERS (THURSDAY EVENING RECEPTION)<br />

5:30 Grand Aarons <strong>Implementation</strong> of an HIV Preventive Intervention in Mexico: The<br />

Roles of Context, Organizational Structure & Process, & Community<br />

Violence<br />

5:30 Grand Spielvogle Utilization of the Hybrid Model to Evaluate an Adolescent Treatment<br />

Engagement Intervention<br />

FRIDAY<br />

10:15 Grand Kasoma <strong>Implementation</strong> of TF‐CBT in Zambia: Perspectives from Local<br />

Supervisors & Counselors<br />

10:35 Grand Murray Organizational <strong>Implementation</strong> Barriers & Facilitators for Mental<br />

Health <strong>Program</strong>s in Zambia: A Mixed‐Methods Study<br />

10:55 Grand Beidas Mixed Methods Assessment of <strong>Implementation</strong> Barriers &<br />

Facilitators for Mental Health <strong>Program</strong>s in Zambia: Provider Level<br />

Themes<br />

12:45 Chancellor K Dorsey Scaling Up Care for Orphans in Tanzania: A Task‐Sharing Approach to<br />

Mental Health Treatment<br />

1:05 Chancellor K Murray A Transdiagnostic Mental Health Intervention in Low Resource<br />

Countries: An Alternative Solution to Mental Health <strong>Implementation</strong><br />

Challenges<br />

1:25 Chancellor K Kaysen <strong>Implementation</strong> of Cognitive Processing Therapy Provided by<br />

Community‐Based Paraprofessionals in the Republic of Congo<br />

12:45 College N Wiltsey Stirman Overcoming <strong>Implementation</strong> <strong>Research</strong> Challenges while Studying<br />

CPT Training & <strong>Implementation</strong> Across Canada<br />

1:25 College N Rush System Improvement Through Service Collaboratives: Closing Gaps &<br />

Improving Access & Coordination<br />

3:05 Chancellor Q Brown Supporting <strong>Implementation</strong> of the Triple P System: A Standardized<br />

Framework<br />

MEASUREMENT<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

9:10 Grand Dorsey Seeing is Believing: Behavioral Rehearsal Methodology<br />

10:00 Grand Avolio Sourcing & Transmitting Leadership to Optimize Organizational<br />

Change<br />

12:30 President A Steinfeld Is My Patient Getting Better? <strong>Implementation</strong> of Mental Health<br />

Progress Monitoring/Outcomes System in an Integrated Care<br />

Delivery System<br />

12:30 College C Berman Negotiating <strong>Implementation</strong> Science & Evaluation <strong>Research</strong>: Lessons<br />

Learned from a National Teen Pregnancy Prevention <strong>Implementation</strong><br />

Study<br />

12:30 Chancellor D Kimber Fidelity Measurements in the Real World: Feasibility of the BECCI &<br />

MITI for Motivational Interviewing in Child & Youth Mental Health<br />

12:50 Grand D Hurley Comparisons Among Six Methods for Measuring Fidelity<br />

1:10 Grand D Hanson An Update on Project BEST: Challenges to Measuring Provider<br />

Fidelity<br />

1:10 Regent E Richards CADET: Clinical & Cost‐Effectiveness of Collaborative Care for<br />

Depression in UK Primary Care: A Cluster RCT<br />

2:20 Regent H Conover* Challenges & Successes in Assessing Fidelity to the Critical Time<br />

Intervention (CTI) Model Over Time<br />

12 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


MEASUREMENT<br />

Time Room Breakout First Author Title<br />

3:30 Grand Atkins Scaling Up Assessment of Therapist Fidelity in Motivational<br />

Interviewing: Preliminary Development of the AutoMITI<br />

POSTERS (THURSDAY EVENING RECEPTION)<br />

5:30 Grand Martinez A Multi‐Level Framework for <strong>Implementation</strong> Science<br />

5:30 Grand White Development of an Assessment of Organizational Readiness for EBP<br />

<strong>Implementation</strong> in Public Child Welfare<br />

FRIDAY<br />

8:15 Grand Lewis <strong>SIRC</strong> Instrument Review Taskforce: An Overview of Progress Made &<br />

Plans for the Future<br />

1:25 President L Hartzler Disseminating Contingency Management: A Training &<br />

<strong>Implementation</strong> Trial<br />

12:45 Grand M Atkins Part I: Design & Analysis Challenges with Multilevel <strong>Implementation</strong><br />

Data<br />

12:45 Regent O Steinman <strong>Implementation</strong> of the <strong>Program</strong> to Encourage Active & Rewarding<br />

Lives for Seniors (PEARLS)<br />

1:05 Regent O Monroe‐DeVita Common Issues with Assessing Fidelity to Complex Multi‐Modal<br />

Service <strong>Program</strong>s: Lessons Learned from Assessing Fidelity to ACT<br />

1:25 Regent O Smith Assessing <strong>Implementation</strong> Fidelity of the Family Check‐Up:<br />

Development & Validation of the COACH Rating System<br />

2:15 College R Atkins Part II: Design & Analysis Challenges with Multilevel <strong>Implementation</strong><br />

Data<br />

2:15 Grand S Ruzek Measuring an Evidence‐Based Model of <strong>Implementation</strong>: Preliminary<br />

Development of a Survey Instrument<br />

3:05 Grand S Martinez Measurement Issues in <strong>Implementation</strong> Science<br />

3:25 Grand S Saldana Common Elements for Implementing Evidence‐Based Practices in<br />

Children’s Mental Health<br />

Schedule<br />

SCALE‐UP<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

8:15 Grand Simon Should EBPs be Locally Grown or Factory Farmed<br />

8:30 Grand Wandersman Bridge Over Troubled Waters: The Interactive Systems Framework for<br />

Dissemination & <strong>Implementation</strong><br />

12:30 Chancellor C Trupin Effective <strong>Implementation</strong> of EBP Legislation by Engaging Providers in a<br />

Coaching Process<br />

1:10 College C Brenner Identifying the Needs of OEF/OIF Veterans with Traumatic Brain Injury<br />

(TBI) & Co‐Occurring Behavioral Health Issues & their Families<br />

1:10 Grand D Hanson An Update on Project BEST: Challenges to Measuring Provider Fidelity<br />

12:30 Regent E Solberg Transformation & Spread of Primary Care Clinics into Medical Homes:<br />

It’s Slow, Hard Work<br />

1:10 Regent E Richards CADET: Clinical & Cost‐Effectiveness of Collaborative Care for<br />

Depression in UK Primary Care: A Cluster RCT<br />

2:40 Regent H Zerger* Assessing the <strong>Implementation</strong> of the Critical Time Intervention Across<br />

20 Homeless‐Service Agencies<br />

2:40 Grand I Mildon Designing & <strong>Implementation</strong> Strategy to Support the Multi‐Site Scale‐<br />

Up of an Evidence‐Based, Culturally Appropriate Practice Model for<br />

Intensive Family Support Services Across the Northern Territory,<br />

Australia<br />

2:00 College J Bruns Evaluation of the Success of a Statewide EBP Scale‐Up Project: The<br />

Children’s Administration‐University of Washington EBP Partnership<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 13


SCALE‐UP<br />

Time Room Breakout First Author Title<br />

3:30 Grand Atkins Scaling Up Assessment of Therapist Fidelity in Motivational<br />

Interviewing: Preliminary Development of the AutoMITI<br />

4:10 Grand Landes Dialectical Behavior Therapy <strong>Implementation</strong> Process & Outcomes in<br />

VA & Community Settings<br />

FRIDAY<br />

9:25 Grand Garcia Racial/Ethnic Disparities & the <strong>Implementation</strong> of Evidence‐Based<br />

Practices in Public Youth‐Serving Systems<br />

10:15 Grand Kasoma <strong>Implementation</strong> of TF‐CBT in Zambia: Perspectives from Local<br />

Supervisors & Counselors<br />

12:45 Chancellor K Dorsey Scaling Up Care for Orphans in Tanzania: A Task‐Sharing Approach to<br />

Mental Health Treatment<br />

1:05 Chancellor K Murray A Transdiagnostic Mental Health Intervention in Low Resource<br />

Countries: An Alternative Solution to Mental Health <strong>Implementation</strong><br />

Challenges<br />

1:25 Chancellor K Kaysen <strong>Implementation</strong> of Cognitive Processing Therapy Provided by<br />

Community‐Based Paraprofessionals in the Republic of Congo<br />

12:45 President L Zatzick Scaling Up & Sustaining Alcohol & PTSD Screening & Intervention in<br />

US Trauma Care Systems<br />

1:05 President L Weingardt Lessons Learned from Implementing a Web‐Based Tool for Brief<br />

Alcohol Interventions in a Large Integrated Health Care System<br />

12:45 College N Wiltsey Stirman Overcoming <strong>Implementation</strong> <strong>Research</strong> Challenges while Studying CPT<br />

Training & <strong>Implementation</strong> Across Canada<br />

1:05 College N Goldman Financing & Scaling Up Early Intervention Services<br />

1:25 College N Rush System Improvement Through Service Collaboratives: Closing Gaps &<br />

Improving Access & Coordination<br />

2:15 President P Leroy <strong>Implementation</strong> of TF‐CBT Across Washington State<br />

3:05 Chancellor Q Brown Supporting <strong>Implementation</strong> of the Triple P System: A Standardized<br />

Framework<br />

2:15 Grand S Ruzek Measuring an Evidence‐Based Model of <strong>Implementation</strong>: Preliminary<br />

Development of a Survey Instrument<br />

3:25 Grand S Saldana Common Elements for Implementing Evidence‐Based Practices in<br />

Children’s Mental Health<br />

2:35 Regent T Smith <strong>Implementation</strong> of the Family Check‐Up in Community Mental Health<br />

Agencies: Clinical Effectiveness, Fidelity, & Other Outcomes<br />

3:45 Grand Hurlburt Interagency Collaborative Teams for Capacity Building to Scale‐Up<br />

Evidence‐Based Practice<br />

4:05 Grand Aarons Collaboration, Negotiation, & Coalescence for Interagency‐<br />

Collaborative Teams to Scale‐Up Evidence‐Based Practice<br />

4:25 Grand Gunderson Leadership & Practice in the Face of Policy: How Supervisors &<br />

Providers Exercise Discretion in Evidence‐Based Practice<br />

<strong>Implementation</strong><br />

SUSTAINABILITY<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

2:20 Chancellor G DuBose DBT Teams in Training 2008‐2012: <strong>Implementation</strong> Follow‐up in 2012<br />

2:40 Chancellor G Wiltsey Stirman Understanding Modifications to CBT in Community Settings: A<br />

Comparison of Providers in Adult & Child Mental Health Settings<br />

14 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


SUSTAINABILITY<br />

Time Room Breakout First Author Title<br />

FRIDAY<br />

12:45 President L Zatzick Scaling Up & Sustaining Alcohol & PTSD Screening & Intervention in<br />

US Trauma Care Systems<br />

2:15 President P Leroy <strong>Implementation</strong> of TF‐CBT Across Washington State<br />

2:15 Grand Q Beidas Sustainability of CBT for Youth Anxiety in Community Settings<br />

Following <strong>Implementation</strong><br />

3:05 Chancellor Q Brown Supporting <strong>Implementation</strong> of the Triple P System: A Standardized<br />

Framework<br />

Schedule<br />

TECHNOLOGY<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

8:15 Grand Simon Should EBPs be Locally Grown or Factory Farmed<br />

12:30 Chancellor B Sethi Training in Triple P: Exploring <strong>Implementation</strong> Outcomes Across<br />

Practitioner Groups in the US, Australia, England, & Canada<br />

2:20 Regent H Conover* Challenges & Successes in Assessing Fidelity to the Critical Time<br />

Intervention (CTI) Model Over Time<br />

2:20 Grand I Lyon User‐Centered Design & the <strong>Implementation</strong> of Evidence‐Based<br />

Interventions<br />

3:30 Grand Atkins Scaling Up Assessment of Therapist Fidelity in Motivational<br />

Interviewing: Preliminary Development of the AutoMITI<br />

3:50 Grand Koerner* PracticeGround: An Online Platform to Help Therapists Learn,<br />

Implement, & Measure Impact of EBPs<br />

4:10 Grand Landes Dialectical Behavior Therapy <strong>Implementation</strong> Process & Outcomes in<br />

VA & Community Settings<br />

POSTERS (THURSDAY EVENING RECEPTION)<br />

5:30 Grand Green Adapting a <strong>Research</strong> Tested Automated Electronic Health Record<br />

Intervention for <strong>Implementation</strong> in Safety Net Clinics<br />

FRIDAY<br />

1:05 President L Weingardt Lessons Learned from Implementing a Web‐Based Tool for Brief<br />

Alcohol Interventions in a Large Integrated Health Care System<br />

12:45 College N Wiltsey Stirman Overcoming <strong>Implementation</strong> <strong>Research</strong> Challenges while Studying CPT<br />

Training & <strong>Implementation</strong> Across Canada<br />

TRAINING<br />

Time Room Breakout First Author Title<br />

THURSDAY<br />

9:10 Grand Dorsey Seeing is Believing: Behavioral Rehearsal Methodology<br />

10:00 Grand Avolio Sourcing & Transmitting Leadership to Optimize Organizational<br />

Change<br />

12:30 Chancellor B Sethi Training in Triple P: Exploring <strong>Implementation</strong> Outcomes Across<br />

Practitioner Groups in the US, Australia, England, & Canada<br />

12:50 Chancellor B Bryan Observed Barriers to <strong>Implementation</strong> of Empirically‐Supported<br />

Treatments by Clinicians Working with Military & Veterans<br />

2:20 Chancellor G DuBose DBT Teams in Training 2008‐2012: <strong>Implementation</strong> Follow‐up in 2012<br />

2:00 Regent H Herman* From Inception to Practice: Taking an Evidence‐Based Practice from<br />

Development to <strong>Implementation</strong><br />

2:00 Grand I Best Matching Training to Setting: A New <strong>Implementation</strong> Model for DBT<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 15


TRAINING<br />

Time Room Breakout First Author Title<br />

3:50 Grand Koerner* PracticeGround: An Online Platform to Help Therapists Learn,<br />

Implement, & Measure Impact of EBPs<br />

4:10 Grand Landes Dialectical Behavior Therapy <strong>Implementation</strong> Process & Outcomes in<br />

VA & Community Settings<br />

POSTERS (THURSDAY EVENING RECEPTION)<br />

5:30 Grand Aarons <strong>Implementation</strong> of an HIV Preventive Intervention in Mexico: The<br />

Roles of Context, Organizational Structure & Process, & Community<br />

Violence<br />

5:30 Grand Landy Improving Our Capacity for Evidence‐Based PTSD Treatment:<br />

Developing an Effective Model of Post‐Workshop Consultation<br />

FRIDAY<br />

10:15 Grand Kasoma <strong>Implementation</strong> of TF‐CBT in Zambia: Perspectives from Local<br />

Supervisors & Counselors<br />

12:45 Chancellor K Dorsey Scaling Up Care for Orphans in Tanzania: A Task‐Sharing Approach to<br />

Mental Health Treatment<br />

1:25 Chancellor K Kaysen <strong>Implementation</strong> of Cognitive Processing Therapy Provided by<br />

Community‐Based Paraprofessionals in the Republic of Congo<br />

1:25 President L Hartzler Disseminating Contingency Management: A Training &<br />

<strong>Implementation</strong> Trial<br />

12:45 College N Wiltsey Stirman Overcoming <strong>Implementation</strong> <strong>Research</strong> Challenges while Studying CPT<br />

Training & <strong>Implementation</strong> Across Canada<br />

2:15 President P Leroy <strong>Implementation</strong> of TF‐CBT Across Washington State<br />

2:15 Grand Q Beidas Sustainability of CBT for Youth Anxiety in Community Settings<br />

Following <strong>Implementation</strong><br />

2:15 Regent T Mancebo Team‐Based Exposure & Ritual Prevention for Adults with Obsessive<br />

Compulsive Disorder: An Open Trial in a CMHC<br />

16 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 8:15‐8:30<br />

SHOULD EBPS BE LOCALLY GROWN<br />

OR FACTORY FARMED?<br />

(MC: Kate Comtois, PhD, MPH)<br />

Greg Simon, MD, MPH<br />

Group Health <strong>Research</strong> Institute<br />

Symposia – May 16<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 17


SHOULD EBPS BE LOCALLY GROWN OR FACTORY FARMED?<br />

Greg Simon, MD, MPH<br />

Group Health <strong>Research</strong> Institute<br />

Contact: simon.g@ghc.org<br />

Efforts to implement evidence‐based psychosocial treatments have typically assumed that those treatments would<br />

be delivered by local providers. Recent research regarding care management and structured psychotherapy<br />

programs for depression should cause us to question this assumption. Effectiveness trials strongly support the<br />

fidelity and clinical effectiveness of centrally produced or “factory‐farmed” depression treatment programs. The<br />

limited data available suggest that centrally produced treatments clearly out‐perform “locally grown” models of<br />

depression care. For any specific treatment, the likelihood that a centralized delivery model will prove superior<br />

depends on two questions: First, can this treatment be provided over distance (via telephone, video conference, or<br />

some other telehealth medium) without significant loss of clinical effectiveness? Second, does local variation in this<br />

treatment lead to better outcomes or simply lower quality? Ultimately, the choice between centrally produced and<br />

locally produced mental health treatments should depend not on the needs or preferences of providers or<br />

researchers, but on the clinical benefits and value they provide to patients or consumers.<br />

Track(s): Fidelity, Scale‐Up, Technology<br />

NOTES<br />

18 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 8:30‐9:45<br />

INNOVATIVE APPROACHES FOR<br />

MAKING EPBS WORK<br />

(MC: Cameo Borntrager, PhD)<br />

Bridge Over Troubled Waters: The<br />

Interactive Systems Framework for<br />

Dissemination & <strong>Implementation</strong><br />

Abraham Wandersman, PhD<br />

University of South Carolina<br />

<strong>Implementation</strong> of School‐Wide<br />

Positive Behavior Interventions &<br />

Supports: The Influence of Emotion,<br />

Self‐Efficacy, & Organizational<br />

Commitment<br />

Zed Kramer, MA, Molly K. McDonald,<br />

MA, Brandon Rennie, EdS, Cameo<br />

Borntrager, PhD<br />

University of Montana<br />

Symposia – May 16<br />

Seeing is Believing: Behavioral <strong>Research</strong><br />

Methodology<br />

Shannon Dorsey, PhD, 1 Rinad Beidas, 2<br />

PhD, & Wendi Cross, PhD 3<br />

1 University of Washington; 2 University of<br />

Pennsylvania, Perelman School of<br />

Medicine; 3 University of Rochester<br />

Medical Center<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 19


BRIDGE OVER TROUBLED WATERS: THE INTERACTIVE SYSTEMS FRAMEWORK FOR DISSEMINATION &<br />

IMPLEMENTATION<br />

Abraham Wandersman, PhD<br />

University of South Carolina<br />

Contact: wandersman@sc.edu<br />

Reducing the gap between research and practice requires a practical framework that brings funders, practitioners,<br />

researchers/evaluators, and consumers together for effective implementation and scaling up. Abe Wandersman and<br />

colleagues, in collaboration with CDC Division of Violence Prevention staff, developed the Interactive Systems<br />

Framework for Dissemination and <strong>Implementation</strong> (ISF) (2008 and 2012 special issues of the American Journal of<br />

Community Psychology). The ISF is now being widely cited and used in domains throughout public health and<br />

in education. The presentation will (a) introduce the ISF; (b) briefly describe one of the contributions of the ISF<br />

special issue: Practical <strong>Implementation</strong> Science; and (c) provide examples of bridge‐building including at CDC and<br />

elsewhere.<br />

Track(s): Scale‐Up<br />

NOTES<br />

20 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


TM<br />

Applying Science.<br />

Advancing Practice. A S A P<br />

Understanding the Interactive Systems Framework<br />

for Dissemination and <strong>Implementation</strong><br />

State public health agencies and sexual assault coalitions<br />

have developed prevention plans with goals and<br />

objectives to prevent first time perpetration of sexual<br />

assault as part of the Rape Prevention Education (RPE)<br />

program. These plans were intended to prepare the<br />

way for successful and sustainable implementation of<br />

evidence-based prevention programs. However, very<br />

few of these evidence-based programs exist for the<br />

prevention of sexual violence. This lag in the development<br />

of evidence-based programs continues to challenge both<br />

sexual violence prevention researchers and practitioners.<br />

The public health approach to violence prevention<br />

(Figure 1) uses four steps to systematically define the<br />

problem, identify risk and protective factors, develop<br />

and test prevention strategies, and finally, ensure<br />

widespread adoption. This model assumes that the tested<br />

interventions will be used in the field, but it provides very<br />

little information on how this should be accomplished.<br />

The Interactive Systems Framework (ISF) for<br />

Dissemination and <strong>Implementation</strong> was developed to<br />

address the “how to” gap that exists between scientifically<br />

determining what works and moving that knowledge into<br />

the field for the benefit of the public.<br />

Those who work in the various fields of violence prevention<br />

are motivated to develop, evaluate, disseminate, and<br />

implement effective strategies for preventing violence<br />

with the goal of building a safer, healthier society. Ideally<br />

we would select programs, practices, or policies that have<br />

been proven to be effective—meaning there is strong,<br />

scientific evidence that they work.<br />

Applying Science.<br />

Advancing Practice.<br />

What is ASAP? Applying Science. Advancing<br />

Practice. (ASAP) is a series of informational<br />

briefs created by CDC’s Division of Violence<br />

Prevention to help apply scientific knowledge<br />

to the practice of primary prevention of<br />

violence.<br />

Who is it for? This series of ASAPs are written<br />

for state health departments and statewide<br />

sexual assault coalitions, the current support<br />

system for rape prevention education activities.<br />

Sharing with other violence prevention<br />

partners is encouraged.<br />

How will it help? ASAP offers specialized, topicspecific<br />

information necessary for successful,<br />

sustainable violence prevention efforts. This<br />

particular series on sexual violence prevention is<br />

intended to provide information and resources<br />

for RPE grantees who provide prevention<br />

support to community-based prevention<br />

education activities.<br />

National Center for Injury Prevention and Control<br />

Division of Violence Prevention


Applying Science. Advancing Practice.<br />

Figure 1: The Public Health Approach to Violence Prevention and the ISF<br />

Define the<br />

Problem<br />

Identify Risk<br />

and Protective<br />

Factors<br />

Develop and<br />

Test Prevention<br />

Strategies<br />

ISF<br />

Ensure<br />

Widespread<br />

Adoption<br />

The Interactive Systems Framework (ISF) for Dissemination and <strong>Implementation</strong> provides a framework<br />

for understanding how to address the gap between the third and fourth stages of the public<br />

health approach to violence prevention, often referred to as the research to practice gap.<br />

Figure 2: The Interactive Systems Framework for Dissemination and <strong>Implementation</strong><br />

Funding<br />

Implementing Prevention—Prevention Delivery System<br />

General<br />

Capacity Use<br />

Innovation-Specific<br />

Capacity Use<br />

Macro<br />

Policy<br />

Supporting the Work—Prevention Support System<br />

General<br />

Capacity Building<br />

Innovation-Specific<br />

Capacity Building<br />

Climate<br />

Distilling Information—Prevention Synthesis and Translation System<br />

Synthesis<br />

Translation<br />

Existing <strong>Research</strong> and Theory


A S A P<br />

The ISF was developed specifically with the fields of<br />

youth violence and child maltreatment prevention in<br />

mind, where much evidence has been gathered over the<br />

past several decades about what works and does not<br />

work. Despite this growing evidence, wide-spread use of<br />

these effective strategies has been less than ideal. The ISF<br />

resolves this by addressing some of these questions:<br />

• How do we achieve the widespread use of effective<br />

practices, policies, and programs to prevent violence?<br />

• What infrastructures or systems are necessary to<br />

ensure that dissemination and implementation are<br />

carried out successfully?<br />

• How do organizations and practitioners build the<br />

capacity necessary to bring effective violence<br />

prevention strategies to scale community wide?<br />

One advantage the ISF offers to the sexual violence<br />

prevention field is a well thought out, underlying process<br />

for how to move science to practice. By spending the<br />

time understanding these underlying processes now,<br />

the field will be better prepared to more rapidly move<br />

effective programs, practices, or policies into the hands of<br />

communities as they become available later.<br />

A Closer Look at the Interactive<br />

Systems Framework<br />

Figure 2 shows the ISF and how it connects three systems<br />

to work together for successful dissemination and<br />

implementation of prevention innovations. The term<br />

“system” is used broadly to describe a set of activities that<br />

accomplish one of the three identified functions that<br />

make dissemination and implementation possible. These<br />

systems are:<br />

Prevention Synthesis and Translation System<br />

Here scientific knowledge is distilled into understandable<br />

and actionable information. <strong>Research</strong> institutions,<br />

universities, and the Division of Violence Prevention (DVP)<br />

at CDC are all institutional examples of this system.<br />

Prevention Support System<br />

This system supports the work of the other two systems<br />

through building capacity for carrying out prevention<br />

activities. Agencies like CDC, state health departments,<br />

or state sexual assault coalitions are often in the role of<br />

prevention support for grantees or local programs.<br />

Prevention Delivery System<br />

This is where innovations are actually implemented or<br />

where “the rubber meets the road.” Community-based<br />

organizations often function in the role of the prevention<br />

delivery system.<br />

As depicted in figure 2, these three systems work<br />

together and are embedded within an underlying<br />

context that influences decision-making and adoption<br />

of prevention strategies. These underlying conditions<br />

include: legislation that supports funding for sexual<br />

assault prevention, the best available theory and<br />

research related to the prevention of sexual assault,<br />

the community and/or organizational context in which<br />

sexual assault strategies are implemented and macrolevel<br />

policy factors such as state or federal level budget<br />

constraints or legislative changes. These underlying<br />

considerations are graphically displayed as the climate<br />

in which the three systems exist, and all of these have an<br />

impact on successful dissemination and implementation.<br />

Each system within the ISF also builds upon or<br />

influences the functions of the other two systems. These<br />

relationships and influences are represented by the<br />

arrows that connect the systems to each other.<br />

“If we keep doing what we are<br />

doing, we will keep getting<br />

what we are getting.”<br />

–Anonymous<br />

For sexual violence prevention, where the research<br />

evidence is scant and still being built, the ISF can be<br />

especially helpful. What the ISF can do is take what we do<br />

know about effective prevention principles and processes<br />

and distill that knowledge into understandable concepts<br />

through the Prevention Synthesis and Translation System.<br />

The Prevention Support System builds the capacity of<br />

local organizations to put these prevention principles and<br />

processes into practice. The Prevention Delivery System<br />

serves to strengthen and deliver prevention principles<br />

and processes on the ground.<br />

To illustrate how the ISF would function in the prevention<br />

of sexual violence, consider the following examples of<br />

activities that may occur within each system:


Applying Science. Advancing Practice.<br />

Distill (PSTS)<br />

• Review and condense scientific literature on risk and<br />

protective factors for sexual violence.<br />

• Translate research findings about risk and protective<br />

factors for sexual violence into user friendly<br />

language.<br />

Support (PSS)<br />

• Build the capacity of local organizations to develop<br />

strong leaders, understand how to use data, or form<br />

long-lasting partnerships.<br />

• Provide training and technical assistance about<br />

specific prevention strategies.<br />

Delivery (PDS)<br />

• Implement sexual violence prevention strategies<br />

across a community.<br />

• Support the spread and uptake of effective sexual<br />

violence prevention principles.<br />

• Monitor and evaluate programmatic activities to<br />

further improve the program.<br />

While the ISF includes activities or functions that are<br />

carried out by people in many different kinds of roles and<br />

within three distinct systems, these systems are working<br />

together to distill, support, and deliver prevention<br />

strategies. By understanding the functions of these three<br />

systems and how they work together, organizations,<br />

stakeholders, funders, and practitioners can<br />

communicate better and work together to disseminate<br />

and more effectively implement prevention strategies.<br />

You may have noticed that in the example above, much<br />

of the RPE grantee roles and/or functions showed up<br />

in the Prevention Support System. This makes sense<br />

because as an RPE grantee, the role of state public<br />

health agencies and state-level sexual assault coalitions<br />

is to provide support for local programs to ensure<br />

they can implement rape prevention education at the<br />

community level. These support activities can be seen as<br />

an important link between taking scientifically derived<br />

information and putting it into practice.<br />

Future editions of ASAP will focus on the PSS in more<br />

detail. Specifically, they will describe how to understand<br />

the capacities necessary for individuals and organizations<br />

(which are linked through systems) to prevent sexual<br />

violence and build healthier and safer communities.<br />

A S A P<br />

Key Terms<br />

The following key terms are found throughout this brief.<br />

Capacity: The ability, skills, and motivations to<br />

conduct and sustain prevention work at the individual,<br />

organizational, and systems level. The ISF views capacity<br />

as carrying out important functions in two distinct ways:<br />

• General Capacity – a capacity to implement or<br />

improve any programmatic strategy or activity.<br />

• Innovation Specific Capacity – a capacity needed<br />

to plan, implement, evaluate and sustain primary<br />

prevention strategies.<br />

Dissemination: The intentional, targeted spreading<br />

of an innovation from the originators to the intended<br />

users that result in a targeted and facilitated process of<br />

distributing information and materials to organizations<br />

and individuals who want and can use them to improve<br />

health.<br />

<strong>Implementation</strong>: A purposeful set of specific activities<br />

that result in individual or organizational use of an<br />

innovation.<br />

Innovation: New prevention knowledge or information<br />

- product, practice, program, policy, idea, research<br />

findings, or results.<br />

Strategy: An approach to address a problem such as<br />

the promotion of respectful relationships to reduce<br />

interpersonal violence.<br />

Synthesis: A process for obtaining and summarizing<br />

scientifically derived information, including evidence of<br />

effectiveness (risk and protective factors, core elements,<br />

and key features, etc.).<br />

Translation: The process of converting scientific and<br />

technically complex research into everyday language and<br />

applicable/actionable concepts in the practice setting.<br />

More Information<br />

More information about the ISF can be found in the<br />

following article at www.cdc.gov/ViolencePrevention/<br />

sexualviolence/translation.html:<br />

Wandersman, A., Duffy, J., Flasphor, P., Noonan, R., Lubell, K., Stillman, L.,<br />

et al. (2008). Bridging the gap between prevention research and practice:<br />

The Interactive Systems Framework for Dissemination and Implemetation.<br />

American Journal of Community Psychology. 41, 3-4.<br />

For more information, please contact:<br />

Centers for Disease Control and Prevention<br />

1-800-CDC-INFO • www.cdc.gov/violenceprevention • cdcinfo@cdc.gov


IMPLEMENTATION OF SCHOOL‐WIDE POSITIVE BEHAVIOR INTERVENTIONS & SUPPORTS: THE INFLUENCE<br />

OF EMOTION, SELF‐EFFICACY, & ORGANIZATIONAL COMMITMENT<br />

Zed Kramer, MA, Molly K. McDonald, MA, Brandon Rennie, EdS, & Cameo Borntrager, PhD<br />

University of Montana<br />

Contact: zed.kramer@umontana.edu<br />

Both organizational and individual factors are known to influence implementation. However, there is little<br />

research investigating the interaction of these influences. For example, Klimes‐Dougan et al. (2009) found<br />

counterintuitive results from a study examining factors related to fidelity: organizational culture was negatively<br />

related to fidelity; whereas, low job satisfaction was positively related to fidelity. Additionally, research on the<br />

role of emotion suggests complex dynamics are at work. Emotional valence has been found to function as a<br />

barrier and facilitator to implementation (Choi et al., 2011; Peters et al., 2011). Self‐efficacy has also been<br />

identified as a factor involved in the implementation of evidence‐based practices (Aarons et al., 2012). This<br />

study further assesses the relationship between organizational initiatives that appear facilitative and individual<br />

factors presupposed to affect implementation. Using self‐report data from elementary school teachers, the<br />

relationships between emotion, self‐efficacy, and organizational commitment are investigated within the<br />

context of an ongoing implementation project initially presented at <strong>SIRC</strong> 2011 involving School‐Wide Positive<br />

Behavior Interventions & Supports (SWPBIS). SWPBIS emphasizes prevention, data based decision‐making, and<br />

evidence‐based practices for addressing difficult behaviors in youth. Implications for the potential existence of<br />

recursive feedback between affect and self‐efficacy and its influence on organizational climate are discussed.<br />

Symposia – May 16<br />

Track(s): Fidelity<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 25


SEEING IS BELIEVING: BEHAVIORAL REHEARSAL METHODOLOGY<br />

Shannon Dorsey, PhD, 1 Rinad Beidas, PhD, 2 & Wendi Cross, PhD 3<br />

1 University of Washington; 2 University of Pennsylvania, Perelman School of Medicine; 3 University of Rochester<br />

Medical Center<br />

Contact: dorsey2@uw.edu<br />

Evidence‐based interventions (EBI) are available for a variety of problems, populations, and settings.<br />

Nevertheless, intervention effectiveness hinges upon the skill level of the people implementing them, making<br />

cost‐effective training and supervision programs a critical focus for implementation science. Adult learning<br />

theory indicates that implementer skill development is most likely when active learning strategies, which include<br />

modeling and practice opportunities, are employed. In addition, behavioral rehearsal and standardized patient<br />

(SP) methods offers a more rigorous and objective means of assessing skill development and fidelity than<br />

commonly used strategies (e.g., self‐report), and a more cost‐effective strategy than audio or video coding.<br />

Understanding how to use BR and SP strategies for cost‐effective dissemination and implementation, and more<br />

rigorous evaluation, is an important emerging research focus.<br />

We present findings from three studies using similar BR methodology (Cross et al., 2007) as both a training and a<br />

fidelity assessment tool. Presentation will include an in‐depth focus on the use of BR in a CBT common elements<br />

training initiative in Washington State for child and adolescent depression, anxiety, trauma, and behavior<br />

problems. Presentation will include BR vignette examples, decision points in BR use (e.g., actor selection; coding<br />

strategy), actual audio/video of BR, and study outcomes.<br />

Track(s): Fidelity, Measurement, Training<br />

NOTES<br />

26 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 10:00‐11:15<br />

LEADERSHIP & IMPLEMENTATION<br />

(MC: Adam Carmel, PhD)<br />

Leadership & <strong>Implementation</strong><br />

Bruce J. Avolio, PhD<br />

Center for Leadership & Strategic<br />

Thinking, Michael G. Foster School of<br />

Business, University of Washington<br />

Taking a Lesson from Usual Care:<br />

Predictors of Use of Evidence‐Based<br />

Practices for Youth<br />

Charmaine K. Higa‐McMillan, PhD<br />

University of Hawaii at Hilo<br />

Symposia – May 16<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 27


SOURCING & TRANSMITTING LEADERSHIP TO OPTIMIZE ORGANIZATIONAL CHANGE<br />

Bruce J. Avolio, PhD, Marion B. Ingersoll Professor, Executive Director, Center for Leadership & Strategic Thinking,<br />

Michael G. Foster School of Business, University of Washington<br />

Contact: bavolio@uw.edu<br />

The focus of my presentation will be on examining the various sources of leadership and how those sources are<br />

transmitted and contribute to or detract from individual, unit and organizational transformation. Today, the source<br />

of leadership is recognized as not being just associated with the designated or formal leader. Indeed, the source of<br />

leadership can be a group or even a crowd! Moreover, leadership is now being distributed throughout organizations,<br />

communities and nation states in ways that are creating opportunities for fundamental change in the way we<br />

configure our institutions and lead them in the 21 st century, including for profit, not for profit, government agencies.<br />

Participants can expect to learn the following from our discussion:<br />

<br />

<br />

<br />

<br />

Where does their leadership come from and how can it be effectively transmitted.<br />

How the source and transmission of leadership can drive more ownership at all organizational levels.<br />

What it means to consider the total leadership system in one’s organization and its development when<br />

engaging in organizational transformation.<br />

One insight that each participant can apply to his or her own leadership development.<br />

Track(s): Fidelity, Measurement, Training<br />

NOTES<br />

28 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


TAKING A LESSON FROM USUAL CARE: PREDICTORS OF USE OF EVIDENCE‐BASED PRACTICES FOR YOUTH<br />

Charmaine K. Higa‐McMillan, PhD, 1 Ashley Usita, MA, 1 & Brad J. Nakamura, PhD 2<br />

1 University of Hawaii at Hilo; 2 University of Hawaii at Manoa<br />

Contact: higac@hawaii.edu<br />

A growing body of research suggests that in addition to examining adoption of evidence‐based psychosocial<br />

interventions (EBPIs), studying practices in usual care might be a complimentary approach to solving a number<br />

of implementation research dilemmas. In a recent study of usual care for youth with disruptive behavior<br />

problems, Brookman‐Frazee, Garland, et al. (2010) found few youth, family, and therapist characteristics that<br />

predicted use of EBPIs. The current study expanded on these findings by examining therapists (N=74) providing<br />

services for youth with anxiety, trauma, depressive, attentive, and disruptive behavior problems (N=514) in a<br />

large public mental health system. Using multilevel modeling, this study examined therapist‐level characteristics<br />

(e.g., training background, theoretical orientation) that predict use of EBPIs after accounting for child‐level<br />

characteristics (e.g., age, gender, diagnosis, functional impairment, episode length). Results suggest that while<br />

no child‐level characteristics predict therapist use of EBPIs, child age, gender, functional impairment at episode<br />

start, and service type predict use of practices that do not have a strong evidence‐base. Further, while most<br />

therapist characteristics do not predict use of EBPIs, theoretical orientation accounts for a significant amount of<br />

variance in provider use of EBPIs with youth in usual care. Implications for implementation research will be<br />

discussed.<br />

Symposia – May 16<br />

Track(s): None specific to this conference<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 29


30 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 12:30‐1:45<br />

Breakout A:<br />

EBP Champion Symposium<br />

(MC: Sara J. Landes, PhD)<br />

Is My Patient Getting Better?<br />

<strong>Implementation</strong> of Mental Health<br />

Progress Monitoring/Outcomes System<br />

in an Integrated Delivery System<br />

Bradley Steinfeld, PhD<br />

Group Health Cooperative<br />

Symposia – May 16<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 31


IS MY PATIENT GETTING BETTER? IMPLEMENTATION OF MENTAL HEALTH PROGRESS MONITORING/<br />

OUTCOMES SYSTEM IN AN INTEGRATED DELIVERY SYSTEM<br />

Bradley Steinfeld, PhD, Allie Franklin, MSSW, Mariam Sarikhan, MA, & Brian Mercer<br />

Group Health Cooperative<br />

Contact: steinfeld.b@ghc.org<br />

Knowing whether patients are getting better is foundational to providing evidence‐based care. Implementing a<br />

progress monitoring system that can both be effectively used by clinicians in real time to track progress at the<br />

individual patient level while at the same time provide feedback on mental health outcomes at a clinic or<br />

program level is a complex endeavor. This symposium will describe the multiyear experience of a behavioral<br />

health department in an integrated delivery system in implementing a progress monitoring system. Strategies<br />

for engaging practicing clinicians in use of a progress monitoring tool as well as how to spread tool use across<br />

multiple locations and providers will be discussed. A particular focus will be on how to integrate a progress<br />

monitoring system into an electronic medical record. This symposium will involve multiple speakers providing<br />

perspectives on implementation of a progress monitoring system from the individual clinician, clinic manager,<br />

organizational leaders, and information systems analyst.<br />

Track(s): EBP Champions, Measurement<br />

NOTES<br />

32 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 12:30‐1:45<br />

BREAKOUT B: LEARNING FROM<br />

IMPLEMENTATION OBSERVATION<br />

(MC: Suzanne Kerns, PhD)<br />

Training in Triple P (Positive Parenting<br />

<strong>Program</strong>): Exploring <strong>Implementation</strong><br />

Outcomes Across Practitioner Groups in<br />

the United States, Australia, England, &<br />

Canada<br />

Suvena Sethi, PhD<br />

Parenting & Family Support Centre,<br />

University of Queensland, Australia<br />

Factors Associated with the Adoption of<br />

a Mental Health Intervention for<br />

Autism Spectrum Disorders<br />

Colby Chlebowski, PhD<br />

University of California, San Diego<br />

Symposia – May 16<br />

Observed Barriers to <strong>Implementation</strong> of<br />

Empirically‐Supported Treatments by<br />

Clinicians Working with Military &<br />

Veteran Patients<br />

Craig J. Bryan, PsyD, ABPP, 1 & David S.<br />

Riggs, PhD 2<br />

1 National Center for Veterans Studies;<br />

2 Center for Deployment Psychology<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 33


TRAINING IN TRIPLE P (POSITIVE PARENTING PROGRAM): EXPLORING IMPLEMENTATION OUTCOMES ACROSS<br />

PRACTITIONER GROUPS IN THE UNITED STATES, AUSTRALIA, ENGLAND & CANADA<br />

Suvena Sethi, PhD, & Matthew Sanders, PhD<br />

Parenting & Family Support Centre, University of Queensland, Australia<br />

Contact: s.sethi@uq.edu.au<br />

The Triple P Positive Parenting <strong>Program</strong> is a multi‐level parenting and family support strategy that aims to reduce<br />

behavioral, emotional, and developmental problems in children by enhancing parental knowledge, skills, and<br />

confidence to promote positive and supportive relationships with their children. An integral element of the<br />

implementation of Triple P is practitioner training. This paper explores the results of an analysis of training outcomes<br />

of participants who received practitioner training in Level 4, Group Triple P (n=4080; for example, GP’s, psychologists,<br />

nurses, teachers, social and family support workers). A series of ANOVAs were conducted to analyze training<br />

outcomes across practitioner groups including perceived adequacy of skills and proficiency in the delivery of Triple P<br />

strategies. While we highlight that across each country, outcomes illustrate significant improvements in practitioner<br />

confidence in the delivery of positive parenting strategies, qualitative responses of participants also highlight the<br />

potential challenges faced by practitioners in accessing evidence‐based interventions. The successful<br />

implementation of a Triple P Online Training <strong>Program</strong> represents the next crucial step in program development.<br />

Discussion will draw on a public health model that engages a wider practitioner cohort and will include an emerging<br />

focus on translating Triple P provider training for an online implementation strategy. The opportunity and challenges<br />

of such a process will be discussed.<br />

Track(s): Global Perspectives, Training, Technology<br />

NOTES<br />

34 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


FACTORS ASSOCIATED WITH ADOPTION OF A MENTAL HEALTH INTERVENTION FOR AUTISM SPECTRUM<br />

DISORDERS<br />

Colby Chlebowski, PhD, & Lauren Brookman‐Frazee, PhD<br />

Department of Psychiatry, Child & Adolescent Services <strong>Research</strong> Center, University of California, San Diego<br />

Contact: cchlebowski@ucsd.edu<br />

Community and school‐based mental health programs play an important role in caring for school‐age children<br />

with Autism Spectrum Disorders (ASD). The AIM HI (“An Individualized Mental Health Intervention for ASD”)<br />

was developed to address concerns about the quality of publicly‐funded MH care for this population. It is a<br />

package of EB strategies developed specifically for delivery in MH services and designed in collaboration with<br />

community stakeholders based on a community needs assessment and data on EB strategies for this clinical<br />

population. This presentation will report findings related to intervention adoption from a large‐scale<br />

effectiveness and implementation study of AIM HI. Child MH programs from one large, geographically diverse<br />

county are being recruited to participate and randomized to immediate AIM HI implementation or a waitlist<br />

control condition. To date, 100% (n=18) of eligible programs approached to participate have enrolled in the<br />

study. Preliminary themes emerging from the program recruitment process highlight implementation<br />

facilitators at the system and organizational levels (e.g., system leader champions, program leader support<br />

based on perceived need for therapist ASD training, and strong implementation climate). Data from agency and<br />

program leader and therapist surveys reporting on their decisions to adopt AIM HI, implementation climate,<br />

leadership, and attitudes towards evidence‐based practices will be presented.<br />

Symposia – May 16<br />

Track(s): None specific to this conference<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 35


OBSERVED BARRIERS TO IMPLEMENTATION OF EMPIRICALLY‐SUPPORTED TREATMENTS BY CLINICIANS<br />

WORKING WITH MILITARY & VETERAN PATIENTS<br />

Craig J. Bryan, PsyD, ABPP, 1 & David S. Riggs, PhD 2<br />

1 National Center for Veterans Studies; 2 Center for Deployment Psychology<br />

Contact: craig.bryan@utah.edu<br />

During the past two years, the National Center for Veterans Studies and Center for Deployment Psychology have<br />

provided training workshops to over 9000 military and civilian mental health professionals focused on several<br />

empirically‐supported treatments for psychiatric conditions of particular relevance to military and veteran patients:<br />

prolonged exposure (PE) and cognitive processing therapy (CPT) for PTSD, cognitive behavioral therapy for insomnia<br />

(CBTi), imagery rehearsal therapy (IRT) for nightmares, brief cognitive behavioral therapy (BCBT) for suicide risk, and<br />

cognitive behavioral therapy for chronic pain. Very few mental health professionals have been exposed to these<br />

treatments, and several common barriers to full adoption of these treatment models have been noted, including<br />

misconceptions about treatment safety and efficacy, strong commitment to unsupported theories and practices, and<br />

pseudoscientific thinking. Furthermore, adoption of these practices has been limited despite these efforts. In this<br />

presentation, individual and systemic barriers to the implementation of these protocols will be discussed, as well as a<br />

few examples of successful adoption that may provide guidance to improve successful dissemination of empiricallysupported<br />

treatments across large and dispersed health care systems.<br />

Track(s): EBP Champions, Training<br />

NOTES<br />

36 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 12:30‐1:45<br />

BREAKOUT C: IMPLEMENTATION<br />

THROUGH COLLABORATIONS WITH<br />

POLICYMAKERS<br />

(MC: Aaron R. Lyon, PhD)<br />

Effective <strong>Implementation</strong> of EBP<br />

Legislation by Engaging Providers in a<br />

Coaching Process<br />

Eric Trupin, PhD, & Gabrielle D’Angelo,<br />

MSW<br />

University of Washington Division of<br />

Public Behavioral Health & Justice Policy<br />

Negotiating <strong>Implementation</strong> Science &<br />

Evaluation <strong>Research</strong>: Lessons Learned<br />

from a National Teen Pregnancy<br />

Prevention <strong>Implementation</strong> Study<br />

Jacqueline Berman, PhD<br />

Mathematica Policy <strong>Research</strong><br />

Symposia – May 16<br />

Identifying the Needs of IEF/OIF<br />

Veterans with TBI & Co‐Occurring<br />

Behavioral Health Issues<br />

Lisa Brenner, PhD, Jennifer Olson‐<br />

Madden, PhD, Bridget Matarazzo, PsyD,<br />

& Gina Signoracci, PhD<br />

Veterans Integrated Service Network<br />

(VISN) 19 Mental Illness <strong>Research</strong>,<br />

Education, & Clinical Center (MIRECC)<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 37


EFFECTIVE IMPLEMENTATION OF EBP LEGISLATION BY ENGAGING PROVIDERS IN A COACHING PROCESS<br />

Eric Trupin, PhD, & Gabrielle D'Angelo, MSW<br />

Division of Public Behavioral Health & Justice Policy, University of Washington<br />

Contact: dgabriel@uw.edu<br />

Legislation to re‐define evidence‐based practice (EBP) and inventory all EBPs currently in practice in Washington was<br />

passed in Spring 2012 (HB 2536). Responding to HB 2536, The University of Washington Evidence‐Based Practice<br />

Institute and Washington State Institute for Public Policy (WSIPP) engaged with community providers to overcome<br />

barriers to these policy changes. First, UW and WSIPP engaged in education of providers about the law, and<br />

dialogued about implications for practice as usual on all levels of practice. Next, UW engaged in a coaching process<br />

with selected providers to help them move their existing interventions into the promising or research‐based<br />

categories, or prepare their organizations for implementation of current EBPs. Case studies of organizational<br />

evolution toward evidence‐based practice reflect the ongoing challenges on the national, state, and individual<br />

stakeholder levels. The strategies used by UW and WSIPP for engaging community providers provide extractable<br />

models for overcoming EBP implementation barriers across service systems. The case studies and strategies<br />

presented could suggest solutions to many of the “<strong>Implementation</strong> <strong>Research</strong> Dilemmas”. Common stakeholder<br />

needs for coaching include: maintaining fidelity, accessibility of training especially considering turnover rates and<br />

referral streams, and analysis of existing qualitative data to demonstrate effectiveness of existing interventions.<br />

Track(s): Scale‐Up<br />

NOTES<br />

38 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


NEGOTIATING IMPLEMENTATION SCIENCE & EVALUATION RESEARCH: LESSONS LEARNED FROM A<br />

NATIONAL TEEN PREGNANCY PREVENTION IMPLEMENTATION STUDY<br />

Jacqueline Berman, PhD, 1 & Ellen Kisker, PhD 2<br />

1 Mathematica Policy <strong>Research</strong>; 2 Twin Peaks Partners, LLC<br />

Contact: jberman@mathematica‐mpr.com<br />

<strong>Implementation</strong> research seeks to guide the adoption, initial implementation, and continuous improvement of<br />

evidence‐based programs over time. <strong>Implementation</strong> science, which focuses on how to translate, replicate,<br />

adapt, and scale up evidence‐based programs or practices in “real world” settings, can serve as a key support to<br />

implementation research. This paper explores emergent lessons about the application of tools from<br />

implementation science to the design and execution of systematic, high‐quality implementation evaluations.<br />

Because implementation science evolved primarily in clinical settings, however, its practices require some<br />

negotiation and translation when applied to the evaluation of public policies and programs. This paper examines<br />

lessons learned with regard to how to (1) use concepts and tools from implementation science to identify key<br />

drivers and elements of implementation necessary for replication and scalability of effective programs; (2) select<br />

valid quantitative and qualitative measures of these elements; and (3) determine multiple, appropriate data<br />

sources for these measures. These lessons are drawn from the development of a conceptual framework and<br />

measures of core program elements for an in‐depth implementation study, embedded in a large‐scale impact<br />

evaluation of evidence‐based teen pregnancy prevention programs funded by the Administration for Children<br />

and Families, DHHS.<br />

Symposia – May 16<br />

Track(s): Measurement<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 39


IDENTIFYING THE NEEDS OF OEF/OIF VETERANS WITH TRAUMATIC BRAIN INJURY (TBI) & CO‐OCCURRING<br />

BEHAVIORAL HEALTH ISSUES AND THEIR FAMILIES<br />

Lisa Brenner, PhD, Jennifer Olson‐Madden, PhD, Bridget Matarazzo, PsyD, & Gina Signoracci, PhD<br />

Veterans Integrated Service Network (VISN) 19 Mental Illness <strong>Research</strong>, Education, & Clinical Center (MIRECC)<br />

Contact: lisa.brenner@va.gov<br />

Background: Estimates of TBI among Soldiers deployed to Iraq and Afghanistan vary, and the long‐term implications<br />

of such injuries are not understood. In light of concerns regarding the behavioral health needs of Veterans with TBI,<br />

the U.S. Department of Health and Human Services Administration awarded the State of Colorado, in collaboration<br />

with the Veterans Integrated Service Network (VISN) 19 Mental Illness <strong>Research</strong> Education and Clinical Center<br />

(MIRECC) a grant to improve the community mental health system’s ability to provide services to individuals with TBI.<br />

Methods: The MIRECC team has conducted focus groups with stakeholders throughout Colorado. In addition, a<br />

consensus conference with national experts specializing in the assessment and treatment of TBI and co‐occurring<br />

behavioral health issues was conducted.<br />

Results: Data obtained from focus groups will be shared along with findings from the consensus conference.<br />

Conclusions: Utilizing data obtained, researchers will work with Statewide Steering Committee (SSC) members to<br />

develop assessment and treatment guidelines. A training module and toolkit based on these guidelines will also be<br />

created. Additionally, the VISN 19 MIRECC and the SSC will develop a statewide Brain Injury Resource Team<br />

comprised of community mental health providers to act as internal facilitators for the above described best practices.<br />

Track(s): Scale‐Up<br />

NOTES<br />

40 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 12:30‐1:45<br />

BREAKOUT D: ADVANCING<br />

FIDELITY MEASUREMENT<br />

(MC: Jennifer Villatte, PhD(c))<br />

Fidelity Measurements in the Real<br />

World: Feasibility of BECCI & MITI for<br />

Motivational Interviewing in Child &<br />

Youth Mental Health<br />

Melissa Kimber, MSW, PhD(c), Raluca<br />

Barac, MA, PhD, & Melanie Barwick,<br />

PhD, CPsych<br />

Hospital for Sick Children, Toronto,<br />

Canada<br />

Comparisons Among Six Methods for<br />

Measuring Fidelity: Implications for<br />

<strong>Research</strong> & Practice<br />

Kristin Duppong Hurley, PhD<br />

University of Nebraska‐Lincoln<br />

Symposia – May 16<br />

An Update on Project BEST (Bringing<br />

Evidence‐Supported Treatments to<br />

South Carolina Children & Families):<br />

Challenges to Measuring Provider<br />

Fidelity<br />

Rochelle F. Hanson, PhD<br />

National Crime Victims <strong>Research</strong> &<br />

Treatment Center, Medical University of<br />

South Carolina<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 41


FIDELITY MEASUREMENT IN THE REAL WORLD: FEASIBILITY OF BECCI & MITI FOR MOTIVATIONAL<br />

INTERVIEWING IN CHILD & YOUTH MENTAL HEALTH<br />

Melissa Kimber, MSW, PhD(c), Raluca Barac, MA, PhD, & Melanie Barwick, PhD, CPsych<br />

Hospital for Sick Children, Toronto, Canada<br />

Contact: melanie.barwick@sickkids.ca<br />

Treatment fidelity is an essential element of good scientific research and clinical practice (Waltz et al., 1993).<br />

Treatment fidelity helps researchers and practitioners draw solid conclusions about treatment effects, inform on<br />

implementation of evidence‐based practices (EBPs) so that cross‐site comparisons can be made, and monitor<br />

therapist training and need for additional training and supervision. Despite its undisputed importance, treatment<br />

fidelity is often neglected in practice. Schulte et al. (2009) noted that fidelity monitoring is least likely to occur<br />

precisely when there is high risk of compromising fidelity, namely during EBP implementation. Fidelity neglect stems<br />

from lack of theoretical knowledge and implementation guidelines and the increased costs associated with its use<br />

(Perepletchikova et al., 2009). METHOD: We addressed the issue of fidelity in the context of implementing<br />

Motivational Interviewing (MI) in four child and youth mental health organizations in Ontario, Canada. Across the<br />

four organizations, 20 clinicians audiotaped monthly therapy sessions with their clients to capture practice at several<br />

time points: 3 months leading to MI training, while receiving coaching support over 9 months, and 3 months<br />

following support. All sessions were coded using the Behavioral Change Counseling Index (BECCI; Lane et al., 2005)<br />

and 50% of the sessions were coded by an expert coder using the Motivational Interviewing Treatment Integrity scale<br />

(MITI; Moyers et al., 2005). Both instruments measure therapists’ competence in MI and have acceptable<br />

psychometric properties (Wallace et al., 2009). However, the two instruments differ in practical and economical<br />

aspects: BECCI is a brief tool, to be scored in one pass and requiring minimal amounts of training, whereas MITI is<br />

more elaborate, shown to be reliable when used by expert raters, and requires relatively longer training. The brevity<br />

of the BECCI makes it a very appealing instrument for practice settings, with great potential for use in mental health<br />

settings to ensure MI fidelity. This is important because fidelity assessment will only become a reality if it is simple<br />

and practicable. To date, the BECCI has only been tested with data from simulated therapy sessions and how it<br />

stands up when tested in real world clinical practice or how it relates to other MI fidelity instruments is less known.<br />

Thus, the present study examined (a) the concordance between the MITI and BECCI and (b) the extent to which the<br />

two fidelity measures detect change in therapists’ competence following training and coaching in MI. These findings<br />

have significant implications for implementing fidelity checks as standard practice in mental health.<br />

Track(s): Fidelity, Global Perspectives, Measurement<br />

NOTES<br />

42 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


COMPARISONS AMONG SIX METHODS FOR MEASURING FIDELITY: IMPLICATIONS FOR RESEARCH &<br />

PRACTICE<br />

Kristin Duppong Hurley, PhD, 1 & Mark Van Ryzin, PhD 2<br />

1 University of Nebraska‐Lincoln; 2 Oregon Social Learning Center<br />

Contact: kristin.hurley@unl.edu<br />

Developing valid, reliable, and cost‐effective fidelity assessment tools that can be used in practice‐settings is a<br />

challenge for many evidence‐based interventions. One of goals of this pilot study was to compare the<br />

psychometrics of six methods for measuring the fidelity of an adaptation of the Teaching Family Model. These<br />

six methods of assessing fidelity included external observations, internal‐agency observations, supervisor<br />

ratings, staff self‐ratings, youth ratings, and archival data. The study included 145 youth with disruptive<br />

behavior disorders, 120 direct‐care staff, and 16 supervisors. Fidelity process data and youth mental health<br />

outcome data were collected longitudinally. We will briefly discuss the findings of this pilot study comparing the<br />

fidelity measurement approaches. Individually, each fidelity approach had acceptable psychometric properties<br />

and their ratings were correlated over time. However, the different assessment approaches were not strongly<br />

correlated among each other. Interestingly, the supervisor ratings had some issues with bias in ratings. By<br />

comparing the multiple assessments, certain supervisors were identified as likely being stricter in their ratings,<br />

causing an unexpected inverse correlation with youth outcomes. Only youth ratings of fidelity and some of the<br />

archival data were predictive of positive youth mental health outcomes. Implications and directions for future<br />

research will be discussed.<br />

Symposia – May 16<br />

Track(s): Fidelity, Measurement<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 43


AN UPDATE ON PROJECT BEST (BRINGING EVIDENCE‐SUPPORTED TREATMENTS TO SOUTH CAROLINA<br />

CHILDREN & FAMILIES): CHALLENGES TO MEASURING PROVIDER FIDELITY<br />

Rochelle F. Hanson, PhD, 1 Benjamin E. Saunders, PhD, 1 Libby Ralston, PhD, 2 Michael de Arellano, PhD, 1 & Angela<br />

Moreland, PhD 1<br />

1 National Crime Victims <strong>Research</strong> & Treatment Center, Medical University of South Carolina; 2 Dee Norton Low Country<br />

Children's Center<br />

Contact: hansonrf@musc.edu<br />

This presentation provides an update on Project BEST (Bringing Evidence‐Supported Treatments to South Carolina<br />

children and families; funded by Duke Endowment), an ongoing statewide initiative designed to support the<br />

dissemination and implementation of Trauma‐focused Cognitive Behavioral Therapy (TF‐CBT). Project BEST utilizes<br />

the Community‐Based Learning Collaborative (CBLC) dissemination/implementation model to build community<br />

capacity to deliver and sustain trauma‐informed services to abused children and their families. Since its onset, we<br />

have completed three CBLCs, and two are nearing completion. These have involved 477 clinicians, brokers, and<br />

senior leaders from 105 different agencies serving 38 of South Carolina’s 46 counties. One of our key goals is to train<br />

mental health providers in the delivery of TF‐CBT, and an ongoing challenge is to determine the most feasible, cost<br />

effective ways to measure therapist fidelity to the model. After providing an update on Project BEST activities to<br />

date (e.g., total number of participants who completed all training requirements; pre/post treatment outcome data<br />

for TF‐CBT training cases), the focus will be to discuss our measure of therapist fidelity to TF‐CBT, including factors<br />

associated with TF‐CBT fidelity; and preliminary findings on the relations among therapist fidelity and child/family<br />

treatment outcomes. The presentation will conclude with a discussion of challenges encountered in this statewide<br />

initiative, lessons learned, and future plans.<br />

Track(s): Fidelity, Measurement, Scale‐up<br />

NOTES<br />

44 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 12:30‐1:45<br />

BREAKOUT E: IMPLEMENTING<br />

PRIMARY CARE INTERVENTIONS<br />

(MC: Maria Monroe‐DeVita, PhD)<br />

Transformation & Spread of Primary<br />

Care Clinics into Medical Homes: It’s<br />

Slow, Hard Work<br />

Leif I. Solberg, MD<br />

HealthPartners Institute for Education &<br />

<strong>Research</strong><br />

A Qualitative Study of Fidelity:<br />

Understanding Variations in<br />

<strong>Implementation</strong> of the Patient‐<br />

Centered Medical Home<br />

Rosalind Keith, PhD<br />

Mathematica Policy <strong>Research</strong><br />

Symposia – May 16<br />

CADET: Clinical & Cost Effectiveness of<br />

Collaborative Care for Depression in UK<br />

Primary Care: A Cluster Randomized<br />

Controlled Trial<br />

David A. Richards, PhD<br />

University of Exeter, United Kingdom<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 45


TRANSFORMATION & SPREAD OF PRIMARY CARE CLINICS INTO MEDICAL HOMES: IT’S SLOW, HARD WORK<br />

Leif I. Solberg, MD, Juliana Tillema, MPA, A. Lauren Crain, PhD, Robin Whitebird, PhD, & Patricia Fontaine, MD<br />

HealthPartners Institute for Education & <strong>Research</strong><br />

Contact: leif.i.solberg@healthpartners.com<br />

Although many advocates of patient‐centered medical homes talk as though transformation occurs quickly and fairly<br />

abruptly, there is very little evidence to support that. We have studied the first 135 primary care clinics in Minnesota<br />

to be certified as health care homes (our local term for medical homes) through a fairly rigorous review and<br />

certification process. Our data shows that while they have implemented fairly large changes in the practice systems<br />

needed to function as medical homes over a three year period, there has been relatively little change in standardized<br />

performance rates for diabetes and cardiovascular disease care. Moreover, while average performance rates for<br />

certified health care homes are higher than for uncertified primary care clinics, there is wide variation in both groups<br />

and considerable overlap. Based on intensive interviews with leaders in nine certified clinics and a survey of them all,<br />

we have been able to identify the strategies and characteristics that most clearly differentiate the certified clinics<br />

with the greatest transformation from those with the least. This information has been translated into<br />

recommendations for what clinics should pay most attention to if they wish to truly transform and achieve triple aim<br />

results for their patients.<br />

Track(s): Scale‐Up<br />

NOTES<br />

46 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


A QUALITATIVE STUDY OF FIDELITY: UNDERSTANDING VARIATION IN IMPLEMENTATION OF THE PATIENT‐<br />

CENTERED MEDICAL HOME<br />

Rosalind Keith, PhD<br />

Mathematica Policy <strong>Research</strong>, Inc.<br />

Contact: rkeith@mathematica‐mpr.com<br />

Evaluating fidelity is particularly salient for complex, multi‐faceted interventions, such as the patient‐centered<br />

medical home (PCMH) model of care delivery, where poor implementation of different model components can<br />

compromise the effectiveness of the intervention as a whole in improving patient outcomes. This presentation<br />

will focus on the qualitative assessment of fidelity to the PCMH model. A comparative case analysis was<br />

conducted to examine fidelity to the PCMH in six primary care clinics affiliated with a large, academic, integrated<br />

health system. The clinics in the sample had similar organizational structures and resources (e.g., health<br />

information systems, incentives, quality initiatives centralized within the system, PCMH tools and processes,<br />

collaborative learning opportunities focused on PCMH implementation). Interview and observational data were<br />

analyzed deductively to assess variation in fidelity across the clinics. Categorical measures were constructed to<br />

reflect relative ratings of clinic level fidelity for each PCMH component. The findings show that despite having<br />

similar organizational structures, considerable variation in fidelity to the various PCMH components existed<br />

across the six clinics.<br />

Track(s): Fidelity<br />

Symposia – May 16<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 47


CADET: CLINICAL & COST EFFECTIVENESS OF COLLABORATIVE CARE FOR DEPRESSION IN UK PRIMARY CARE:<br />

A CLUSTER RANDOMIZED CONTROLLED TRIAL<br />

David A. Richards, PhD<br />

University of Exeter, UK<br />

Contact: d.a.richards@exeter.ac.uk<br />

Background: Collaborative care for depression is effective in the US but effects are uncertain internationally.<br />

Design: Multi‐centre, cluster randomised controlled trial with two parallel group arms.<br />

Results: 581 participants recruited from 49 primary care practices; 276 in collaborative care, 305 usual care.<br />

Participants in collaborative care had a mean reduction in depression score 1.33 PHQ‐9 points greater (95% CI 0.35 to<br />

2.31, p=0.009) compared to usual care (effect size = 0.26, 95% CI 0.07 to 0.46). Odds ratios for PHQ‐9 scores below<br />

depression threshold (PHQ


May 16. 2:00‐3:15<br />

BREAKOUT F:<br />

EBP CHAMPION SYMPOSIUM<br />

(MC: Cara C. Lewis, PhD, & Cameo<br />

Borntrager, PhD)<br />

Solving <strong>Research</strong> Dilemmas Related to<br />

<strong>Implementation</strong> Fidelity<br />

Rebecca Selove, PhD, MPH, Kathryn<br />

Mathes, BSN, RN, MS, PhD, & Heather<br />

Wallace, PhD<br />

Centerstone <strong>Research</strong> Institute<br />

Symposia – May 16<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 49


SOLVING RESEARCH DILEMMAS RELATED TO IMPLEMENTATION FIDELITY<br />

Rebecca Selove, PhD, MPH, Kathryn Mathes, BSN, RN, MS, PhD, & Heather Wallace, PhD<br />

Centerstone <strong>Research</strong> Institute<br />

Contact: rebecca.selove@centerstone.org<br />

The Interactive Systems Framework (ISF) offers a broad context within which a variety of implementation projects<br />

can be described and improved. Centerstone <strong>Research</strong> Institute (CRI) is uniquely positioned to evaluate<br />

implementation processes and outcomes for a diverse range of community‐based health improvement projects.<br />

Three CRI‐evaluated programs were selected for systematic retrospective review using the ISF to identify barriers to<br />

and facilitators of implementation with fidelity. The services provided by these programs include (a) community<br />

development of a system of care for transition‐age youth, (b) intensive in‐home intervention with parents involved in<br />

the judicial system in connection with substance abuse histories, and (c) a school‐based educational program to<br />

reduce risk of teen pregnancy. Data came from staff observations, evaluation records, and interviews with program<br />

managers and service providers. The symposium will provide an overview of potentially critical elements of fidelity<br />

related to each of the three systems of the ISF, and description of the investigation methodology. Presenters will<br />

highlight challenges to researchers in the Translation and Synthesis System and their implications for implementation<br />

planning. The three panelists will discuss cross‐cutting approaches to research dilemmas related to planning for<br />

implementation fidelity in a variety of programs and services.<br />

Track(s): EBP Champions, Fidelity<br />

NOTES<br />

50 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 2:00‐3:15<br />

BREAKOUT G: SUSTAINABILITY &<br />

ADAPTATION IN SOCIAL SERVICES<br />

(MC: Maria Monroe‐DeVita, PhD)<br />

<strong>Implementation</strong> Strategies in Social<br />

Service Settings: A <strong>Research</strong> Agenda<br />

Byron J. Powell, AM<br />

Washington University in St. Louis<br />

DBT Teams in Training 2008‐2011:<br />

<strong>Implementation</strong> Follow‐up in 2012<br />

Anthony DuBose, PsyD, 1 & André<br />

Ivanoff, PhD 1,2<br />

1 Behavioral Tech, LLC; 2 Columbia<br />

University<br />

Symposia – May 16<br />

Understanding Modifications to CBT in<br />

Community Settings: A Comparison of<br />

Providers in Adults & Child Mental<br />

Health Service Settings<br />

Shannon Wiltsey Stirman, PhD<br />

VA Boston Healthcare System, National<br />

Center for PTSD, & Boston University<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 51


IMPLEMENTATION STRATEGIES IN SOCIAL SERVICE SETTINGS: A RESEARCH AGENDA<br />

Enola K. Proctor, PhD, 1 Byron J. Powell, AM, 1 & J. Curtis McMillen, PhD 2<br />

1 Washington University in St. Louis; 2 The University of Chicago<br />

Contact: bjpowell@wustl.edu<br />

The prioritization of implementation research has yielded conceptual, empirical, and methodological advances that<br />

contribute to our understanding of the structures, processes, and outcomes of implementation. This is perhaps an<br />

ideal time to generate a rich set of research questions pertaining to the use of implementation strategies. This paper<br />

draws upon both the published literature and ongoing implementation research to demonstrate the challenges and<br />

opportunities associated with a number of these key questions, including:<br />

• How can community stakeholders inform research on implementation strategies?<br />

• Who are the appropriate stakeholders to be deploying implementation strategies?<br />

• To what extent are implementation strategies generalizable?<br />

• Are tailored implementation strategies more effective than other approaches?<br />

• Can the methods for selecting and designing implementation strategies be strengthened?<br />

• Can implementation strategies be adequately specified?<br />

• How can we disentangle the mutative factors of multifaceted implementation strategies?<br />

• How can (inexpensive) technologies be harnessed in developing innovative implementation strategies?<br />

• What is the economic impact of implementation strategies?<br />

• Can we develop “learning organizations” and evidence‐based systems of care?<br />

In exploring these questions, we identify associated methodological challenges and innovative approaches to<br />

addressing them, and suggest promising directions for future studies aimed at increasing our knowledge of how to<br />

implement effective evidence‐based treatments.<br />

Track(s): None specific to this conference<br />

NOTES<br />

52 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


DBT TEAMS IN TRAINING 2008‐2011: IMPLEMENTATION FOLLOW‐UP IN 2012<br />

Anthony DuBose, PsyD, 1 André Ivanoff, PhD, 1,2 Erin Miga, PhD, 1,3 Linda Dimeff, PhD, 4 & Marsha Linehan, PhD 3<br />

1 Behavioral Tech, LLC; 2 Columbia University; 3 Behavioral <strong>Research</strong> & Therapy Clinics, University of Washington;<br />

4 Portland DBT Institute<br />

Contact: apdubose@behavioraltech.org<br />

To address the challenges of implementing evidence‐based therapies in large systems, Behavioral Tech, LLC<br />

(BTECH) has begun a significant evaluation of team and system‐based DBT implementations toward developing<br />

improved methods and outcomes. As part of this, a pilot study consisted of a random sample of 50% (n=77) of<br />

teams was drawn from those completing BTECH intensive training from 2008‐2011 (n=154). Mixed methods<br />

data collection was used; an online survey including the DBT <strong>Program</strong> Elements of Treatment Questionnaire<br />

(PETQ: Schmidt, Ivanoff & Linehan, 2009) was followed by a semi‐structured telephone interview to discuss<br />

team leader questions, programs goals, and next steps. Data reveals that most programs offer individual<br />

treatment, skills training, weekly DBT consultation team, and after‐hours coaching. Approximately half of<br />

programs provide regular DBT supervision. Thirty‐nine percent of programs conduct manual‐based, selfassessment<br />

of DBT adherence, although fewer (18%) provide adherence data to staff/or clients for purposes of<br />

quality improvement. Significant implementation barriers included time constraints and staff turnover. Team<br />

leads identified “careful selection of intensive team members” and “sending more team members to intensive<br />

training” as ways to improve program functioning. In sum, while the majority of programs deliver all DBT<br />

modes, onsite DBT supervision and fidelity/adherence assessment are insufficient.<br />

Symposia – May 16<br />

Track(s): Training, Sustainability<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 53


UNDERSTANDING MODIFICATIONS TO CBT IN COMMUNITY SETTINGS: A COMPARISON OF PROVIDERS IN<br />

ADULT & CHILD MENTAL HEALTH SERVICE SETTINGS<br />

Shannon Wiltsey Stirman, PhD, 1,2 Rinad Beidas, PhD, 3 Christopher Miller, PhD, 4 Julie Edmunds, MA, 5 Mary Margaret<br />

Downey, BA, 3 Matthew Gallagher, BA, 3 Philip Kendall, PhD, ABPP, 5 Katherine Toder, 3 & Amber Calloway 6<br />

1 National Center for PTSD; 2 VA Boston Healthcare System, & Boston University; 3 University of Pennsylvania; 4 VA<br />

Center for Leadership, Organization, & Management <strong>Research</strong>; 5 Temple University; 6 University of Massachusetts at<br />

Boston<br />

Contact: sws@bu.edu<br />

Little is known about modifications to CBT that providers make following training and consultation. To optimize<br />

clinician implementation and sustained use of CBT, it is necessary to investigate provider perspectives of<br />

modifications that are necessary to make CBT feasible and sustainable in a community setting. However, a challenge<br />

to understanding the nature and implications of modifications to EBPs is a lack of consistency or comprehensiveness<br />

in the classification of adaptations. This study allows us to do so with both adult and child providers in the<br />

Philadelphia community, using a framework that was developed to characterize adaptations made to evidence‐based<br />

interventions. The present study examines follow‐up interviews conducted 2 years following the CBT training and<br />

consultation provided in Beidas et al. (2012) and Stirman et al. (2010, 2012) with adult (n=30) and child (n=50) mental<br />

health providers. The same coding system was used for both samples to examine modifications made to CBT to<br />

facilitate comparisons between groups. The findings from this study will shed much needed insight on whether and<br />

how providers modify evidence‐based treatments to make them more usable in their settings. Implications for CBT<br />

implementation and for further research will be discussed.<br />

Track(s): Sustainability<br />

NOTES<br />

54 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 2:00‐3:15<br />

BREAKOUT H: IMPLEMENTATION<br />

OF CRITICAL TIME INTERVENTION<br />

(MC: Meghan Keough, PhD)<br />

From Inception to Practice: Taking an<br />

Evidence‐Based Practice from<br />

Development to <strong>Implementation</strong><br />

Challenges & Successes in Assessing<br />

Fidelity to the CTI Model Over Time<br />

Assessing the <strong>Implementation</strong> of the<br />

Critical Time Intervention Model Across<br />

20 Homeless‐Service Agencies<br />

Symposia – May 16<br />

R. Neil Greene, PhD, & Melissa Martin,<br />

MSW<br />

Center for Social Innovation, Needham,<br />

MA<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 55


FROM INCEPTION TO PRACTICE: TAKING AN EVIDENCE‐BASED PRACTICE FROM DEVELOPMENT TO<br />

IMPLEMENTATION<br />

Daniel Herman, PhD, 1 Sarah Conover, MS, 1 Jeff Olivet, MA, 2 & Melissa Martin, MSW 2<br />

1 Critical Time Intervention Global Network, Silberman School of Social Work at Hunter College, New York, NY; 2 Center<br />

for Social Innovation, Needham, MA<br />

Contact: mmartin@center4si.com<br />

Critical Time Intervention (CTI) is a time‐limited case management model designed to prevent homelessness and<br />

other adverse outcomes in people with severe mental illness (SMI) during periods of transition in their lives, such as<br />

following discharge from hospitals, shelters, prisons and other institutions. During transitions, people often have<br />

difficulty re‐establishing themselves in satisfactory living arrangements with access to needed supports. CTI provides<br />

focused, time‐limited case management assistance during this critical period and can have enduring positive impacts.<br />

In this session, presenters will provide an overview of the CTI model and how it was developed to fit an urgent need,<br />

as well as the subsequent experimental and quasi‐experimental research on the model in a variety of settings.<br />

Presenters will describe effective approaches used to provide consultation and training of individuals and agencies<br />

for adaptation and implementation of the CTI model, and current collaborations that are enabling the model to be<br />

brought to scale to broader national and international audiences.<br />

Track(s): Training<br />

NOTES<br />

56 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


CHALLENGES & SUCCESSES IN ASSESSING FIDELITY TO THE CTI MODEL OVER TIME<br />

Sarah Conover, MS, 1 Suzanne Zerger, PhD, 2 & R. Neil Greene, PhD 2<br />

1 Critical Time Intervention Global Network, Office of Scholarship & <strong>Research</strong>, Silberman School of Social Work at<br />

Hunter College, New York, NY; 2 Center for Social Innovation, Needham, MA<br />

Contact: mmartin@center4si.com<br />

In this session, presenters will describe the development of the original “fidelity scale” for the CTI model and<br />

share considerations of how to assess fidelity within existing case management paperwork practices and<br />

protocols. The fidelity scale was developed by Sarah Conover at the Silberman School of Social Work at Hunter<br />

College. The scale has been tested and used to assist numerous agencies to implement the model in a wide<br />

range of settings and increasingly diverse populations. In our recently completed Phase II SBIR study, Sarah<br />

Conover helped the research team modify the scale and accompanying case management tracking forms. These<br />

tools were adapted to “fit” the needs and practices of these 20 homeless‐service agencies enrolled in the study,<br />

and to enable tracking of both provider and client‐level outcomes. We describe challenges and successes in<br />

helping these agencies adapt the fidelity measures to their existing practices, and in collecting, analyzing,<br />

interpreting, and sharing results from the fidelity scale. With funding from this study, the research team was<br />

able to adapt the pen‐and‐paper fidelity assessment process into an electronic format; we describe our<br />

development of a user‐friendly electronic fidelity assessment product that agencies can employ when they<br />

choose to learn and implement the CTI model.<br />

Track(s): Fidelity, Measurement, Technology<br />

Symposia – May 16<br />

Notes<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 57


ASSESSING THE IMPLEMENTATION OF THE CRITICAL TIME INTERVENTION MODEL ACROSS 20 HOMELESS‐<br />

SERVICE AGENCIES<br />

Suzanne Zerger, PhD, R. Neil Greene, PhD, Melissa Martin, MSW, & Rachael R. Kenney, MA<br />

Center for Social Innovation, Needham, MA<br />

Contact: mmartin@center4si.com<br />

With funding from a Phase II Small Business Innovation <strong>Research</strong> (SBIR) grant from the National Institute of Mental<br />

Health, the Center for Social Innovation developed an online multi‐media training on CTI which incorporates a<br />

Community of Practice approach to encourage peer‐based learning. The primary aim of this longitudinal,<br />

randomized‐control study was to compare and contrast this online training modality with a face‐to‐face training on<br />

implementation of and fidelity to the CTI model over time. In this presentation, we describe research methodologies<br />

and lessons learned in our exploration and documentation of the implementation (and adaptation) experiences of 20<br />

diverse homeless‐service agencies across the U.S. and nearly two‐hundred direct service providers engaged in<br />

implementing the model over the course of one‐year. We describe challenges and promising approaches used to<br />

assess the “readiness” of the agencies to implement the model prior to training their staff, and with documenting<br />

experiences of implementation facilitators and barriers from the perspectives of agency administrators and direct<br />

service providers through surveys, interviews, fidelity tracking forms, and in‐depth case studies. We also talk about<br />

strategies and challenges associated with assessing impacts on clients enrolled in the model.<br />

Track(s): Scale‐Up<br />

NOTES<br />

58 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 2:00‐3:15<br />

BREAKOUT I: MATCHING<br />

IMPLEMENTATION TO SETTING<br />

(MC: Sara J. Landes, PhD)<br />

Matching Training to Setting: A New<br />

<strong>Implementation</strong> Model for Dialectical<br />

Behavior Therapy<br />

Helen Best, MEd, 1 Kate Comtois, PhD,<br />

MPH, 2 Nancy A. McDonald, MS, CADC,<br />

LPC, 3 & Jamie F. Edwards, LCSW,<br />

CMFSW 4<br />

1 Treatment <strong>Implementation</strong><br />

Collaborative, LLC; 2 University of<br />

Washington; 3 Chester County<br />

Department of Human Services;<br />

4 Community Care Behavioral Health<br />

Symposia – May 16<br />

User‐Centered Design & the<br />

<strong>Implementation</strong> of Evidence‐Based<br />

Interventions<br />

Aaron R. Lyon, PhD<br />

University of Washington<br />

Designing an <strong>Implementation</strong> Strategy<br />

to Support the Multi‐Site Scale‐Up of an<br />

Evidence‐Based, Culturally Appropriate<br />

Practice Model for Intensive Family<br />

Support Services Across the Northern<br />

Territory, Australia<br />

Robyn Mildon, PhD<br />

Knowledge Exchange & <strong>Implementation</strong>,<br />

Parenting <strong>Research</strong> Centre, Australia<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 59


MATCHING TRAINING TO SETTING: A NEW IMPLEMENTATION MODEL FOR DIALECTICAL BEHAVIOR THERAPY<br />

Helen Best, MEd, 1 Katherine Anne Comtois, PhD, MPH, 2 Nancy A. McDonald, MS, CADC, LPC, 3 & Jamie F. Edwards,<br />

LCSW, CMFSW 4<br />

1 Treatment <strong>Implementation</strong> Collaborative, LLC; 2 University of Washington; 3 Chester County Department of Human<br />

Services; 4 Community Care Behavioral Health<br />

Contact: hbest@ticllc.org<br />

While Dialectical Behavior Therapy has been widely disseminated, most large scale system initiatives have focused on<br />

training DBT to adherence and how to integrate the EBP into standard system structures. The Treatment<br />

<strong>Implementation</strong> Collaborative, LLC, is testing a new model of implementation that is organized in terms of how<br />

systems implement a new treatment rather than how to train clinicians in the treatment. It is no accident that DBT<br />

skills training is the component most often misconstrued as comprehensive DBT, by clinicians and consumers, as it is<br />

the component of DBT that is most accessible to a broad audience. With this in mind, the implementation model<br />

being tested by TIC focuses first on laying a solid foundation on administrative orientation for leadership in<br />

conjunction with a solid base in DBT skills training for clinicians and programs. Once this core component is in place,<br />

clinicians and teams are trained to implement all modes of comprehensive DBT. This presentation will highlight TIC’s<br />

model for implementation using a case example and data from the implementation of DBT across 11 teams in three<br />

counties in PA, including Chester County.<br />

Track(s): EBP Champions, Training<br />

NOTES<br />

60 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


USER‐CENTERED DESIGN & THE IMPLEMENTATION OF EVIDENCE‐BASED INTERVENTIONS<br />

Aaron R. Lyon, PhD<br />

University of Washington<br />

Contact: lyona@uw.edu<br />

A well‐documented “research‐to‐practice gap” exists in which evidence‐based interventions (EBI) are unlikely to<br />

be adopted by mental health practitioners working in community settings, limiting their public health impact.<br />

This presentation discusses how the design of EBI is detracting from their ability to be effectively implemented<br />

on a large scale. Although EBI frequently produce robust effects for well‐specified problems, their design is<br />

characterized by excessive complexity, inflexibility (e.g., fidelity requirements), and steep learning curves. In this<br />

way, EBI can be said to be very well engineered (functional and able to produce their intended outcome), but<br />

badly designed.<br />

This presentation will draw from the literature on user experience and user‐centered design to address EBI<br />

design as a key implementation issue. To date, the mental health research community has done relatively little<br />

to ensure that existing EBI are appealing and accessible to their target audience. A variety of design heuristics<br />

and principles of good design will be applied to the construction of EBI with the goal of better meeting the needs<br />

of the end user (i.e., mental health practitioners). These include building EBI that are more readily learnable,<br />

demonstrate functional minimalism, decrease a user’s cognitive load, and exploit the natural constraints of the<br />

context of use. Examples drawn from ongoing projects initiated to create contextually‐appropriate and usable<br />

supports for quality improvement in school‐based mental health will also be presented.<br />

Symposia – May 16<br />

Track(s): Technology<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 61


DESIGNING AN IMPLEMENTATION STRATEGY TO SUPPORT THE MULTI‐SITE SCALE‐UP OF AN EVIDENCE‐BASED,<br />

CULTURALLY APPROPRIATE PRACTICE MODEL FOR INTENSIVE FAMILY SUPPORT SERVICES ACROSS THE<br />

NORTHERN TERRITORY, AUSTRALIA<br />

Robyn Mildon, PhD, 1 & Fiona Arney, PhD 2<br />

1 Director of Knowledge Exchange & <strong>Implementation</strong>, Parenting <strong>Research</strong> Centre; 2 Director, Australian Centre for Child<br />

Protection, University of South Australia<br />

Contact: rmildon@parentingrc.org.au<br />

In recent years, we have seen a growth in Australia of funding and delivery of “Intensive Family Support Services” for<br />

vulnerable families in an effort to improve health, safety and wellbeing of children and prevent family involvement in<br />

our child protection system, including out of home care. Despite this trend, few services adopt a coherent, evidence‐‐<br />

based program model, effective and full implementation is rarely reached or sustained, and little evaluation is done<br />

in child protection and family support on any large scale.<br />

Family support service providers and policy makers seeking an evidence‐based Practice Model will find that there is<br />

little written for a community service setting about how to go about making the critical decision of choosing and<br />

refining a comprehensive Model with the potential for significantly impacting child and family outcomes.<br />

Furthermore, there are relatively few comprehensive guidelines on quality implementation of practice specific to<br />

intensive family support. A framework based on the Quality <strong>Implementation</strong> Framework (Meyers, Durlak &<br />

Wandersman, in press) and the work of the National <strong>Implementation</strong> <strong>Research</strong> Network (NIRN), is being applied to<br />

support the scale‐up of a purpose built, evidence‐based, culturally appropriate Practice Model in multiple health and<br />

child welfare service delivery sites across the Northern Territory, Australia.<br />

This paper will describe in detail the implementation framework being applied, implementation support strategies<br />

being utilized to date to achieve early practice change and the effect these have had, and the conceptual model<br />

being used to guide the evaluation of this work.<br />

Track(s): Global Perspectives, Scale‐Up<br />

NOTES<br />

62 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 2:00‐3:15<br />

BREAKOUT J: RESEARCH‐<br />

COMMUNITY RELATIONSHIPS<br />

(MC: Suzanne Kerns, PhD)<br />

Evaluating the Success of a Statewide<br />

EBP Scale‐Up Project: The Children’s<br />

Administration‐University of<br />

Washington EBP Partnership<br />

Eric Bruns, PhD, 1 Andrea Negrete, MEd, 1<br />

& Tammy Cordova, MSW 2<br />

1 University of Washington; 2 Washington<br />

State Children’s Administration<br />

Reviewing the Use of <strong>Research</strong>‐<br />

Community Partnerships to Facilitate<br />

<strong>Implementation</strong> of Evidence‐Based<br />

Practices in Children’s Community<br />

Services<br />

Nicole Stadnick, MS, MPH<br />

SDSU/UCSD Joint Doctoral <strong>Program</strong> in<br />

Clinical Psychology & Child & Adolescent<br />

Services <strong>Research</strong> Center<br />

Symposia – May 16<br />

Developing the Autism Model of<br />

<strong>Implementation</strong> for ASD Community<br />

Providers: Use of <strong>Research</strong>‐Community<br />

Partnership<br />

Amy Drahota, PhD<br />

San Diego State University & Child &<br />

Adolescent Services <strong>Research</strong> Center<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 63


EVALUATION OF THE SUCCESS OF A STATEWIDE EBP SCALE‐UP PROJECT: THE CHILDREN’S ADMINISTRATION‐<br />

UNIVERSITY OF WASHINGTON EBP PARTNERSHIP<br />

Eric Bruns, PhD, 1 Eric Trupin, PhD, 1 Suzanne Kerns, PhD, 1 Sarah Walker, PhD, 1 Andrea Negrete, MEd, 1 Rima Ellard,<br />

MSW, 1 Tim Kelly, 2 & Tammy Cordova, MSW 2<br />

1 University of Washington; 2 Washington State Children's Administration<br />

Contact: ebruns@uw.edu<br />

In 2012, the Washington State Children’s Administration (CA) and the University of Washington (UW) Division of<br />

Public Behavioral Health and Justice Policy (PBHJP) launched an initiative to expand availability and utilization, and<br />

improve fidelity and outcomes of evidence‐based practices (EBP) relevant to core child welfare outcomes of child<br />

safety, permanency, and well‐being. This presentation will describe how key theoretical models for implementation<br />

science (e.g., Fixsen et al., 2005; Proctor et al., 2004; Shortell et al., 2009) were used to (1) create a logic model for<br />

the project, (2) develop core implementation strategies, (3) identify key indicators of success, and (4) keep the<br />

project and its collaborators organized as the Partnership evolved over time. The presentation will go on to present<br />

data on our identified indicators of success, such as number of providers trained, number of agencies implementing<br />

EBPs, rate of EBP referrals, rate of providers meeting criteria for fidelity, rate of billed units of service, and<br />

associations between EBPs and child welfare outcomes. We will conclude with a discussion of how these results point<br />

to the successes and challenges faced by this innovative state‐academic partnership.<br />

Track(s): Scale‐Up<br />

NOTES<br />

64 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


REVIEWING THE USE OF RESEARCH‐COMMUNITY PARTNERSHIPS TO FACILITATE IMPLEMENTATION OF<br />

EVIDENCE‐BASED PRACTICES IN CHILDREN’S COMMUNITY SERVICES<br />

Nicole Stadnick, MS, MPH, 1,2 Lauren Brookman‐Frazee, PhD, 2,3 Aubyn Stahmer, PhD, 2 Amy Herschell, PhD, 4 &<br />

Ann Garland, PhD 2,5<br />

1 SDSU/UCSD Joint Doctoral <strong>Program</strong> in Clinical Psychology; 2 Child & Adolescent Services <strong>Research</strong> Center;<br />

3 Department of Psychiatry, University of California, San Diego; 4 University of Pittsburgh School of Medicine;<br />

5 University of San Diego<br />

Contact: nstadnic@ucsd.edu<br />

There is growing conceptualization of research‐community partnerships (RCPs) in implementation models as<br />

critical in facilitating the uptake and sustainability of evidence‐based practices (EBPs) in community‐based<br />

services. There are a growing number of RCPs in the field of mental health services, particularly within pediatric<br />

service settings. The purpose of this study is to examine RCPs that have been used to adapt EBP interventions,<br />

training, and broader implementation models to address mental health and behavioral issues for children served<br />

in community‐based service systems. Through a comprehensive literature and grants search, independent<br />

review and consensus coding, 38 studies using RCPs for these purposes were identified. A web‐based survey<br />

completed by project principal investigators and community partners will be used to characterize the use of<br />

RCPs by examining: (1) characteristics of research studies using RCP models; (2) RCP functioning, processes, and<br />

products; (4) processes of tailoring EBPs for implementation in the community; and (3) investigator perceptions<br />

of the benefits and challenges of collaborating with community providers and consumers. Respondents<br />

included 28 PIs and community partners reporting on 18 studies involving RCPs. Survey data were analyzed<br />

using mixed quantitative and qualitative methods. Common themes may inform future collaborative projects<br />

and the development of RCP theory.<br />

Symposia – May 16<br />

Track(s): None specific to this conference<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 65


DEVELOPING THE AUTISM MODEL OF IMPLEMENTATION FOR ASD COMMUNITY PROVIDERS: USE OF A<br />

RESEARCH‐COMMUNITY PARTNERSHIP<br />

Amy Drahota, PhD, 1,2 Gregory A. Aarons, PhD, 2,3 & Aubyn C. Stahmer, PhD 2,3<br />

1 San Diego State University; 2 Child & Adolescent Services <strong>Research</strong> Center; 3 University of California, San Diego<br />

Contact: adrahota@projects.sdsu.edu<br />

ASD community providers (ASD‐CPs) provide services to children with any severity of ASD symptoms using a<br />

combination of treatment paradigms, some with an evidence‐base and some without. When evidence‐based<br />

practices (EBPs) are successfully implemented by ASD‐CPs, they can result in positive outcomes. Despite this, EBPs<br />

are often implemented unsuccessfully and other treatments used by ASD‐CPs lack supportive evidence, especially for<br />

school‐age children with ASD. While it is not well understood why ASD‐CPs are not implementing EBPs,<br />

organizational and individual characteristics likely play a role. An academic‐community collaboration (ACC)<br />

partnering ASD‐CPs and researchers will develop the autism model of implementation (AMI), a systematic process<br />

specifically for use by ASD community‐based agencies to facilitate implementation of EBPs. Using social networking<br />

and focused recruitment strategies, 13 members have joined the ACC, and begun building the collaborative. Using<br />

mixed methods, the purpose of this study is to evaluate the (1) development, (2) collaborative process, (3) function,<br />

and (4) tangible products (i.e., the AMI and related materials) of the ACC. Qualitative and quantitative data will be<br />

integrated and analyzed using triangulation, expansion, and complementarity. This study is designed to address the<br />

real‐world implications of EBP implementation in ASD community‐based agencies.<br />

Track(s): None specific to this conference<br />

NOTES<br />

66 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 3:30‐4:45<br />

LEVERAGING TECHNOLOGY<br />

(MC: Rinad Beidas, PhD)<br />

Scaling Up Assessment of Therapist<br />

Fidelity in Motivational Interviewing:<br />

Preliminary Development of the<br />

AutoMITI<br />

David C. Atkins, PhD<br />

University of Washington<br />

PracticeGround: An Online Platform to<br />

Help Therapists Learn, Implement, &<br />

Measure Impact of EBPs<br />

Gareth Holman, PhD<br />

Evidence‐Based Practice Institute,<br />

University of Washington<br />

Symposia – May 16<br />

Dialectical Behavior Therapy<br />

<strong>Implementation</strong> Process & Outcomes in<br />

VA & Community Settings<br />

Sara J. Landes, PhD, 1 & Matthew Ditty,<br />

MSW 2<br />

1 National Center for PTSD, VA Palo Alto<br />

Health Care System; 2 University of<br />

Pennsylvania School of Social Policy &<br />

Practice<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 67


SCALING UP ASSESSMENT OF THERAPIST FIDELITY IN MOTIVATIONAL INTERVIEWING: PRELIMINARY<br />

DEVELOPMENT OF THE AUTOMITI<br />

David C. Atkins, PhD, 1 Zac E. Imel, PhD, 2 Doğan Can, MSc, 3 Bo Xiao, 3 Panayiotis Georgiou, MEng, 3 & Shrikanth<br />

Narayanan, PhD 3<br />

1 University of Washington; 2 University of Utah; 3 University of Southern California<br />

Contact: datkins@uw.edu<br />

<strong>Implementation</strong> and dissemination are by nature large‐scale endeavors: How do we take evidence‐based practices<br />

and move them to general clinical use? As such, common tools in the clinical research setting do not easily translate<br />

to general use. One example is assessing therapist fidelity, in which the typical technology for assessing fidelity is to<br />

use behavioral coding systems and human raters. This “low tech” route to assessing fidelity does not scale up to<br />

larger applications and is a non‐starter for wide‐spread use. The current authors are part of a larger, interdisciplinary<br />

team developing automated methods for assessing therapist fidelity in Motivational Interviewing (MI). The current<br />

talk will provide an overview and current status of this work, discussing initial examination of automated detection of<br />

therapist reflections and empathy. Thus far, detecting reflections using linguistic tools has been quite successful,<br />

whereas assessing therapist empathy has proved more challenging. We will review both of these tasks and some<br />

reasons why there is differential effectiveness across these two domains of therapist fidelity to MI. In addition, we<br />

will briefly comment on the underlying methodology for this work, arising out of the new field of “behavioral<br />

informatics” with tools from engineering, computer science, and related disciplines.<br />

Track(s): Fidelity, Measurement, Scale‐Up, Technology<br />

NOTES<br />

68 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


PRACTICEGROUND: AN ONLINE PLATFORM TO HELP THERAPISTS LEARN, IMPLEMENT, & MEASURE<br />

IMPACT OF EBPS<br />

Kelly Koerner, PhD, & Gareth Holman, PhD<br />

Evidence‐Based Practice Institute, University of Washington<br />

Contact: kellykpracticeground@gmail.com<br />

Effective post‐graduate training and consultation are essential to successfully disseminate and implement<br />

evidence‐based practices (EBPs). However, commonly used continuing education methods produce little change<br />

in practitioner behavior. Instead intensive training models, those combining training and ongoing practice with<br />

supervision, appear most effective (Rakovshik & McManus, 2010). Yet such intensive models are expensive and<br />

difficult to take to scale. Even in the best case, in‐person expert‐led training and consultation can only reach a<br />

limited number of practitioners.<br />

PracticeGround is a scalable online alternative to traditional continuing education methods. PracticeGround is a<br />

training and performance support platform through which practitioners, trainers, and researchers work together<br />

to achieve the best possible therapy outcomes for clients. PracticeGround integrates learning, implementation<br />

support and measurement into practitioners’ routine workflow. In this talk I will layout our long‐range strategy<br />

to develop and test training and implementation methods using PracticeGround. I will report findings from our<br />

first three studies (training to do behavioral activation (Puspitasari et al (in press), enhance therapeutic<br />

relationship skills (Kanter et al, in press); and implement progress monitoring (Persons et al, 2012)).<br />

Symposia – May 16<br />

Track(s): Technology, Training<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 69


DIALECTICAL BEHAVIOR THERAPY IMPLEMENTATION PROCESS & OUTCOMES IN VA & COMMUNITY SETTINGS<br />

Sara J. Landes, PhD, 1 & Matthew Ditty, MSW 2<br />

1 National Center for PTSD, VA Palo Alto Health Care System; 2 University of Pennsylvania School of Social Policy &<br />

Practice<br />

Contact: sjlandes@uw.edu<br />

Dialectical Behavior Therapy (Linehan, 1993) is an evidence‐based cognitive behavioral psychotherapy for suicidal<br />

individuals with Borderline Personality Disorder (BPD); it is considered the gold standard treatment for BPD, suicidal<br />

behavior, and severe behavioral dyscontrol. DBT is a comprehensive treatment and consists of four modes: group<br />

skills training, individual therapy, skills coaching outside of session, and therapist consultation team. Limited data is<br />

available about how teams in real‐world settings implement full DBT programs following intensive training. Two<br />

different ongoing projects are evaluating the process of implementing DBT programs, in community settings and in<br />

Department of Veterans Affairs (VA) settings. We will present data from these studies, including what components<br />

are implemented at different time points following intensive training, barriers encountered, and qualitative<br />

descriptions of the process. We will discuss whether these data support implementation strategies that encourage<br />

implementation of all components at once or a modular approach (e.g., implementing one component at a time) and<br />

future research directions.<br />

Track(s): Scale‐Up, Technology, Training<br />

NOTES<br />

70 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 8:15‐8:45<br />

<strong>SIRC</strong> INSTRUMENT REVIEW<br />

TASKFORCE: AN OVERVIEW OF<br />

PROGRESS MADE & PLANS FOR<br />

THE FUTURE<br />

Cara C. Lewis, PhD<br />

Indiana University<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 71


UPDATE FROM THE <strong>SIRC</strong> INSTRUMENT REVIEW TASK FORCE<br />

Cara C. Lewis, PhD, 1 Ruben Martinez, BA, 1 Cameo Borntrager, PhD, 2 & Bryan Weiner, PhD 3<br />

1 Indiana University; 2 University of Montana; 3 University of North Carolina at Chapel Hill<br />

It is critical that researchers utilize psychometrically validated instruments when studying their implementation<br />

efforts to build a strong knowledge base and to avoid drawing incorrect or inappropriate conclusions. Navigating the<br />

seemingly dense landscape of instruments used in implementation science is an arduous task even for the most<br />

experienced reviewer. In an effort to move the field forward, the <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative<br />

(<strong>SIRC</strong>) has coordinated a multi‐site effort that attempts to systematically review, compile, and empirically rate<br />

instruments relevant to the study of implementation. <strong>SIRC</strong> first identified 33 distinct constructs integral to the<br />

implementation process delineated by the Consolidated Framework for <strong>Implementation</strong> <strong>Research</strong> (Damschroder et<br />

al., 2009) and the <strong>Implementation</strong> Outcomes (Proctor et al., 2010). Through <strong>SIRC</strong>’s systematic review of these<br />

constructs, we have identified and categorized over 450 instruments to be empirically rated and made available to<br />

members of <strong>SIRC</strong>. <strong>SIRC</strong>’s Instrument Review Task Force (containing over 50 members) will use the modified<br />

evidence‐based assessment criteria to rate each instrument for its psychometric strength (e.g., reliability and<br />

validity). This talk will present the results of the rating process for approximately 115 instruments tapping<br />

implementation outcomes (i.e., acceptability, adoption, appropriateness, feasibility, penetration, sustainability).<br />

Additionally, we will unveil the interactive <strong>SIRC</strong> Instrument Review Project website page.<br />

Track(s): Measurement<br />

NOTES<br />

72 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 8:45‐10:00<br />

KEY FINDINGS & FUTURE PATHS IN<br />

IMPLEMENTATION RESEARCH<br />

(MC: Suzanne Kerns, PhD)<br />

<strong>Implementation</strong> Science in an Era of<br />

Health Reform & Patient‐Centered<br />

Comparative Effectiveness <strong>Research</strong>:<br />

New Threats, New Expectations, New<br />

Opportunities<br />

Brian S. Mittman, PhD<br />

VA Greater Los Angeles Healthcare<br />

System & Kaiser Permanente Southern<br />

California<br />

Synthesis of Findings from 3 Lifestyle<br />

Behavior Change <strong>Program</strong><br />

<strong>Implementation</strong> in the VA<br />

Laura J. Damschroder, MS, MPH<br />

VA Ann Arbor Healthcare System<br />

Racial/Ethnic Disparities & the<br />

<strong>Implementation</strong> of Evidence‐Based<br />

Practices in Public Youth‐Serving<br />

Systems<br />

Antonio R. Garcia, PhD<br />

School of Social Policy & Practice, University<br />

of Pennsylvania<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 73


IMPLEMENTATION SCIENCE IN AN ERA OF HEALTH REFORM & PATIENT‐CENTERED COMPARATIVE<br />

EFFECTIVENESS RESEARCH: NEW THREATS, NEW EXPECTATIONS, NEW OPPORTUNITIES<br />

Brian S. Mittman, PhD<br />

VA Greater Los Angeles Healthcare System & Kaiser Permanente Southern California<br />

Contact: brian.mittman@va.gov<br />

The field of implementation science is enjoying a surge of attention and interest throughout the world, driven by<br />

increasing recognition of gaps in the quality and outcomes of health services (and other social services) – and gaps in<br />

the appropriate use of effective practices and innovations – as well as by economic pressures and additional policy<br />

initiatives (e.g., health reform in the US). This new attention offers the promise of increased funding and other<br />

support, but accompanied by heightened expectations and scrutiny. This presentation begins with a brief overview<br />

of these and other trends in the field of health research (e.g., the rise of patient‐centeredness and engagement as a<br />

dominant value in clinical, health services and health behavior research), and then identifies several specific threats<br />

and opportunities they pose for implementation science. Possible responses are suggested, including suggested<br />

actions by individual implementation scientists, research program leaders and funding agencies, and other key<br />

stakeholders.<br />

Track(s): None specific to this conference<br />

NOTES<br />

74 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


SYNTHESIS OF FINDINGS FROM 3 LIFESTYLE BEHAVIOR CHANGE PROGRAM IMPLEMENTATIONS IN THE VA<br />

Laura J. Damschroder, MS, MPH, & Julie C. Lowery, PhD<br />

VA Ann Arbor Healthcare System<br />

Contact: laura.damschroder@va.gov<br />

There is urgent need to implement evidence‐based lifestyle change interventions more widely in healthcare<br />

settings. However, evidence supporting multi‐dimensional, complex implementation strategies and techniques<br />

consistently shows mixed results with repeated calls for needed research into “context.” <strong>Research</strong> on context<br />

has been dominated by single or small sample case studies without using a theoretical structure to promote<br />

comparison across studies. We used the Consolidated Framework for <strong>Implementation</strong> <strong>Research</strong> (CFIR), a<br />

theory‐based taxonomy of contextual constructs, in a series of 3 implementation studies of lifestyle behavior<br />

change programs for Veterans. Some contextual factors were influential regardless of the program, such as<br />

nature and quality of networks and communications, though specific behaviors varied by program. Other<br />

contextual factors varied; e.g., the group‐based program implementation was heavily influenced by stakeholder<br />

perceptions of the relative advantage of the new program compared to other options but this construct did not<br />

influence success for the other programs. Insights into why and how these constructs manifest differently based<br />

on differences in intervention characteristics, settings, and processes will be presented. Use of a common<br />

theoretical framework allowed this synthesis to take place, demonstrating how a knowledge‐base can be more<br />

readily built to help accelerate implementation of evidence‐based lifestyle behavior change interventions.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 75


RACIAL/ETHNIC DISPARITIES & THE IMPLEMENTATION OF EVIDENCE‐BASED PRACTICES IN PUBLIC YOUTH‐<br />

SERVING SYSTEMS<br />

Antonio R. Garcia, PhD, 1 Lawrence A. Palinkas, PhD, 2 Lonnie Snowden, PhD, 3 Lisa Saldana, PhD, 4 & Patricia<br />

Chamberlain, PhD 4<br />

1 School of Social Policy & Practice, University of Pennsylvania; 2 School of Social Work, University of Southern<br />

California; 3 School of Public Health, University of California‐Berkeley; 4 Center for <strong>Research</strong> to Practice, Eugene, Oregon<br />

Contact: antgar@sp2.upenn.edu;<br />

Much of the disparities in the delivery and outcomes of child welfare services can be attributed to differences in the<br />

implementation of high‐quality evidence‐based practices (EBPs). However, no research to date has identified socioenvironmental<br />

and organizational barriers to and facilitators of successful implementation of EBPs among high and<br />

low minority concentrated areas (MCAs). To address this gap and understand the role of research evidence use in<br />

bringing EBPs to scale, the current study utilized data from a larger randomized trial comparing the use of<br />

Community Development Teams (CDTs) versus standard implementation strategies to implement Multidimensional<br />

Treatment Foster Care to scale in California and Ohio. While findings point to no differences in the number of<br />

implementation activities completed, multi‐group path analyses revealed that there are significant differences in<br />

which socio‐ecological and organizational factors significantly predict implementation outcomes between high and<br />

low MCAs. Study findings highlight that more attention should be devoted to improving the organizational social<br />

context in preparing for and actively implementing EBPs to scale to disrupt structural disparities. Further implications<br />

for practice innovation and research to improve implementation outcomes will be discussed.<br />

Track(s): Scale‐Up<br />

NOTES<br />

76 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 10:15‐11:30<br />

IMPLEMENTATION IN ZAMBIA<br />

(MC: Shannon Dorsey, PhD)<br />

<strong>Implementation</strong> of TF‐CBT in Zambia:<br />

Perspectives from Local Supervisors &<br />

Counselors<br />

Margaret Kasoma<br />

Serenity Harm Reduction <strong>Program</strong>me<br />

Zambia (SHARPZ), Lusaka, Zambia<br />

Organizational <strong>Implementation</strong> Barriers<br />

& Facilitators for Mental Health<br />

<strong>Program</strong>s in Zambia: A Mixed Methods<br />

Study<br />

Laura Murray, PhD<br />

Johns Hopkins University Bloomberg<br />

School of Public Health<br />

Mixed Methods Assessment of<br />

<strong>Implementation</strong> Barriers & Facilitators<br />

for Mental Health <strong>Program</strong>s in Zambia:<br />

Provider Level Themes<br />

Rinad Beidas, PhD<br />

University of Pennsylvania<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 77


IMPLEMENTATION OF TF‐CBT IN ZAMBIA: PERSPECTIVES FROM LOCAL SUPERVISORS & COUNSELORS<br />

Margaret Kasoma<br />

Serenity Harm Reduction <strong>Program</strong>me Zambia (SHARPZ), Lusaka, Zambia<br />

Contact: chimfwembe1957@yahoo.co.uk<br />

Global mental health is receiving increasing attention given the high burden of mental health disease and in low and<br />

middle income countries (LMIC) and the fewer than 10% of individuals with need who receive treatment. One<br />

implementation strategy for scaling up care that has received a great deal of attention is task‐shifting mental health<br />

care to lay counselors, who have little or no mental health experience. However, training lay counselors requires<br />

additional time, supervision and supports. In addition, most evidence‐based practices (EBP) are developed in highincome<br />

countries, and transported to LMIC, which have different cultures and contexts.<br />

This unique presentation focuses on the perspectives and experiences of a local counselor and now supervisor, Ms.<br />

Kasoma, on learning, implementing, and supervising a child mental health EBP (trauma‐focused CBT) in Lusaka,<br />

Zambia over seven years. Ms. Kasoma first served as a counselor on a NIH‐funded feasibility study and currently is a<br />

supervisor on a nearly completed USAID randomized trial.<br />

Ms. Kasoma will speak to issues related to learning an EBP, counselor and consumer acceptability, monitoring<br />

counselor fidelity, and cultural adaptation and acceptability.<br />

Track(s): EBP Champions, Fidelity, Global Perspectives, Scale‐Up, Training<br />

NOTES<br />

78 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


ORGANIZATIONAL IMPLEMENTATION BARRIERS & FACILITATORS FOR MENTAL HEALTH PROGRAMS IN<br />

ZAMBIA: A MIXED‐METHODS STUDY<br />

Laura Murray, PhD, 1 Rinad Beidas, PhD, 2 Stephanie Skavenski, MPH, MSW, 1 Shannon Dorsey, PhD, 3 & John<br />

Mayeya 4<br />

1 Johns Hopkins University Bloomberg School of Public Health; 2 University of Pennsylvania; 3 University of<br />

Washington; 4 Ministry of Health, Zambia<br />

Contact: lamurray@jhsph.edu<br />

The Dissemination and <strong>Implementation</strong> (D&I) literature in low‐resource countries (LRC) is often described as<br />

being in its infancy to that of the West (Thorncraft et al., 2009). Although the field of global mental health has<br />

now shown that various evidence‐based practices (EBPs) are feasible, adaptable, and effective (e.g., Bolton et<br />

al., 2007), the uptake of these interventions by Ministry of Health (MoH), non‐governmental (NGOs) or<br />

community‐based organizations (CBOs) has been sluggish at best (despite no evidence that “psychosocial<br />

programming” is effective, Bryant et al., 2012). It is likely that major barriers to uptake of EBPs include<br />

implementation factors.<br />

Participants include 65 individuals who were part of a Trauma Focused‐Cognitive Behavioral Therapy (TF‐CBT)<br />

feasibility and/or effectiveness pilot in Zambia. A mixed‐methods design will include a sequential collection of<br />

qualitative, followed by quantitative data (Palinkas et al., 2011).<br />

This study will examine how organizational structure impacts implementation factors, including adoption,<br />

appropriateness, feasibility, penetration, and sustainability (Proctor et al., 2011). A semi‐structured interview<br />

will be followed by administration of the DOOR and ORC. Results will be discussed across different<br />

organizational levels including JHU, CBOs, NGOs, and the MoH. Policy implications and future research ideas will<br />

be discussed related to implementation dilemmas.<br />

Track(s): Global Perspectives<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 79


MIXED METHODS ASSESSMENT OF IMPLEMENTATION BARRIERS & FACILITATORS FOR MENTAL HEALTH<br />

PROGRAMS IN ZAMBIA: PROVIDER LEVEL THEMES<br />

Rinad Beidas, PhD, 1 Laura Murray, PhD, 2 Shannon Dorsey, PhD, 3 Stephanie Skavenski, MSW, MPH, 2 Margaret<br />

Kasoma, 4 & John Mayeya 5<br />

1 University of Pennsylvania; 2 Johns Hopkins University Bloomberg School of Public Health; 3 University of Washington;<br />

4 Serenity Harm Reduction <strong>Program</strong>me Zambia (SHARPZ); 5 Ministry of Health<br />

Contact: rbeidas@upenn.edu<br />

A better understanding of the implementation factors involved in the uptake of mental health programming in lowresource<br />

countries (LRC) is needed. Although the field of global mental health has now shown that various evidencebased<br />

treatments (EBTs) are feasible, adaptable and effective (Bolton et al., 2003; 2007; Rahman et al., 2008), the<br />

uptake of these interventions by locally‐based organizations has been sluggish at best. It is likely that major barriers<br />

to uptake of EBPs include implementation factors. To our knowledge, no study has been conducted globally around<br />

barriers and facilitators of implementation of evidence‐based mental health practices. Participants will include 65<br />

individuals who were part of a TF‐CBT feasibility and effectiveness pilot in Zambia. The goal of this study is to<br />

qualitatively and quantitatively examine implementation factors, including acceptability, adoption, appropriateness,<br />

feasibility, fidelity, penetration and sustainability (Proctor et al., 2011). This study will employ mixed‐methods,<br />

specifically the semi‐structured interviews and the EBPAS‐50 to examine stakeholders perspectives and attitudes on<br />

implementation of evidence‐based mental health programming in Zambia. This presentation will focus on themes<br />

related to the ecological level of the individual‐provider level.<br />

Track(s): Global Perspectives<br />

NOTES<br />

80 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 12:45‐2:00<br />

BREAKOUT K: GLOBAL MODELS OF<br />

IMPLEMENTATION<br />

(MC: Rinad Beidas, PhD)<br />

Scaling Up Care for Orphans in<br />

Tanzania: A Task‐Sharing Approach to<br />

Mental Health Treatment<br />

Shannon Dorsey, PhD<br />

University of Washington<br />

A Transdiagnostic Mental Health<br />

Intervention in Low Resource Countries:<br />

An Alternative Solutions to Mental<br />

Health <strong>Implementation</strong> Challenges<br />

Laura Murray, PhD<br />

Johns Hopkins University Bloomberg<br />

School of Public Health<br />

<strong>Implementation</strong> of Cognitive Processing<br />

Therapy Provided by Community‐Based<br />

Paraprofessionals in the Democratic<br />

Republic of Congo: Influence of<br />

Therapist Factors Randomized Clinical<br />

Trial<br />

Debra Kaysen, PhD<br />

University of Washington<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 81


SCALING UP CARE FOR ORPHANS IN TANZANIA: A TASK‐SHARING APPROACH TO MENTAL HEALTH<br />

TREATMENT<br />

Shannon Dorsey, PhD, 1 Karen O'Donnell, PhD, 2 Kate Whetten, PhD, 3 Wenfeng Gong, MA, 4 Dafrosa Itemba, 5 & Rachel<br />

Manongi 6<br />

1 University of Washington; 2 Duke University School of Medicine; 3 Duke University; 4 Johns Hopkins University;<br />

5 Tanzania Women <strong>Research</strong> Foundation; 6 Kilimanjaro Christian Medical Centre<br />

Contact: dorsey2@uw.edu<br />

Global mental health is increasingly receiving research attention. Nearly ten randomized clinical trials in low and<br />

middle income countries (LMIC) have demonstrated the effectiveness of evidence‐based practices (EBP), however,<br />

only one trial focused on adolescents, none on children, despite the high mental health gap for this population<br />

(Saxena et al., 2007). Furthermore, very few D&I questions have been included in trials, despite the focus on tasksharing<br />

(training non‐mental health professionals to deliver mental health interventions) and its relevance to<br />

implementation science. We examined feasibility and clinical outcomes for children and adolescents receiving<br />

Trauma‐focused Cognitive Behavioral Therapy (TF‐CBT) in Moshi, Tanzania, an area of high HIV prevalence and<br />

orphaned children. TF‐CBT was provided to single sex groups (ages 7‐10; 11‐13), using a task‐sharing approach. The<br />

study employed the Apprenticeship Training Model (Murray, Dorsey et al., in press), developed specifically for<br />

training, supervision, and iterative, collaborative adaption with local lay counselors. Post‐treatment, children had<br />

significantly reduced PTSD and traumatic grief or shame at post‐treatment (PTSD Child‐report; β=15.38; p


A TRANSDIAGNOSTIC MENTAL HEALTH INTERVENTION IN LOW RESOURCE COUNTRIES: AN ALTERNATIVE<br />

SOLUTION TO MENTAL HEALTH IMPLEMENTATION CHALLENGES<br />

Laura Murray, PhD, 1 Shannon Dorsey, PhD, 2 Maythem Alyasiry, 3 Amir Haydary, 4 & Paul Bolton, MBBS, MPH 1<br />

1 Johns Hopkins University Bloomberg School of Public Health; 2 University of Washington; 3 Department of<br />

Psychiatry, Iraq; 4 Ministry of Health, Iraq<br />

Contact: lamurray@jhsph.edu<br />

A growing research base demonstrates that evidence‐based treatments (EBT) are transportable, adaptable,<br />

acceptable, and effective in low and middle income countries (LMIC) (e.g., Bolton et al., 2007; Patel et al., 2011)<br />

using a task‐shifting approach (i.e., counselors with limited or no prior mental health training). However, the<br />

singular focus of most EBT on one clinical problem (e.g., depression) is a barrier to scale up, reducing ability to<br />

address the mental substantial health treatment gap in LMIC. Transdiagnostic interventions teach a set of<br />

common practice elements delivered in varying combinations to address a range of problems. Components‐<br />

Based Intervention (CBI) is a transdiagnostic approach developed for LMIC and is currently being tested in two<br />

RCTs for adult survivors of torture in Southern Iraq and at the Thailand‐Burma border. Presentation will focus on<br />

the novel intervention development, training, supervision, and outcomes in both sites. Pilot cases showed a<br />

75% decrease in clinical symptoms (Iraq) and 54.5%/55.2% decrease in depression and trauma symptoms<br />

respectively (Thailand). Functioning impairment also decreased at both sites. These findings, combined with<br />

growing US‐based evidence, suggest that a transdiagnostic approach may be an alternative for implementation<br />

and scale‐up challenges of addressing mental health problems in LMIC.<br />

Track(s): Global Perspectives, Scale‐Up<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 83


IMPLEMENTATION OF COGNITIVE PROCESSING THERAPY PROVIDED BY COMMUNITY‐BASED<br />

PARAPROFESSIONALS IN THE REPUBLIC OF CONGO: INFLUENCE OF THERAPIST FACTORS RANDOMIZED CLINICAL<br />

TRIAL<br />

Debra Kaysen, PhD, 1 Shelly Griffiths, MSW, LICSW, 1 Cindy Stappenbeck, PhD, 1 Janny Jinor, MSW, LCSW, 2 Paul Bolton,<br />

MBBS, MPH, 3 Jeannie Annan, PhD, 4 Katie Robinette, MPH, 4 & Judith Bass, PhD, MPH 3<br />

1 University of Washington; 2 Morgan State, Baltimore, MD; 3 Johns Hopkins University Bloomberg School of Public<br />

Health; 4 International Rescue Committee<br />

Contact: dkaysen@uw.edu<br />

Need for mental health care services for sexual violence victims in eastern Democratic Republic of Congo (DRC) has<br />

been documented but few services exist. Therapies developed in the West with established efficacy with female<br />

rape victims have not been tested in low resource settings like DRC. Growing literature addresses the adaptability of<br />

evidence‐based psychotherapies cross‐culturally and in resource‐poor contexts. One dilemma for implementation in<br />

low resource settings is to what extent complex treatments can be delivered successfully by paraprofessionals. In<br />

this study, we will discuss results of adapted Cognitive Processing Therapy for use in DRC. Congolese communitybased<br />

paraprofessionals were trained and supervised in delivering group CPT. Hierarchical linear modeling was used<br />

to examine change over time. Based on preliminary analyses of weekly self‐report measures, there was a significant<br />

reduction in mental health symptoms over time (b= ‐2.04, p


May 17. 12:45‐2:00<br />

BREAKOUT L: INNOVATIVE<br />

SUBSTANCE ABUSE TREATMENT<br />

IMPLEMENTATION<br />

(MC: Doyanne Darnell, PhD)<br />

Scaling Up & Sustaining Alcohol & PTSD<br />

Screening & Intervention in US Trauma<br />

Care Systems<br />

Douglas Zatzick, MD<br />

University of Washington School of<br />

Medicine, Harborview Injury Prevention<br />

& <strong>Research</strong> Center<br />

Lessons Learned from Implementing a<br />

Web‐Based Tool for Brief Alcohol<br />

Interventions in a Large Integrated<br />

Health Care System<br />

Kenneth R. Weingardt, PhD<br />

Veterans Health Administration &<br />

Stanford University<br />

Disseminating Contingency<br />

Management: A Training &<br />

<strong>Implementation</strong> Trial<br />

Bryan Hartzler, PhD<br />

Alcohol & Drug Abuse Institute,<br />

University of Washington<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 85


SCALING UP & SUSTAINING ALCOHOL & PTSD SCREENING & INTERVENTION IN US TRAUMA CARE SYSTEMS<br />

Douglas Zatzick, MD, 1,2 Dennis Donovan, PhD, 1,3 Chris Dunn, PhD, 1,2 Frederick Rivara, MD, MPH, 1,2 Larry Gentilello,<br />

MD, 1,2 Joan Russo, PhD, 1 Jin Wang, PhD, 2 Jeff Love,BA, 1 Collin McFadden, BA, 1 & Gregory Jurkovich, MD 4<br />

1 University of Washington School of Medicine; 2 Harborview Injury Prevention & <strong>Research</strong> Center; 3 Alcohol & Drug<br />

Abuse Institute; 4 Denver Health Care System, Denver, CO<br />

Contact: dzatzick@uw.edu<br />

The American College of Surgeons Committee on Trauma tightly regulates United States (US) trauma center care<br />

through policy mandates and clinical guideline best practice recommendations. College mandates are reinforced<br />

through verification site visit implementation criteria. The American College of Surgeons has successfully linked<br />

trauma center funding to verification site visits and other quality indicators. This presentation will describe a unique<br />

investigative‐policy collaboration whereby federally funded empiric research on posttraumatic stress disorder (PTSD),<br />

and alcohol/drug screening and intervention has been directly translated to policy mandates and clinical guidelines<br />

for acute care medical trauma centers nationwide. The presentation will first describe the randomized clinical trial<br />

evidence base supporting alcohol and PTSD screening and intervention in acute care medical settings. Next, the<br />

presentation will focus on recent policy summits with the American College of Surgeons that have allowed for the<br />

scaling up of results of single site randomized clinical trials across trauma care systems nationwide. Findings suggest<br />

that regulatory requirements developed in concert with multidisciplinary implementation team oversight may<br />

optimally enhance the scaling up and sustainability of behavioral health treatment integration in acute care trauma<br />

center medical settings.<br />

Track(s): Scale‐Up, Sustainability<br />

NOTES<br />

86 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


LESSONS LEARNED FROM IMPLEMENTING A WEB‐BASED TOOL FOR BRIEF ALCOHOL INTERVENTIONS IN A<br />

LARGE INTEGRATED HEALTH CARE SYSTEM<br />

Kenneth R. Weingardt, PhD, 1,2 Michael A. Cucciare, PhD, 2,3 Paula L. Wilbourne, PhD, 1 & John S. Baer, PhD 4<br />

1 Veterans Health Administration; 2 Stanford University; 3 Center for Health Care Evaluation, VA Palo Alto;<br />

4 University of Washington & VA Puget Sound Health Care System<br />

Contact: kenweingardt@va.gov<br />

<strong>Research</strong> has demonstrated that computer‐based brief motivational interventions can be an efficacious means<br />

of reducing alcohol use and alcohol‐related problems. The authors designed and built a web‐based brief alcohol<br />

intervention for use in the Department of Veterans Affairs (VA) in 2005. This presentation summarizes the<br />

projects and initiatives that have been undertaken in the ensuing years to support the implementation of this<br />

tool into clinical practice. Projects have focused on a variety of clinical settings, including outpatient<br />

Readjustment Counseling (Vet Centers), specialty care for PTSD and Substance Use Disorders, and Integrated<br />

Primary Care. The authors will provide a narrative description of each initiative, and use the Consolidated<br />

Framework for <strong>Implementation</strong> <strong>Research</strong> (CFIR) to communicate how various contextual factors acted as<br />

barriers and facilitators within each setting. The presentation concludes with a summary of lessons learned, and<br />

a description of how these lessons are informing current efforts to implement the web‐based brief alcohol<br />

interventions in HCV clinics, and as part of a national initiative to roll out Motivational Enhancement Therapy<br />

(MET).<br />

Track(s): Scale‐Up, Technology<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 87


DISSEMINATING CONTINGENCY MANAGEMENT: A TRAINING & IMPLEMENTATION TRIAL<br />

Bryan Hartzler, PhD<br />

Alcohol & Drug Abuse Institute, University of Washington<br />

Contact: hartzb@uw.edu<br />

Contingency Management (CM) is an empirically‐validated behavioral treatment, for which community dissemination<br />

has been surprisingly slow. Given the considerable evidence accumulated for its efficacy, examination of<br />

implementation outcomes is paramount. A recent study involved the collaborative design of a CM intervention by a<br />

university investigator and partnering community addiction treatment clinic. The study then formally evaluated<br />

immediate impacts of a 16‐hour training workshop with clinic personnel, as well as well as eventual impacts following<br />

a 90‐day trial implementation period. A mixed method design allowed repeated measurement of clinician‐focused<br />

implementation (acceptability, appropriateness, adoption, fidelity) among 17 staff clinicians and qualitative<br />

retrospective measurement of management‐focused implementation outcomes (cost, feasibility, penetration,<br />

sustainability) among 5 executive staff. Broad clinical outcomes of the intervention were also examined via review<br />

of patient medical records, and compared to a historical control. The presentation will outline: 1) the processes of<br />

intervention design and staff training, 2) immediate and eventual impacts of training on clinician‐focused<br />

implementation domains, 3) qualitative report of clinic executives concerning management‐focused implementation<br />

domains following the trial implementation period, and 4) patient‐based intervention outcomes. The study offers a<br />

formal, comprehensive evaluation of staff training and implementation of CM at a community addiction treatment<br />

clinic, and collective results may provide useful insights for those working in or collaborating with community<br />

treatment programs.<br />

Track(s): Fidelity, Measurement, Training<br />

NOTES<br />

88 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 12:45‐2:00<br />

BREAKOUT M: STATISTICAL<br />

METHODS WORKSHOP PART I<br />

(MC: Kate Comtois, PhD, MPH)<br />

Design & Analysis Challenges with<br />

Multilevel <strong>Implementation</strong> Data<br />

David C. Atkins, PhD, 1 & Scott A.<br />

Baldwin, PhD 2<br />

1 University of Washington; 2 Brigham<br />

Young University<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 89


DESIGN & ANALYSIS CHALLENGES WITH MULTILEVEL IMPLEMENTATION DATA<br />

David C. Atkins, PhD, 1 & Scott A. Baldwin, PhD 2<br />

1 University of Washington; 2 Brigham Young University<br />

Contact: datkins@uw.edu; scott_baldwin@byu.edu<br />

<strong>Implementation</strong> research often involves multilevel data (sometimes called hierarchical, clustered, or nested data).<br />

Examples of such data include patients clustered within providers, providers clustered within sites, and therapist<br />

fidelity items clustered within therapists. Such multilevel data present a number of design and analysis challenges.<br />

Our presentation will provide a brief, general overview of multilevel models and then focus on specific challenges<br />

related to implementation research. Topics will include:<br />

<br />

<br />

<br />

<br />

How sample sizes at provider and site levels affect the use of multilevel models<br />

Advantages and disadvantages of randomizing between or within clusters (e.g., randomizing therapists<br />

within a site or randomizing sites to treatment condition)<br />

Power and sample size calculations, including costs (e.g., costs of adding providers vs. sites) and attrition<br />

How multilevel designs influence the assessment of therapist fidelity, including reliability and psychometric<br />

considerations<br />

Our goal is to provide a non‐technical introduction to these topics, emphasizing concepts as opposed to statistics.<br />

Moreover, we hope to have a highly interactive session with input and questions from the audience. Finally, there<br />

will be time for general questions on implementation designs at the end of the session, not necessarily specific to<br />

multilevel data.<br />

Track(s): Fidelity, Measurement<br />

NOTES<br />

90 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 12:45‐2:00<br />

BREAKOUT N: LEARNING FROM<br />

SCALE‐UP<br />

(MC: Meghan Keough, PhD)<br />

Overcoming <strong>Implementation</strong> <strong>Research</strong><br />

Challenges While Studying CPT Training<br />

& <strong>Implementation</strong> Across Canada<br />

Shannon Wiltsey Stirman, PhD<br />

VA Boston Healthcare System, National<br />

Center for PTSD, & Boston University<br />

Financing & Scaling Up Early<br />

Intervention Services<br />

Howard H. Goldman, MD, PhD<br />

Department of Psychiatry, University of<br />

Maryland School of Medicine<br />

System Improvement Through Service<br />

Collaboratives: Closing Gaps &<br />

Improving Access & Coordination<br />

Brian Rush, PhD<br />

Centre for Addiction & Mental Health<br />

Health Systems & Health Equity<br />

<strong>Research</strong>, Department of Psychiatry,<br />

University of Toronto, Canada<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 91


OVERCOMING IMPLEMENTATION RESEARCH CHALLENGES WHILE STUDYING CPT TRAINING &<br />

IMPLEMENTATION ACROSS CANADA<br />

Shannon Wiltsey Stirman, PhD, 1 Candice Monson, PhD, 2 Josh Deloriea, BA, 2 Jennifer Belus, BA, 3 Marta Maslej, BA, 2 &<br />

Norman Shields, PhD 4<br />

1 VA Boston Healthcare System, National Center for PTSD, & Boston University; 2 Ryerson University; 3 University of<br />

North Carolina; 4 Operational Stress Injuries National Network, Veteran Affairs Canada<br />

Contact: sws@bu.edu<br />

The overall objectives of this study are 1) to compare three different post‐workshop consultation strategies<br />

(technology‐enhanced, standard, and independent use of resources) for the implementation of Cognitive Processing<br />

Therapy with clinicians who serve Veterans in Canada, and 2) to identify system‐, site‐ and provider‐level barriers and<br />

facilitators to implementation. Consistent with the theme of this year’s conference, we will present data relevant to<br />

our efforts to solve implementation research dilemmas that arose as we executed the study. Two grant submissions<br />

and three cycles of recruitment have allowed us to iteratively refine strategies to address a number of challenges<br />

inherent in implementation research, including considerations regarding study design, recruiting and engaging<br />

clinicians, managing over 16 separate IRB submissions, addressing problems with study technology, collecting data<br />

from clinicians based at clinics throughout the country, consenting of patient participants, ethical dilemmas, budget<br />

constraints, and fidelity monitoring (for both consultation condition fidelity at the consultant level CPT fidelity at the<br />

clinician level). We will present data gathered and analyzed from meeting minutes, data tracking logs, monthly<br />

clinician activity reports, weekly consultation reports, and interviews with clinicians and consultants to demonstrate<br />

both the decision making processes around addressing these challenges, as well as the outcomes of our<br />

implementation of these strategies.<br />

Track(s): Global Perspectives, Scale‐Up, Technology, Training<br />

NOTES<br />

92 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


FINANCING & SCALING UP EARLY INTERVENTION SERVICES<br />

Howard Goldman, MD, PhD, 1 Mustafa C. Karakus, PhD, 2 & Kirsten Beronio, JD 3<br />

1 Department of Psychiatry, University of Maryland School of Medicine; 2 Westat; 3 US Department of Health &<br />

Human Services, ASPE<br />

Contact: hh.goldman@verizon.net<br />

This presentation will focus on the scalability of supported employment and early intervention services in<br />

mental health care. It is based on a series of studies performed by Westat for the Office of the Assistant<br />

Secretary for Planning and Evaluation (ASPE) in the Department of Health and Human Services. ASPE has been<br />

interested in examining policies that would promote the implementation of supported employment services and<br />

early intervention services in mental disorders over the past several years. The passage of the Affordable Care<br />

Act (ACA) has improved the prospects for financing some of these services, but barriers still remain for paying<br />

for supported employment and supported education, as well as scaling up other aspects of team‐based early<br />

intervention services. We will report on our discussions with policy makers, service providers and our<br />

observations from site visits across the U.S.<br />

Track(s): Scale‐Up<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 93


SYSTEM IMPROVEMENT THROUGH SERVICE COLLABORATIVES: CLOSING GAPS & IMPROVING ACCESS &<br />

COORDINATION<br />

Brian Rush, PhD, 1 Fiona C. Thomas, 2 & Heather McKee 2<br />

1 Centre for Addiction & Mental Health, Department of Psychiatry, University of Toronto; 2 Centre for Addiction &<br />

Mental Health, University of Toronto<br />

Contact: brian.rush@camh.ca<br />

The Systems Improvement through Service Collaboratives (SISC) initiative, sponsored by the Centre for Addiction and<br />

Mental Health, aims to improve access and coordination for those with mental health and addictions problems, with<br />

priority on children and youth in regions across Ontario, Canada. This provincial initiative spans across multiple<br />

sectors and six different Ministries, including those in the health, justice and education sectors. SISC is grounded in<br />

<strong>Implementation</strong> Science (IS) frameworks, including Quality Improvement (QI), and has a strong emphasis on both<br />

developmental evaluation and more traditional performance measurement.<br />

The presentation will focus on how this initiative embedded the National <strong>Implementation</strong> <strong>Research</strong> Network (NIRN)<br />

framework for implementing evidence‐based interventions (EBI) across the province. Evaluation data will be<br />

presented, which includes performance indicators related to key transitions (e.g. hospital – community; health –<br />

justice; youth‐adult) and measures of implementation progress, including those of maintaining fidelity to EBI. The<br />

evaluation is further comprised of assessments of partnership and collaboration, measures of health equity and case<br />

studies. The challenges of balancing between both provincial and local indicators will be discussed. Finally,<br />

contributions to the theoretical framework of IS, based on learnings from this initiative will be shared.<br />

Track(s): Global Perspectives, Scale‐Up<br />

NOTES<br />

94 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 12:45‐2:00<br />

BREAKOUT O: FIDELITY OF<br />

INTERVENTIONS ACROSS THE AGE<br />

SPECTRUM<br />

(MC: Suzanne Kerns, PhD)<br />

<strong>Implementation</strong> of the <strong>Program</strong> to<br />

Encourage Active & Rewarding Lives for<br />

Seniors (PEARLS)<br />

Leslie Steinman, MSW, MPH<br />

UW Health Promotion <strong>Research</strong> Center<br />

Common Issues with Assessing Fidelity<br />

to Complex Multi‐Modal Service<br />

<strong>Program</strong>s: Lessons Learned from<br />

Assessing Fidelity to the ACT Model<br />

Maria Monroe‐DeVita, PhD<br />

Department of Psychiatry & Behavioral<br />

Sciences, University of Washington<br />

Assessing <strong>Implementation</strong> Fidelity of<br />

the Family Check‐Up: Development &<br />

Validation of the COACH Rating System<br />

Justin D. Smith, PhD<br />

Child & Family Center, University of<br />

Oregon<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 95


IMPLEMENTATION OF THE PROGRAM TO ENCOURAGE ACTIVE & REWARDING LIVES FOR SENIORS (PEARLS)<br />

Lesley Steinman, MSW, MPH, 1 & Mark Snowden, MD, MPH 2<br />

1 Health Promotion <strong>Research</strong> Center, Department of Health Services, University of Washington; 2 Department of<br />

Psychiatry & Behavioral Sciences, University of Washington<br />

Contact: lesles@uw.edu<br />

Background: Depression is often undertreated in older adults. PEARLS is an evidence‐based depression care<br />

management program for homebound elders. Working with multiple community partners, we studied several<br />

approaches for improving implementation and adapting PEARLS to address commonly identified barriers.<br />

Methods: Ten focus groups with 40 staff were analyzed using thematic analysis to identify barriers to<br />

implementation. A formal agency plan was developed to improve implementation and the plan was evaluated using<br />

process and outcome measures. We developed a 20‐item fidelity instrument through key informant interviews and<br />

validated the instrument using known‐groups method with 12 agencies.<br />

Results: Focus groups revealed strict eligibility criteria interfered with agency’s mission to serve all clients. PEARLS<br />

modifications were piloted with interpreters for limited English‐speaking clients and for clients with major<br />

depression. Depression response and remission rates were similar to the original model (80%). <strong>Implementation</strong><br />

coaching resulted in modest improvements in referral (9% to 15%) and enrollment rates (4% to 8%). Fidelity<br />

instrument testing showed PEARLS programs had higher fidelity scores compared to other types of depression<br />

programs (p


COMMON ISSUES WITH ASSESSING FIDELITY TO COMPLEX MULTI‐MODAL SERVICE PROGRAMS: LESSONS<br />

LEARNED FROM ASSESSING FIDELITY TO THE ACT MODEL<br />

Maria Monroe‐DeVita, PhD<br />

Department of Psychiatry & Behavioral Sciences, University of Washington School of Medicine<br />

Contact: mmdv@uw.edu<br />

Fidelity assessment is a key element to ensuring that treatment programs are adherent to the intended model<br />

and can therefore anticipate achieving desired clinical outcomes; however, comprehensive evaluation of fidelity<br />

to more complex multi‐modal service programs can be difficult to achieve. For some programs, the broad range<br />

of biopsychosocial service needs of the population served require clinicians to employ more than one evidencebased<br />

practice (EBP) within the context of the larger program; in some cases, service recipients may receive<br />

treatments delivered by more than one clinician and/or in a variety of community‐based settings outside of the<br />

office. This presentation will focus on how these complex program elements may be assessed by using the new<br />

Assertive Community Treatment (ACT) fidelity scale – the TMACT – as a case study, focusing on key issues in<br />

fidelity assessment such as balancing evaluation of: (1) process and structure; (2) team and individual clinical<br />

skills; and (3) other EBPs integrated or blended within the larger service program. While this presentation will<br />

use ACT, an EBP for adults with serious mental illness, as an illustration of how these core dilemmas in fidelity<br />

assessment can be handled, implications for other service programs for different populations will be discussed.<br />

Track(s): Fidelity, Measurement<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 97


ASSESSING IMPLEMENTATION FIDELITY OF THE FAMILY CHECK‐UP: DEVELOPMENT & VALIDATION OF THE<br />

COACH RATING SYSTEM<br />

Justin D. Smith, PhD 1 Elizabeth A. Stormshak, PhD, 1 & Thomas J. Dishion, PhD 1,2<br />

1 Child & Family Center, University of Oregon; 2 Prevention <strong>Research</strong> Center, Arizona State University<br />

Contact: jsmith6@uoregon.edu<br />

Objective: We present a series of studies concerning the development and validation of an observation fidelity of<br />

implementation rating system for the Family Check‐Up (FCU). The FCU is a family‐based intervention shown to<br />

improve family management practices and reduce problem behaviors in youth ages 2‐18 (e.g., Dishion et al, 2008;<br />

Stormshak & Dishion, 2009). Method: Therapists treating families of children ages 2–17 from two randomized trials<br />

(one efficacy and one effectiveness) were rated for fidelity to the FCU in three separate studies, the final study being<br />

an experimental manipulation of the rating procedures in an attempt to improve reliability of the ratings. Results:<br />

Study 1: Variations in fidelity were associated with observed positive parenting of toddler‐age children one year after<br />

receipt of the FCU, which in turn predicted reductions in child problem behaviors the following year. Study 2:<br />

Therapists employed at community mental health agencies achieved adequate levels of fidelity, which was<br />

associated with family and child level outcomes. Study 3: Two factors of previous fidelity rating studies were<br />

identified that likely contributed to less than optimal reliability: Coder training in the FCU and access to family’s<br />

assessment data. Observed caregiver engagement, a single item in the rating system that has been found to be an<br />

important intervening variable in the relationship between fidelity and family outcomes, was also examined for<br />

validity. Conclusions: The FCU is an effective family‐based intervention that is feasible for scale‐up in multiple<br />

community service settings. The accurate and reliable assessment of fidelity of implementation is a crucial factor in<br />

training providers to deliver the intervention as intended. These studies demonstrate the validity and reliability of<br />

our fidelity rating system.<br />

Track(s): Fidelity, Measurement<br />

NOTES<br />

98 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 2:15‐3:30<br />

BREAKOUT P:<br />

EBP CHAMPION SYMPOSIUM<br />

(MC: Shannon Dorsey, PhD)<br />

<strong>Implementation</strong> of TF‐CBT Across<br />

Washington State<br />

Joe Leroy, MSW<br />

HopeSparks Family Services<br />

Dan Fox, MSW<br />

Lutheran Community Services Northwest<br />

Ron Gengler, MS<br />

Central WA Comprehensive Mental<br />

Health<br />

Lori Vanderburg, MS<br />

Compass Health<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 99


IMPLEMENTATION OF TF‐CBT ACROSS WASHINGTON STATE<br />

Joe Leroy, MSW, 1 Dan Fox, MSW, 2 Ron Gengler, MS, 3 & Lori Vanderburg, MS 4<br />

1 HopeSparks Family Services; 2 Lutheran Community Services Northwest; 3 Central WA Comprehensive Mental Health;<br />

4 Compass Health<br />

Contact: jleroy@hopesparks.org<br />

The dissemination and implementation literature is replete with challenges, barriers, and some successes to<br />

implementing and scaling up evidence‐based treatments (EBT) in community‐based settings. With a few exceptions,<br />

generally missing from the research dialogue; however, are the perspectives and voices of clinicians, supervisors, and<br />

administrators in public mental health agencies who are implementing, or being asked or mandated to implement<br />

EBT.<br />

In Washington State, the Division of Behavioral Health and Recovery funded a yearly Trauma‐focused CBT training<br />

initiative for public mental health agencies. Now in its seventh year, with a broadened, common elements focus on<br />

CBT for anxiety, depression and behavioral problems, a number of agencies have adopted CBT. This presentation<br />

brings together champions from a wide range of EBT‐adopting agencies in Washington State to present their<br />

experiences and perspectives on facilitators, barriers, success stories, and ongoing challenges at the various stages of<br />

implementation. Of particular focus for the presentation will be the balance between effective and feasible strategies<br />

for obtaining: 1) agency‐wide reach of EBT and 2) assessing clinician competence and fidelity given challenges of<br />

turnover, tight budget climate, and in Washington State, pending EBT legislation.<br />

Track(s): EBP Champions, Fidelity, Scale‐Up, Sustainability, Training<br />

NOTES<br />

100 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 2:15‐3:30<br />

BREAKOUT Q: SUSTAINABILITY<br />

(MC: Adam Carmel, PhD)<br />

Sustainability of CBT for Youth Anxiety<br />

in Community Settings Following<br />

<strong>Implementation</strong><br />

Rinad Beidas, PhD<br />

University of Pennsylvania<br />

Supporting <strong>Implementation</strong> of the<br />

Triple P System: A Standard Framework<br />

Jacquie Brown, MES, RSW, 1 & Sara van<br />

Driel, PhD 2<br />

1 Triple P International; 2 Triple P America<br />

<strong>Research</strong> <strong>Implementation</strong> within a<br />

Clinical Practice: Resolving the<br />

Science/Practice Dialectic<br />

Sally A. Moore, PhD<br />

Evidence‐Based Treatment Centers of<br />

<strong>Seattle</strong> & University of Washington<br />

Department of Psychiatry & Behavioral<br />

Sciences<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 101


SUSTAINABILITY OF CBT FOR YOUTH ANXIETY IN COMMUNITY SETTINGS FOLLOWING IMPLEMENTATION<br />

Rinad Beidas, PhD, 1 Julie Edmunds, MA, 2 Margaret Mary Downey, BA, 1 Mark Gallagher, BA, 1 Jessica Watkins, 3 & Philip<br />

C. Kendall, PhD, ABPP 2<br />

1 University of Pennsylvania; 2 Temple University; 3 Bryn Mawr College;<br />

Contact: rbeidas@upenn.edu<br />

A recent experimental study examined whether various training methods and ongoing support resulted in differential<br />

outcomes in clinicians’ skill and adherence in implementing CBT for youth anxiety (Beidas, Edmunds, & Kendall,<br />

2012). Results indicated that any of the three training methods resulted in somewhat improved skill and adherence<br />

but that consultation was the most robust predictor of therapist outcomes. However, little is known about the<br />

sustainability of therapist implementation of CBT following training and consultation. To optimize therapist<br />

implementation and sustained use of CBT, it is necessary to investigate therapist perspectives on their sustained use<br />

of CBT, as well as barriers and facilitators to use of CBT. The present study used semi‐structured interviews<br />

conducted 2 years following the training and consultation provided in Beidas et al. (2012) with 50 therapists who<br />

participated in the initial study. Provider interviews were coded for the following themes: attitudes, practice change,<br />

facilitators, barriers, adaptation, organizational factors, self‐efficacy, eclecticism, client factors, treatment factors,<br />

EBP language, and consultation. The findings from this study will shed much needed insight on how CBT for youth<br />

anxiety is sustained in community settings following training and consultation.<br />

Track(s): Sustainability, Training<br />

NOTES<br />

102 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


SUPPORTING IMPLEMENTATION OF THE TRIPLE P SYSTEM: A STANDARDIZED FRAMEWORK<br />

Jacquie Brown, MES, RSW, 1 Jenna McWilliam, PhD, 1 Sara van Driel, PhD, 1 Randy Ahn, PhD, MLIS, 1 Debbie<br />

Easton, 2 Thomas Dirscherl, 3 Ronja Born, 3 Brad Thomas, 4 Sarah Munro, 1 & Rita Bostick, MA, LPC 4<br />

1 Triple P International; 2 Triple P Parenting Canada Inc.; 3 Triple P Deutschland; 4 Triple P America<br />

Contact: jacquie.brown@triplep.net; sara@triplep.net<br />

As a result of 15 years of dissemination and expansion in more than 20 countries, Triple P recognized the need<br />

to develop a flexible but standardized framework to support implementation of the Triple P System. The<br />

standardized framework is based in the RE‐AIM (Reach, Effectiveness, Adoption, <strong>Implementation</strong>, and<br />

Maintenance) and National <strong>Implementation</strong> <strong>Research</strong> Network (NIRN) Active <strong>Implementation</strong> Frameworks and<br />

includes 5 main steps: 1) Engagement, 2) Commitment and Contracting, 3) <strong>Implementation</strong> Planning, 4) Training<br />

and Accreditation, and 5) <strong>Implementation</strong> Maintenance. In using this framework, Triple P hopes to increase<br />

utilization rates of trained practitioners, improve sustainability of the program in communities, and expand the<br />

use of Triple P as a public health approach.<br />

In this presentation, we will provide an overview the framework, including each of the steps and the key<br />

activities within those steps, and discuss how the integration of implementation experience and implementation<br />

science contributed to the development of the framework.<br />

Track(s): Global Perspectives, Scale‐Up, Sustainability<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 103


RESEARCH IMPLEMENTATION WITHIN A CLINICAL PRACTICE: RESOLVING THE SCIENCE/PRACTICE DIALECTIC<br />

Sally A. Moore, PhD, Stacy Shaw Welch, PhD, Travis Osborne, PhD, Jennifer Sayrs, PhD, & Jennifer Tininenko, PhD<br />

Evidence‐Based Treatment Centers of <strong>Seattle</strong> & University of Washington Department of Psychiatry & Behavioral<br />

Sciences<br />

Contact: smoore@ebtseattle.com<br />

Conducting research within a clinical setting is a complex endeavor, particularly given that research and clinical work<br />

are often viewed as conflicting interests. Clinicians may fear that investment in research will detract from providing<br />

optimal treatment. We will discuss the Evidence‐Based Treatment Centers of <strong>Seattle</strong> (EBTCS) as a case study<br />

illustrating the process of research implementation in a clinical setting. EBTCS is a treatment center with two primary<br />

missions: to provide evidence‐based specialty care for individuals with anxiety disorders, borderline personality<br />

disorder, and other difficulties, and to conduct research relevant to the individuals we treat. We will discuss the<br />

tension that can naturally arise between research and clinical domains and our efforts to move research from the<br />

back burner into the spotlight without sacrificing clinical care. When implementing a research program, clinical<br />

settings face multiple dilemmas, including resource allocation, creating a research infrastructure, and data collection<br />

methods. We will discuss our evolution in meeting these challenges and potential future directions for research at<br />

EBTCS, such as outcome benchmarking, mechanisms of change, and modular treatment of anxiety disorders. We<br />

have come to view research as complementary to our clinical mission, and this perspective promotes successful<br />

research implementation in clinical practice.<br />

Track(s): EBP Champion<br />

NOTES<br />

104 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 2:15‐3:30<br />

BREAKOUT R: STATISTICAL<br />

METHODS WORKSHOP PART II<br />

(MC: Kate Comtois, PhD, MPH)<br />

Design & Analysis Challenges with<br />

Multilevel <strong>Implementation</strong> Data<br />

David C. Atkins, PhD, 1 & Scott A.<br />

Baldwin, PhD 2<br />

1 University of Washington; 2 Brigham<br />

Young University<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 105


DESIGN & ANALYSIS CHALLENGES WITH MULTILEVEL IMPLEMENTATION DATA<br />

David C. Atkins, PhD, 1 & Scott A. Baldwin, PhD 2<br />

1 University of Washington; 2 Brigham Young University<br />

Contact: datkins@uw.edu; scott_baldwin@byu.edu<br />

<strong>Implementation</strong> research often involves multilevel data (sometimes called hierarchical, clustered, or nested data).<br />

Examples of such data include patients clustered within providers, providers clustered within sites, and therapist<br />

fidelity items clustered within therapists. Such multilevel data present a number of design and analysis challenges.<br />

Our presentation will provide a brief, general overview of multilevel models and then focus on specific challenges<br />

related to implementation research. Topics will include:<br />

<br />

<br />

<br />

<br />

How sample sizes at provider and site levels affect the use of multilevel models<br />

Advantages and disadvantages of randomizing between or within clusters (e.g., randomizing therapists<br />

within a site or randomizing sites to treatment condition)<br />

Power and sample size calculations, including costs (e.g., costs of adding providers vs. sites) and attrition<br />

How multilevel designs influence the assessment of therapist fidelity, including reliability and psychometric<br />

considerations<br />

Our goal is to provide a non‐technical introduction to these topics, emphasizing concepts as opposed to statistics.<br />

Moreover, we hope to have a highly interactive session with input and questions from the audience. Finally, there<br />

will be time for general questions on implementation designs at the end of the session, not necessarily specific to<br />

multilevel data.<br />

Track(s): Fidelity, Measurement<br />

NOTES<br />

106 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 2:15‐3:30<br />

BREAKOUT S:<br />

NEW IMPLEMENTATION MEASURES<br />

(MC: Doyanne Darnell, PhD)<br />

Measuring an Evidence‐Based Model of<br />

<strong>Implementation</strong>: Preliminary<br />

Development of a Survey Instrument<br />

Josef I. Ruzek, PhD<br />

National Center for PTSD, VA Palo Alto<br />

Health Care System<br />

Solving Measurement Issues in<br />

<strong>Implementation</strong> Science<br />

Ruben Martinez, BA, & Cara C. Lewis,<br />

PhD<br />

Indiana University<br />

Common Elements for Implementing<br />

Evidence‐Based Practices in Children’s<br />

Mental Health<br />

Lisa Saldana, PhD<br />

Oregon Social Learning Center<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 107


MEASURING AN EVIDENCE‐BASED MODEL OF IMPLEMENTATION: PRELIMINARY DEVELOPMENT OF A SURVEY<br />

INSTRUMENT<br />

Josef I. Ruzek, PhD, 1,2 Joan M. Cook, PhD, 1,3 Richard Thompson, PhD, 4 Stephanie Dinnen, MS, 3 James C. Coyne, PhD, 5<br />

Paula P. Schnurr, PhD, 1,6<br />

1 National Center for PTSD; 2 Stanford University; 3 Yale School of Medicine; 4 University of Illinois at Chicago; 5 University<br />

of Pennsylvania & University of Groningen, the Netherlands; 6 Geisel School of Medicine at Dartmouth<br />

Contact: josef.ruzek@va.gov<br />

To facilitate testing of comprehensive models of implementation, measures are needed that assess a broad range of<br />

theory‐based constructs within a single measurement instrument. We developed a measure assessing the six broad<br />

elements of the Greenhalgh et al. (2005) model: perceived innovation characteristics; individual characteristics of<br />

potential adopters; communication and influence; system antecedents and readiness; outer context; and<br />

implementation process. Survey and interview measures were developed, via systematic literature review of<br />

measures for associated constructs. Keywords representing 53 separate model constructs were searched to identify<br />

existing measures for each construct. These were assessed for adequate reliability, validity, and applicability to<br />

healthcare settings. Items meeting these criteria were used to guide survey/interview design; where no measure<br />

was deemed appropriate, items were created through a consensus‐based process. Approximately two items were<br />

used to assess each construct, to ensure low respondent burden. The measure was used to assess factors affecting<br />

implementation of two PTSD treatments (Prolonged Exposure, Cognitive Processing Therapy). All 229 PTSD<br />

treatment providers in 38 VA residential PTSD treatment settings were asked to complete the measure. 216 (94.3%)<br />

completed the survey. Internal consistency was generally good and results suggested that the measure may be<br />

helpful in researching and planning implementation. CPT scored significantly higher than PE on a number of factors<br />

(including relative advantage, compatibility, trialability, potential for reinvention, task issues, and augmentationtechnical<br />

support) and lower on perceived risk. The measure has several possible applications for research and<br />

implementation and can be adapted for other organizations, settings, and innovations.<br />

Themes: Measurement, Scale‐up<br />

NOTES<br />

108 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


MEASUREMENT ISSUES IN IMPLEMENTATION SCIENCE<br />

Ruben Martinez, BA, & Cara C. Lewis, PhD<br />

Indiana University<br />

Contact: rgm@indiana.edu<br />

According to Siegel (1964), “Science is measurement”, which begs the question, is measurement necessarily<br />

scientific? Unfortunately, the answer is “no”; particularly for the field of implementation science, which in its<br />

nascent state, has become vulnerable to measurement issues that threaten the strength of the developing<br />

knowledge base. It is as though the demand for the implementation of evidence‐based practices is outpacing<br />

the science (Chamberlain, Brown, & Saldana, 2011), resulting in measurement that is not always scientific (Cook<br />

& Beckman, 2006; Proctor et al., 2011; Weiner, 2009). This situation presents an alarming paradox whereby<br />

investigators are drawing conclusions based off of instruments that have not been psychometrically validated,<br />

leading to an unstable methodological ground. In order to make interpretations or draw conclusions from data<br />

and confidently generalize findings to different settings it is imperative that the measures we utilize assess what<br />

we think they are measuring‐‐a test only repeated analysis and reporting of psychometrics can affirm (Hunsley &<br />

Mash, 2011). If our measures are not empirically validated, we run the risk of constructing “a magnificent house<br />

with no foundation” (Achenbach, 2005). Perhaps a bold question is worthy of our consideration: what is the<br />

value of performing evaluative research if it is not possible to place confidence in your interpretations of the<br />

data? We will present data from a survey completed by 80 implementation stakeholders who shared their<br />

perspectives of the most pressing measurement issues in the field. Specifically, we will report on their use of<br />

measures, theoretical frameworks, mixed‐methods, and recommendations for advancing the science of<br />

implementation.<br />

Track(s): Measurement<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 109


COMMON ELEMENTS FOR IMPLEMENTING EVIDENCE‐BASED PRACTICES IN CHILDREN’S MENTAL HEALTH<br />

Lisa Saldana, PhD, & Patricia Chamberlain, PhD<br />

Oregon Social Learning Center<br />

Contact: lisas@cr2p.org<br />

With the increased focus and effort to scale‐up evidence‐based practices (EBPs) in real‐world settings comes<br />

recognition of the complexity of the process, which involves planning, training, quality assurance, and interactions<br />

among developers, system leaders, practitioners, and consumers. Little is known about which aspects of these<br />

methods are essential for successful implementation, or how to measure if they have occurred well. The Stages of<br />

<strong>Implementation</strong> Completion (SIC), was developed to assess sites’ implementation process and obtainment of<br />

implementation milestones for Multidimensional Treatment Foster Care (MTFC). The SIC is an 8‐stage tool that maps<br />

onto three phases of implementation (pre‐implementation, implementation, and sustainability). Items are defined<br />

by the activities that are identified by the developer/purveyor as necessary to implement MTFC. The SIC is being<br />

adapted for other EBPs in children’s services that address highly prevalent (e.g., anxiety) and costly (e.g., parenting<br />

for child welfare populations) mental health and behavioral problems. This presentation will examine the<br />

common/universal elements of the implementation process that have been identified through the SIC adaptation<br />

process with multiple EBP developers/purveyors. Preliminary data will be presented to demonstrate the frequency<br />

with which these elements are completed by adopting sites and the influence that this completion has on successful<br />

scale‐up.<br />

Track(s): Measurement, Scale‐Up<br />

NOTES<br />

110 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 2:15‐3:30<br />

BREAKOUT T: OUTCOMES FROM<br />

NEW INTERVENTIONS<br />

(MC: Meghan Keough, PhD)<br />

Team‐Based Exposure & Ritual<br />

Prevention for Adults with Obsessive<br />

Compulsive Disorder: An Open Trial<br />

Implemented in a Community Mental<br />

Health Center<br />

Maria Mancebo, PhD<br />

Butler Hospital, Brown University<br />

<strong>Implementation</strong> of the Family Check‐Up<br />

in Community Mental Health Agencies<br />

Justin D. Smith, PhD<br />

Child & Family Center, University of<br />

Oregon<br />

Cognitive Retraining (CR) for Attention<br />

& Working Memory for Older Adults:<br />

What to Train, to Whom, & How Long<br />

Lee Hyer, PhD, ABPP<br />

Psychiatry & Family Health, Mercer<br />

Georgia Neurosurgical Institute &<br />

Mercer School of Medicine<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 111


TEAM‐BASED EXPOSURE & RITUAL PREVENTION FOR ADULTS WITH OBSESSIVE COMPULSIVE DISORDER: AN<br />

OPEN TRIAL IMPLEMENTED IN A COMMUNITY MENTAL HEALTH CENTER<br />

Maria Mancebo, PhD, Gail Steketee, PhD, Jordana Muroff, PhD, Steven Rasmussen, MD, & Caron Zlotnick, PhD<br />

Butler Hospital, Alpert Medical School at Brown University<br />

Contact: maria_mancebo@brown.edu<br />

Objectives: This study assessed the acceptability and feasibility of a portable training program and team‐based<br />

approach to delivering Exposure and Ritual Prevention (Ex/RP) for OCD in a community mental health center (CMHC).<br />

Method: Group Ex/RP for OCD was adapted to meet the needs of low‐income adults receiving treatment in a CMHC.<br />

A training program, treatment manual, and fidelity ratings were developed to train community therapists and case<br />

managers to deliver Ex/RP. Low‐income individuals with OCD were included if they: identified OCD as their primary<br />

DSM‐IV disorder, had at least moderate OCD symptoms, were on a stable psychotropic medication regiment, and had<br />

never received a trial of Ex/RP. Participants received 14 group therapy sessions with a therapist and 10 individual<br />

coaching sessions with a case manager. Results: Nine participants entered active treatment. Feasibility and<br />

acceptability of the intervention were high, with two‐thirds (n=6) of patients completing treatment, 89% of<br />

scheduled sessions attended, and high treatment satisfaction ratings. Two therapists and six case managers<br />

completed the training program. Fidelity ratings indicated staff were able to effectively deliver manualized Ex/RP.<br />

Treatment completers experienced significant reductions in symptoms on the Yale‐Brown Obsessive‐Compulsive<br />

Scale at Posttreatment. Conclusions: Team‐based Ex/RP appears to be a feasible and acceptable intervention for<br />

individuals receiving treatment for OCD at CMHCs.<br />

Track(s): Fidelity, Training<br />

NOTES<br />

112 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


IMPLEMENTATION OF THE FAMILY CHECK‐UP IN COMMUNITY MENTAL HEALTH AGENCIES: CLINICAL<br />

EFFECTIVENESS, FIDELITY, & OTHER OUTCOMES<br />

Justin D. Smith, PhD, Elizabeth A. Stormshak, PhD, & Katherine Kavanagh<br />

Child & Family Center, University of Oregon<br />

Contact: jsmith6@uoregon.edu<br />

Objective: We examine the process and outcomes of implementation of the Family Check‐Up (FCU; Dishion &<br />

Stormshak, 2007) in low resource community mental health agencies serving high‐risk families. The FCU is a<br />

family‐based intervention shown to improve family management practices and reduce problem behaviors in<br />

youth ages 2‐18 (e.g., Dishion et al, 2008; Stormshak & Dishion, 2009). The model was also designed for<br />

implementation in community settings serving families, such as schools, primary care, and community mental<br />

health. Method: Of the 36 therapists engaged in the study, 17 were randomly assigned to the FCU condition and<br />

19 to TAU. Therapists in the FCU condition received intensive training followed by ongoing implementation<br />

support to promote uptake and counter drift. Families were pseudo‐randomly assigned to a therapist in one of<br />

the conditions: 42 families received the FCU while 32 received community treatment as usual (TAU). Results: ITT<br />

analysis revealed intervention effects on youths’ conduct problems and increased parental monitoring, which<br />

differed for male and female youth. Interventionists were found to have positive attitudes toward evidencebased<br />

practices, a desire to learn evidence‐based approaches, and acceptable implementation fidelity, assessed<br />

using the COACH observational rating system. Furthermore, variations in fidelity were associated with change in<br />

specific parenting behaviors. Conclusions: The FCU is a viable and effective family‐based community intervention<br />

as a precursor to typical community mental health services. The implications of these findings for future scale‐up<br />

efforts of the FCU will be discussed.<br />

Track(s): Scale‐Up<br />

NOTES<br />

Symposia – May 17<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 113


COGNITIVE RETRAINING (CR) FOR ATTENTION & WORKING MEMORY FOR OLDER ADULTS: WHAT TO TRAIN,<br />

TO WHOM, & HOW LONG?<br />

Lee Hyer, PhD, ABPP, 1 Christine Mullin, 2 & Laura McKensie 2<br />

1 Georgia Neurosurgical Institute & Mercer School of Medicine; 2 Mercer School of Medicine<br />

Contact: lhyer@ganeuroandspine.com<br />

It is now accepted that cognitive retraining (CR) improves attention and working memory for older adults. What to<br />

train, to whom, and how long are now important questions. We conducted three independent clinical studies in a<br />

medical school addressing different memory programs; cogmed, memory/attention training, and brainwaveR. These<br />

studies were distinctive, one being computer based (N=64), one attention/memory focused and holistic (N=112), and<br />

one a structured memory class (N= 24). All subjects had memory complaints ranging from AAMI to mild dementia.<br />

Pre and post testing were applied in all studies. We had a sham or control group in all three studies; cogmed used<br />

sham and control; memory/attention used control, and brainwaveR had a control group. We also separated the<br />

Memory Group by Risk Status, Low, Medium and High. Targets addressed new learning, memory and executive<br />

functioning, and memory habits, function, and affect. Results showed that the interventions improved on most<br />

outcomes, especially cognitive and adjustment markers, emphasizing far transfer. We discuss the research and<br />

practical aspects of the three studies, implementation issues, and lessons learned.<br />

Track(s): None specific to this conference<br />

NOTES<br />

114 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 17. 3:45‐5:00<br />

INTERAGENCY COLLABORATIVE TEAMS<br />

TO SCALE‐UP EVIDENCE‐BASED<br />

PRACTICES: PRELIMINARY RESULTS<br />

FROM A LARGE SCALE IMPLEMENTATION<br />

(MC: Maria Monroe‐DeVita, PhD)<br />

Symposium Chair: Gregory A. Aarons, PhD, 1,2<br />

Interagency Collaborative Teams for Capacity<br />

Building to Scale‐Up Evidence‐Based Practice<br />

Michael Hurlburt, PhD, 2,3 Gregory A. Aarons,<br />

PhD, 1,2 Danielle Fettes, 1,2 Cathleen Willging, 4<br />

Lawrence A. Palinkas, PhD, 2,3 & Mark J.<br />

Chaffin 5<br />

Collaboration, Negotiation, & Coalescence for<br />

Interagency‐Collaborative Teams to Scale‐Up<br />

Evidence‐Based Practice<br />

Gregory A. Aarons, PhD, 1,2 Michael Hurlburt,<br />

PhD, 2,3 Danielle Fettes, 1,2 Cathleen Willging, 4<br />

Lara Gunderson, MA, 4 Mark Chaffin, 5 &<br />

Lawrence A. Palinkas, PhD 2,3<br />

Leadership & Practice in the Face of Policy:<br />

How Supervisors & Providers Exercise<br />

Discretion in Evidence‐Based Practice<br />

<strong>Implementation</strong><br />

Lara Gunderson, MA 4 & Cathleen Willging 4<br />

Symposia – May 17<br />

1 University of California, San Diego; 2 Child &<br />

Adolescent Services <strong>Research</strong> Center;<br />

3 University of Southern California; 4 Pacific<br />

Institute for <strong>Research</strong> & Evaluation; 5 University<br />

of Oklahoma Health Science Center<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 115


INTERAGENCY COLLABORATIVE TEAMS FOR CAPACITY BUILDING TO SCALE‐UP EVIDENCE‐BASED PRACTICE<br />

Michael Hurlburt, PhD, 1,2 Gregory A. Aarons, PhD, 1,3 Danielle Fettes, 1,3 Cathleen Willging, 4 Lawrence A. Palinkas,<br />

PhD, 1,2 & Mark J. Chaffin 5<br />

1 Child & Adolescent Services <strong>Research</strong> Center; 2 University of Southern California; 3 University of California, San Diego;<br />

4 Pacific Institute for <strong>Research</strong> & Evaluation; 5 University of Oklahoma Health Science Center<br />

Contact: gaarons@ucsd.edu<br />

Background: System‐wide scale up of evidence‐based practice (EBP) in child welfare is a complex process that<br />

requires consideration of multiple system and organizational levels. Yet few strategic approaches to address such<br />

concerns exist to support EBP implementation and sustainment across a service system. Building on the Exploration,<br />

Preparation, <strong>Implementation</strong>, and Sustainment (EPIS) implementation framework, we developed and are testing the<br />

Interagency Collaborative Team (ICT) process model to implement an evidence‐based home visitation model within a<br />

large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating<br />

structured supports for successful implementation. Methods: In this paper, we describe utilization of the ICT model<br />

to scale‐up an EBP and present preliminary qualitative results from use of the implementation model. Qualitative<br />

interviews were conducted to assess challenges in building system, organization, and home visitor capacity to<br />

implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences<br />

of home visitors under the ICT model. Results: Six notable issues relating to implementation process emerged from<br />

participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c)<br />

communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early<br />

successes. These issues highlight strengths and areas for development in the ICT model. Conclusions: Use of the ICT<br />

model led to sustained and widespread use of SafeCare in San Diego County. Although some aspects of the<br />

implementation model may benefit from enhancement, rich qualitative data from implementation of SafeCare in San<br />

Diego suggest that the process model generates strong structural supports for implementation and creates<br />

conditions in which tensions between EBP structure and local contextual needs can be resolved in ways that support<br />

the expansion and maintenance of an EBP while preserving potential for public health benefit.<br />

Track(s): Scale‐Up<br />

NOTES<br />

116 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


COLLABORATION, NEGOTIATION, & COALESCENCE FOR INTERAGENCY‐COLLABORATIVE TEAMS TO SCALE‐<br />

UP EVIDENCE‐BASED PRACTICE<br />

Gregory A. Aarons, PhD, 1,2 Michael Hurlburt, PhD, 2,3 Danielle Fettes, 1,2 Cathleen Willging, 4 Lara Gunderson, MA, 4<br />

Mark Chaffin, 5 & Lawrence A. Palinkas, PhD 2,3<br />

1 University of California, San Diego; 2 Child & Adolescent Services <strong>Research</strong> Center; 3 University of Southern<br />

California; 4 Pacific Institute for <strong>Research</strong> & Evaluation; 5 University of Oklahoma Health Science Center<br />

Contact: gaarons@ucsd.edu<br />

Objective: <strong>Implementation</strong> and scale‐up of evidence‐based practices (EBPs) involves multiple stakeholders and a<br />

process that requires collaboration, negotiation, compromise and a shared vision of improving care. The present<br />

study examined the complex process involved in EBP scale‐up across an entire service system using the<br />

Interagency Collaborative Team (ICT) approach and utilizing the EPIS implementation framework. Methods:<br />

Participants were key stakeholders in a large‐scale county‐wide implementation of an EBP to reduce child<br />

neglect, SafeCare®. Semi‐structured interviews and/or focus groups were conducted with 27 individuals<br />

representing various constituents in the service system. A grounded theory approach to qualitative data<br />

collection and analysis was utilized. Results: Several challenges affected the eventual coalescence of community<br />

stakeholders in their implementation of SafeCare including, differing organizational cultures, varied<br />

organizational strategic approaches, differing definitions of collaboration, variations and competing priorities<br />

across leadership and leaders, resolution of power struggles, role ambiguity, and communication effectiveness.<br />

While the process resulted in eventual stakeholder coalescence and collaboration, each of the factors identified<br />

above impacted how stakeholders approached the EBP implementation process. Conclusions: System wide<br />

scale‐up of EBPs involves multiple stakeholders in a complex process that provides a nexus for differing agendas,<br />

priorities, leadership styles, and negotiation strategies. The term “collaboration” oversimplifies the multifaceted<br />

nature of the scale‐up process. <strong>Implementation</strong> efforts should openly acknowledge and consider the complex<br />

agendas, priorities, and interaction styles of organizations and individual stakeholders during the Exploration,<br />

Preparation, and <strong>Implementation</strong> phases across outer (system) and inner (organizational) contexts. This will<br />

allow for facilitative resolution of the concerns of each participant in the process.<br />

Track(s): Scale‐Up<br />

Symposia – May 17<br />

NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 117


LEADERSHIP & PRACTICE IN THE FACE OF POLICY: HOW SUPERVISORS & PROVIDERS EXERCISE DISCRETION IN<br />

EVIDENCE‐BASED PRACTICE IMPLEMENTATION<br />

Lara Gunderson, MA, & Cathleen Willging, PhD<br />

Pacific Institute for <strong>Research</strong> & Evaluation, University of New Mexico<br />

Contact: lgunderson@pire.org<br />

Background: As more community‐based organizations (CBOs) contract with government to deliver public services, we<br />

analyze how a unique partnership forged among a child welfare agency, a private foundation, and CBOs facilitated<br />

use of a Collaborative Training and Supervision Model (CTSM) to roll‐out an evidence‐based practice (EBP) to<br />

decrease child neglect throughout a large county in California. Partners were connected via complex, frequently<br />

evolving contracts, the contents of which were filtered through the lenses of diverse stakeholders, ranging from<br />

upper‐level county administrators to those at the frontlines of service delivery. Methods: Qualitative data collected<br />

via focus groups and semi‐structured interviews were collected and grounded theory analysis methods were utilized.<br />

Results: Analyses provided insight into how different stakeholders experienced and understood the implementation<br />

process, allowing for mid‐course corrections and informing replication of both the CTSM and EBP within other service<br />

systems. Drawing on Michael Lipsky’s theoretical approach to street‐level bureaucracy, we describe how two sets of<br />

stakeholders directly involved in EBP provision—home visitors and their supervisors—negotiated the constraints of a<br />

seemingly rigid contract to implement the EBP. We illustrated how both parties also exert discretion in their<br />

respective efforts to (a) realize their ethical obligation to serve clients, (b) adhere to CBO policies, and (c) function<br />

under limited financing. Discussion: The sometimes conflicting roles of supervisor and supervisee are complex and<br />

impacted by factors in both the outer system context and inner organizational context. Such complexity, if not<br />

resolved, may compromise fidelity to CTSM and EBP implementation requirements.<br />

Track(s): Scale‐Up<br />

NOTES<br />

118 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


May 16. 5:30 pm<br />

POSTER SESSION 4<br />

1. Barriers to & Facilitators of <strong>Implementation</strong> of<br />

Evidence‐Based Mental Health Treatments<br />

Adam Chuong, BA, Marlanea E. Peabody, BA, Shannon<br />

Wiltsey Stirman, PhD, & Jennifer E. Johnson, PhD<br />

3. Adapting a <strong>Research</strong> Tested Automated Electronic<br />

Health Record Intervention (Systems of Support to<br />

Increase Colorectal Cancer Screening) for<br />

<strong>Implementation</strong> in Safety Net Clinics (Strategies &<br />

Opportunities to Stop Colorectal Cancer in Priority<br />

Populations)<br />

Beverly Green MD, MPH, Jennifer DeVoe, MD, DPhil,<br />

& Gloria D. Coronado, PhD<br />

5. Development of An Assessment of Organizational<br />

Readiness for EBP <strong>Implementation</strong> in Public Child<br />

Welfare<br />

Catherine Roller White, MA, Adam Darnell, PhD, Lien<br />

Bragg, Kirk O'Brien, PhD, & Erin Maher, PhD<br />

7. Using Qualitative <strong>Research</strong> to Understand VA<br />

Provider Perspectives<br />

Gina M. Signoracci, PhD, Nazanin H. Bahraini, PhD,<br />

Bridget B. Matarazzo, PsyD, Jennifer H. Olson‐<br />

Madden, PhD, & Lisa A. Brenner, PhD<br />

2. Fidelity Assessment of Widely‐Disseminated but Understudied<br />

Prevention <strong>Program</strong>s: A Framework & Illustration from the<br />

Common Sense Parenting Trial<br />

W. Alex Mason, PhD, Robert G. Oats, MA, Wendi F. Cross, PhD,<br />

Mary Casey‐Goldstein, MSEd, Kevin G. Haggerty, PhD, & Koren G.<br />

Hansen<br />

4. Adapting a Multidimensional Knowledge Translation Strategy to<br />

Improve Pediatric Pain Practices in Canada: Evidence‐Based<br />

Practice for Improving Quality<br />

Bonnie Stevens, RN, PhD, Janet Yamada, RN, MSc, Carole A.<br />

Estabrooks, PhD, Jennifer Stinson, PhD, RN‐EC, CPNP, Fiona<br />

Campbell, MD, FRCA, & Shannon D. Scott, RN, PhD<br />

6. Development & Use of a Fidelity Checklist for Permanency<br />

Roundtables: A New Child Welfare Intervention<br />

Catherine Roller White, MA, Kirk O'Brien, PhD, Tyler Corwin, MA,<br />

& Anne Buher<br />

8. <strong>Implementation</strong> of an HIV Preventive Intervention in Mexico:<br />

The Roles of Context, Organizational Structure & Process, &<br />

Community Violence<br />

Gregory A. Aarons, PhD, Thomas L. Patterson, PhD, Claudia V.<br />

Chavarin, MD, & Lawrence A. Palinkas, PhD<br />

9. Utilization of the Hybrid Model to Evaluate an<br />

Adolescent Treatment Engagement Intervention<br />

Heather Spielvogle, PhD, & Faye Mishna, PhD<br />

10. Doulas to Fill the Gap: A Proposed Model of Doula Delivery of<br />

Cognitive‐Behavioral Therapy for Maternal Anxiety &<br />

Depression<br />

Margaret Mary Downey, BA, & Rinad Beidas, PhD<br />

11. A Mixed‐Methods Approach to the Intersection of<br />

Attitudes & Organizational Factors by Provider Type<br />

in Dissemination & <strong>Implementation</strong> of Evidence‐<br />

Based Practice for Child Anxiety<br />

Margaret Mary Downey, BA, Mark Gallagher, BA,<br />

Jessica Watkins, BA, Prianna Pathak, Julie Edmunds,<br />

MA & Rinad Beidas, PhD<br />

13. Elucidating the Practical Challenges &<br />

Opportunities for Implementing & Sustaining<br />

Alcohol Screening & Brief Intervention Services in<br />

Level I Trauma Centers<br />

Mijung Park, PhD, MPH, RN, Ashley Jones, Jeff Love,<br />

BA, Jin Wang, PhD Joan Russo, PhD, Dennis Donovan,<br />

PhD, Chris Dunn, PhD, Gregory Jurkovich, MD,<br />

Frederick Rivara, MD, Larry Gentilello, MD, &<br />

Douglas Zatzick, MD<br />

12. Conceptualizing the Dilemmas of <strong>Implementation</strong> Through a<br />

Socio‐Cultural Lens: A Qualitative Study Examining Tools,<br />

People, & Organizations<br />

Meaghan McCollow, MST, BCBA, Grace Blum, MA, Jacob Hackett,<br />

MEd, Yelena Patish, & Jennifer Pierce<br />

14. Identifying the Predictors of Early Versus Late Engagement in a<br />

Foster Parent Training Intervention<br />

Natalia Escobar Walsh, MS, & Joseph M. Price, PhD<br />

(Continued on next page)<br />

Posters<br />

4 The poster list includes all authors, but all may not necessarily be present for the poster session.<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 119


May 16. 5:30 pm<br />

POSTER SESSION CONTINUED<br />

15. Determining Factors Important in Influencing ASD<br />

Community Stakeholders Participation in an<br />

Academic‐Community Collaboration<br />

Rosemary Meza, BA<br />

17. Exploring Training Therapist Use of Cognitive<br />

Behavioral Therapy to the Prediction of Early<br />

Depression Symptom Reduction<br />

Taylor Marshall, Brigid Marriott, Sofia Braga, MA,<br />

Meredith Boyd, Mark Crossen, BA, & Cara C. Lewis,<br />

PhD<br />

19. Who Receives Evidence‐Based Psychotherapies?<br />

Characteristics of Female Veterans from a Practice<br />

Setting<br />

Andrea DeVito, Cassidy Gutner, Alexandra Dick, MA,<br />

Sam Meisel, & Shannon Wiltsey Stirman, PhD<br />

16. A Multi‐Level Framework for <strong>Implementation</strong> Science<br />

Ruben G. Martinez, BA, & Cara C. Lewis, PhD<br />

18. Community Assessment & EBP <strong>Implementation</strong>: Illness<br />

Management & Recovery (IMR) & Integrated Dual Disorder<br />

Treatment (IDDT)<br />

Shannon Blajeski, MSW<br />

20. Comparing Self, Clinician, & Observer Reports of Cognitive<br />

Processing Therapy (CPT) Adherence<br />

Samuel Meisel, Amber Calloway, Alexandra Dick, MA, Andrea<br />

DeVito, Ann Rasmusson, & Shannon Wiltsey Stirman, PhD<br />

21. Improving our Capacity for Evidence‐Based PTSD<br />

Treatment: Developing an Effective Model of Post‐<br />

Workshop Consultation<br />

Meredith S. H. Landy, Kelly McShane, Sheena Bance,<br />

Shannon Wiltsey Stirman, PhD, Josh Deloriea, BA,<br />

Marta Maslej, & Candice M. Monson, PhD<br />

120 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


1. BARRIERS TO & FACILITATORS OF IMPLEMENTATION OF EVIDENCE‐BASED MENTAL HEALTH<br />

TREATMENTS<br />

Adam Chuong, BA, 1 Marlanea E. Peabody, 1 Shannon Wiltsey Stirman, PhD, 2 & Jennifer E. Johnson, PhD 1<br />

1 Brown University; 2 VA Boston Healthcare System, National Center for PTSD, & Boston University<br />

Contact: adam_chuong@brown.edu<br />

Individuals with psychiatric disorders have substantially increased risk of incarceration. There is a critical need to<br />

incorporate evidence‐based treatment (EBT) into prison mental health treatment. Number of Stakeholders from<br />

two different prison systems was surveyed to identify barriers and facilitators of EBT implementation.<br />

Descriptive analyses suggest that stakeholders view rehabilitation as an important aspect of incarceration.<br />

Furthermore, the majority of stakeholders express a high interest in, awareness of, and concern about<br />

treatments of depression. Stakeholders rated workload burden and potential inconvenience as the greatest<br />

obstacles to implementation. Supervision and clinical fit were rated as the greatest facilitators of<br />

implementation.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 121


2. FIDELITY ASSESSMENT OF WIDELY‐DISSEMINATED BUT UNDERSTUDIED PREVENTION PROGRAMS: A<br />

FRAMEWORK & ILLUSTRATION FROM THE COMMON SENSE PARENTING TRIAL<br />

W. Alex Mason, PhD, 1 Robert G. Oats, MA, 1 Wendi F. Cross, PhD, 2 Mary Casey‐Goldstein, MSEd, 3 Kevin G. Haggerty,<br />

PhD, 3 & Koren G. Hansen 3<br />

1 Boys Town National <strong>Research</strong> Institute; 2 University of Rochester Medical Center; 3 University of Washington<br />

Contact: walter.mason@boystown.org<br />

Common Sense Parenting (CSP) is a promising and widely disseminated but understudied parenting program that is<br />

being tested in a recently initiated randomized trial. To facilitate evaluation, programs must be implemented as<br />

manualized, which requires monitoring of fidelity and feedback for performance improvement. This is challenging for<br />

programs, like CSP, developed in community contexts, which often lack well‐established fidelity assessment tools and<br />

protocols that meet rigorous research standards. This presentation presents a framework for fidelity assessment<br />

relevant for widely used but understudied preventive interventions, and reports fidelity data from the CSP trial.<br />

Overall, 321 families were recruited into the study and randomly assigned to either the standard CSP program (n =<br />

118), a modified CSP+ program (n = 95), or a control condition (n = 108). Parent workshops were videotaped, and a<br />

random sample of videotapes were coded for adherence to the key components of this structured program, and for<br />

the quality of delivery. There was a high degree of correspondence in the ratings (96% agreement) and adherence<br />

was high (95%). Quality ratings met overall standards and generally showed improvement from the beginning to the<br />

end of the intervention phase, as fidelity data were used for supervision and feedback.<br />

Track(s): Fidelity<br />

NOTES<br />

122 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


3. ADAPTING A RESEARCH TESTED AUTOMATED ELECTRONIC HEALTH RECORD INTERVENTION (SYSTEMS<br />

OF SUPPORT TO INCREASE COLORECTAL CANCER SCREENING) FOR IMPLEMENTATION IN SAFETY NET<br />

CLINICS (STRATEGIES & OPPORTUNITIES TO STOP COLORECTAL CANCER IN PRIORITY POPULATIONS)<br />

Beverly Green MD, MPH, 1 Jennifer DeVoe, MD, DPhil, 2 & Gloria D. Coronado, PhD 3<br />

1 Group Health <strong>Research</strong> Institute; 2 OCHIN; 3 The Center for Health <strong>Research</strong>, Kaiser Permanente Northwest<br />

Contact: green.b@ghc.org<br />

Background: Systems of Support to Increase Colorectal Cancer (CRC) Screening (SOS) was a randomized trial that<br />

leveraged electronic health record (EHR) data to implement a stepwise intervention to increase CRC screening.<br />

Compared to usual care (UC), intervention patients were more likely to be current for CRCS in both study years,<br />

with incremental increases by intervention intensity: UC 26.5%, Automated 50.7%, Assisted 57.7%, and<br />

Navigated 64.4% (P


4. ADAPTING A MULTIDIMENSIONAL KNOWLEDGE TRANSLATION STRATEGY TO IMPROVE PEDIATRIC PAIN<br />

PRACTICES IN CANADA: EVIDENCE‐BASED PRACTICE FOR IMPROVING QUALITY<br />

Bonnie Stevens, RN, PhD, 1 Janet Yamada, RN, MSc, 1,2 Carole A. Estabrooks, PhD, 2 Jennifer Stinson, PhD, RN‐EC,<br />

CPNP, 1,2 Fiona Campbell, MD, FRCA, 1,2 & Shannon D. Scott, RN, PhD 2<br />

1 Hospital for Sick Children & University of Toronto, 2 University of Alberta<br />

Contact: b.stevens@utoronto.ca<br />

An innovative, multidimensional knowledge translation (KT) approach, the Evidence‐Based Practice for Improving<br />

Quality (EPIQ), was originally developed and effectively used to reduce nosocomial infection and bronchopulmonary<br />

dysplasia in Neonatal Intensive Care Units (Lee, et al. 2009). EPIQ was adapted by the CIHR Team in Children’s Pain<br />

(Stevens et al., 2006‐2011) and evaluated in a broader hospitalized pediatric population in terms of pediatric pain<br />

practices (assessment and management). EPIQ is guided by the Promoting Action on <strong>Research</strong> <strong>Implementation</strong> in<br />

Health Sciences (PARiHS) framework and rooted in continuous quality improvement (CQI) methods. EPIQ consists of<br />

two phases: 1.) Preparation Phase: establishing a group of healthcare professional facilitators to promote practice<br />

change at the unit level, training them to implement the EPIQ strategy, and identifying a pain practice aim as the<br />

focus of improvement. 2.) <strong>Implementation</strong> and Change Phase: implementing change processes in rapid Plan‐Do‐<br />

Study‐Act (PDSA) cycles; choosing, developing and implementing KT strategies (e.g., audit and feedback) to address<br />

the unique needs and culture of the unit; and monitoring improvement. Results indicate that EPIQ was successful in<br />

promoting improved pediatric pain practices in 8 pediatric hospitals across Canada.<br />

Track(s): Global Perspectives<br />

NOTES<br />

124 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


5. DEVELOPMENT OF AN ASSESSMENT OF ORGANIZATIONAL READINESS FOR EBP IMPLEMENTATION IN<br />

PUBLIC CHILD WELFARE<br />

Catherine Roller White, MA, Adam Darnell, PhD, Lien Bragg, Kirk O'Brien, PhD, & Erin Maher, PhD<br />

Casey Family <strong>Program</strong>s<br />

Contact: crwhite@casey.org<br />

Although recent Children’s Bureau waiver and grant applications encourage the use of implementation science<br />

(e.g., Title IV‐E waiver applications), the child welfare sector has been particularly slow to embrace this<br />

methodology to examine successful adoption of evidence‐based practice. <strong>Implementation</strong> science has identified<br />

a number of factors at the organization and system level that can hinder or promote adoption of evidence‐based<br />

practice. Further, implementation science is developing a body of literature to examine these factors specifically<br />

in the child welfare arena. Based on a review of this literature and first‐hand knowledge of challenges faced in<br />

promoting organizational and system change in public child welfare jurisdictions, we will describe a conceptual<br />

model of organizational readiness for adoption of evidence‐based practice. This conceptual model serves as the<br />

foundation for development of a brief instrument to assess organizational readiness for change. The instrument<br />

is intended for use by internal or external organizational change agents and is expected to serve a dual role to<br />

support organization development and to provide data for implementation research. A poster session is an ideal<br />

forum for dialogue as it will assist us in refining the content of the instrument, ultimately maximizing its utility<br />

for both practitioners and researchers.<br />

Track(s): Measurement<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 125


6. DEVELOPMENT & USE OF A FIDELITY CHECKLIST FOR PERMANENCY ROUNDTABLES: A NEW CHILD WELFARE<br />

INTERVENTION<br />

Catherine Roller White, MA, Kirk O'Brien, PhD, Tyler Corwin, MA, & Anne Buher<br />

Casey Family <strong>Program</strong>s<br />

Contact: crwhite@casey.org<br />

Permanency Roundtables (PRTs) are intensive case consultations in which action plans are developed to assist<br />

children and youth in foster care achieve legal permanency (defined as adoption, guardianship, or reunification). To<br />

date, PRTs have occurred in over 130 jurisdictions in 30 states across the country, and while specific elements are<br />

considered to be central to the PRT model, there has been no standard instrument with which to measure fidelity.<br />

The PRT Fidelity Checklist was developed in response to this need. Surveys were administered to 248 case managers<br />

who participated in PRTs in Alabama, Colorado, Florida, and Ohio in 2010. Using confirmatory factor analysis, four<br />

subscales were created from 16 items: Engagement, Resources, Identifying Relatives, and Focus. Items that did not<br />

load onto any factors were discarded or used as single‐item predictors. Predictive analyses examined the degree to<br />

which fidelity to the PRT model predicts achievement of legal permanency in a sample of 726 youth, controlling for<br />

youth demographics, risk factors, placement history, and caseworker background and attitudes. We will discuss<br />

lessons learned about the creation of a fidelity checklist with conference attendees and seek their input on further<br />

enhancement of the instrument.<br />

Track(s): Fidelity<br />

NOTES<br />

126 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


7. USING QUALITATIVE RESEARCH TO UNDERSTAND VA PROVIDER PERSPECTIVES<br />

Gina M. Signoracci, PhD, Nazanin H. Bahraini, PhD, Bridget B. Matarazzo, PsyD, Jennifer H. Olson‐Madden, PhD,<br />

& Lisa A. Brenner, PhD<br />

Veterans Integrated Service Network (VISN) 19 Mental Illness <strong>Research</strong>, Education, & Clinical Center (MIRECC);<br />

University of Colorado Denver, School of Medicine, Department of Psychiatry<br />

Contact: Gina.Signoracci@va.gov<br />

Background: Operation Enduring Freedom (OEF) and Operation Iraqi Freedom (OIF) military personnel are<br />

presenting to the Veterans Affairs (VA) healthcare system with complex needs that may require utilization of<br />

various service lines for comprehensive treatment. This study aimed to gather information from VA<br />

professionals regarding necessary resources to provide services to OEF/OIF Veterans. Thirteen semi‐structured<br />

interviews were conducted to describe provider perspectives regarding: OEF/OIF Veteran clinical needs;<br />

collaboration and referral processes; barriers to providing treatment; needs and resources regarding service<br />

delivery; psychiatric outcomes; and professional satisfaction.<br />

Methods: De‐identified interviews were transcribed in their entirety and cross‐checked for accuracy. Members<br />

of the research team first identified themes in the above‐listed domains independently, and then met as a group<br />

to achieve thematic consensus. Saturation was considered to have been reached at the point at which no new<br />

information or themes were identified.<br />

Results: Themes included acuity and intensity of OEF/OIF distress and symptoms, need for professional trainings,<br />

and a sense of purpose with regard to providing services to this cohort. In‐depth information regarding themes<br />

and select quotes will be presented.<br />

Conclusions: Study results may help inform VA healthcare systems about provider experiences and service needs<br />

of OEF/OIF Veterans.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 127


8. IMPLEMENTATION OF AN HIV PREVENTIVE INTERVENTION IN MEXICO: THE ROLES OF CONTEXT,<br />

ORGANIZATIONAL STRUCTURE & PROCESS, & COMMUNITY VIOLENCE<br />

Gregory A. Aarons, PhD, 1 Thomas L. Patterson, PhD, 1 Claudia V. Chavarin, MD, 1 & Lawrence A. Palinkas, PhD 2<br />

1 University of California, San Diego, Department of Psychiatry; 2 University of Southern California<br />

Contact: gaarons@ucsd.edu<br />

This study describes the initial implementation of Mujer Segura (Healthy Woman), an evidence‐based HIV/STI<br />

preventive intervention for female sex workers in Mexico. Our conceptual framework addresses outer (system) and<br />

inner (organizational) contextual factors that can enhance or limit effective implementation. Mixed<br />

quantitative/qualitative methods are being used to assess the train‐the‐trainer implementation model being utilized<br />

in twelve cities and eight states in Mexico. Mujer Segura is currently being implemented at thirteen sites throughout<br />

Mexico. We conducted focus groups and interviews with participants at multiple organizational levels (i.e., site<br />

directors, physicians, nurses, outreach workers) and completed observations of meetings and interactions of study<br />

stakeholders. For the present study we analyzed focus group data using grounded theory techniques to identify<br />

predominant themes and issues impact implementation progress. Preliminary results indicate that early<br />

implementation was influenced by variability in leadership, team member commitment to effective outreach,<br />

workforce stability across organization levels, creative problem solving, and a facilitative and supportive work<br />

environment. Variability in state level policies and community violence also impacted implementation progress and<br />

effectiveness. Our examination of contextual, organizational, and individual implementation factors will inform the<br />

further development of strategies to support effective evidence‐based practice implementation and sustainment in<br />

low‐resourced settings.<br />

Track(s): Global Perspectives, Training<br />

NOTES<br />

128 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


9. UTILIZATION OF THE HYBRID MODEL TO EVALUATE AN ADOLESCENT TREATMENT ENGAGEMENT<br />

INTERVENTION<br />

Heather Spielvogle, PhD, 1 & Faye Mishna, PhD 2<br />

1 University of Washington; 2 University of Toronto<br />

Corresponding author contact: spielvog@uw.edu<br />

Premature service dropout is a common problem in child and adolescent community mental health settings.<br />

Blending elements of efficacy and effectiveness research, this research used a hybrid model (Carroll &<br />

Rounsaville, 2003) to explore the impact of an engagement intervention (i.e., motivational interviewing) on<br />

adolescent treatment attendance in partnership with four Toronto‐based community mental health centers.<br />

The results of this randomized pilot study (n=51) demonstrated that the engagement intervention had a<br />

medium effect (d=.51) on initial treatment attendance. This poster will focus on the outcomes of the<br />

intervention on mediating variables (i.e., self‐efficacy, autonomy, and alliance) and initial treatment attendance.<br />

This poster will also demonstrate the process of collaborating with agency leadership and clinical staff from<br />

stakeholder engagement to follow‐up.<br />

Track(s): Global Perspectives<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 129


10. DOULAS TO FILL THE GAP: A PROPOSED MODEL OF DOULA DELIVERY OF COGNITIVE‐BEHAVIORAL<br />

THERAPY FOR MATERNAL ANXIETY & DEPRESSION<br />

Margaret Mary Downey, BA, & Rinad Beidas, PhD<br />

University of Pennsylvania<br />

Contact: mdowney@upenn.edu<br />

Evidence‐based interventions (EBIs) have grown exponentially in recent decades. However, calls continue for broader<br />

adoption in the United States in order for their potential to be affordably and effectively realized. Proposals to close<br />

the development‐implementation gap include brief interventions (O’Connor and Whaley 2003) and utilizing<br />

paraprofessionals (Rotheram‐Borus et al 2012). One sector of consideration for such proposals is maternal mental<br />

health. It remains a pressing issue in EBIs given the prevalence of maternal anxiety and depression and their<br />

association with other health indicators for mother, partner, and newborn (Patel 2004). Using task‐shifting/sharing<br />

approaches, this concept poster outlines a hypothetical model of Cognitive Behavioral Therapy (CBT) delivery, an EST<br />

for anxiety and depression, by doulas. Doulas are trained individuals who provide emotional, physical, and<br />

informational support to clients across pregnancy. Continuous labor support as offered by doulas is a demonstrated<br />

effective practice for improving several maternal health outcomes (e.g., shortening labors, lowering instrumental<br />

intervention rates, improving newborn health factors and maternal self‐esteem and satisfaction) and continues to<br />

grow in popularity (Hodnett 2007). Doulas operate as “paraprofessionals”, e.g., agents of knowledge transfer<br />

between mother and clinical staff and sharing support duties. Doula provision of CBT effectively and efficiently<br />

complements doula care models to potentially fill a critical EBI gap.<br />

Track(s): None specific to this conference<br />

NOTES<br />

130 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


11. A MIXED‐METHODS APPROACH TO THE INTERSECTION OF ATTITUDES & ORGANIZATIONAL FACTORS<br />

BY PROVIDER TYPE IN DISSEMINATION & IMPLEMENTATION OF EVIDENCE‐BASED PRACTICE FOR CHILD<br />

ANXIETY<br />

Margaret Mary Downey, BA, 1 Mark Gallagher, BA, 2 Jessica Watkins, BA, 3 Prianna Pathak, 3 Julie Edmunds, MA 2 &<br />

Rinad Beidas, PhD 1<br />

1 University of Pennsylvania; 2 Temple University; 3 Bryn Mawr College<br />

Contact: mdowney@upenn.edu<br />

As research on dissemination and implementation of evidence‐based practice (EBP) proliferates, understanding<br />

contextual characteristics that influence implementation is critical. Two levels of the ecological model, therapist<br />

and organizational, are worthy of consideration, as quantitative evidence suggests attitudes and organizational<br />

characteristics remain significant factors in implementation (Aarons 2006). However, little research explores the<br />

relationship between therapist attitudes toward EBPs, therapist type (e.g., doctoral vs. non‐doctoral),<br />

organizational factors, and their impact on dissemination and implementation (Aarons 2004). Previous work<br />

indicates doctoral providers may endorse differences in implementation factors including adequacy of resources<br />

and motivation for change compared to non‐doctoral counterparts (Downey et al 2012). The current study<br />

approaches this question using a mixed‐methods framework. Participants include 50 therapists trained in an EBP<br />

for child anxiety two years prior (Beidas, Edmunds, Marcus & Kendall, 2012). We conducted semi‐structured<br />

qualitative interviews to examine themes and relationships between attitudes and organizational characteristics<br />

stratified by therapist type (doctoral (6) vs. non‐doctoral (44)). We developed and applied a comprehensive<br />

coding scheme to produce fine‐grained descriptive analyses of the role of organizational and therapist‐level<br />

characteristics in implementation. To assess reliability and robustness, a sample of transcripts was separately<br />

coded by researchers to compare applications of the scheme.<br />

This research was supported by NIMH grant: F31 MH083333<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 131


12. CONCEPTUALIZING THE DILEMMAS OF IMPLEMENTATION THROUGH A SOCIO‐CULTURAL LENS: A<br />

QUALITATIVE STUDY EXAMINING TOOLS, PEOPLE, & ORGANIZATIONS<br />

Meaghan McCollow, MST, BCBA, Grace Blum, MA, Jacob Hackett, MEd, Yelena Patish, & Jennifer Pierce<br />

University of Washington<br />

Contact: meaghm@uw.edu<br />

With the emphasis placed on evidence‐based practices (EBPs) in special education (IDEiA, 2004) and the gap that<br />

exists between research and practice (Fixsen et al., 2005; Odom, 2008), it is important to know how evidence‐based<br />

practices are understood by implementers.<br />

A framework was developed that drew upon socio‐cultural theory of practice (Rogoff, Baker‐Sennett, Lacasa, &<br />

Goldsmith, 1995). The framework provides a means of analyzing factors of implementation related to implementers<br />

(people), classrooms/schools/districts (organizations), and methods of assessment/curriculum/practice (tools). A<br />

qualitative study was then conducted to examine how special education teachers and administrators in five school<br />

districts interpret and make decisions about the implementation of EBPs. The findings indicate the complexities and<br />

factors that shape practitioners’ interpretations and actions related to the implementation of evidence‐based<br />

practices. The analysis of the data using the framework made evident potential variables that might influence the<br />

implementation of EBPs in schools. We will propose and discuss the implications for how this conceptual framework<br />

can deepen our understanding of the complexities of implementation.<br />

Track(s): None specific to this conference<br />

NOTES<br />

132 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


13. ELUCIDATING THE PRACTICAL CHALLENGES & OPPORTUNITIES FOR IMPLEMENTING & SUSTAINING<br />

ALCOHOL SCREENING & BRIEF INTERVENTION SERVICES IN LEVEL I TRAUMA CENTERS<br />

Mijung Park, PhD, MPH, RN, 1 Ashley Jones, 2 Jeff Love, BA, 2 Jin Wang, PhD 2 Joan Russo, PhD, 2 Dennis Donovan,<br />

PhD, 2 Chris Dunn, PhD, 2 Gregory Jurkovich, MD, 2 Frederick Rivara, MD, 2 Larry Gentilello, MD, 2 & Douglas Zatzick,<br />

MD 2<br />

1 University of Pittsburgh School of Nursing, 2 University of Washington School of Medicine<br />

Contact: parkm@pitt.edu<br />

Alcohol‐use is common among patients who are admitted to a Level I trauma centers. The American College of<br />

Surgeons’ mandated level I trauma centers screen injured patients for an alcohol‐use problems provide an<br />

intervention to those who screen positive. However, marked variability has been observed in prior nationwide<br />

surveys of trauma center readiness to implement alcohol screening and brief intervention (SBI).<br />

The purpose of this paper is to elucidate the challenges and opportunities for implementing and sustaining<br />

alcohol SBI at trauma centers. We conducted semi‐structured telephone interviews with clinicians from<br />

intervention (n=10) and control (n=8) trauma centers that were part of a multi‐site randomized clinical trial<br />

designed to test the implementation of high quality alcohol SBI. Transcribed interviews were organized and<br />

managed using the Atlas‐Ti computer program. Thematic analysis was used as an analytic method.<br />

The ability to implement alcohol SBI across sites varied markedly, yet occurred in all sites in response to the<br />

American College of Surgeons mandate. Overall, intervention sites reported that receiving implementation<br />

guidance was helpful for sustaining SBI services after study completion. Practical challenges occurred at all sites<br />

and included lack of implementation resources and/or existing resources that were not well aligned with SBI<br />

implementation needs.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 133


14. IDENTIFYING THE PREDICTORS OF EARLY VERSUS LATE ENGAGEMENT IN A FOSTER PARENT TRAINING<br />

INTERVENTION<br />

Natalia Escobar Walsh, MS, 1 & Joseph M. Price, PhD 2<br />

1 SDSU/UCSD Joint Doctoral <strong>Program</strong> in Clinical Psychology; 2 Department of Psychology, San Diego State University<br />

Contact: nataliawalsh@gmail.com<br />

While several studies have identified predictors of participant engagement, most have measured engagement<br />

through attendance and none have examined engagement among foster parents. During an effectiveness trial, the<br />

KEEP intervention reduced foster child behavior problems by providing parent training to foster parents in a group<br />

setting. In the current study, pre‐intervention foster parent and child characteristics in the KEEP intervention group<br />

(n = 359) were used to predict process‐oriented engagement (e.g., homework completion, level of participation) in<br />

early and later stages of the intervention. Regression analyses showed that the number of baseline foster child<br />

behavior problems (b = 0.01, SE, = 0.01, p = .029) was associated with early engagement (R2 = .03, F(2, 317) = 5.40, p<br />

= .005). Baseline foster parent distress (b = 0.16, SE = 0.07, p = .03) and number of children in the home (b = ‐0.03, SE<br />

= 0.02, p = .029) were associated with late engagement (R2 = .04, F(3, 302) = 4.78, p = .003). These findings provide<br />

an opportunity to give targeted recommendations to child welfare agencies that will facilitate the effective<br />

dissemination and implementation of the KEEP intervention.<br />

Track(s): None specific to this conference<br />

NOTES<br />

134 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


15. DETERMINING FACTORS IMPORTANT IN INFLUENCING ASD COMMUNITY STAKEHOLDERS<br />

PARTICIPATION IN AN ACADEMIC‐COMMUNITY COLLABORATION<br />

Rosemary Meza, BA<br />

San Diego State University & Child & Adolescent Services <strong>Research</strong> Center<br />

Contact: Rosemarydmeza@gmail.com<br />

Successful academic‐community collaborations (ACCs) increase communication, cooperation and trust between<br />

researchers and community stakeholders, generate feasible and useful innovations, and help to close the gap<br />

between research and community practice. However, factors influencing an individuals’ decision to participate<br />

in an ACC are not well understood. ASD community stakeholders, previously contacted to participate in an ACC,<br />

completed the Decision to Participate questionnaire (DPQ). Ten ACC participants and 8 non‐participants<br />

completed the DPQ, which asks individuals to rate the importance of items selected a priori in their decision to<br />

participate in an ACC or not. Using independent sample T‐tests, four items were found to be statistically and<br />

meaningfully different between the groups. ACC participants rated networking with other providers (p=.007;<br />

effect size (ES)=1.74), the fit of collaboration with agency philosophy (p=.011; ES=1.31), and the opportunity for<br />

future training/consultations (p=.034; ES=1.16) as factors more important in their decision to participate in the<br />

ACC than non‐participants. Non‐participants reported the number of requests to participate in research more<br />

important in their decision to participate (p=0.15; ES=1.48) than participants. Considering the networking<br />

opportunities, collaboration philosophy, opportunities for training, and amount of research requests being made<br />

of community stakeholders may aid in building and sustaining successful ACCs.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 135


16. A MULTI‐LEVEL FRAMEWORK FOR IMPLEMENTATION SCIENCE<br />

Ruben G. Martinez, BA, & Cara C. Lewis, PhD<br />

Indiana University<br />

Contact: rgm@indiana.edu<br />

Over 60 theoretical implementation frameworks exist (Tabak et al, 2012). These frameworks range from heuristic<br />

models, or models that detail what constructs are important to measure, to process models, which detail a temporal<br />

sequence of implementation. Even so, a gap in the framework literature has been identified. No testable theoretical<br />

framework exists that simultaneously delineates the temporal process of an implementation, acknowledges the<br />

multiple stakeholder levels ripe for investigation, and situates the constructs in the stage and level within which they<br />

are implicated. The author extracted, utilized, and combined aspects from three well‐established theoretical<br />

frameworks: the Consolidated Framework for <strong>Implementation</strong> <strong>Research</strong> (Damschroder, et al., 2009), <strong>Implementation</strong><br />

Outcomes (Proctor et al., 2010), and the Stages of <strong>Implementation</strong> (Fixsen, et al., 2006). The proposed framework<br />

will detail not only what constructs to measure at what time, but at what stakeholder level (client, provider,<br />

administration, organization) and the relation between constructs at specific points in the implementation. The<br />

proposed framework will be populated with instruments identified in the <strong>SIRC</strong> Instrument Review resulting in a<br />

practical guide for stakeholders. The proposed framework will be presented using two contrasting examples of<br />

implementation in mental health. Implications for the field will be discussed.<br />

Track(s): Measurement<br />

NOTES<br />

136 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


17. EXPLORING TRAINING THERAPIST USE OF COGNITIVE BEHAVIORAL THERAPY TO THE PREDICTION OF<br />

EARLY DEPRESSION SYMPTOM REDUCTION<br />

Taylor Marshall, Brigid Marriott, Sofia Braga, MA, Meredith Boyd, Mark Crossen, BA, & Cara C. Lewis, PhD<br />

Indiana University<br />

Contact: taymarsh@indiana.edu<br />

In an effort to address the well‐documented science‐practice gap, implementation researchers have focused on<br />

training doctoral student therapists to deliver empirically supported treatments, such as cognitive behavioral<br />

therapy (CBT), that deliver rapid early symptom changes for clients. This current study sought to identify<br />

components of CBT responsible for early changes in depressive symptoms in order to isolate which core features<br />

are ripe for dissemination. Participants were advanced doctoral student therapists (n=6), and clients (n=14) who<br />

presented with an Axis I diagnosis at a psychotherapy training clinic. The training model included a weekly<br />

didactic session and 1.5 hours of weekly group supervision over one school semester (approximately 16 weeks).<br />

Each therapy session was video recorded and coded using the Collaborative Study Psychotherapy Rating Scale to<br />

identify use of Cognitive therapy elements. The Beck Depression Inventory or Patient Health Questionaire‐9<br />

were completed by clients at each session. We hypothesize the Behavioral Methods/Homework subscale will be<br />

most closely related to early depressive symptom reduction. Results will be presented with respect to<br />

implications for the dissemination of CBT in a community mental health setting.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 137


18. COMMUNITY ASSESSMENT & EBP IMPLEMENTATION: ILLNESS MANAGEMENT & RECOVERY (IMR) &<br />

INTEGRATED DUAL DISORDER TREATMENT (IDDT)<br />

Shannon Blajeski, MSW<br />

University of Washington School of Social Work<br />

Contact: blajes@uw.edu<br />

Beginning in 2010, The Washington Institute for Mental Health <strong>Research</strong> & Training, part of the University of<br />

Washington Department of Psychiatry & Behavioral Sciences, Division of Public Behavioral Health & Justice Policy,<br />

worked to implement pilots of the Illness Management & Recovery (IMR) and Integrated Dual Disorder Treatment<br />

(IDDT) models in Washington State. IMR is a manualized module‐based psychoeducational wellness program while<br />

IDDT is a team‐based community mental health treatment program for adults with severe mental illness and cooccurring<br />

substance use disorders. Both are evidence‐based practices linked to positive outcomes for adults with<br />

both Schizophrenia‐spectrum and Bipolar disorders who have significant functional impairments in the community.<br />

With annual fiscal support from the State of Washington Division of Behavioral Health & Recovery, the Institute<br />

worked to develop these EBP’s at the agency level including initial conversations, assessment of desire/fit, training,<br />

consultation and fidelity measures.<br />

The poster session will explore these two EBP pilots with a focus on community assessment at the agency and<br />

Regional Support Network (RSN) level as a necessary first step to the implementation process. Initial program fidelity<br />

data and first‐hand experience will be presented and linked to the level of community assessment that was granted<br />

within the scope of the project’s funding. Lessons learned and future ideas will be presented.<br />

Track(s): None specific to this conference<br />

NOTES<br />

138 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


19. WHO RECEIVES EVIDENCE‐BASED PSYCHOTHERAPIES? CHARACTERISTICS OF FEMALE VETERANS FROM<br />

A PRACTICE SETTING<br />

Andrea DeVito, 1 Cassidy Gutner, 1 Alexandra Dick, MA, 2 Sam Meisel, 1 & Shannon Wiltsey Stirman,PhD 1,2,3<br />

1 Boston University; 2 VA Boston Healthcare System; 3 National Center for PTSD<br />

Contact: adevito@bu.edu<br />

The Department of Veterans Affairs is working to implement the use of Evidence‐Based Psychotherapies (EBPs)<br />

in their clinics to returning veterans that seek psychological treatment. A key implementation challenge is<br />

estimating the penetration of EBPs into mental health systems. The goal of our study was to determine the<br />

proportion of women in a VA clinic who a) were offered an EBP, b) engaged in EBPs, and c) to understand why<br />

others were not receiving these treatments. Using a program evaluation tool designed to be feasible for delivery<br />

in a clinical practice setting, and including validated symptom, satisfaction, and functioning inventories, we<br />

surveyed 165 women Veterans and their providers over 13 months during intake and follow up appointments.<br />

Of those women, 58 presented for individual psychotherapy and 55 were offered an EBP through the clinic.<br />

Clinicians identified 14 individuals who refused treatment, were not judged stable enough for trauma‐focused<br />

treatment, or were noncompliant. We found that the most common form of EBPs offered were Cognitive<br />

Processing Therapy (CPT) and cognitive behavioral therapy (CBT), with 37.9% of women reporting engaging in<br />

one of these treatments after intake. We also found that a significant proportion of women received a different<br />

type of treatment not listed on the patient and clinician forms. We compared characteristics of women<br />

Veterans to eligibility criteria for EBPs offered in the clinic. We also compared a subset of these instruments to<br />

clinical documentation to better understand what occurred in session when they did and did not receive EBPs.<br />

The data gathered from our study hopefully will shed light on EBP penetration in a VA setting as well as why<br />

some women veterans do not receive EBPs.<br />

Track(s): None specific to this conference<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 139


20. COMPARING SELF, CLINICIAN, & OBSERVER REPORTS OF COGNITIVE PROCESSING THERAPY (CPT)<br />

ADHERENCE<br />

Samuel Meisel, 1 Amber Calloway, 2 Alexandra Dick, MA, 3 Andrea DeVito, 1 Ann Rasmusson, 1,3 & Shannon Wiltsey<br />

Stirman, PhD 1,3,4<br />

1 Boston University; 2 UMass Boston; 3 National Center for PTSD, 4 VA Boston Healthcare System<br />

Contact: meisels@bu.edu<br />

Most previous research has assessed treatment fidelity using time‐and cost‐intensive observer ratings. In line with<br />

the theme of this year’s conference, the current study looks to offer a new, more efficient way of analyzing fidelity of<br />

treatment for CPT for PTSD. The objectives of the current study are to (1) assess the degree in which clinicians,<br />

clients, and an objective rater agree on a newly developed CPT adherence measure and (2) explore its psychometric<br />

properties. The current study will assess agreement between clients (n> 50), clinicians (n>10), and observers (n=3).<br />

Reliability of the new measure will be determined through an intraclass correlation coefficient and criterion‐related<br />

validity will be assessed a comparison of the measure to the CPT Adherence and Competence Observer Rating Scale.<br />

We will conduct a preliminary evaluation of the predictive validity of this measure by examining the association<br />

between adherence and PTSD symptom change. Validation of the self‐report measure could greatly improve the<br />

efficiency of monitoring adherence, a key implementation outcome.<br />

Track(s): Fidelity<br />

NOTES<br />

140 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


21. IMPROVING OUR CAPACITY FOR EVIDENCE‐BASED PTSD TREATMENT: DEVELOPING AN EFFECTIVE<br />

MODEL OF POST‐WORKSHOP CONSULTATION<br />

Meredith S. H. Landy, 1 Kelly McShane, 1 Sheena Bance, 2 Shannon Wiltsey Stirman, PhD, 3 Josh Deloriea, BA, 1<br />

Marta Maslej, 1 & Candice M. Monson, PhD 1<br />

1 Ryerson University; 2 Ontario Institute of Studies in Education (OISE), University of Toronto; 3 VA Boston<br />

Healthcare System, National Center for PTSD, & Boston University<br />

Contact: meredith.landy@psych.ryerson.ca<br />

Current best practice in training clinicians in the administration of evidence‐based psychotherapies (EBP)<br />

includes workshop attendance followed by post‐workshop consultation. In spite of data suggesting that<br />

consultation is important, little is known about what makes for successful consultation. The aim of this study is<br />

to understand how consultation changes the behaviors of clinicians learning to administer Cognitive Processing<br />

Therapy, an EBP for Posttraumatic Stress Disorder. We will undertake a qualitative analysis of consultant and<br />

clinician behaviors and interactions, and develop and test a theory detailing the context and mechanisms that<br />

make for successful consultation. Consistent with the theme of this year’s conference, in this presentation we<br />

will describe the challenges we faced in our efforts to develop a model of clinical consultation. Specifically, we<br />

will describe how we selected which consultation calls to code, the process of developing the plan for analyzing<br />

data, and used how we used this early data to further refine the plan. We will also discuss how we used a realist<br />

approach, a paradigm frequently adopted by social scientists, to develop and test a theory describing the<br />

context and mechanisms that make for successful consultation. The strengths and limitations of this approach<br />

will be discussed.<br />

Track(s): Global Perspectives, Training<br />

NOTES<br />

Posters<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 141


ADDITIONAL NOTES<br />

142 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


ADDITIONAL NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 143


ADDITIONAL NOTES<br />

144 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


ADDITIONAL NOTES<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 145


ADDITIONAL NOTES<br />

146 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)


RESTAURANTS NEAR HOTEL DECA<br />

Agua Verde Cafe<br />

(206) 545‐8570<br />

1303 NE Boat Street, <strong>Seattle</strong>, WA 98105<br />

Espresso ‐ Monday‐Friday 7:30 am to 2:00 pm<br />

Monday‐Saturday 11:00 am to 9:00 pm<br />

http://www.aguaverde.com/<br />

Bulldog News & Fast Expresso<br />

(206) 632‐6397<br />

4208 University Way NE<br />

Newstand open daily 8am to 8pm<br />

Weekdays 6:30am to 7pm; Weekends 8am to 7pm<br />

www.bulldognews.com<br />

Chaco Canyon Organic Café<br />

(206) 522‐6966<br />

4757 12ave NE<br />

<strong>Seattle</strong>, WA 98105<br />

http://chacocanyoncafe.com/<br />

Costas Restaurant<br />

(206) 633‐2751<br />

4559 University Way NE<br />

Open Everyday 7am to 10pm<br />

www.costasontheave.com<br />

Earl's on the Ave<br />

(206) 525 4493<br />

4333 University Way NE<br />

Open Everyday 11am to 2am<br />

www.earlsuw.com<br />

Pagliacci Pizzeria<br />

(206) 726‐1717<br />

4529 University Way NE<br />

Monday‐Thursday 11am to 11pm Friday‐ Saturday 11am to<br />

12am<br />

www.pagliacci.com<br />

Shultzy's<br />

(206) 548‐9461<br />

4114 University Way NE<br />

Monday‐Saturday 10am to 2am<br />

Sunday 10am to 12am<br />

www.shultzys.com<br />

Which Wich?<br />

(206) 588‐0471<br />

4730 University Way NE #102<br />

Monday‐Thursday 11am to 8pm Friday‐Saturday 11am to 9pm<br />

Sunday 11am to 8pm<br />

www.whichwich.com<br />

Big Time Brewery<br />

(206) 545‐4509<br />

4133 University Way NE<br />

Monday‐Thursday 11:30am to 12:30am (kitchen closes at<br />

11pm)<br />

Friday‐Saturday 11:30am to 1:30am (kitchen closes at 12am)<br />

www.bigtimebrewery.com<br />

Cafe Allegro<br />

(206) 633‐3030<br />

4214 University Way NE<br />

Monday‐Friday 6am to 10pm<br />

Saturday 7:30am to 10pm Sunday 8am to 10pm<br />

www.cafeallegro.com<br />

Continental Restaurant & Pastry Shop<br />

(206) 632‐4700<br />

4549 University Way NE<br />

Open Everyday 7pm to 11pm<br />

Dick's Drive In<br />

(206) 633‐2751<br />

111 NE 45th St<br />

Open Everyday 10am to 2am<br />

www.ddir.com<br />

Ivar's Salmon House<br />

(206) 632‐0767<br />

401 NE Northlake Way<br />

Monday‐Thursday 11am to 9:30pm<br />

Friday‐Saturday 11am to 10pm Sunday 9:30am to 9:30pm<br />

www.ivars.com<br />

Qdoba Mexican Grill<br />

(206) 547‐0803 1200 NE 45th St<br />

Monday‐Saturday 10am to 11pm<br />

Sunday 11am to 10pm<br />

www.qdoba.com<br />

Village Sushi<br />

(206) 985‐6870 4741 12th Ave NE<br />

Tuesday‐Friday 11:30am to 2pm, 5pm to 9:30pm Saturday<br />

12pm to 9:30pm Sunday 5pm to 9pm<br />

www.villagesushiseattle.com<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 147


9th Ave N<br />

8th Ave N<br />

D exter Ave N<br />

A urora Ave N<br />

6th Ave N<br />

Taylor Ave N<br />

5th Ave N<br />

Roy St<br />

To Lake Union<br />

W Republican St<br />

W Harrison St<br />

W Thomas St<br />

John St<br />

Elliott Avenue<br />

Queen Ann e<br />

Ave N<br />

1st Ave N<br />

SEATTLE<br />

CENTER<br />

2nd Ave N<br />

Chihuly<br />

Garden<br />

& Glass<br />

Pacific<br />

Science<br />

Center<br />

EMP<br />

Museum<br />

Space<br />

Needle<br />

Mercer St<br />

Republican St<br />

Harrison St<br />

Denny Way<br />

Thomas St<br />

John St<br />

Westlake Ave N<br />

Ter y r Ave N<br />

B oren Ave N<br />

F a irvie w Ave N<br />

7.<br />

M inor Ave N<br />

Terminal 91<br />

Olympic<br />

Sculpture Park<br />

Pier 70<br />

Victoria Clipper<br />

NORTH<br />

Major Attractions<br />

Pier 69<br />

Pier 67<br />

Broad St<br />

Clay St<br />

Alaskan Way<br />

Cedar St<br />

Vine St<br />

Bell Harbor International<br />

<strong>Conference</strong> Center<br />

Pier 66/Bell St.<br />

Cruise Terminal<br />

Ferry to Tillicum Village<br />

Tilikum Place<br />

Wall St<br />

Elliott Bay<br />

Battery St<br />

Pier 62 & 63<br />

Bell St<br />

Western Ave<br />

Ferry to Bainbridge Island<br />

Bus/Light Rail Tunnel Stops<br />

South Lake Union Streetcar<br />

Parks<br />

5th Ave<br />

Regrade Park<br />

Blanchard St<br />

Pier 59<br />

4th Ave<br />

Lenora St<br />

1st Ave<br />

<strong>Seattle</strong> Aquarium<br />

Waterfront Park<br />

<strong>Seattle</strong> Great Wheel<br />

Ferry to Bremerton<br />

Steinbrueck<br />

Park<br />

Pier 57<br />

Pier 56<br />

Pier 55<br />

Ferry to Vashon Island<br />

7th Ave<br />

6th Ave<br />

2nd Ave<br />

3rd Ave<br />

PIKE PLACE MARKET<br />

Pier 54<br />

Pier 52<br />

8th Ave<br />

WA State<br />

Ferries<br />

Pier 48<br />

Pine St<br />

Pike St<br />

9th Ave<br />

Union St<br />

Post Alley<br />

Virginia St<br />

Westlake<br />

Park<br />

Terry Ave<br />

Stewart St<br />

Olive Way<br />

University St<br />

Bus<br />

Terminal<br />

Seneca St<br />

Spring St<br />

Marion St<br />

Madison St<br />

Columbia St<br />

Pioneer<br />

Square Park<br />

Occidental<br />

Square<br />

1st Ave S<br />

Minor Ave<br />

Occidental Ave S<br />

Howell St<br />

4th Ave<br />

PIONEER<br />

SQUARE<br />

Yale Ave<br />

WASHINGTON STATE<br />

CONVENTION CENTER<br />

5th Ave<br />

City Hall<br />

Park<br />

2nd Ave S<br />

6th Ave<br />

Safeco Field<br />

(baseball)<br />

Freeway Park<br />

7th Ave<br />

8th Ave<br />

Jefferson St<br />

3rd Ave S<br />

King St.<br />

Station<br />

AMTRAK<br />

CenturyLink<br />

Field<br />

(soccer<br />

&<br />

football)<br />

CenturyLink<br />

Event Center<br />

Boren-Pike-Pine Park<br />

Terrace<br />

9th Ave<br />

Terry Ave<br />

Cherry St<br />

James St<br />

Kobe<br />

Terrace<br />

Park<br />

S Dearborn St<br />

Summit Ave<br />

Boren Ave<br />

Hing Hay Park<br />

CHINATOWN–<br />

INTERNAT’L DISTRICT<br />

International<br />

Children’s Park<br />

M aynard Ave S<br />

Royal Brougham Wy<br />

Boylston Ave<br />

Minor Ave<br />

Alder St<br />

Yesler Way<br />

B road way Ave<br />

S Washington<br />

7th Ave S<br />

S Main<br />

S Jackson<br />

S King<br />

S Weller<br />

S Lane


SEATTLE IMPLEMENTATION RESEARCH CONFERENCE (<strong>SIRC</strong>)<br />

MAY 16‐17, 2013<br />

PROGRAM EVALUATION<br />

Please complete this evaluation form and return it to the <strong>SIRC</strong> Registration Desk prior to your departure.<br />

The planning committee appreciates your reaction and suggestions to develop and improve future conferences.<br />

Thank you!<br />

Answer each by circling the appropriate number, per the following scale. Please provide additional information in<br />

the space provided.<br />

Scale 1 = Strongly Disagree 2 = Disagree 3 = Neutral 4 = Agree 5 = Strongly Agree<br />

1. Overall, the <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> <strong>Conference</strong> met my expectations.<br />

1 2 3 4 5<br />

Why or Why Not?<br />

2. The Panel Discussions, Breakout Sessions, and Materials were useful.<br />

1 2 3 4 5<br />

Why or Why Not?<br />

3. I will recommend this conference to my colleagues.<br />

1 2 3 4 5<br />

Why or Why Not?<br />

4. The registration process and conference administration were effective.<br />

1 2 3 4 5<br />

Why or Why Not?<br />

5. Compared to other implementation‐focused conferences recently, <strong>SIRC</strong> offered a unique<br />

experience that made me glad I attended. (Please highlight below what contributed to your<br />

response)<br />

1 2 3 4 5<br />

2 nd Biennial <strong>Conference</strong>: Solving <strong>Implementation</strong> <strong>Research</strong> Dilemmas | May 16‐17, 2013 149


6. What talks or topics were of particular use for you?<br />

7. What talks or topics were not particularly helpful or did not belong in this conference?<br />

8. What topics would you like to have covered at the next <strong>SIRC</strong> conference?<br />

9. What speakers would you like to hear from at the next <strong>SIRC</strong> conference?<br />

10. Apart from the conferences themselves, what ventures would you recommend <strong>SIRC</strong> take on in the next 1‐2<br />

years? (If you are interested in participating in such a venture should we take it on, please include your email so we<br />

can include you.)<br />

11. Was there anything you want to complement or raise a concern about regarding the facilities?<br />

150 <strong>Seattle</strong> <strong>Implementation</strong> <strong>Research</strong> Collaborative (<strong>SIRC</strong>)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!