20.02.2014 Views

Improving the Effectiveness of Juvenile Justice Programs: A New

Improving the Effectiveness of Juvenile Justice Programs: A New

Improving the Effectiveness of Juvenile Justice Programs: A New

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

effectiveness found in <strong>the</strong> meta-analysis and <strong>the</strong> basis for<br />

rating <strong>the</strong>m in <strong>the</strong> SPEP are as follows.<br />

Type <strong>of</strong> program. The SPEP covers only program types<br />

that take a <strong>the</strong>rapeutic approach, as defined in <strong>the</strong><br />

program categories used in <strong>the</strong> meta-analysis (e.g., family<br />

counseling, mentoring, cognitive-behavioral <strong>the</strong>rapy,<br />

vocational training). The relative effectiveness <strong>of</strong> each<br />

program type for reducing recidivism that was found in<br />

<strong>the</strong> statistical analysis was used to categorize program<br />

types as having, on average, high, medium, or low effects<br />

on recidivism, keeping in mind that even <strong>the</strong> low program<br />

types none<strong>the</strong>less have positive average effects. The total<br />

number <strong>of</strong> points—which represents <strong>the</strong> proportionate<br />

contribution <strong>of</strong> program type to predicting recidivism<br />

effects—is distributed across <strong>the</strong>se categories so that <strong>the</strong><br />

maximum number <strong>of</strong> program type points goes to those in<br />

<strong>the</strong> high category with discounted scores given to program<br />

types in <strong>the</strong> medium and low categories.<br />

To determine which program type a local program<br />

represents, and thus what its SPEP score is on that<br />

factor, descriptive information about <strong>the</strong> nature <strong>of</strong> <strong>the</strong><br />

services it provides must be examined. That information is<br />

compared with <strong>the</strong> descriptions in a glossary <strong>of</strong> program<br />

types that was developed from <strong>the</strong> descriptions provided<br />

in <strong>the</strong> corresponding research studies included in <strong>the</strong><br />

meta-analysis. The local program is <strong>the</strong>n identified with<br />

regard to <strong>the</strong> program type it represents and, depending<br />

on whe<strong>the</strong>r that program type is classified as having low,<br />

medium, or high effectiveness, <strong>the</strong> corresponding SPEP<br />

rating is assigned. If a program does not match any <strong>of</strong> <strong>the</strong><br />

program types in <strong>the</strong> glossary, it means that insufficient<br />

research exists for estimating <strong>the</strong> effectiveness <strong>of</strong> that<br />

type <strong>of</strong> program.<br />

Many programs involve combinations <strong>of</strong> services that may<br />

represent different program types. In those cases, primary<br />

and supplementary services are distinguished and, if <strong>the</strong><br />

supplementary services are <strong>of</strong> a different type from <strong>the</strong><br />

primary service, but <strong>of</strong> a type shown to be effective in <strong>the</strong><br />

research, bonus points are awarded for it.<br />

Amount <strong>of</strong> treatment. Service amount is divided into<br />

duration and total contact hours, with <strong>the</strong> latter receiving<br />

somewhat more points in light <strong>of</strong> its slightly stronger<br />

relationship to outcomes. Service duration is assessed<br />

as <strong>the</strong> time (e.g., number <strong>of</strong> weeks) between <strong>the</strong> date<br />

<strong>of</strong> service intake and <strong>the</strong> date <strong>of</strong> service termination for<br />

each juvenile with a closed case who was served by<br />

<strong>the</strong> program over <strong>the</strong> period <strong>of</strong> time to which <strong>the</strong> SPEP<br />

is applied (e.g., SPEP ratings might be made annually).<br />

Similarly, total contact hours are assessed as <strong>the</strong><br />

number <strong>of</strong> hours <strong>of</strong> direct exposure each juvenile had to<br />

substantive program activities. In both cases, <strong>the</strong>se values<br />

must be determined from actual service records, not<br />

estimated subjectively.<br />

The SPEP ratings for <strong>the</strong>se service dimensions assign<br />

a greater or lesser proportion <strong>of</strong> <strong>the</strong> points available for<br />

amount <strong>of</strong> service according to <strong>the</strong> proportion <strong>of</strong> <strong>the</strong><br />

juveniles served with service duration or contact hours<br />

that reach or exceed specified target values. Those target<br />

values are set at <strong>the</strong> average found in <strong>the</strong> corresponding<br />

research studies for programs <strong>of</strong> that type. This is based<br />

on <strong>the</strong> assumption that, if <strong>the</strong> amount <strong>of</strong> service provided<br />

at least reaches <strong>the</strong> average reported in <strong>the</strong> respective<br />

research studies, <strong>the</strong> program should attain at least <strong>the</strong><br />

average effects on recidivism found for that program type.<br />

Quality <strong>of</strong> treatment. The quality <strong>of</strong> <strong>the</strong> treatment<br />

implementation is <strong>the</strong> most difficult SPEP factor to rate<br />

on <strong>the</strong> basis <strong>of</strong> actual program data. This factor, as it is<br />

represented in <strong>the</strong> research studies and analyzed in <strong>the</strong><br />

meta-analysis, refers to <strong>the</strong> extent to which <strong>the</strong> program<br />

was implemented as intended for every juvenile recipient.<br />

Such information is not generally collected as part <strong>of</strong><br />

<strong>the</strong> management information or client-tracking systems<br />

used by juvenile justice agencies and may have to be<br />

developed in order to support full SPEP ratings. Drawing<br />

on <strong>the</strong> representation <strong>of</strong> this factor in <strong>the</strong> research studies,<br />

we identify <strong>the</strong> key dimensions <strong>of</strong> implementation quality<br />

as (1) a written protocol describing <strong>the</strong> intended service,<br />

(2) provision <strong>of</strong> training on <strong>the</strong> intended service for those<br />

delivering it, (3) a regular procedure for monitoring service<br />

to assess whe<strong>the</strong>r it is being delivered as intended, and<br />

(4) a procedure for taking corrective action when service<br />

delivery strays from what is intended. Note that <strong>the</strong>se are<br />

not dimensions <strong>of</strong> clinical quality, which may be important<br />

but are not captured well in <strong>the</strong> research on which <strong>the</strong><br />

SPEP is based. Ra<strong>the</strong>r, <strong>the</strong>se are organizational matters<br />

that can be assessed in terms <strong>of</strong> <strong>the</strong> operating procedures<br />

established and maintained by <strong>the</strong> provider delivering <strong>the</strong><br />

program being rated.<br />

30 <strong>Improving</strong> <strong>the</strong> <strong>Effectiveness</strong> <strong>of</strong> <strong>Juvenile</strong> <strong>Justice</strong> <strong>Programs</strong>: A <strong>New</strong> Perspective on Evidence-Based Practice

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!