10.07.2015 Views

(PDF) - Key Steps in Outcome Management - Urban Institute

(PDF) - Key Steps in Outcome Management - Urban Institute

(PDF) - Key Steps in Outcome Management - Urban Institute

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

IntroductionLike the leaders of private companies, nonprofit executives and managers needto know whether their programs are provid<strong>in</strong>g satisfactory results. <strong>Outcome</strong> managementenables organizations to def<strong>in</strong>e and use specific <strong>in</strong>dicators to cont<strong>in</strong>uallymeasure how well services or programs are lead<strong>in</strong>g to the desired results. With this<strong>in</strong>formation, managers can better develop budgets, allocate their resources, andimprove their services.A successful outcome management program <strong>in</strong>cludes a process to measure outcomesplus the use of that <strong>in</strong>formation to help manage and improve services andorganizational outcomes.This is the first <strong>in</strong> a series of guidebooks from the <strong>Urban</strong> <strong>Institute</strong>. It covers thenecessary steps for nonprofit organizations that wish to implement outcome management(also known as “manag<strong>in</strong>g for results”), and <strong>in</strong>cludes guidance on establish<strong>in</strong>gan outcome-oriented measurement process and practices for us<strong>in</strong>g the<strong>in</strong>formation <strong>in</strong>ternally.Additional guides <strong>in</strong> the series will exam<strong>in</strong>e <strong>in</strong> more detail some componentsof outcome management, such as undertak<strong>in</strong>g client surveys. Please checkhttp://www.urban.org to see what guides are currently available.What Contributes to SuccessA nonprofit should have certa<strong>in</strong> characteristics to successfully develop andimplement an outcome management process. They <strong>in</strong>clude the follow<strong>in</strong>g:Leadership support. There must be visible support from top management <strong>in</strong>the organization.Commitment of time and staff resources. Initial development and <strong>in</strong>troductionof the process often requires the time and effort of many staff members. Oncethe process is <strong>in</strong> place, the effort required typically decreases, as outcomemanagement becomes part of basic program management.Program stability. Programs that are undergo<strong>in</strong>g major change <strong>in</strong> mission orpersonnel are not good candidates for <strong>in</strong>troduc<strong>in</strong>g performance measurement.A stable organizational environment is needed.Computer capability. Even if the organization is very small, the capacity touse computers to record data and prepare reports is very desirable. Hardwareand software (even if rudimentary) as well as staff with the necessary expertiseare needed.


viii<strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>About This GuidebookFunders, <strong>in</strong>clud<strong>in</strong>g local governments, United Ways, and foundations <strong>in</strong>creas<strong>in</strong>glyask, and sometimes require, reports that <strong>in</strong>clude outcome <strong>in</strong>formation, <strong>in</strong> order todemonstrate that their services have value. The United Way of America with its 1996manual Measur<strong>in</strong>g Program <strong>Outcome</strong>s: A Practical Approach became a major impetus<strong>in</strong> encourag<strong>in</strong>g the measurement of outcomes. Other national service organizations havealso created materials and provided resources to help their affiliates move <strong>in</strong>to this area.This report and the others forthcom<strong>in</strong>g <strong>in</strong> this series build on these sector-wide effortsand are <strong>in</strong>tended to provide assistance to nonprofits that wish to collect outcomemeasurement data and use the <strong>in</strong>formation to help improve services to clients.Exhibit 1 lists 13 key steps to implement outcome management that are detailed<strong>in</strong> this guidebook. The steps are grouped <strong>in</strong>to four sections:SECTIONISett<strong>in</strong>g Up <strong>in</strong>cludes steps 1 through 3, the <strong>in</strong>itial organizational tasks.SECTIONIIDecid<strong>in</strong>g What and How to Measure <strong>in</strong>cludes steps 4 through 9, what is needed todevelop the outcome measurement process.SECTIONIIIAnalyz<strong>in</strong>g the Data <strong>in</strong>cludes steps 10 through 12, review<strong>in</strong>g and report<strong>in</strong>g on the<strong>in</strong>formation collected.SECTIONIVUs<strong>in</strong>g the Results <strong>in</strong>cludes step 13, the potential uses for outcome data, focus<strong>in</strong>g onhow outcome <strong>in</strong>formation can be used to improve services.These steps can help create and ma<strong>in</strong>ta<strong>in</strong> a mean<strong>in</strong>gful outcome managementprocess. But remember, it is impossible to measure perfectly all ideal outcomes oreven any particular outcome. The goal for nonprofit organizations is to develop, atleast roughly, outcome <strong>in</strong>formation that can be used by program managers and staffto improve services on a cont<strong>in</strong>u<strong>in</strong>g basis. When managers make decisions based onoutcomes, the result is more effective programs with <strong>in</strong>creased benefits to clients andthe community year after year.


IntroductionixEXHIBIT 1<strong>Key</strong> <strong>Steps</strong> to Implement<strong>Outcome</strong> <strong>Management</strong>SECTIONISett<strong>in</strong>g UpStep 1:Step 2:Step 3:Select programs to <strong>in</strong>cludeDeterm<strong>in</strong>e who will be <strong>in</strong>volved <strong>in</strong> develop<strong>in</strong>g the process and howEstablish an overall scheduleSECTIONIIDecid<strong>in</strong>g What and How to MeasureStep 4:Step 5:Step 6:Step 7:Step 8:Step 9:Identify the program’s mission, objectives, and clientsIdentify the outcomes (results) sought by the programSelect specific <strong>in</strong>dicators to measure the outcomesSelect data sources and data collection procedures for each <strong>in</strong>dicatorIdentify key client and service characteristics to be l<strong>in</strong>ked to outcome<strong>in</strong>formationPilot test the procedures, make needed modifications, and implementSECTIONIIIAnalyz<strong>in</strong>g the DataStep 10:Step 11:Step 12:Exam<strong>in</strong>e the outcome dataReport the f<strong>in</strong>d<strong>in</strong>gsSeek explanations for unusual or unexpected f<strong>in</strong>d<strong>in</strong>gsSECTIONIVUs<strong>in</strong>g the ResultsStep 13:Use the outcome <strong>in</strong>formation to improve services


S E C T I O NISett<strong>in</strong>g UpStep 1:Step 2:Select Programs to IncludeDef<strong>in</strong>itions of “program” may vary. Generally, a program constitutes some set ofactivities with a def<strong>in</strong>ed objective, such as an employment tra<strong>in</strong><strong>in</strong>g program, a tutor<strong>in</strong>gor mentor<strong>in</strong>g program, a sports program, or a literacy development program foryouth. In some nonprofits, there may be one supervisor per program, while <strong>in</strong> others,a supervisor will oversee more than one program.If an organization has more than one group provid<strong>in</strong>g the same service, managerscan use a common measurement approach for both. For example, if two offices provideadult mental health services, each with its own supervisor, then “adult mentalhealth services” can be treated as one program—and use the same outcome measurementprocess.Each different program or service will have its own set of ways to measure performance.Thus, each program will need its own outcome <strong>in</strong>dicators and measurementprocedures. 1A nonprofit can beg<strong>in</strong> its outcome management program by cover<strong>in</strong>g only someof its programs or services. This <strong>in</strong>cremental approach could make the <strong>in</strong>itial efforteasier and allow a focus on programs that are more <strong>in</strong>terested or have started theprocess. A major disadvantage, however, is that full implementation by the organizationwill take much longer.Determ<strong>in</strong>e Who Will Be Involved<strong>in</strong> Develop<strong>in</strong>g the Process and HowA good process is to establish an outcome management work<strong>in</strong>g group for eachprogram. The work<strong>in</strong>g group members work out the details of the outcome managementprocess and oversee its <strong>in</strong>itial implementation.1Alternatively, the organization can identify a set of general service characteristics applicable to all its programs,such as service timel<strong>in</strong>ess and helpfulness. The organization can then survey clients of all its programs toobta<strong>in</strong> rat<strong>in</strong>gs of each of these characteristics. However, to be most useful for improv<strong>in</strong>g services, more detailed<strong>in</strong>formation, specific to <strong>in</strong>dividual programs, is needed.


2 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>Work<strong>in</strong>g groups that <strong>in</strong>clude representatives from the program and also otherparts of the organization can provide a rich variety of perspectives on what outcomesshould be measured, how outcome <strong>in</strong>formation can be collected, and the ways theoutcome <strong>in</strong>formation can be used. The work<strong>in</strong>g group approach can also reduce thelikelihood that program staff will feel that the outcome process was imposed on themby outsiders (the “not-<strong>in</strong>vented-here” problem).An alternative is to have a small number of managers and consultants developthe outcome measurement process. This puts less burden on the staff and may speedup the process. However, it is not likely to provide sufficient perspective on whatshould be measured and is not as likely to be accepted by program personnel. Staffsupport is essential for quality data collection and for use of the result<strong>in</strong>g outcome<strong>in</strong>formation.Exhibit 2 lists the possible composition of the work<strong>in</strong>g group. The number ofmembers will vary with the size of the organization, from a few to as many as 12 ormore <strong>in</strong> large nonprofits.Step 3:Establish an Overall ScheduleIt is important to allow enough time to work through the many issues that willarise <strong>in</strong> the outcome management process. Don’t rush—it takes time to do it right!The work<strong>in</strong>g group will almost certa<strong>in</strong>ly need many sessions to work through theissues and questions that <strong>in</strong>evitably arise. Work by one or more of the work<strong>in</strong>g groupmembers may also be needed between formal meet<strong>in</strong>gs to help resolve specificissues.Exhibit 3 illustrates a schedule for the development of the system from the startto use of the outcome data collected. It would need to be adapted for each organizationbased on such factors as the number and complexity of programs <strong>in</strong>cluded, theexperience and knowledge of the work<strong>in</strong>g group members, and the time that they areavailable to work on this process.Exhibit 4 provides sample agendas for meet<strong>in</strong>gs of the work<strong>in</strong>g group.


Sett<strong>in</strong>g Up 3EXHIBIT 2Potential Members of <strong>Outcome</strong><strong>Management</strong> Work<strong>in</strong>g Groups Program manager (typically thefacilitator, also) Members of the program staff Representatives of at least one otherprogram <strong>in</strong> the organization A measurement “expert” (<strong>in</strong> theorganization, or a volunteer fromoutside) A representative of upper managementto provide overall organizationalperspective Possibly, one or two former or currentclients (another option for obta<strong>in</strong><strong>in</strong>gclient perspective is to hold focusgroups)


4 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>EXHIBIT 3Sample <strong>Outcome</strong> <strong>Management</strong> SystemDevelopment Schedule (24 months)Project <strong>Steps</strong>Month(see exhibit 1) 0 2 4 6 8 10 12 14 16 18 20 22 24Step 1–3:Initial organizational stepsStep 4:Identify missionand clients<strong>Steps</strong> 5–6:Identify what is tobe measuredStep 7:Identify data sourcesand data collection proceduresStep 8:Determ<strong>in</strong>e data breakouts,comparisons, and analysis planStep 9:Pilot test<strong>Steps</strong> 10–12:Analyze the dataStep 13:Use the data➨Source: Adapted from Performance Measurement: Gett<strong>in</strong>g Results (Wash<strong>in</strong>gton, D.C.: The <strong>Urban</strong> <strong>Institute</strong>, 1999).


Sett<strong>in</strong>g Up 5EXHIBIT 4Sample Agendas for Work<strong>in</strong>g Group Meet<strong>in</strong>gsMeet<strong>in</strong>g One Identify the purposes and uses ofoutcome data. Discuss work<strong>in</strong>g group mission,objectives, and overall schedule. Beg<strong>in</strong> def<strong>in</strong><strong>in</strong>g program mission,objectives, and customers. Plan for focus groups to obta<strong>in</strong> <strong>in</strong>putfrom customers.Meet<strong>in</strong>g Two Complete the def<strong>in</strong><strong>in</strong>g of programmission, objectives, and customers. Beg<strong>in</strong> identify<strong>in</strong>g outcomes andefficiency aspects to be tracked. Role-play as customers.Prepare outcome sequence charts.Work out details of customer focusgroups (held before meet<strong>in</strong>g three).Meet<strong>in</strong>g Three Review f<strong>in</strong>d<strong>in</strong>gs from focus groups. F<strong>in</strong>alize list of outcomes to track. Beg<strong>in</strong> select<strong>in</strong>g outcome and efficiency<strong>in</strong>dicators. Discuss possible data sources and datacollection procedures.Meet<strong>in</strong>g Four Work on identify<strong>in</strong>g outcome <strong>in</strong>dicators,data sources, and basic data collectionprocedures. Identify desirable breakouts of <strong>in</strong>dicatordata. Plan for development of detailed datacollection procedures, such as customersurvey questionnaires.Meet<strong>in</strong>g Five F<strong>in</strong>alize outcome <strong>in</strong>dicators and datasources. Review <strong>in</strong>itial cuts at detailed datacollection procedures, such as customersurvey questionnaires, and develop aplan for analyz<strong>in</strong>g the performance data(<strong>in</strong>clud<strong>in</strong>g specify<strong>in</strong>g comparisonbenchmarks).Beg<strong>in</strong> plann<strong>in</strong>g for pilot-test<strong>in</strong>g of newdata collection procedures.Meet<strong>in</strong>g Six Complete plan for the pilot test and<strong>in</strong>itiate the test.Meet<strong>in</strong>gs Seven, Eight, and N<strong>in</strong>e Review progress of pilot test. Work out test problems. Select performance report formats andidentify needed tabulations for theoutcome data com<strong>in</strong>g from the pilottest.Meet<strong>in</strong>g Ten Review pilot test outcome data. Review results of pilot test procedures. Identify and make necessarymodifications.Meet<strong>in</strong>g Eleven Beg<strong>in</strong> document<strong>in</strong>g outcomemeasurement procedures for theongo<strong>in</strong>g implementation process. Identify specific ways to make theoutcome data most useful. (This<strong>in</strong>cludes determ<strong>in</strong><strong>in</strong>g frequency ofreport<strong>in</strong>g, improv<strong>in</strong>g analysis andpresentation of the performance<strong>in</strong>formation methods of reportdissem<strong>in</strong>ation, and develop<strong>in</strong>g ways tofollow up on f<strong>in</strong>d<strong>in</strong>gs.)Meet<strong>in</strong>g Twelve Review all aspects of the performancemeasurement process.F<strong>in</strong>alize documentation.Develop a multiyear schedule for fullimplementation.Source: Adapted from Performance Measurement: Gett<strong>in</strong>g Results (Wash<strong>in</strong>gton, D.C.: The <strong>Urban</strong> <strong>Institute</strong>, 1999).


Decid<strong>in</strong>g What andHow to MeasureS E C T I O NIIStep 4:Step 5:Identify the Program’s Mission, Objectives, and ClientsAfter the outcome management work<strong>in</strong>g group has selected programs to measure,the detailed work beg<strong>in</strong>s. 2First, the program’s goals and constituents, customers, or clients should be identified.In many cases, organizations have already formulated mission statements fortheir programs. If not, they should be prepared and dissem<strong>in</strong>ated to all program staff.Typically, the mission and objective statements should take the follow<strong>in</strong>g form:To: (List the basic objectives the program seeks—and any major negativeconsequences the program should avoid)By: (Describe the service that is provided to achieve those objectives)For example, the statement for a youth development program might read:To improve self esteem, school attendance, and learn<strong>in</strong>g, and reduce negativebehavior for youths age 10 to 14.By provid<strong>in</strong>g a variety of social and educational activities <strong>in</strong> the program’sfacilities.A common error <strong>in</strong> form<strong>in</strong>g the mission/objective statement is to identify thenature of the services to be delivered (the “By” part of the statement) without <strong>in</strong>dicat<strong>in</strong>gthe objectives of those services (the “To” part of the statement).Identify the <strong>Outcome</strong>s of the ProgramNext, the mission/objective statement should be translated <strong>in</strong>to specific client andprogram results. These should be as specific as possible, as they become the basisfor identify<strong>in</strong>g specific outcome <strong>in</strong>dicators.2For more <strong>in</strong>formation, see “Measur<strong>in</strong>g Program <strong>Outcome</strong>s: A Practical Approach” (Alexandria, Va.: UnitedWay of America, 1996); Performance Measurement: Gett<strong>in</strong>g Results (Wash<strong>in</strong>gton, D.C.: The <strong>Urban</strong> <strong>Institute</strong>, 1999);and Develop<strong>in</strong>g Useful Measures and <strong>Outcome</strong>s (Louisville, Ky.: Community Foundations of America, 2002).


8 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>To help identify specific results, the work<strong>in</strong>g group canexam<strong>in</strong>e outcomes used by similar programs. With the <strong>in</strong>creas<strong>in</strong>g emphasison outcome measurement, perhaps other nonprofits or government agencieshave identified specific outcomes for similar programs that could be used asa start<strong>in</strong>g po<strong>in</strong>t.talk to program staff. Their views about what clients need and want from theservice will be a major source of <strong>in</strong>formation for select<strong>in</strong>g outcomes.hold focus groups with current and former clients. Focus groups are facilitatedmeet<strong>in</strong>gs that attempt to solicit <strong>in</strong>formation on specific questions fromparticipants. They usually <strong>in</strong>volve a small number of people (perhaps up to12) <strong>in</strong> an <strong>in</strong>formal two-hour meet<strong>in</strong>g.At the session, participants would address such questions as “What haschanged <strong>in</strong> your life because of this program?” and “What don’t you likeabout the service?” Responses to such questions help identify characteristicsof successful programs and client expectations. For more details on thefocus group process, see exhibit 5.A program staff member might attend but should rema<strong>in</strong> silent, to ensurethat the participants are not <strong>in</strong>hibited or <strong>in</strong>fluenced.use outcome sequence charts (often called logic models). These diagram thesteps from “<strong>in</strong>puts”—of dollars and staff—that lead to “activities” that lead to“outputs.” Outputs lead to “<strong>in</strong>termediate outcomes” that are expected to result<strong>in</strong> “end outcomes”—the ultimate goal of the program. See exhibit 6 for basicdef<strong>in</strong>itions and exhibit 7 for an example of an outcome sequence chart.Us<strong>in</strong>g these sources, the work<strong>in</strong>g group should be able to identify the programoutcomes that should be tracked.Step 6:Select Specific Indicators to Measure the <strong>Outcome</strong>sAfter program outcomes are def<strong>in</strong>ed <strong>in</strong> general terms, the next step is to translatethe statements <strong>in</strong>to specific <strong>in</strong>dicators that will be measured.For each outcome, the work<strong>in</strong>g group needs to identify one or more outcome<strong>in</strong>dicators that could be measured to track progress toward the outcomes. <strong>Key</strong> criteriaare the feasibility and cost of measurement.<strong>Outcome</strong> <strong>in</strong>dicators should almost always beg<strong>in</strong> with words such as “The numberof . . .” or “The percent of . . .”Exhibit 8 illustrates the relationship between a program’s mission/objectives,outcomes, and outcome <strong>in</strong>dicators.Although data sources and collection procedures are discussed <strong>in</strong> step 7, the f<strong>in</strong>alselection of the specific outcome <strong>in</strong>dicators should not occur without consider<strong>in</strong>g thesource of that data and the likely data collection procedure. Thus, steps 6 and 7overlap.


Decid<strong>in</strong>g What and How to Measure 9EXHIBIT 5Focus Group <strong>Steps</strong> Plan the sessions. Determ<strong>in</strong>e the<strong>in</strong>formation needed, the categories ofparticipants, the tim<strong>in</strong>g, location, andother adm<strong>in</strong>istrative details of thesessions. Beg<strong>in</strong> with <strong>in</strong>troductions and an overview ofthe purpose of the meet<strong>in</strong>g. The facilitator can then ask theparticipants three questions: Select a facilitator who is experienced <strong>in</strong>conduct<strong>in</strong>g focus groups to manage themeet<strong>in</strong>g and a person to take notes on the<strong>in</strong>formation provided by participants.What do you like about the service?What don’t you like about the service?In what ways has the service helpedyou? Invite 8 to 12 current and former clients toeach focus group meet<strong>in</strong>g. Members can bechosen from lists of clients without regardto the statistical representation of theselection. The ma<strong>in</strong> selection criteria arethat the participants be familiar with theprogram and be at least somewhat varied<strong>in</strong> their characteristics. Set a maximum of two hours. Hold themeet<strong>in</strong>g <strong>in</strong> a pleasant and comfortablelocation. Soft dr<strong>in</strong>ks and snacks helpprovide a relaxed atmosphere. The facilitator can ask these questions <strong>in</strong>many different ways. The fundamentalrequirement is to establish an open,nonthreaten<strong>in</strong>g environment and to obta<strong>in</strong><strong>in</strong>put from each participant. The recorder and the facilitator shouldwork together to provide a meet<strong>in</strong>g report.The report should identify outcome-relatedcharacteristics raised explicitly orimplicitly by one or more participants. Theprogram should consider track<strong>in</strong>g thesecharacteristics.Source: Adapted from Performance Measurement: Gett<strong>in</strong>g Results.


10 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>InputsEXHIBIT 6Some Basic Def<strong>in</strong>itionsIndicate the amount of resources applied; for example, the amount of funds or number ofemployees. When related to output or outcome <strong>in</strong>formation, the comb<strong>in</strong>ed <strong>in</strong>formation willprovide <strong>in</strong>dicators of efficiency/productivity.OutputsShow the quantity of work activity completed. Outputs are expected to lead to desiredoutcomes, but by themselves do not tell anyth<strong>in</strong>g about the outcomes.Intermediate <strong>Outcome</strong>sEvents or results that are expected to lead to the end outcomes, but are not themselves“ends.” Also <strong>in</strong>clude characteristics relat<strong>in</strong>g to the quality of the service provided to clients,such as accessibility, response time, and overall satisfaction.End <strong>Outcome</strong>sThe consequences/results of what the program did, not what the program itself did. Theseare likely to be aspects of the client’s condition or behavior that the program seeks toaffect.BenchmarksData that can be used as targets for outcomes or as a comparison with observed outcomes.


Decid<strong>in</strong>g What and How to Measure 11EXHIBIT 7<strong>Outcome</strong>-Sequence Chart: Parental Involvement<strong>in</strong> Dropout Prevention ProgramActivity/OutputIntermediate <strong>Outcome</strong>s End <strong>Outcome</strong>sChildren havebetterattendanceSchool holdsparent<strong>in</strong>gclassesParents attendprogramParentscompleteprogramParents providemore schoolencouragementto their children(1) (2) (3) (4)(5a)Children havefewerbehavioralproblems <strong>in</strong>school(5b)Fewerchildren dropoutLong-termeconomicwell-be<strong>in</strong>g is<strong>in</strong>creased(6) (7)Children haveimprovedgrades(5c)Source: Adapted from Harry P. Hatry and Mary Kopczynski, Guide to <strong>Outcome</strong> Measurement for the U.S. Department of Education (Wash<strong>in</strong>gton, D.C.: Plann<strong>in</strong>g andEvaluation Service, U.S. Department of Education, February 1997).


12 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>EXHIBIT 8L<strong>in</strong>k<strong>in</strong>g <strong>Outcome</strong>s to <strong>Outcome</strong>Indicators to Data SourcesExample:Foster Home ServicesMission/Objective: Ensure the physical and emotional well-be<strong>in</strong>g (safety) and normaldevelopment of children, by plac<strong>in</strong>g them <strong>in</strong>to stable, safe, high-quality foster homes.<strong>Outcome</strong> <strong>Outcome</strong> <strong>in</strong>dicator Data sourceChild safetyPhysical well-be<strong>in</strong>g Number and percent of children with Agency records; tra<strong>in</strong>edserious health problems at follow-up. observer rat<strong>in</strong>gsRepeated abuse and Number and percent of children Agency records; tra<strong>in</strong>edneglect identified as either abused or neg- observer rat<strong>in</strong>gs; clientlected by time of follow-up.surveySafety concerns Number and percent of children Agency records; tra<strong>in</strong>edremoved from foster home by time of observer rat<strong>in</strong>gsfollow-up for other than permanentplacement.Child developmentPhysical development Number and percent of children who Agency records; tra<strong>in</strong>edmet normal growth curves andobserver rat<strong>in</strong>gsheight/weight expectations attime of follow-up.Social development Number and percent of children who Tra<strong>in</strong>ed observer rat<strong>in</strong>gs;displayed “age-appropriate” social client surveyskills at time of follow-up.Educational development Number and percent of school-age Agency records; clientchildren who were progress<strong>in</strong>gsurveysatisfactorily <strong>in</strong> school at time offollow-up.


Decid<strong>in</strong>g What and How to Measure 13While the work<strong>in</strong>g group should be responsible for select<strong>in</strong>g <strong>in</strong>dicators, or atleast provid<strong>in</strong>g specific recommendations, organization management should reviewthem to ensure that the <strong>in</strong>dicators chosen are comprehensive and do not neglectimportant outcomes.Step 7:Select Data Sources and Data CollectionProcedures for Each IndicatorAdvice from someone experienced <strong>in</strong> outcome measurement or program evaluationis helpful for this step, possibly someone from a local university or communitycollege, a volunteer expert, or a consultant. Volunteer faculty members may have students<strong>in</strong> a policy analysis, economics, eng<strong>in</strong>eer<strong>in</strong>g, or statistics course will<strong>in</strong>g to helpas part of a class project.Data collection procedures need to be selected carefully so that the programobta<strong>in</strong>s quality <strong>in</strong>formation. Basic data sources <strong>in</strong>clude the follow<strong>in</strong>g:Organization records or records from other similar organizationsClient or customer surveysRat<strong>in</strong>gs by tra<strong>in</strong>ed observers (for example, of client’s ability to perform theactivities of daily liv<strong>in</strong>g or of environmental conditions)Tests of clients (usually of knowledge)Observations us<strong>in</strong>g equipment, such as chemical tests to track quality ofwater and airThe first three are likely to be the sources used by most nonprofit organizationsand are described more fully below.Organizational RecordsMost programs can use their own records to obta<strong>in</strong> outcome <strong>in</strong>formation for atleast some of their outcome <strong>in</strong>dicators. For example, homeless shelters can track thenumber of overnight uses of their facilities. Food distribution programs can track thenumber of meals they provide or the number of different people they serve. Andmost programs should be able to use their records to track their response times fromclient request to service provision.Many programs can also use their own records to obta<strong>in</strong> <strong>in</strong>formation on other<strong>in</strong>termediate outcomes, such as the percentage of youth who completed the scheduledprogram, and on the performance of youths while <strong>in</strong> that program, such as theirscores on knowledge and attitude tests.Some programs, such as drug and alcohol abuse programs or family counsel<strong>in</strong>gprograms, may want to track “recidivism,” the number of clients that had to returnfor further help (to <strong>in</strong>dicate the lack of complete success of prior participation).Some programs seek outcomes that can only be measured by obta<strong>in</strong><strong>in</strong>g <strong>in</strong>formationfrom outside sources. For example, youth development programs may seek to


14 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>improve youth learn<strong>in</strong>g <strong>in</strong> school and reduce juvenile del<strong>in</strong>quency. To obta<strong>in</strong> <strong>in</strong>formationon the success of these outcomes, the programs need data from schools (suchas grades, test scores, or records of disturbances) and the crim<strong>in</strong>al justice system(such as arrest <strong>in</strong>formation).If measur<strong>in</strong>g an outcome <strong>in</strong>dicator requires data from another organization orgovernment agency, cooperative agreements may be needed, probably with a guaranteeof confidentiality for <strong>in</strong>formation on <strong>in</strong>dividual clients.Requirements for Quality InformationKeep records up-to-date. Enter<strong>in</strong>g client <strong>in</strong>formation directly on the computer is likely toenhance the quality of record keep<strong>in</strong>g.Have clear and complete def<strong>in</strong>itions and <strong>in</strong>structions so data are entered properly.Record <strong>in</strong>formation on important client characteristics and the type and amount ofservice received by <strong>in</strong>dividual clients. This means that outcomes can be related to this<strong>in</strong>formation and used to improve programs.Client or Customer SurveysMost nonprofit health and human service programs will need to obta<strong>in</strong> <strong>in</strong>formationfrom clients on service outcomes. For example, outcome <strong>in</strong>dicators may call forcount<strong>in</strong>g the number and percentage of clients whose condition or behavior was asexpected at a time after services were provided. To obta<strong>in</strong> credible <strong>in</strong>formation, aformal client-survey process will probably be necessary.Organizations should consider seek<strong>in</strong>g the follow<strong>in</strong>g types of <strong>in</strong>formation fromsurveys:Information on the client’s behavior and/or condition, and the client’sperception of the degree of improvement, s<strong>in</strong>ce enter<strong>in</strong>g the programRat<strong>in</strong>gs of service timel<strong>in</strong>ess, accessibility of staff and service facility,condition and safety of facilitiesRat<strong>in</strong>gs of staff competence and courtesyOverall satisfaction with the services providedReasons for any poor rat<strong>in</strong>gs givenSuggestions for improvementsClient surveys can provide a wealth of <strong>in</strong>formation that can help <strong>in</strong> improv<strong>in</strong>gservices, gaug<strong>in</strong>g success levels, and plann<strong>in</strong>g new programs. They are often anessential part of a successful outcome management process.Programs should seek feedback from clients as a regular part of their activities.For nonprofits with large numbers of clients, program adm<strong>in</strong>istrators might chooseto draw samples of clients, perhaps seek<strong>in</strong>g feedback from every fifth client.


Decid<strong>in</strong>g What and How to Measure 15Exhibit 9 lists a number of basic tasks most nonprofits will need to take to implementa client survey process. Additional guides <strong>in</strong> this series provide details on thesurvey process and follow<strong>in</strong>g up with former clients.Nonprofit organizations should seek expert advice when start<strong>in</strong>g up a client surveyprocess, on such issues as the word<strong>in</strong>g of the survey questions. 3Tra<strong>in</strong>ed Observer Rat<strong>in</strong>gsThis data collection procedure can be used for those outcomes that can beassessed by visual rat<strong>in</strong>gs of physical conditions. It requires a detailed rat<strong>in</strong>g guidethat staff or program volunteers use to rate the relevant conditions <strong>in</strong> a consistentmanner over time. That is, different tra<strong>in</strong>ed observers rat<strong>in</strong>g a particular conditionshould each give the condition approximately the same rat<strong>in</strong>g and rate the same conditionfound <strong>in</strong> future report<strong>in</strong>g periods approximately the same. 4 For example, if acommunity development program tries to keep neighborhoods attractive and clean,tra<strong>in</strong>ed observers could be used to assess the physical appearance of the neighborhoodsserved. After the program identifies the specific characteristics important tolocal citizens (such as cleanl<strong>in</strong>ess of the streets; physical appearance of build<strong>in</strong>gs,houses, and yards; absence of unwanted rodents and <strong>in</strong>sects; and adequacy of trafficand street signs), tra<strong>in</strong>ed observers can rate these characteristics for <strong>in</strong>clusion <strong>in</strong> theoutcome reports. Tra<strong>in</strong>ed observers might rate the condition of animals, and shelterfacilities, <strong>in</strong> periodic assessments of animal shelters.For environmental programs such as water clean-up programs, tra<strong>in</strong>ed observersmight assess water visibility. For hik<strong>in</strong>g and bik<strong>in</strong>g trail programs, tra<strong>in</strong>ed observersmight assess various physical conditions such as the condition of the trails and theircleanl<strong>in</strong>ess.A form of tra<strong>in</strong>ed observer procedures has been used to assess the progress ofclients of various rehabilitation and physical care programs, for example, to rate theability to undertake activities of daily liv<strong>in</strong>g.Exhibit 10 lists the basic tasks for implement<strong>in</strong>g tra<strong>in</strong>ed observer measurements.Program staff or volunteers (perhaps students from nearby high schools or colleges)might be used. For example, the Near Northside Partners Council <strong>in</strong> Fort Worth,Texas, tra<strong>in</strong>ed more than a dozen 12- to 18-year-old volunteers to rate neighborhoodconditions. When us<strong>in</strong>g volunteers, however, it is important to consider that the outcomemanagement process needs to have the tra<strong>in</strong>ed observer rat<strong>in</strong>gs done on a regularbasis.3For more <strong>in</strong>formation, see Customer Surveys for Agency Managers: What Managers Need to Know(Wash<strong>in</strong>gton, D.C.: The <strong>Urban</strong> <strong>Institute</strong>, 1997); and Don A. Dillman, Mail and Internet Surveys: The Tailored DesignMethod (New York: Wiley, 2000). Numerous other publications are available on various aspects of surveys, but mostof them are focused on more <strong>in</strong>-depth, one-time surveys and surveys of a community’s households, rather than theregular survey<strong>in</strong>g of clients of a particular program.4For more detailed <strong>in</strong>formation, see Performance Measurement: Gett<strong>in</strong>g Results; “Use of Rat<strong>in</strong>gs by Tra<strong>in</strong>edObservers,” chapter 10 <strong>in</strong> Handbook of Practical Program Evaluation (San Francisco: Jossey Bass, 1994); and“Rat<strong>in</strong>g Street Level Conditions Us<strong>in</strong>g Handheld Computers: Reference Guide for Neighborhood Raters,” availableat http://www.urban.org.


16 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>EXHIBIT 9Basic Tasks <strong>in</strong> Implement<strong>in</strong>g aRegular Client Survey Process1. Identify the specific <strong>in</strong>formation needed.2. Develop the questionnaire, with help from an expert if possible. Each question <strong>in</strong>cludedshould provide <strong>in</strong>formation related to one or more of the outcome <strong>in</strong>dicators.3. Decide when to adm<strong>in</strong>ister the questionnaire. For example, if a program seeks to help clientssusta<strong>in</strong> an improved condition, then each client might be surveyed 6 or 12 months aftercomplet<strong>in</strong>g the service. In other programs, clients could provide outcome <strong>in</strong>formation at thetime the services are completed. Institutionalized clients might be surveyed periodically, forexample, at one-year <strong>in</strong>tervals.4. Determ<strong>in</strong>e how the questionnaire will be adm<strong>in</strong>istered. Common options <strong>in</strong>clude: Mail, if addresses are available and most clients are literate (a low-cost method); Telephone <strong>in</strong>terview, if clients have telephones (a more time-consum<strong>in</strong>g and expensivemethod); In-person <strong>in</strong>terviews, which will likely be too costly unless the questionnaire can be adm<strong>in</strong>isteredat the program’s offices; and A comb<strong>in</strong>ation of these methods.Consider low-cost <strong>in</strong>centives (free meals, movie tickets, or a chance to w<strong>in</strong> a TV or otheritems) to improve the response rate.5. Assign staff to track which clients should be surveyed and when, and to oversee the surveyadm<strong>in</strong>istration and ensure completion, <strong>in</strong>clud<strong>in</strong>g arrang<strong>in</strong>g for second or third mail<strong>in</strong>gs ortelephone calls to nonrespondents.6. Enter and tabulate survey <strong>in</strong>formation, preferably us<strong>in</strong>g a computer to prepare reports.7. Provide and dissem<strong>in</strong>ate easily understood reports to staff and <strong>in</strong>terested outsiders atregular <strong>in</strong>tervals. Usually, it is not appropriate to report on the responses of <strong>in</strong>dividualclients (and some programs may provide clients with a guarantee of confidentiality).8. Encourage use of the survey <strong>in</strong>formation to identify program weaknesses and improvementneeds.


Decid<strong>in</strong>g What and How to Measure 17EXHIBIT 10Basic Tasks <strong>in</strong> Implement<strong>in</strong>g Regular Tra<strong>in</strong>edObserver Measurements Identify what specific <strong>in</strong>formation is wanted. Develop the tra<strong>in</strong>ed observer rat<strong>in</strong>g guide.Test with a number of raters to make surethe rated items and rat<strong>in</strong>g categories areclear. Decide when the rat<strong>in</strong>gs will be made andhow frequently they will be reported dur<strong>in</strong>gthe year. Select and tra<strong>in</strong> the observers. Assign staff to oversee the process,<strong>in</strong>clud<strong>in</strong>g (a) mak<strong>in</strong>g sure the rat<strong>in</strong>gs aredone on schedule; (b) periodically check<strong>in</strong>gthe rat<strong>in</strong>gs to make sure that each tra<strong>in</strong>edobserver is still provid<strong>in</strong>g accurate rat<strong>in</strong>gs;and (c) provid<strong>in</strong>g retra<strong>in</strong><strong>in</strong>g when necessaryand tra<strong>in</strong><strong>in</strong>g new observers. Arrange for the rat<strong>in</strong>gs to be entered andtabulated, preferably electronically andus<strong>in</strong>g a computer to tabulate that<strong>in</strong>formation and prepare reports. (In recentyears, many organizations have begun us<strong>in</strong>ghand-held computers to record the rat<strong>in</strong>gs.The use of such computers can greatlyreduce data entry, tabulation, andreport<strong>in</strong>g time.) Provide and dissem<strong>in</strong>ate regular reports onthe f<strong>in</strong>d<strong>in</strong>gs to staff and <strong>in</strong>terestedoutside organizations. The reports shouldbe clear and understandable. Encourage use of the rat<strong>in</strong>g <strong>in</strong>formation toidentify program weaknesses andimprovement needs. (See later section onus<strong>in</strong>g outcome <strong>in</strong>formation.)


18 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>Step 8:Step 9:Identify <strong>Key</strong> Client and Service Characteristicsto L<strong>in</strong>k to <strong>Outcome</strong> InformationThe outcome data should not only be totaled for all the program’s clients but alsobe tabulated for specific groups of clients, where appropriate. This <strong>in</strong>formation willenable the program manager and staff, as well as upper management, to assess theextent to which the program has been, or has not been, successful <strong>in</strong> help<strong>in</strong>g <strong>in</strong>dividualclient groups and to determ<strong>in</strong>e if changes are needed. For example, health andhuman service programs are likely to f<strong>in</strong>d it useful to report outcomes by one ormore of the follow<strong>in</strong>g:genderage grouprace/ethnicity<strong>in</strong>come grouptype of disabilityeducational levelhous<strong>in</strong>g typeThe work<strong>in</strong>g group should identify client characteristics that may be related tooutcomes. Then, when the data collection procedures are established, the <strong>in</strong>formationon the selected characteristics for each client can be obta<strong>in</strong>ed. Many of thesedemographic characteristics would be obta<strong>in</strong>ed at <strong>in</strong>take. Another way to exam<strong>in</strong>eoutcome data is to l<strong>in</strong>k the outcomes to one or more of various service characteristics,such asspecific office or facility, if more than onespecific caseworker or cl<strong>in</strong>ician (usually should not be reported outside theprogram)key characteristics of the service, such as type (such as whether group or<strong>in</strong>dividual counsel<strong>in</strong>g was used, mode of delivery, location, etc.) and amountof service (such as the number of hours or visits)Such <strong>in</strong>formation can help identify staff who need tra<strong>in</strong><strong>in</strong>g or technical assistanceand help identify successful service procedures that may be transferable toother staff. Exhibit 11 provides an example of the report<strong>in</strong>g of breakout <strong>in</strong>formationby service unit and client race/ethnicity.Pilot Test the Procedures, Make NeededModifications, and ImplementAny new data collection procedures, such as client surveys or tra<strong>in</strong>ed observerrat<strong>in</strong>gs, should be pilot tested. Inevitably glitches and problems will occur that needto be corrected. Any data collected <strong>in</strong> the first round of data collection should be usedwith caution.


Decid<strong>in</strong>g What and How to Measure 19EXHIBIT 11Report Format: <strong>Outcome</strong>s byOrganizational Unit and Race/EthnicityPercent of Clients Whose Adjustment Level Has Improved 12 Months after Intake aFamily Services Family Services Family ServicesRace/Ethnicity Unit 1 Unit 2 Unit 3 All UnitsBlack 52 35 b 56 47Hispanic 35 30 54 c 39Other 58 69 61 63Total 48 44 57 50a. Tables such as these should also identify the number of clients <strong>in</strong> each cell. If a number istoo small, the percentages may not be mean<strong>in</strong>gful.b. The outcomes for Unit 2 black clients are considerably poorer than those for the otherunits. Unit 2 should look <strong>in</strong>to this (such as identify<strong>in</strong>g what the other units are do<strong>in</strong>g toachieve their considerably higher success rates), report on their difficulties, and providerecommendations for corrective actions.c. A substantial proportion of Unit 3 Hispanic clients showed improvement. The programshould attempt to f<strong>in</strong>d out what is lead<strong>in</strong>g to this higher rate of improvement so Units 1 and2 can use the <strong>in</strong>formation. Unit 3 should be congratulated for these results.


Analyz<strong>in</strong>g the<strong>Outcome</strong> InformationS E C T I O NIIIStep 10: Exam<strong>in</strong>e the <strong>Outcome</strong> DataBoth the aggregate outcome data and the breakouts by client group and servicecharacteristics should be reviewed. The f<strong>in</strong>d<strong>in</strong>gs can be compared to benchmarks,such asmost recent data to those of previous time period (the traditional comparison)targets established for each outcome <strong>in</strong>dicator for the time periodoutcomes by characteristic of each identified client group, such as outcomesfor males to those for femalesoutcomes by the different offices or facilities, if appropriate (for mean<strong>in</strong>gfuland fair comparisons, sites should be provid<strong>in</strong>g approximately the sameservices to similar types of clients)outcomes by <strong>in</strong>dividual caseworkers/cl<strong>in</strong>icians (for mean<strong>in</strong>gful and faircomparisons, staff should be provid<strong>in</strong>g similar services to similar types ofclients)outcomes by service delivery (if variations <strong>in</strong> types or amounts)Substantial differences between the latest f<strong>in</strong>d<strong>in</strong>gs and the benchmarks should beidentified for later consideration and possible action.A note on sett<strong>in</strong>g targets, the second comparison listed above: For some programs,funders may require such targets as part of the grant application. Even if notrequired, sett<strong>in</strong>g targets is a good management practice. If the outcome <strong>in</strong>dicator isnew, for example, one whose data are to be obta<strong>in</strong>ed from a new client survey procedure,probably only a reasonable “guesstimate” can be made of what can beachieved. After the program has ga<strong>in</strong>ed experience, however, more reasonable targetsfor the outcome <strong>in</strong>dicators can be set.When sett<strong>in</strong>g outcome targets, programs should consider the follow<strong>in</strong>g:<strong>Outcome</strong> data from previous report<strong>in</strong>g periods. This <strong>in</strong>formation will likelybe the s<strong>in</strong>gle major factor <strong>in</strong> establish<strong>in</strong>g targets for the next report<strong>in</strong>g period.


22 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>Expected budget and staff<strong>in</strong>g levels. Any anticipated changes <strong>in</strong> fund<strong>in</strong>g orstaff (<strong>in</strong>clud<strong>in</strong>g volunteers) that may affect service levels and thus expectedoutcomes, should be considered;The range of recent outcome values reported among the various customergroups, offices, and/or caseworkers/cl<strong>in</strong>icians. A program push<strong>in</strong>g for highoutcomes might set targets for all groups equal to that achieved for the clientgroup with the best outcomes. A more conservative approach is to use pastaverage values of the <strong>in</strong>dicator;External factors. These <strong>in</strong>clude anyth<strong>in</strong>g that might affect the program’sability to achieve outcomes dur<strong>in</strong>g the next report<strong>in</strong>g period, for example,predicted changes to the local or national economy or changes <strong>in</strong> demographiccharacteristics of the community served;Changes <strong>in</strong> the program’s procedures or processes. Planned or recent changesthat can be expected to alter future outcome levels should be considered.Most often targets are set for one-year periods. However, nonprofits will likelyf<strong>in</strong>d it useful to establish targets for each report<strong>in</strong>g period, such as by quarter ormonth, to provide more timely <strong>in</strong>formation and to reflect expected seasonalvariations.Step 11:Report the F<strong>in</strong>d<strong>in</strong>gsPresentation is critical <strong>in</strong> mak<strong>in</strong>g the <strong>in</strong>formation useful, and is often neglected.Reports should be clear, understandable, and mean<strong>in</strong>gful. Even if all other steps <strong>in</strong>the outcome measurement process have gone well, poorly reported <strong>in</strong>formation willdiscourage use or provide mislead<strong>in</strong>g <strong>in</strong>formation.<strong>Outcome</strong> <strong>in</strong>formation can be reported <strong>in</strong> a variety of ways, as shown <strong>in</strong> exhibits11, 12, and 13.Exhibit 11 (see page 19) presents data on one outcome <strong>in</strong>dicator for a given report<strong>in</strong>gperiod broken out by <strong>in</strong>dividual service unit and race/ethnicity. Any twocharacteristics can be readily “cross-tabulated” and reported <strong>in</strong> such a format.Exhibit 12 illustrates a format for display<strong>in</strong>g two types of comparisons for anumber of outcome <strong>in</strong>dicators. In this case, the report compares outcomes for twodifferent time periods and reports actual values aga<strong>in</strong>st targets for each time period.Exhibit 13 is a one-page format that compares outcomes for one outcome <strong>in</strong>dicatorfor a number of client demographic characteristics. This format enables programpersonnel to quickly assess which categories of clients had successful or pooroutcomes for the report<strong>in</strong>g period and the extent to which the outcomes varied bycharacteristic. This type of exam<strong>in</strong>ation can assist program staff <strong>in</strong> identify<strong>in</strong>g potentialproblems and <strong>in</strong> plann<strong>in</strong>g improvements.


Analyz<strong>in</strong>g the <strong>Outcome</strong> Information 23To make the outcome reports user-friendly,<strong>in</strong>clude full and clear labels.keep it simple—don’t clutter up reports with too much <strong>in</strong>formation.highlight what is particularly important (see exhibit 11).use bar charts to illustrate major comparisons.Step 12: Seek Explanations for Unusual or Unexpected F<strong>in</strong>d<strong>in</strong>gsThis explicit step requires the program to seek explanations for unusually high orlow outcomes on each <strong>in</strong>dicator. The organization should establish a rout<strong>in</strong>e processfor exam<strong>in</strong><strong>in</strong>g the f<strong>in</strong>d<strong>in</strong>gs from the latest outcome reports. For example, exhibit 11compares outcomes for three family service units on one particular demographiccharacteristic. The two outly<strong>in</strong>g values circled <strong>in</strong> the exhibit <strong>in</strong>dicate unusual outcomesthat a program manager will likely want to <strong>in</strong>vestigate.To implement a follow-up process, program managers should take actions suchas the follow<strong>in</strong>g:Identify those key outcome values that appear to represent unusually positiveor negative outcomes;Hold “How Are We Do<strong>in</strong>g?” sessions with program staff shortly after eachoutcome report becomes available. Such sessions are one way to obta<strong>in</strong> staff<strong>in</strong>put after outcome reports have been issued. Ask staff for their <strong>in</strong>terpretationof why outcomes outperformed or underperformed expectations.Form a small work<strong>in</strong>g group to exam<strong>in</strong>e the reasons for unusual outcomelevels.Convene a client focus group to explore why those outcomes occurred.Recruit an outside organization, such as a local university or communitycollege, to study the reasons beh<strong>in</strong>d the outcomes.


24 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>EXHIBIT 12Report Format: Actual <strong>Outcome</strong>s versusTargets; Two Time PeriodsLast PeriodThis Period<strong>Outcome</strong> Indicator Target Actual Difference Target Actual DifferencePercent of childrenreturned to homewith<strong>in</strong> 12 months 35 25 –10 35 30 –5Percent of children whohad more than twoplacements with<strong>in</strong> thepast 12 months 20 20 0 15 12 –3Percent of childrenwhose adjustment levelimproved dur<strong>in</strong>g thepast 12 months 50 30 –20 50 35 –15Percent of clientsreport<strong>in</strong>g satisfactionwith their liv<strong>in</strong>garrangements 80 70 –10 80 85 +5This format compares actual outcomes to targets for each of the two periods.Plus (+) <strong>in</strong>dicates actual is better than target; m<strong>in</strong>us (–) <strong>in</strong>dicates actual is worse than target.The actuals for each period can also be compared to each other to show trends.Source: Performance Measurement: Gett<strong>in</strong>g Results.


Analyz<strong>in</strong>g the <strong>Outcome</strong> Information 25EXHIBIT 13Report Format: Responses to Client SurveyQuestions Broken Out by Demographic(or Program) CharacteristicsClients Report<strong>in</strong>g Change <strong>in</strong> Their Problem Condition from the TimeThey Entered Service to Time of MeasurementExtent of Improvement (%)TotalRespondent Respond<strong>in</strong>g None A Little Somewhat ConsiderableCharacteristics (N = 625) (N = 50) (N = 83) (N = 429) (N = 63)Sex and raceWhite male 265 7 11 71 10White female 284 8 13 68 11Nonwhite male 36 11 30 53 6Nonwhite female 40 10 17 65 8Age18–34 272 13 16 58 1335–49 125 6 11 75 850–64 105 3 11 80 565 and over 123 6 11 74 9Family <strong>in</strong>comeLess than $20,000 150 12 22 55 11$20,000–$49,999 286 9 15 66 10$50,000 and over 189 2 6 83 9Total 625 8 13 69 10This format enables program staff to identify what categories of clients show unusuallypositive or negative results on a particular outcome <strong>in</strong>dicator.Source: Adapted from Performance Measurement: Gett<strong>in</strong>g Results.


Us<strong>in</strong>g the ResultsS E C T I O NIVStep 13: Use the <strong>Outcome</strong> Information to Improve ServicesThe major impetus for the collection of outcome <strong>in</strong>formation for many nonprofitorganizations is probably funder requirements. However, regularly collected outcome<strong>in</strong>formation has many other vital uses, <strong>in</strong>clud<strong>in</strong>g the follow<strong>in</strong>g:Identify where improvements are needed—for which client groups, whichoffices, and which staff. This is likely to be the s<strong>in</strong>gle most important use formost nonprofits.Determ<strong>in</strong>e how effectively program modifications improved services. Thiswill help identify whether further changes are needed.Motivate staff to cont<strong>in</strong>ually strive for service improvements.Identify offices and staff that have performed well and deserve recognition.Encourage <strong>in</strong>novation.Improve fundrais<strong>in</strong>g and community relations by <strong>in</strong>clud<strong>in</strong>g outcome <strong>in</strong>formation<strong>in</strong> communications.Program Improvement and Evaluation of These Program ChangesProbably the major ultimate use of outcome management is program improvement.The data provide the start<strong>in</strong>g po<strong>in</strong>t for identify<strong>in</strong>g problems, such as the <strong>in</strong>formationbroken out by key client and service characteristics. The analysis can identifywhich particular client groups, which particular offices, or which particular caseworkersor cl<strong>in</strong>icians are do<strong>in</strong>g well, or not well. This starts off the dialogue aboutwhere problems exist, or about major accomplishments achieved.The search for explanations will provide <strong>in</strong>formation as to why the problemsoccurred. And “How Are We Do<strong>in</strong>g?” sessions will help identify causes and actionplans.A plan to improve outcomes based on analysis of the data might <strong>in</strong>cludewho is responsible for each component of the improvement planlist of actions for those responsible


28 <strong>Key</strong> <strong>Steps</strong> <strong>in</strong> <strong>Outcome</strong> <strong>Management</strong>deadl<strong>in</strong>es for the completion of the actionslist of expected resultsSubsequent outcome reports should be used to assess the extent to which theimprovements have led to the expected results. This will help determ<strong>in</strong>e whetherfurther actions are needed.Motivat<strong>in</strong>g EmployeesMost nonprofit personnel are clearly dedicated to help<strong>in</strong>g their clients and communities.<strong>Outcome</strong> management, however, can encourage nonprofit staff (and volunteers)to ma<strong>in</strong>ta<strong>in</strong> an explicit, cont<strong>in</strong>uous focus on results (outcomes) and programimprovement. <strong>Outcome</strong> measurement is <strong>in</strong>tended to provide an <strong>in</strong>dependent andobjective assessment of the extent to which clients have been helped and to identifyproblem areas.To encourage staff to pursue cont<strong>in</strong>uous learn<strong>in</strong>g and cont<strong>in</strong>uous improvementefforts, the follow<strong>in</strong>g actions might be considered.Provide recognition awards to <strong>in</strong>dividual and/or groups whose outcomeshave met or exceeded targets and/or have significantly improved.Dissem<strong>in</strong>ate each outcome report to all <strong>in</strong>terested employees, and post it <strong>in</strong>each office, if appropriate. Values that represent greater-than-expected, aswell as worse-than-expected, outcomes (such as the items circled <strong>in</strong>exhibit 11) could be highlighted.Hold “How Are We Do<strong>in</strong>g?” sessions with managers and staff us<strong>in</strong>g thelatest outcome report as a basis for discussion. The group should be askedwhy certa<strong>in</strong> functions have been go<strong>in</strong>g well and what is needed to extendsuccessful strategies to underperform<strong>in</strong>g areas. For problem areas, the groupshould attempt to identify why outcomes are not as expected and suggestways to improve the situation.In subsequent “How Are We Do<strong>in</strong>g?” sessions, the group should assess whetherthe actions previously taken have led to the expected improvements.This process—used constructively, rather than as a f<strong>in</strong>ger-po<strong>in</strong>t<strong>in</strong>g opportunity—can be one of the best ways to motivate personnel. Such sessions held regularly afterthe release of each outcome report will also show management’s commitment to outcomemanagement and cont<strong>in</strong>uous improvement.Encourag<strong>in</strong>g InnovationHav<strong>in</strong>g outcome <strong>in</strong>formation available on a regular basis can also encouragemanagers and their staff to identify and try <strong>in</strong>novative approaches to service delivery.For example, a program might test different lengths, and numbers, of sessionswith clients, or use different ways to present <strong>in</strong>formation to clients, and use the out-


Us<strong>in</strong>g the Results 29come measurement process and result<strong>in</strong>g data to compare the old and new servicedelivery procedures. This would provides strong evidence on the merits of the <strong>in</strong>novation,compared to the previous procedure.For example, if program staff want to try different ways of present<strong>in</strong>g <strong>in</strong>formationto clients, clients com<strong>in</strong>g <strong>in</strong>to the program could be assigned on an alternat<strong>in</strong>gbasis to the different presentation approaches. After sufficient time to show resultshas gone by, staff could then compare the outcomes for each of the two groups. Thistype of experiment would only require a m<strong>in</strong>or modification (to identify whichclients received which service approach) to the ongo<strong>in</strong>g outcome measurementprocess. Subsequent outcome tabulations would identify the outcomes for eachgroup. While not yet widely implemented, this use of outcome <strong>in</strong>formation appearsto have considerable promise.Improv<strong>in</strong>g Fundrais<strong>in</strong>g and Communicat<strong>in</strong>g with the Public<strong>Outcome</strong> <strong>in</strong>formation can also be used <strong>in</strong> proposals and other materials seek<strong>in</strong>gfund<strong>in</strong>g. It is assumed that organizations that can document the beneficial outcomesof programs to clients and the community are more likely to receive support. Such<strong>in</strong>formation can also encourage potential clients to use the services of the nonprofit.


The <strong>Urban</strong><strong>Institute</strong>2100 M Street, NWWash<strong>in</strong>gton, DC 20037Phone: 202.833.7200Fax: 202.429.0687E-mail: paffairs@ui.urban.orghttp://www.urban.org

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!