PDF version - Alabama Cooperative Extension System

sites.aces.edu

PDF version - Alabama Cooperative Extension System

Measuring Program

Outcomes and Impacts

Dr. Ayanava Majumdar

Extension Entomologist, State SARE Coordinator

Gulf Coast Research and Extension Center

8300 State Hwy 104, Fairhope, AL 36532

Email: bugdoctor@auburn.edu

Cell phone: 251-331-8416


Objectives

• Why measure outcomes and impacts

• Types of evaluations (Logic model)

• Major evaluation techniques

• Criteria of Evaluation

• Sources of information

This presentation contains information unique to ACES and ideas should be taken as

suggestions. You may modify concept given in this presentation to suit your needs.


Some critical sources of information

• Kirkpatrick’s Four Levels of

Evaluation (1994), Encyclopedia

of Education

• Logic Model Development Guide

(W.K. Kellogg Foundation, 1998)

• Developing a Logic Model (Taylor-

Powell & Renner, 2000, UWEX)


Conventional measurements

(old-school evaluations)

• Participant reaction: usefulness of program

• Teaching & facilitation: suggestions for

improvement

• Outcomes: what did you learn today

• Future programming: what do want to

learn more about

Taylor-Powell & Renner, 2000


Conventional measurements

(old-school evaluations)

…focus on outputs and immediate effects (learning)

…no information about action and conditions

…in effect, this was the push strategy (linear TOT)

Taylor-Powell & Renner, 2000


Welcome to the

Accountability Era!

What gets measured gets

done!

If you don’t measure

results, you can’t

differentiate success

from failure.

If you demonstrate

results, you can win

public support.

Osborne & Gaebler, 1992


Accountability & evolution of concepts

Outputs. The activities,

products, and participation

generated through the

investment of

resources. Goods and services

delivered.

Outcomes. Results or

changes from the program

such as changes in

knowledge, awareness,

skills, attitudes, opinions,

aspirations, behavior,

practice, decision-making,

policies, social action,

Impact. The social, economic,

civic and/or environmental

consequences of the program.

Impacts tend to be longer-term

and so may be equated with

goals.

condition, or status.

Taylor-Powell & Renner, 2000


Revisiting the LOGIC MODEL

Remember…

• It is not a theory, it is not a reality

• It is only a MODEL…a framework for visualizing

relationships

Taylor-Powell & Renner, 2000


Measuring success is complicated!

• Measuring outcomes, by itself, will need resources!

• Assumptions and external factors create variations in

outcomes.


TYPES OF EVALUATION


Types of evaluation

Four basic types:

1. Needs assessment

2. Process evaluation

3. Outcome evaluation

4. Impact evaluation When to conduct

• First thing we should be doing!

• Establishes priorities

Don’t forget your

camera, writing

instruments,

survey print & give

time to respond

What questions to ask

• Characteristics of audience

• Needs of audience (prioritize)

• Where do they find information

• Best learning method

• Find barriers to knowledge adoption


Types of evaluation contd.

Four basic types:

1. Needs assessment

2. Process evaluation

3. Outcome evaluation

4. Impact evaluation

When to conduct

• During program implementation

• E.g., quality survey, satisfaction survey,

future needs survey

What questions to ask

• Were you satisfied with delivery

methods

• Was there too much information

• Are you reaching to targeted audience


Types of evaluation contd.

Four basic types:

1. Needs assessment

2. Process evaluation

3. Outcome evaluation

4. Impact evaluation

Part 1

When to conduct

• Measure learning

• During on-site programs: workshops, field days, etc.

What questions to ask

• Short-term change: key words in questions “awareness”,

“knowledge”, “opinion”, “motivation”

• Document who is not benefiting (analyze sample and

understand biases)


Types of evaluation contd.

Four basic types:

1. Needs assessment

2. Process evaluation

3. Outcome evaluation

4. Impact evaluation

Part 2

When to conduct

• Measure behavioral changes

• During one-to-one visits, farm visits, telephonic, mail,

email…repeat surveys!

What questions to ask

• Medium-term changes: key words in questions “behavior”,

“practices”, “decision”, “action”

• Are you meeting goals Unintended outcomes


Types of evaluation contd.

Four basic types:

1. Needs assessment

2. Process evaluation

3. Outcome evaluation

4. Impact evaluation

When to conduct

• You should have partially achieved this if you did

previous steps right.

What questions to ask

• Long-term changes: change in “condition”

• Separate real impact from “background noise”

• Try to document final consequences: new products,

innovations, services, community changes, motivation to act in

the absence of program


Needs assessments

Needs assessment is the first step to identification of

critical desires and wants of communities.

Assessments can be done by paper based individual

or by group assessment techniques.

Pictured above evaluations in progress in Cullman and Andalusia, AL.


Impact assessments

On-farm impact assessments are friendly and easy to

conduct with grower cooperators using a carefully

developed questionnaire.

Pictured above are two vegetable farmers in Baldwin & Mobile Co., AL.


EVALUATION TECHNIQUES


Evaluation techniques

• Survey: collect standardized information, may be

mailed, done on-site, structured interviews (N, P)

N = Needs asses., P = Process eval., O = Outcome eval., I =

Impact eval.

Taylor-Powell, 2002


Evaluation techniques (contd.)

• Case study: in-depth examination of particular

groups or individuals (O, I)

• Interviews: face-to-face interaction,

conversational, one-on-one or small groups (P, I)

N = Needs asses., P = Process eval., O = Outcome eval., I =

Impact eval.

Taylor-Powell, 2002


Evaluation techniques (contd.)

• Group assessment: use of nominal techniques

like focus groups, brainstorming, community

forum (N)

N = Needs asses., P = Process eval., O = Outcome eval., I =

Impact eval.

Taylor-Powell, 2002


Evaluation techniques (contd.)

• Testimonials: individual statements by people

indicating personal reactions (O, I)

N = Needs asses., P = Process eval., O = Outcome eval., I =

Impact eval.

Taylor-Powell, 2002


Other techniques…

• Expert/peer review: examination by a review

committee, Delphi method (“indicator”, I)

• Portfolio reviews: collection and presentation of

materials and samples of work that indicate

breadth of program (“indicator”)

N = Needs asses., P = Process eval., O = Outcome eval., I =

Impact eval.

Taylor-Powell, 2002


“INDICATORS” of Success

Relatively

EASY

Relatively

DIFFICULT

Indicators: can be qualitative or quantitative


Evaluation techniques (contd.)

• Tests: assess knowledge,

skills, performance, e.g.,

pre-test & post-test (P, O)

• Success or problem stories:

narrative account by

participants about adoption

of new practices(“indicator”,

N, I)

N = Needs asses., P = Process eval., O = Outcome eval.,

I = Impact eval.

Taylor-Powell, 2002


Evaluation Criteria

To be used when making an “Evaluation Plan”!


• Utility:

Four evaluation criteria

– Goal: how useful is your program evaluation to

you & your audience

– Know the following information:

• State purpose clearly

• Consider your audience

• Communicate findings & relevance of findings

Boyd, 2002


Evaluation criteria contd.

• Feasibility:

– Goal: how practical is your assessment

technique

– Know the following information:

• Keep evaluation practical

• Calculate cost: benefit

• Use appropriate evaluation technique/s

Boyd, 2002


Evaluation criteria contd.

• Appropriateness:

– Goal: how appropriate is your program

evaluation for those involved

– Know the following information:

• Respect people and their rights

• Use appropriate choice statements

• Disclose findings properly

Boyd, 2002


Evaluation criteria contd.

• Accuracy:

– Goal: how accurate is your program

evaluation to you & your audience

– Know the following information:

• Design repeatable surveys

• Use appropriate analyses

• Draw justifiable conclusions

Boyd, 2002


Final tips on program evaluations

• Consult specialist in planning phase

• Think backwards in LOGIC model (impact >>

output >> input) & allocate resources

• Think about “indicators” of success

• If you conduct surveys, allocate time to respond

(don’t rush)

• Publicize your programs >> create a “pull”

system >> more success


Thank you for patient listening!

This presentation is archived at

Ext. Conference Website:

http://tinyurl.com/ydar9up

QUESTIONS FOR Dr. A

More magazines by this user
Similar magazines