26.01.2015 Views

public program evaluation - Askew School of Public Administration ...

public program evaluation - Askew School of Public Administration ...

public program evaluation - Askew School of Public Administration ...

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

PUBLIC PROGRAM EVALUATION<br />

PAD5327, Spring 2013<br />

Pr<strong>of</strong>. Fran Berry<br />

Class Hours: Wednesday 5:30-8:15 p.m.<br />

Office Hours: Tuesday 3:00-5:00p.m. & Wednesday 3:00-5:00p.m. or by appointment<br />

Office: 003 Bellamy Building, <strong>of</strong>fice phone--644-7603; fax 644-7617; cell 443-0061<br />

e-mail: fberry@fsu.edu<br />

I. COURSE OBJECTIVES<br />

This class is designed for the in-service and pre-service pr<strong>of</strong>essional <strong>public</strong> manager and<br />

policy analyst, as well as for doctoral students who will teach <strong>program</strong> <strong>evaluation</strong>. The<br />

course is one <strong>of</strong> the course requirements for the MPA student taking the Policy Analysis<br />

concentration in the MPA <strong>program</strong>. At the end <strong>of</strong> the semester, each student in the course<br />

should be able to meet each <strong>of</strong> the following seven learning objectives:<br />

Learning Objective One: Be able to describe the different types <strong>of</strong> <strong>program</strong> <strong>evaluation</strong>s,<br />

and select and apply each type <strong>of</strong> <strong>evaluation</strong> for an appropriate purpose in the <strong>program</strong><br />

environment in which <strong>evaluation</strong>s are conducted;<br />

Learning Objective Two: Know how to identify stakeholders for <strong>program</strong> <strong>evaluation</strong>s,<br />

and engage them in useful, economical ways for the <strong>evaluation</strong>, as well as understanding<br />

the roles <strong>of</strong> evaluators in making <strong>evaluation</strong>s useful and relevant to the stakeholders;<br />

Learning Objective Three: Demonstrate familiarity with the primary types <strong>of</strong> research<br />

designs used in <strong>program</strong> <strong>evaluation</strong>s;<br />

Learning Objective Four: Introduce the practitioner with the primary methods (including<br />

both quantitative and qualitative methods) for conducting <strong>evaluation</strong>s, both to enable the<br />

practitioner to conduct sound <strong>program</strong> <strong>evaluation</strong>s and to critique <strong>evaluation</strong>s they<br />

oversee or read;<br />

Objective Five: Develop and demonstrate skills in constructing logic models and <strong>program</strong><br />

performance measures, collecting data, and writing succinct reports on <strong>evaluation</strong>s;<br />

Learning Objective Six: Discuss the actual and ideal roles <strong>of</strong> the evaluator in the<br />

executive and legislative policy processes, including potential conflicts <strong>of</strong> interest and<br />

ethical issues evaluators face; and<br />

Learning Objective Seven: Learn how to successfully manage a <strong>program</strong> <strong>evaluation</strong><br />

project to promote full utilization <strong>of</strong> the <strong>evaluation</strong> into the policy process, and to have<br />

the experience <strong>of</strong> actually conducting an <strong>evaluation</strong> <strong>of</strong> a <strong>public</strong> <strong>program</strong>.


II. Course Requirements<br />

Grading:<br />

Evaluation Project: 30%<br />

Short Project One: 15%<br />

Short Project Two: 15%<br />

Mid-term exam: 30%<br />

Class Participation: 10%<br />

The class will be interactive, thus it is important that you have completed the readings for<br />

each class session and come prepared to discuss the material. Students will be asked to<br />

lead a discussion or group facilitation or case study review on a topic related to the<br />

overall class focus, mostly using materials I provide them. I will be assigning additional<br />

short <strong>evaluation</strong>s or cases throughout the semester that are not posted on the syllabus.<br />

There will be two short written projects, and these will be detailed in a handout, which<br />

will consist <strong>of</strong> a short written report in a briefing format or a critical review <strong>of</strong> one types<br />

<strong>of</strong> <strong>evaluation</strong> theory or approach. There will be an in-class mid-term exams, which will<br />

consist <strong>of</strong> short answer and/or essay questions, or <strong>evaluation</strong> critiques which will<br />

integrate major issues from the course. The major written product for the class is an<br />

<strong>evaluation</strong> project design, which will be described in detail in a written handout and will<br />

count 30% <strong>of</strong> your class grade. Brief presentations on your <strong>evaluation</strong> project will be<br />

presented in class during April. Class members (in groups <strong>of</strong> two) will be asked to make<br />

presentations <strong>of</strong> about 15 minutes on a topic related to the night’s readings and themes,<br />

choosing from a list compiled by Dr. Berry. The topics, readings and a signup for the<br />

semester will be presented early in the semester.<br />

Please let me know if you will not be in class, as I do keep track <strong>of</strong> attendance. The<br />

student honor code is enforced in the class; if you are unfamiliar with it, please refer to<br />

the Student Handbook. I expect that the short written project and the final exam will be<br />

your own work alone, and not conducted with help from any other person.<br />

If you see materials in the popular media or in your work place on <strong>program</strong> <strong>evaluation</strong><br />

that you think are pertinent, please bring them in to share with me and the class.<br />

Note: For items submitted to me, please no binders; a staple in the top left-hand corner is<br />

very functional. Please send me an electronic version <strong>of</strong> each hard copy assignment you<br />

submit to me. Even with great care, sometimes things get lost.<br />

ACADEMIC CONDUCT<br />

Students are expected to uphold the Academic Honor Code published in The<br />

Florida State University Bulletin and the Student Handbook. The Academic Honor<br />

System <strong>of</strong> The Florida State University is based on the premise that each student has the<br />

responsibility (1) to uphold the highest standards <strong>of</strong> academic integrity in the student's<br />

own work, (2) to refuse to tolerate violations <strong>of</strong> academic integrity in the university<br />

community, and (3) to foster a high sense <strong>of</strong> integrity and social responsibility on the part


<strong>of</strong> the university community. Please see the following web site for a complete<br />

explanation <strong>of</strong> the Academic Honor Code. http://www.fsu.edu/Books/Student-<br />

Handbook/codes/honor.html<br />

Academic dishonesty includes, but is not limited to:<br />

Plagiarism: quoting or paraphrasing the ideas or opinions <strong>of</strong> others without<br />

appropriate attribution in text citations and a reference list. This includes<br />

books, journal articles, conference presentations, published or unpublished<br />

papers and web-based materials.<br />

Fraud: submitting work that was not prepared by you, or which you have<br />

previously submitted for another class.<br />

Cheating: giving help to other students, or asking them for it, on the final<br />

examination. The consequence <strong>of</strong> academic dishonesty is a grade <strong>of</strong> F on the<br />

assignment in question<br />

AMERICANS WITH DISABILITIES ACT<br />

Students with disabilities needing academic accommodation should:<br />

(1) register with and provide documentation to the Student Disability Resource Center;<br />

and<br />

(2) bring a letter to the instructor indicating the need for accommodation and what type.<br />

This should be done by the second week <strong>of</strong> class. This syllabus and other class materials<br />

are available in alternative format upon request. For more information about services<br />

available to FSU students with disabilities, contact the:<br />

Student Disability Resource Center<br />

97 Woodward Avenue, South<br />

108 Student Services Building<br />

Florida State University<br />

Tallahassee, FL 32306-4167<br />

(850) 644-9566 (voice)<br />

(850) 644-8504 (TDD)<br />

sdrc@admin.fsu.edu<br />

http://www.disabilitycenter.fsu.edu/<br />

SYLLABUS CHANGE POLICY: This syllabus is a guide for the course and is subject<br />

to change with advance notice. Additional readings will be added in the future.<br />

III. COURSE TEXTS<br />

The following text is required reading and can be purchased at the University Bookstore<br />

or Bill's Bookstore, or online.<br />

Evaluation in Organizations: A Systematic Approach to Enhancing<br />

Learning, Performance and Change by Darlene Ruff-Eft and Hallie<br />

Preskill, Basic Books, New York, New York, second edition, 2009.<br />

Optional as background materials but not required in the class:<br />

James Sanders, ed., The Program Evaluation Standards, third edition, Sage<br />

<strong>Public</strong>ations, Thousand Oaks, CA 2010.


Handbook <strong>of</strong> Practical Program Evaluation (Jossey Bass Nonpr<strong>of</strong>it &<br />

<strong>Public</strong> Management Series) by Joseph S. Wholey, Harry P. Hatry, and Kathryn E.<br />

Newcomer, Third edition, 2010.<br />

An excellent resource <strong>of</strong> materials on how to conduct community-based<br />

<strong>evaluation</strong>s is from the Community Toolbox on the University <strong>of</strong> Kansas website<br />

which has a 46 chapter handbook on how to promote community health through<br />

<strong>evaluation</strong> and analysis. The URL is: http://ctb.ku.edu/en/tablecontents/index.aspx<br />

IV. CLASS SCHEDULE:<br />

TOPICS AND ASSIGNED READINGS<br />

January 9 th<br />

INTRODUCTION AND COURSE OVERVIEW<br />

January 16 th History and where Evaluations are Used;<br />

Standards for Evaluation, Types and Purposes <strong>of</strong> Evaluation; Designing<br />

Useful Evaluations<br />

Russ-Eft and Preskill, Evaluation in Organizations, Chapter 1 Defining Evaluation and<br />

Chapter 2 The Evolution <strong>of</strong> Evaluation<br />

Topics to cover: Gain a brief overview <strong>of</strong> how <strong>program</strong> <strong>evaluation</strong>s have developed<br />

over time, and where they have been practiced; Understand the different purposes for<br />

conducting <strong>program</strong> <strong>evaluation</strong>s, what questions are asked, and how this can impact the<br />

design and implementation <strong>of</strong> an <strong>evaluation</strong>.<br />

January 23 rd The Principles and Practice <strong>of</strong> Causal Inference;<br />

Using Logic Models; Performance Measurement<br />

**Logic Models from the Community Toolbox, University <strong>of</strong> Kansas<br />

Russ-Eft and Preskill, Evaluation in Organizations Chapter 5 Focusing the Evaluation<br />

Topics to cover: How to establish causal inference and why it is important in an<br />

<strong>evaluation</strong> research design; understanding logic models and how to develop them to<br />

illustrate the causal linkages among activities, sub-objectives or short term outcomes, and<br />

longer term outcomes; should all <strong>evaluation</strong>s be causally directed Understanding how to<br />

develop the <strong>evaluation</strong>’s rationale and purpose. Developing logic models that will be<br />

useful to stakeholders, and help direct the measures used for the <strong>evaluation</strong>.


January 30 th Working with the Evaluation’s Stakeholders and Clients;<br />

Roles <strong>of</strong> the Evaluator<br />

**Wholey, Hatry and Newcomer, Chapter 2 Analyzing and Engaging Stakeholders<br />

Topics to cover: Identifying and engaging stakeholders <strong>of</strong> the <strong>program</strong> you are<br />

evaluating is a necessary and important part <strong>of</strong> the <strong>program</strong> <strong>evaluation</strong> process, both to<br />

get their views on how the <strong>program</strong> is operating, and to find out what information and<br />

criteria they believe the <strong>program</strong> should be assessed by. Considering the diverse roles<br />

that evaluators can play in an <strong>evaluation</strong>, and the skills and pr<strong>of</strong>essional duties that the<br />

roles require. Considering choices about what type <strong>of</strong> evaluator you want to be.<br />

February 6 th Selecting a Research Design: Process/Implementation<br />

and Formative Evaluations; Time series <strong>evaluation</strong>s.<br />

Russ-Eft and Preskill, Chapter 6 Selecting an Evaluation Design from Evaluation in<br />

Organizations,<br />

**Tim Weston, “Formative Evaluation for Implementation” (BB)<br />

**U.S. General Accountability Office, Medicaid Expansion: States’ Implementation <strong>of</strong><br />

the Patient Protection and Affordable Care Act, August 2012.<br />

Topics to cover: Research designs are the logical method <strong>of</strong> obtaining data and using<br />

analysis to determine the findings. Everyone should have a good familiarity with the<br />

various research designs available to the <strong>program</strong> evaluator, and the logic <strong>of</strong> getting<br />

comparative data for a control <strong>of</strong> comparison group, as well as gathering data before and<br />

after treatment, if that is possible. Assessing whether a <strong>program</strong> or treatment is<br />

implemented before/during your <strong>evaluation</strong>; understanding the impact that <strong>program</strong><br />

implementation has on <strong>program</strong> <strong>evaluation</strong>; learning the basic elements <strong>of</strong> how to<br />

conduct a process <strong>evaluation</strong> and an evaluability assessment.<br />

February 13 th<br />

Research Designs: Experimental Designs and Random<br />

Experiments<br />

**Ann Solberg, “Community Posthospital Follow-up Services,” Evaluation Review vol.<br />

7 (1) Feb. 1983. (BB)<br />

**Jay Greene et.al, “<strong>School</strong> Choice in Milwaukee: A Randomized Experiment” from<br />

Evaluation in Practice by Bingham and Felbinger, 2009. (BB)<br />

Topics to cover: Properties <strong>of</strong> experiments; the logic <strong>of</strong> experimental design, including<br />

the importance <strong>of</strong> random assignment ; threats to internal validity; external validity;<br />

participation in <strong>program</strong>s, including bias and <strong>program</strong> coverage


February 20 th Getting Customer/Client Input through surveys ;<br />

Sampling for Surveys<br />

Russ-Eft and Preskill, Evaluation in Organizations Chapter 10 Surveys and<br />

Questionnaires;; Chapter 12 Sampling<br />

Topics to cover: How to construct and implement great surveys; sampling methods and<br />

when and why to use them.<br />

February 20 th<br />

First Project Due<br />

February 27 th Quasi-Experimental Designs: Interrupted<br />

Time Series; Developing Measures and Collecting Data<br />

Russ-Eft and Preskill, Evaluation in Organizations Chapter 7 Choosing Data Collection<br />

methods; Chapter 8 Archival Data; Chapter 9 Observation<br />

**Ken Meier, Executive Reorganization <strong>of</strong> Government: Impact on Employment and<br />

Expenditures, AJPS, 1980.<br />

Optional for short presentations:<br />

Wholey, Hatry and Newcomer; Chapter 13 Using Trained Observer Ratings; Chapter 14,<br />

Collecting Data in the Field; Chapter 15, Using the Internet, Chapter 16 Focus Group<br />

Interviewing, and Chapter 17 Using Stories in Evaluation<br />

Optional:<br />

Kickham, Kenneth and David Ford, Are State Marriage Initiatives Having an Effect An<br />

Initial Exploration <strong>of</strong> the Impact on Divorce and Childhood Poverty Rates”, in<br />

<strong>Public</strong> <strong>Administration</strong> Review, September/October 2009, pages 846-854. (BB)<br />

Topics to cover: Quasi-Experimental Designs, and how interrupted time-series can be<br />

sued to rule out alternative explanations. How to collect data from multiple sources.<br />

Focus groups and field data are also widely used for many purposes to get in-depth view<br />

or people or clients or data from client files.<br />

March 6 th<br />

Analyzing Data, Quantitative and Qualitative<br />

Approaches<br />

Russ-Eft and Preskill, Evaluation in Organizations; Chapter 11 Individual and Focus<br />

Group Interviews; Chapter 13 Analyzing Evaluation Data<br />

Topics to cover: A broad overview <strong>of</strong> most frequently used methods for data analysis<br />

and interpretation within <strong>program</strong> <strong>evaluation</strong> fields, and the issues we consider in<br />

choosing the best methods <strong>of</strong> data analysis. A brief overview <strong>of</strong> the world view behind<br />

qualitative approaches to <strong>program</strong> <strong>evaluation</strong> and how they differ from quantitative<br />

approaches.


March 13 th<br />

NO CLASS SPRING BREAK<br />

March 15 Second Project Due<br />

March 20 th Linking Performance Measurement and<br />

Management to Improving Programs, and Gathering Data for<br />

Ongoing Program Assessment<br />

**Wholey, Hatry and Newcomer; Chapter 5 Performance Measurement (BB)<br />

**Robert Kaplan, Presentation on the Balanced Scorecard:<br />

Performance Management in <strong>Public</strong> Sector Organizations (BB)<br />

**Summary Information on the Sterling Award Criteria (BB)<br />

**OPPAGA, (December 2003) State Faces Challenges to improving community <strong>Public</strong><br />

Health in Florida (No. 03-71.) and also four page update from June 2006—“Steps<br />

Have been Taken to Improve Community <strong>Public</strong> Health Infrastructure<br />

Throughout Florida” (BB)<br />

“Performance Measurement and Benchmarking” Chapter 4 from Evaluation in Practice<br />

by Bingham and Felbinger (BB)<br />

Topics to cover: Using data collected for planning, budgeting and annual reports for<br />

<strong>program</strong> assessment and <strong>evaluation</strong>, including frameworks <strong>of</strong> the Balanced Scorecard,<br />

and the Sterling Criteria. How can managers integrate <strong>program</strong> <strong>evaluation</strong> into other<br />

ongoing management duties<br />

March 27 th<br />

Mid-term Exam (in-class)<br />

April 3 rd Analyzing Data and Communicating Your<br />

Findings; The Politics and Ethics <strong>of</strong> Evaluations<br />

Russ-Eft and Preskill, Evaluation in Organizations Chapter 4<br />

Russ-Eft and Preskill, Evaluation in Organizations Chapter 13 Evaluating Evaluation<br />

Data; and Chapter 14 Communicating and Reporting Evaluation Activities and Findings;<br />

**Program Evaluation Case<br />

Topics to cover: A broad overview <strong>of</strong> most frequently used methods for data analysis<br />

and interpretation within <strong>program</strong> <strong>evaluation</strong> fields.


APRIL 10 TH Managing Evaluations; Institutional Review Boards<br />

(IRBs); The Request for proposal (RFP); Managing An Evaluation;<br />

Implementing And Evaluating an Evaluation<br />

Russ-Eft and Preskill, Evaluation in Organizations Chapter Chapter 15 Planning,<br />

Managing and Budgeting the Evaluation; Chapter 16 Evaluating the Evaluation, and<br />

Chapter 17 Strategies for Implementing Evaluations in Organizations<br />

**Will assign an <strong>evaluation</strong> proposal (RFP) to read<br />

Topics to cover: basic introduction to writing an <strong>evaluation</strong> proposal; sources <strong>of</strong><br />

information for tracking <strong>evaluation</strong> requests for proposals (RFPs), roles for contract<br />

management, gantt charts and project management; developing an <strong>evaluation</strong> budget.<br />

April 17 th<br />

Cost Benefit and Cost Effectiveness Evaluations<br />

*Bingham and Felbinger, Chapter 13 and 14 (BB)<br />

**FL OPPAGA, Intermediate Sanctions for Non-Violent Offenders Could<br />

Produce Savings, March 2010.<br />

The Program Evaluation Standards on cost benefit analysis (BB)<br />

Topics to cover: role <strong>of</strong> cost benefit and cost effectiveness designs in policy analysis and<br />

<strong>program</strong> <strong>evaluation</strong>; issues to consider in conducting cost benefit and cost effectiveness<br />

designs and methodology for each.<br />

April 24 th<br />

April 24 th<br />

Evaluation Projects Presented in Class<br />

Evaluation Project Paper DUE<br />

April 29-May3, 2013<br />

THIS CLASS)<br />

Final Exam Week (NO FINAL EXAM FOR<br />

May 3 Graduation<br />

Websites <strong>of</strong> Evaluation Materials and Resources<br />

Americhan Evaluation Association (AEA), Magnolia, AR, http://www.eval.org. AEA is an international<br />

pr<strong>of</strong>essional association <strong>of</strong> evaluators devoted to the application and exploration <strong>of</strong> <strong>program</strong> <strong>evaluation</strong>,<br />

personnel <strong>evaluation</strong>, technology, and many other forms <strong>of</strong> <strong>evaluation</strong>.<br />

Campbell-Kibler Associates, Inc., Groton, MA, http://www.campbell-kibler.com. Campbell-Kibler Associates<br />

is a private <strong>evaluation</strong> consulting firm with free resources available on their web site.<br />

Horizon Research, Inc., Chapel Hill, NC, http://www.horizon-research.com. A private <strong>evaluation</strong> consulting<br />

firm with free resources available on the web site.


The Evaluation Center at Western University, Kalamazoo, MI, http://www.wmich.edu/evalctr/index.html. The<br />

Evaluation Center’s mission is to provide national and international leadership for advancing the theory and<br />

practice <strong>of</strong> <strong>program</strong>, personnel, and student/constituent <strong>evaluation</strong>, as applied primarily to education and<br />

human services.<br />

Survey Suite at the University <strong>of</strong> Virginia, http://intercom.virginia.edu/surveysuite. This site allows you to<br />

develop online surveys for your <strong>evaluation</strong>, gives you a secure access to allow participants to complete<br />

surveys, and allows you to download the data as a statistical summary and/or a Micros<strong>of</strong>t Excel file.<br />

Web <strong>Public</strong>ations<br />

Bond, S. L., Boyd, S. E., & Rapp, K. A. (1997). Taking Stock: A Practical Guide to Evaluating Your Own<br />

Programs. Chapel Hill, NC: Horizon Research. [ www.horizon-research.com/<strong>public</strong>ations/stock.pdf ]<br />

Frechtling, J. and Sharp. L. (Eds) (1997.) User-Friendly Handbook for Mixed Methods Evaluation.<br />

Washington, DC: National Science Foundation. [ www.ehr.nsf.gov/ehr/rec/pubs/nsf97-153/start.htm ]<br />

Stevens, F., Lawrenz, L. & Sharp, L. (1993). User-Friendly Handbook for Project Evaluation: Science,<br />

Mathematics, Engineering and Technology Education. Washington, DC: National Science Foundation.<br />

[ http://www.nsf.gov/pubs/2002/nsf02057/start.htm]<br />

Print <strong>Public</strong>ations<br />

Berdie, D. R. & Anderson, J.F. (1974). Questionnaires: Design and Use. New Jersey: Scarecrow Press.<br />

Berk, R. A. & Rossi, P. H. (1990). Thinking About Program Evaluation. Newbury Park, CA: Sage.<br />

Baker, T. L. (1998). Doing Social Research, 3 rd Ed. New York, NY: McGraw Hill.<br />

Chelimsky, E. & Shadish, W. R. (1997). Evaluation for the 21 st Century: A Handbook. Newbury Park, CA:<br />

Sage.<br />

Converse, J. M. & Presser, S. (1986). Survey Questions: Handcrafting The Standardized<br />

Questionnaire. Sage University Papers 63: Quantitative Applications in the Social Sciences.<br />

Beverly Hills, CA: Sage.<br />

Guba and Lincoln, Fourth Generation Evaluation<br />

Davis, B. G. & Humphreys, S. (1993). Evaluating Intervention Programs. New York, NY: Teachers College<br />

Press.<br />

DeVillis, R. F. (1991). Scale Development: Theory and Applications. Newbury Park, CA: Sage.<br />

Fink, A. & Kosec<strong>of</strong>f, J. (1985). How to Conduct Surveys: A Step-by-Step Guide. Newbury Park, CA: Sage.<br />

Fitz-Gibbon, C. T. & Morris, L. L. (1987). How to Analyze Data. Newbury Park, CA: Sage.<br />

Fitz-Gibbon, C. T. & Morris, L. L. (1987). How to Design a Program Evaluation. Newbury Park, CA: Sage.<br />

Freeman, H. E., Rossi, P. H. & Sandefur, G. D. (1993). Workshop for Evaluation 5: A Systematic Approach.<br />

Newbury Park, CA: Sage.<br />

Glebocki, J. & Lancaster, D. (1984). In Search <strong>of</strong> The Wild Hypothesis. Arvada, CO: Anderson-Bell.<br />

Gredler, M. E. (1995). Program Evaluation. Englewood Cliffs, NJ: Prentice Hall.


Greenbaum, T. L. (1997). The Handbook for Focus Group Research. New York, NY: MacMillan.<br />

Henderson, M. E., & Morris, L. L. (1987). How to Measure Attitudes. Newbury Park, CA: Sage.<br />

Herman, J. L., Morris, L. L. & Fitz-Gibbon, C. T. (1987). Evaluator’s Handbook. Newbury Park, CA: Sage.<br />

Klecka, W. R. (1980). Discriminant Analysis. Sage University Papers 19: Quantitative Applications in the<br />

Social Sciences. Beverly Hills, CA: Sage.<br />

King, J. A., Morris, L. L. & Fitz-Gibbon, C. T. (1987). How to Access Program Implementation. Newbury<br />

Park, CA: Sage.<br />

Krueger, R. A. (1998). Analyzing and Reporting Focus Group Results. Newbury Park, CA: Sage.<br />

Krueger, R.A. (1998). Developing Questions for Focus Groups, Vol.3. Newbury Park, CA: Sage.<br />

Krueger, R. A. & Casey, M. A. (2000). Focus Groups: A Practical Guide to Applied Research. Newbury Park,<br />

CA: Sage.<br />

Lee, E. S., Forth<strong>of</strong>er, R. N. & Lorimore, R. J. (1989). Analyzing Complex Survey Data. Sage University<br />

Papers 71: Quantitative Applications in the Social Sciences. Beverly Hills, CA: Sage.<br />

L<strong>of</strong>land, J. & L<strong>of</strong>land, L. H. (1995). Analyzing Social Settings: A Guide to Qualitative Observation and<br />

Analysis, 3 rd Ed. Belmont, CA: Wadsworth.<br />

Miller, D. C. (1991). Handbook <strong>of</strong> Research Design and Social Measurement, 5 th Ed. Newbury Park, CA:<br />

Sage.<br />

Mohr, L.B. (1990). Understanding Significance Testing. Sage University Papers 73: Quantitative<br />

Applications in the Social Sciences. Beverly Hills, CA: Sage.<br />

Morris, L. L., Fitz-Gibbon, C. T. & Freeman, M.E. (1987). How to Communicate Evaluation Findings.<br />

Newbury Park, CA: Sage.<br />

Osterlind, S. J. (1983). Test Item Bias. Sage University Papers 30: Quantitative Applications in the Social<br />

Sciences. Beverly Hills, CA: Sage.<br />

Patton, M. Q. (1990). Qualitative Evaluation and Research Methods, 2 nd Ed. Newbury Park, CA: Sage.<br />

Patton, M. Q. (1987). How to Use Qualitative Methods in Evaluation. Newbury Park, CA: Sage.<br />

Rossi, P.H., Freeman, H.E. & Lipsey, M.W. (1999). Evaluation: A Systematic Approach, 6 th Ed. Newbury<br />

Park, CA: Sage.<br />

Schroeder, L. D., Sjoquist, D. L. & Stephan, P. E. (1986). Understanding Regression Analysis: An<br />

Introductory Guide. Sage University Papers 57: Quantitative Applications in the Social Sciences. Beverly<br />

Hills, CA: Sage.<br />

Scriven, M. (1991). Evaluation Thesaurus, 4 th Ed. Newbury Park, CA: Sage.<br />

Stecher, B. M. & Davis, W. A. (1987). How to Focus an Evaluation. Newbury Park, CA: Sage.<br />

Torres, R. T., Preskill, H. S. & Piontek, M. E. (1996). Evaluation Strategies for Communicating and<br />

Reporting: Enhancing Learning in Organizations. Newbury Park, CA: Sage.


Wadsworth, E. M. (Ed). (1996). Evaluation Resource Book. West Lafayette, IN: Women in Engineering<br />

Program Advocates Network.<br />

Weiss, C. H. (1997). Evaluation. Englewood Cliffs, NJ: Prentice Hall.<br />

Wholey, J. S., Newcomer, K. E. & Hatry, H. P. (2007). Handbook <strong>of</strong> Practical Program Evaluation. San<br />

Francisco, CA: Jossey-Bass.<br />

Worthen, B. R. & Sanders, J. R. (1987). Educational Evaluation: Alternative Approaches and Practical<br />

Guidelines. White Plains, NY: Longman.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!