1999-2003 Evaluation Handbook - Section 1


You are not a program evaluation specialist. You are an Extension subject matter specialist or agent who is seeking to evaluate a program. Hence, the reason for this handbook - to guide you through the process.

Why Evaluate?

Evaluation is a way to demonstrate we're carrying out Extension's mission. Data Collection:

  • What do you want to find out?
  • Who has the information you need?
  • How and when will you collect the information?


  • Who is responsible for seeing that the evaluation is carried out?
  • Can an existing evaluation tool be used?
  • What resources are needed (budget, equipment, training, etc).

Use of Evaluation Results:

  • Who will use the results?
  • What are the users' needs, interests, expectations, values?
  • How and when will the findings be presented?

Decide what method(s) will be used to collect the data.

Report current or project-end results.

Evaluation is an important component of the program development process. It is not a separate activity, not an add-on function after a program has been completed.

Fours Steps to Evaluation

The development of an evaluation plan requires four steps. These steps follow the same reporting format suggested in your annual promotion and tenure document (statement of the problem, objectives, methods and results.)

Decide what it is you want to study / learn / solve.

Ex: Extension wants to study the impact of a concentrated series of public speaking lessons and activities on a group of Native American youth.

State your program's measurable objectives to be assessed.

Ex: To help youth improve their public speaking skills. (measurable) (Who, what, level of change, how many people, how will I know, when are these changes expected.)

Ex: To help youth develop their full potential (not measurable).

Decide what method(s) will be used to collect the data.

Report current or project-end results.

Evaluation is an important component of the program development process. It is not a separate activity, not an add-on function after a program has been completed.

A Hierarchy for Obtaining Evidence for Program Evaluation Outcomes

  1. End results:
    • What is the long-term impact of the program?
    • How have participants, their families and communities been helped, hindered or harmed by the results of changes in practices, knowledge, attitudes, skills and aspirations? To what degree?
  2. Practice change
    • Have participants applied knowledge and learned skills?
    • Have participants acted upon attitudes and have aspirations changed?
  3. KASA changes Knowledge, attitudes, skills, and aspirations:
    • Knowledge Have participants changed their awareness, understanding and / or problem solving ability? In what specific areas?
    • Attitudes Have participants changed their interest in ideas or practices that were part of the program content? Which ideas? Which practices?
    • Skills Have participants changed their verbal or physical abilities? Learned new skills? Improved performance? What skills? What abilities?
    • Aspirations Have participants selected future courses of action or made decisions based on program content? In what areas?
  4. Reactions

    • How did participants react to the program? Were they satisfied?
    • Were their expectations met? Was the program appealing? Do they perceive any immediate benefits.
  5. People Involvement

    • How many participated? Who participated (descriptive characteristics)?
  6. Activities

    • What resources were involved (content or subject matter; methods and techniques)?
  7. Inputs

    • What resources were expended on the program (time, money)?

Tips for Using Evaluation Hierarchy

Evidence of the impact of a program becomes stronger as the hierarchy is ascended. Difficulty and cost of obtaining evidence tend to increase as the hierarchy is ascended. Evaluations are strengthened by assessing at several levels of the hierarchy. Evaluation is more successful when you identify the levels of the hierarchy and specific criteria you are going to use for evaluation prior to conducting the evaluation. The lower levels of the hierarchy may be more important if the people to whom you report are unfamiliar with the program being evaluated.

Summary of Program Outcome Model - State Major Program Plan

  • Target Audience
  • Performance Goals
  • Objectives
  • Key Program Components
  • Key Program Components
  • Internal/External Linkages
  • Evaluation Plan


  • The direct products of program activities
  • number of classes taught
  • number of consultation sessions conducted.
  • number of educational materials distributed.
  • number of hours of service delivered.
  • number of participants attending.


  • Benefits for participants during and after program activities.
  • New knowledge - 30% more consumers in Taos County understand how to read packaged food
  • labels. Increased skills - 50% of participants can now balance their check-bookswith 100% accuracy.
  • Changed attitudes or values systems and hunger problems.
  • Modified behavior County records show 50% increase in number of home owners having water tested before using new wells.
  • Improved condition - Farm incomes recorded economic gains.
  • Altered status - ten fisheries were able to show an average $25,000 in increased income after being trained to use extension's record book income tracking system moving their economic well being from poverty to medium income.

Evaluation Tool Selection

Which evaluation procedure or methods will you use to collect evidence:

  • Interview questionnaire (filled out by interviewers)
  • Mail questionnaire or survey
  • Questionnaire distributed at meeting(s)
  • Rating sheet, checklist or observation sheet with specific criteria listed
  • Documentation from observers
  • Telephone survey
  • Pre-test / post-test
  • Study of attendance, participation or other available records
  • Case study
  • During the meeting activity
  • Mail questionnaire (after)
  • Focus group interview
  • Direct observation during the meeting
  • Direct observation (follow-up location)

Make your evaluation as brief as possible. Collecting too much data is like planting too many zucchini.