NMSU branding

1999-2003 EVALUATION HANDBOOK Appendices

EVALUATION HISTORY

There are four evaluation mandates for Extension issued by the federal government.

  • Extension's first evaluation mandate was a part of the 1914 Smith-Lever Act. It called for a "full and detailed report of its operations".
  • In 1977, Section 1459 of the 1977 Food and Agriculture Act ushered in a new era of evaluation activity when it directed the Secretary of Agriculture to transmit to Congress "an evaluation of the economic and social consequences of programs".
  • In 1993, the Government Performance and Results Act (GPRA) required strategic plans and numerical assessment of outcomes as a part of its measurement of performance of government organizations.
  • In 1998, the Agricultural Research, Extension and Education Reform Act requires all institutions eligible for federal research and extension formula funds to submit a Plan of Work to include an evaluation component.

Managing an evaluation is similar to managing a program - it requires planning the use of time, people and resources.

PILOT TESTING YOUR EVALUATION TOOL

Pilot testing a questionnaire or other method of gathering data can save many later headaches. Unfortunately, many people consider pilot testing nothing more than a ritual it they consider it at all. However, the time you save by not pilot testing is usually more than made up by trying to salvage data from instruments that weren't pilot tested.

The pilot test helps you find out if the content and form of the questions are satisfactory. You can also get information about:

  • How long it takes to complete the tools.
  • Whether the order of the questions flows well
  • Whether ample space is provided for responses.
  • Whether the directions, as well as the questions are understood
  • Whether your total is both reliable and valid

Another advantage to pilot testing is the opportunity to check the responses you get against the major evaluation question or issue you're exploring. Are patterns beginning to emerge? Are there identical responses showing up in the "Other (Please Specify)" blanks that you might want to include as a specific category?

When you pilot test, have your data analysis plans already in mind. By playing around with the data you get in the pilot testing, you can begin to see if those plans are on target.

Try your tool on people who resemble your program participants. Test under conditions similar to those you'll be using ingathering your data. For example, when plot testing a mail questionnaire, have people answer it without help from you. When they are finished, ask for opinions and suggestions. Check out an interview questionnaire by having an actual interview, either by phone or face-to-face, depending on how it will be used.

Before pilot testing, ask some of your colleagues to review the questionnaire, particularly those who have interest or experience in evaluation or who may be familiar with the program or audience you're examining.

Sometimes evaluators get so close to their work they can't see the forest for the trees. A reviewer can help out by taking a fresh look from a more distant perspective.

Are the questions you're proposing ones that are reliable? For example, if you issued a questionnaire, and reissued it four weeks later would you get the same responses?

SUMMARIZING YOUR DATA

  1. Use qualitative comments, but ludiciously. Sometimes people have been so conditioned to think of evaluation as "objective" that they're reluctant to venture even the most timid observations. It's perfectly permissible (in fact, even encouraged) for the evaluator to contribute his or her own interpretation. Subtle descriptive comments along the way can help your audience work its way through your analysis. Just be sure your claim can be supported by the data. Some examples of such phrasing are:
  • A substantial majority
  • Fairly substantial
  • A major influence
  • Considerable differences
  • Perhaps more importantly
  • Strong support for the idea that
  • Very strong association with
  • Related strongly to

Identify supposition and speculation. Remember that interpretation involves explaining as well as describing. You might note something interesting or unexpected in your data, yet unexplainable. Offer your explanation, but in terms that lets your audience know you are speculating or supposing. For example:

More experienced leaders spent more time on leadership roles than less experienced leaders, possibly because they may have taken on additional responsibilities.