Skip to Content

Guidelines for Designing Evaluation Questionnaires

 

      SUBSTANCE ABUSE RESOURCE

Guidelines for Designing Evaluation Questionnaires

 

(Adapted from Understanding Evaluation:  The Way to Better Prevention Programs by Lana D. Muraskin, U.S. Department of Education, 1993, pp. 38-39.)

 

These tips can help you to ensure that your evaluation questionnaire is of high quality.

 

    • The items in the questionnaire reflect the program’s specific aims.  For example, what changes in participant behavior might be expected to occur as a consequence of the program?  What are the best indicators that teachers are implementing a curriculum?  The questions should measure what the program is designed to achieve.

    • The questions, language, and reading level are appropriate to the respondents.  The flow of the questionnaire and the ease of responding should be assessed.  For example, the questions should not ask young children about issues they are unlikely to comprehend or about behavioral expectations well beyond their stage of development.

    • Wording biases have been eliminated.  If the wording of the questions leads respondents to guess the desired answer, nothing will be learned from the questionnaire.

    • Questions are direct and focused, not indirect or open ended.  Yes/no or simple 5-point scales (such as “strongly agree” or “strongly disagree” with a particular statement) should be used when possible.

    • The response format matches the question format.  A question that asks “How many times in the last month …”  should have response choices tailored to that question.

    • Coding requirements are incorporated into the instrument.  The person who will be responsible for coding should review the instrument to make sure it will be easy to code and aggregate the information.

    • Items from widely used instruments are not changed unless there is a good reason. In general, use the exact wording of the items on widely used questionnaires.  In addition to issues of accuracy, comparability of findings with other evaluations or national trends could turn out to be of critical importance.  Thus, any item change should be weighed carefully.

    • The instrument is sensitive to “response burden”—the number of minutes it will take a respondent to fill out the questionnaire--and to the burden on evaluator time.  A long, poorly thought out questionnaire will waste respondent and staff time.

 



Related Articles
Substance Abuse Prevention Toolkit

Planning an Evaluation

Characteristics of Effective After-School Prevention Programs

Guidelines and Benchmarks for Prevention Programming

Related Links
Do It Now Foundation


News
Conference Calendar
Forum
Find Others
Ask a Specialist