Step 3: Design the Evaluation Tool(s)

Step 3: Design the Evaluation Tool(s)

Whether using the outcomes-based approach to evaluation or other approaches, you can use the following general outline to set up the evaluation format (1 adapted from p154).

Draft the Evaluation Format

Introduction

  • Purpose: in the first sentence of the introduction, indicate why you are having an evaluation: what is the reason? (Be sure to tell learners at the beginning of the educational event whether there will be an evaluation; sometimes they will pay more attention.)
  • Disclosure: indicate how results will be accessed. For example, in an “anonymous” evaluation, absolutely no one other than the participant has access to their results, as names are not recorded; in a “confidential” evaluation, the participant and the evaluator can access their own results, as names are recorded. Indicate if names may be recorded in the evaluator’s files over time in order to track changes in individual or group scores during future evaluations.
  • Analysis: describe briefly how the results will be analyzed (see Step 5). For example, only the average of scores will be reported. Or, only themes or clusters of ideas will be identified.
  • Distribution of results and follow-up: be clear about with whom the feedback will be shared and how it will be shared. Identify if the group and/or individual results of the evaluation be shared with the agency lead, with supervisors and/or with other participants. This could influence how participants respond.

Focus of Questions

  • The questions focus on two areas, including a number of components in each area:
    • The experience of in-session learning during the educational event (concepts, issues and skills on which learners will be assessed; effectiveness of event participation, productivity, logistics, organization)
    • The experience of post-session performance back in the workplace (application of learning in on-the-job performance, support for on-the-job application of learning)
  • The focus of questions is tailored according to the goals of each educational event.

Closing

  • Thank you
  • Directions for returning the completed form, if necessary.

When drafting the format of the evaluation use the following tips to ensure the tool will provide you with the information you require to make decisions for future educational sessions in your organization.

Tips

  • Think about what methods have been used previously to gather evaluation information about your organization’s educational events and with what success. If your organization has used any standard templates, you may be able to adapt one to your needs.
  • Identify the pros and cons of each evaluation method in your organization. For example, surveys may be most efficient but may produce a low response rate if people are suffering from survey fatigue; some approaches may be less affordable in terms of time and funding; you may capture more insights from a brief focus group than a long survey. Some educators regard the pre/post-test as the gold standard of evaluation for learning. It can be intensive and time consuming but can also identify what is really important for learners to achieve. For example, one group embarking on an “Ethics of Touch” training were asked the same 8 questions at the beginning of the program and then at the end. The overall results were compared to determine if an increase in knowledge would change service delivery. Instructors were also able to reflect on where participants missed certain points and think about a better way to present those ethical concepts next time.
  • If your organization has a standard evaluation that they require, staff may feel compelled to have participants complete it (even though it is not giving the event organizers the information they need about the intended outcomes back-on-the-job). To compensate for deficient evaluation templates:
    • Add a short supplementary question(s) tailored to the specific information required. For example, ask open-ended questions if the mandated form is multiple choice or a Likert scale.
    • Include more specific questions as part of the Q/A at the end of the educational event while people are still in the room. You may not get answers from everyone but some answers may be better than none.
  • Make 10 minutes at the end for small group discussion that focuses on specific questions.  Working in teams may reduce the pressure of any one person to contribute and will also allow participants to learn from each other about how to inform their practice.
  • Give advance warning that a follow-up email will be sent asking more specific information.
  • When reviewing the overall draft design, whether paper or electronic, ensure it looks inviting so as to attract quality responses.
  • Consider when the evaluation will happen in relation to the event agenda.
    • Formative evaluation contributes to a process while it is happening. If you have decided to ask some formative questions that will help you to shape the rest of the session’s agenda, when is it best to do this within mid-session?
    • Summative evaluation happens at the end of an event or after the event. At what point do you want to ask participants about what went well, what needs to be improved and what happens next: At the end of the session? One week later back on-the-job? One month later after new skills have been practised?
  • Be realistic with your own knowledge about evaluation. If you require a more comprehensive approach, find someone who has more expertise.

Construct the Questions

Once you have drafted the evaluation format, you are ready to construct the questions. An outcomes-based approach in planning the education event can be helpful here. This approach involves identify key components that learners must experience in order to make a change in their work performance: understanding certain concepts and issues, mastering certain skills, demonstrating what they can do in-session and performing post-session on the job (see Template: Designing the Session & PWS Sample). The “outcome” is what learners change in their work; it connects what they “know” with the “know-how” applied back on the job(2). Use the table provided below to identify evaluation questions for each of the outcomes-based approach components and what tool(s) you may be able use to ask those questions.  The following tips will help you construct clear and easy to understand questions: (adapted from 3, p. 157)

Tips

  • Limit the total number of questions – short is better than long
  • Incorporate “plain language” so questions are easy for everyone to understand(4)
  • Be sure each question asks about only one point. For example, if you ask about both the length AND difficulty of the session in one question, an individual would not be able to differentiate within one response that it was too long but not difficult enough.
  • Check whether the space for a response is appropriate to the type of question. Don’t provide three lines when you want two words.
  • Only ask questions where you intend to use the responses in your analysis. For example, when demographic data is requested, but not used. This wastes everyone’s time.
  • When using a scale with a question, offer a choice of at least five options so respondents have a real choice, not a forced choice within a few options.
  • Look at how much variety there is among the types of questions, such as closed, open, multiple-choice and scaled. Different types of questions solicit different responses and appeal to different perspectives. For example, a question asking people to rate items on a list shapes a response that conforms to the list.  An open-ended question invites participants to respond based on their own “internal” list.  Forced-choice questions (respondent must choose yes or no, agree or disagree) can constrain or polarize thinking.
  • Anticipate your time: some find it more time-consuming to analyze the text responses to open questions than the numerical responses to multiple choice or scaled questions.
  • Avoid asking leading questions – questions that steer people in the direction of a particular answer. You can inadvertently skew the answers by the way you ask a question. As an example, if you ask people to identify what sector they are from and only give them one option for a response, those who work in two sectors will be forced to indicate only one of the sectors in which they work. Therefore, your results will not accurately reflect all of the sectors in which people work.
  • For the final question, encourage individuals to say whatever is important to them
  • Test out your draft questions with someone who knows nothing about the topic; they will be able to identify wording that is confusing or difficult to understand.
  • Walk through how you will analyze the data once you have it. Ask yourself: what am I going to do with the results?

Designing the Evaluation: What Evaluation Questions Are Addressed in an Outcomes-based Approach5

In-session evidence of:

Assessment Tasks

What will learners do in the training to demonstrate evidence of what they can do? For example, in-session evidence of performance – assessment of tasks.

Did participants effectively demonstrate how they applied their new skills?

Tools:

  • Questionnaires or structured interviews with participants
  • Participant reflections
  • Simulations
  • Direct observations
  • Video or audio tapes

Process Skills

What skills must the learner master to demonstrate what they can do? For example, specific acts, ability to do tasks – process skills.

Can participants demonstrate their mastery of the intended skills?

Can they list the skills?

Can they communicate those skills to others?

Tools:

  • Paper or electronic questionnaires, including self-assessments and tests
  • Focus groups, interviews
  • Participant reflections
  • Personal learning logs
  • Case study analysis

Concepts and Issues

What must the learner understand to demonstrate what they can do? For example, specific facts, patterns, values, motivations – concepts and issues.

Can participants demonstrate their understanding of the concepts and issues?

Can they list them?

Can they communicate those concepts and issues to others?

Tools:

  • Paper or electronic questionnaires, self-assessments and tests
  • Focus groups, interviews
  • Participant reflections
  • Personal learning logs
  • Case study analysis

Event Agenda (productivity, participation)

How did the activities and facilitator contribute to participants’ learning? For example, provided activities that appealed to diverse ways of learning.

At our next educational event, how could participants support each other’s learning more effectively?

What did you do to contribute to the success of the educational event, other than participate?

What are you learning from other participants?

Tools:

  • Questionnaires – paper or electronic
  • Participant reflections
  • Structured interviews
  • Focus group

Event Environment (logistics, organization)

How did the facilities, technology and equipment contribute to participants’ learning in the session? For example, offered web-based options to paper-based exercises.

What aspects of the facility supported (did not support) your participation?

What two words would you use to describe the facilities?

If we conduct the next education event at this facility, what could be improved?

Tools:

  • Questionnaires – paper or electronic
  • Participant reflections
  • Structured interviews
  • Focus group

Post-session evidence of:

Performance of learning at work

What do learners need to be able to DO “in their work” that we’re responsible for addressing “in the training session”? For example, post-session performance.

Are participants consistently changing their practice on-the-job? If so, what evidence supports that conclusion?

What was the impact of this change in practice on service recipients? (For example, applying AODA accessibility standards):

  • Did it affect individual well-being?
  • Did it influence individual cognitive, physical or emotional well-being?
  • Was implementation supported, advocated and facilitated by the organization?

Tools:

  • Review individuals’ records, original reports, etc.
  • Questionnaires or structured interviews with individuals, families, supervisors

Employment matters that affect at-work performance

What does the organization need to be able to do to advocate, facilitate and support application of learning in the workplace? For example, build on improvements implemented after previous educational events.

To what extent was implementation of learning supported, advocated for and/or facilitated by your organization?

Were any problems addressed right away so implementation continued?

Did the participants’ application of learning back at work affect the organization’s policies and procedures?  If so, how? (For example, applying new safety policies and procedures)

Tools:

  • Minutes from follow-up meetings
  • Questionnaires – paper or electronic
  • Structured interviews with participants and supervisors