To evaluate means to make a judgment about the amount, number or value of something. Who makes or influences that judgment of your educational event is of critical importance. People approach evaluation from different perspectives. For example, some people want to know:
- If participants like the training
- What knowledge or skills learners acquire
- The extent to which the training influences job performance
- How the training affects the organization’s productivity or client outcomes.
A Multi-stakeholder Approach
Stakeholders are individuals or groups who might influence or be affected by the session’s outcomes. A multi-stakeholder approach to evaluation provides useful information that various groups—including learners/participants, organizers, administrators, funders and supervisors—can use to support learning. The first step is to explore what the evaluation must accomplish, or its purpose. Ask two questions:
- What is it that you and/or others want to know?
- What are you going to do with this information – what difference is it going to make?
Determining what you and others want to know
Each evaluation will be shaped by the purpose of the educational event and by what different people will want to know about the outcomes of the event. The table “Designing the Evaluation: What You and Others Want to Know” (see Template: Designing the Evaluation: What You and Others Want to Know) includes a list of individuals or groups who might influence or be affected by the session’s outcomes (i.e. the stakeholders). The table provides a quick way to identify which stakeholders’ perspectives are important and what those stakeholders might want to know from the evaluation results. For example, stakeholders may want to know what happened during the educational event (in-session) or what happened when the participants tried to apply their learning back on-the-job (post-session). You can use the table to:
- Identify what parts of the session evaluation each stakeholder might be interested in (indicate with √), whether they are attending the session or not. For example, your CEO may want the evaluation results of a Board-funded session so that members can discuss other funding opportunities at their next Board meeting.
- Remember that not all stakeholders have an interest in all aspects of your educational events. For example:
- The purpose of your educational event may focus on only one of the categories related to learning (e.g., to inform staff of new policies). Stakeholders may only be interested in evaluating participants’ understanding of those concepts and issues, the third column, and/or how they applied those back on-the-job, the two columns to the right.
- You may be the only person interested in the results so you can plan your next educational event; for other sessions there may be others who have a stake in different aspects of the evaluation`s results.
- Find out who wants to know about what and then incorporate those interests into the tool design. Exercise your discretion when anticipating which session evaluations (or a collation of session evaluations) will be of most interest to which stakeholders.
You can use this table to launch your thinking about stakeholders and what they may want to understand from the evaluation. Some components will be more important than others so you won’t have entries in every box – no need to complete the entire chart. Once you are clear on the evaluation’s purpose, you can begin to craft the questions.
Identifying what you are going to do with this information – what difference it will make
For evaluation results to be meaningful, stakeholders have to identify what they plan to do with the information. This will also shape how the evaluation is designed. There are a variety of ways the evaluation results can be used. Ask yourself if you want to use your evaluation results to do any or all of the following:
- Increase participant satisfaction with the education event experience?
- Improve participant attitudes to education events?
- Enhance how the education events are conducted, how the activities are designed and how the educator leads the sessions?
- Increase the extent to which participants understand new concepts and issues, acquire new skills and translate their learning to on-the-job performance?
- Develop the organizational supports for professional development and/or continuing education?
- Track the number and type of educational events offered by a funder?
- Measure the impact of staff education events on client outcomes?
- Address emerging issues in the organization?
Be sure to communicate with the evaluation participants about what you will do with the results. For example, “We will collate your responses and analyze those results for themes. A copy of this analysis will be included in a report to the Director of Client Services and to all participants”.
- Don’t ask questions if you know there is no opportunity or desire to act on the result of those questions in your organization. For example, some evaluators ask about the age and gender of the respondents but are not going to do anything with that data.
- If there is evidence about which your organization repeatedly inquires, be sure to include a question that will gather that data. For example, data related to a specific strategic priority such as “reaching beyond our sector”.
- Identify evaluation criteria early when designing instruction/ training. What you want as a learning outcome informs what and how you will design training.