Data analysis can become overly complicated when the purpose of the evaluation is forgotten. In this step, an approach to analyzing data is described that can be accomplished efficiently and can be accessible and useful for all stakeholders. There are two types of data generated when designing evaluation tools:
- Quantitative data (numbers, scores)
- Qualitative data (text that reflects peoples’ attitudes, feelings and behaviours in greater depth).
Note that if qualitative data collection is used to piggyback quantitative questions, it can explain why a particular numerical response was provided. For example, “please explain your score in the previous question”. The next two sections describe how to analyze each type of data. Note that when evaluations are completed online, you may be able to access tools that do a basic analysis and can display the results in different types of graphs.
Enter the data from your information gathering into a computer, whether numbers from a questionnaire, a scored demonstration or highlights from a structured recorded activity. Check the data for errors – a procedure called data cleaning. Once you have cleaned the data, consider the responses to each question.
- When responses are scaled (recorded on a scale such as 1 to 5), calculate the average response by adding up the value of all the responses and dividing by the total number of responses.
- Note the distribution of responses so you can report on the central tendency of the scores as well as the variability across the scores – did participants mostly agree with the average response or did their responses indicate various perspectives (across the scale of 1 to 5)?
- In the following chart, consider the participant responses to the question: How would you rate your organization in addressing “transfer of learning” problems quickly and efficiently, so training implementation could continue?
- The average is 3.1, which is good; however, the distribution indicates that one-half the group is pleased with the organization’s addressing problems and one-half are not.
- It is important to reflect both the distribution and the average when reporting quantitative results.
- Frame each question the same way for consistency. On a Likert scale of 1 to 5, the 5 would be the most desirable response for every question and the 1 the least desirable response. This will ensure that you can readily determine which questions had the most “positive” responses – the 4’s and 5’s.
- Capturing all the nuances of data analysis can be taxing at the best of times and omissions can easily occur. Have one or two colleagues review the draft to ensure its accuracy. As the old saying goes, “sometimes the fish is the last to discover water”; sometimes we cannot see what is surrounding us.
Enter the data from your information gathering into a computer, whether text from open-ended questions in a survey, an interview or a focus group.
- For each open-ended question, look for similarities and dissimilarities among the responses and cluster these together.
- Then look at the clusters to identify themes or patterns and highlight these manually, using highlighters or Post-it notes.
- If you are analyzing larger amounts of data, you can identify a computer program online that will serve your requirements.
- Keep in mind that although these programs have a number of helpful features, additional time is required to learn the program and to input and code the data.
Some common ways of looking for themes or patterns within each question, whether manual or electronic, include(6):
- Frequencies: how often the same or similar response occurs; you can rely on frequencies as a way of getting started with your analysis
- Magnitudes: levels or size of a situation or need
- Structures: the different types of needs (e.g., physical, mental, behavioural)
- Processes: the order among the elements of structure (e.g., do the needs begin with physical and then become more behavioural?)
- Causes: What are the causes of service gaps? Are they more common among those with only government funding? Among those clients with English as a second language?
- Consequences: How does the need affect caregivers, in both the short and long term? clients?
When documenting the analysis, provide the theme and a few examples that support that theme, so the reader has a sense of what the theme captures. For example, if the theme is “most important services”, examples might include: in-home respite, safety training and transportation reimbursement. Remain curious during your data analysis. Read and re-read the notes several times to identify and revise themes. Through the technique of “constant comparison”, compare passages you select with all other passages you have previously selected within the same theme. This technique will support consistency within each theme and also help you to further explore any inconsistencies. As an example, you may notice that almost all of the session participants agree on the most important educational services provided, but that they disagree on the learning needs that are being met by those services. You would then look to see if any of the responses might give you a clue as to why this might be the case.
- As in the quantitative analysis, have one or two colleagues review your qualitative data analysis for any clarifications or insights. You may find that, through discussion, a theme can be fine-tuned or jargon clarified so that understanding of your analysis is easier for readers.
- Keep in mind the original purpose for the evaluation when analyzing the data. What is my focus here? What am I trying to find out and how am I going to use this information?
- Be sure to clarify that an evaluation’s results can provide some “evidence” but not “proof”. Evidence refers to improvements in how participants perform on the job; proof means that an educational session is solely responsible for improvements in practice. As practice settings are under the influence of many variables, one influencer such as an educational session cannot be regarded as the proof that is solely responsible for changes in job performance or changes in client outcomes.
- Calculate and clearly identify the ratio of respondents to participants. If a small percentage of participants responded to the evaluation, the results cannot reflect the entire group – there is not enough evidence.
- Pay attention to unusual or atypical responses that are not part of a theme or pattern. Don’t disregard them; they may raise a unique insight that your stakeholders would want to know.
- Data analysis can be simple or complex – determine what is most appropriate for your needs and resources. Electronic tools or databases can help you to make sense of large amounts of data, if needed. For example, if there is a mixture of participants, you can explore which sub-groups said what and determine any similarities or differences that were evident.