- For evaluators’ eyes only (21/07/2018)
- Reconciliation Week and findings from an Aboriginal health evaluation (04/06/2016)
- Evaluation amidst complexity: 8 questions evaluators should ask (04/12/2015)
- To count or not to count: Australian population data (20/02/2015)
- My pick of readings on scaling up health interventions amid complexity (12/12/2014)
Taking the smoke and mirrors out of qualitative analysis
Thursday, 1st March
Most program evaluations I do involve consulting stakeholders and clients. Usually the program is a complex one that does not offer a single product or service. A structured questionnaire will not capture the diversity of outputs and outcomes. If the evaluation goal is to explain how the program works, surveys are almost certainly not an option.
In such circumstances the evaluator has no choice but to conduct semi-structured interviews that go for anywhere between 15 and 90 minutes and are of widely varying value. As the small mounds of notebooks or sound recordings pile up on the desk(top), what is the evaluator to do?
First, does an analysis method really matter? Will it change the conclusions? Apparently many evaluators don’t think that it does.
The lazy approach to incorporating qualitative data into evaluations is to write up the recommendations based on what you remember and cherry pick the interviews for great quotes. Lots of compelling evaluations do just that. If the client agrees with the conclusion, or the consultant is actually bringing enough industry and environment specific knowledge to the task to have been able to recognise the novel and significance from the conventional content, then such cursory use of qualitative data may be sufficient.
But the people who fund evaluations of social programs should not be satisfied with findings pulled out of a rabbit’s hat. Public monies, through donations or tax dollars, need greater accountability and that means conclusions which can be justified. Having a clear analytical path from data collection to conclusion is part of expected standards.
I have a method for managing data from semi-structured interviews using Excel. It isn’t exactly a shortcut and it certainly isn’t groundbreaking but it works for me every time. It might help you be more systematic and more confident in your conclusions.
First: create a workbook with several worksheets. Rename the first worksheet Summary.
Second: make a different worksheet for each group of informants you want to analyse separately. For example, you might have interviewed managers, staff, funders and community members and you expect that each have different but important perspectives. So you have for new worksheets in the same book name, you guessed it, Managers, Staff, Funders and Community.
Third: each interview will take up a single column.
Fourth: The first row with be the person’s name or some descriptor. The next rows might give basic information like their role, their location and some background. The following roles will relate to each of the main questions for your evaluation. Have a final row for additional issues. Copy the row titles to each worksheet, including the summary sheet.
Five: make your notes from each interview directly on this sheet. This will take some time but by organising your notes you have already started analysing. As you have more interviews then can fit on a screen use View > Split to be able to scroll to new columns while keeping the first column with the row titles fixed.
Six: once you are done, read across the row and summarise what the managers said about each question. Do the same for staff, funders and community members. Make note of the column letter – let’s say it is J.
Seven: this is the sexy bit. On the summary page label columns Managers, Staff, Funders and Community. Automatically copy the summaries on each worksheet to their respective cell by typing this command =’Managers’!J5. The contents of J5 will appear as if by magic. Do the same for the remaining cells, using copy and paste down columns. When you go back to the managers worksheet you can alter the summaries and those changes will be on the summary sheet.
Now you have all of the information in front of you. You can draw conclusions across your stakeholder groups, seeing the areas of agreement while being sensitive to diversity. Most importantly you can defend those conclusions.