Pages

Friday 17 July 2020

Data wholes and holes

A kaleidoscope metaphor describing one approach to
analyzing qualitative data (Suter, 2011, p. 349)
I have recently had an undergraduate student who had absolutely no idea where to start with qualitative data analysis. The student had planned the entire project, had successfully collected the primary data, but then was completely, utterly lost with what to do next with the recordings. They had no cultural context or feel for the data that had been collected.

While I was able to provide more and chapters on analysing findings for the student to read, I quickly realised that what I was providing was still not giving them a framework to work within. It was a really interesting problem: like the old adage, I was attempting to explain colour to the blind. I struggled to find a structure which would show the student some steps to help them gain the confidence to begin the process of analysis.

The student had no access to any specialist software, but did have access to Office 365. They had adequate Word skills and some Excel skills. They had recorded their interviews on their phone. My solution had to be relatively low-tech.

I did some thinking. I did not want to be overly prescriptive, but when you have a student who doesn't even know where to start, the desire not to be prescriptive becomes a problem in itself. What follows is the first draft of what I came up with for this particular student.

1. Preparation:
  • We gather our data (interviews, or focus groups) by recording it as video or sound files. We listen to it a number of times so we get familiar with the data (making a playlist and listening to it on an iPod or phone helps).
  • Transcribe the into Word with timestamps separated by a tab from the script (transcribe the script as one sentence, or paragraph until there is a pause or change of speaker, then start with the next timestamp and paragraph).
  • Once the transcript is complete, copy the tabbed transcript into Excel. Excel will import the first column as timestamps, and the second column will be the script for that sentence, or set of sentences until there is a pause or change of speaker. We insert a new second column and add the speaker's name against each timestamp.
  • We add our field notes into the fourth column against the appropriate time stamp.
  • If we have any rough ideas of themes, we can note those a fifth column at the appropriate time stamp.
Now we can start analysing what data we have.


2. Analysis (first run):
We need to start looking for "wholes and holes" to quote Suter (2011, pp. 348-349) and see the image accompanying this post (this kaleidoscope metaphor comes from the 2000 work of Dye, Schatz, Rosenberg and Coleman and is very helpful in allowing us to shift the data a little, and see what the new 'frame' looks like). We could use a similar metaphor such as working with Lego. First we look through all sort through all our pieces or partial constructions and start to group all the similar pieces or elements together. We look for patterns. We deconstruct some pieces and reconfigure them, trying to understand how and why they were built in that particular way. We put them back together. We reorganise them.
Suter also suggests that we consider "a jigsaw puzzle (LeCompte, 2000). Assembling data into an explanation is akin to reassembling puzzle pieces. One strategy is grouping all pieces that look alike, sky for example, and placing these pieces near the top. Other sketchy-looking objects may be grouped together using any dimension (e.g., color) whose properties make conceptual sense. Puzzle pieces will have to be rearranged many times before the reassembled pieces emerge into a coherent pattern" (2011, p. 348).
  • We have our data in Excel. We now think about what impressions we have of our data now. We can re-read our transcript and think that "communication" seems to come up a lot, so we could head up a column 6 with "Communication" and add a '1' in this column whenever the issue of communication appears. We have an initial code for communication. Add new row at the top of the sheet, and sum the column to see how many times communication appears.
  • Look for more patterns and repeating ideas. Add new columns. Keep adding in a number whenever the code appears in the data.
  • Rinse repeat.
  • Complete coding our first interview/focus group.
  • Add a workbook tab, and using the same Excel framework as for interviewee/focus group one, do the same for interviewee two's transcription. Add more codes where they appear. Go back and see if this code appears in interviewee/focus group one's data.
  • Rinse repeat until all interviewee//focus groups are complete.
Our first cut of the data and our initial coding is complete.

3. Analysis (second run +):
We can consider Suter's review of "Seidel (1998): Qualitative data analysis is best understand as a symphony based on three elegant but simple notes—noticing, collecting, and thinking. Clearly not linear, the process is described as iterative (a repeating cycle), recursive (returning to a previous point), and 'holographic' (each 'note' contains a whole) with 'swirls and eddies.' When one notices, one records information and codes it using an organizing framework. When one collects, one shifts and sorts information." (2011, p. 348) and "When one thinks, one finds patterns, makes sense of them, and makes discoveries" (p. 349).
  • We now need to review our first run. We will need to re-read and review our data several times, to see whether we understand what is really happening. We need to consider whether our initial codes are the 'real' codes, or whether there is something else going on. Are there ways we can group elements together: so where we saw communication as being a code, is that actually a symptom of feeling connected? Or is it part of the culture? Or should it be broken up into body language, tone, inflection and silence? Or is about power and lack of power?
  • As we start to get more familiar with what we are looking at, we will see more layers. Where these additional layers turn up, do a save as on our 'raw' Excel workbook, and start playing with new codes and counts.
  • We need to make a lot of notes, and dig back into our literature to codify our thoughts, ideas, and questions. Add more columns. Do more counts.
  • We are looking for themes within the data. These might be language, opinions, beliefs, ideas, motivations, clusters, or codes which turn up in a particular order (CampusLabs, n.d.).
  • We cross-tab our data. We compare between questions, between interviewees, between field notes and interview data. Because we are in Excel we can pull graphs, percentages, cluster, tabulate and project.

    4. Write up the patterns:
    Now we can start writing our findings chapter. We need to think about who our reader will be, what we set out to discover, and how we can best tell our story, but we now have somewhere to start our process (CampusLabs, n.d.).

    I hope this helps. It is one way, anyway :-)


    Sam

    References:

    No comments :

    Post a Comment

    Thanks for your feedback. The elves will post it shortly.