Given the choice, how many people would swap a gloriously sunny Cambridge Saturday for a 7-hour long tutorial about, wait for it, qualitative field research methods and analysis? Yet thirty odd people did just that last Saturday and elected to be closeted in one of Churchill College‘s nicer rooms to listen to what User Centred Design (UCD) practitioner and researcher David Siegel had to say.
It turned out to be a highly motivating, fast-paced and anecdote-rich journey through the process of designing and analysing qualitative field-work in a product design context.
As anybody involved in UCD, User Experience (UX) or related work probably knows, field-work – be it usability tests, interviews, focus groups – is an essential tool of the trade. Yet making sense of the data and field notes collected can often be a non-trivial task. The material can easily build up to a stack of notes, transcripts and visuals that is measured in inches. Even more challenging is the task of communicating the results to the client in a compelling and authoritative way. Based on extensive real-world examples, including his work on Microsoft’s Tablet PC OS and similarly high-profile clients, David attacked these problems with vigour and enthusiasm. These are my top 5 take away tips from the day-long session:
- Preempt the Quantitative Dissenter
The big divide between qualitative and quantitative research is well known. Quantitative results sometimes tend to be more appealing because “10%” is easier to understand and take home than a page-long narrative. Deal with this by recognising and showing that a quantitative study, such as a survey or questionnaire, requires a qualitative decision making process a priori: choosing the kind and extensiveness of the questions. This is an area where qualitative field work can prove its worth by lending confidence and scientific credibility to that process (rather than leaving it up to some graduate intern).
- Quantitative Doesn’t Mean Numeric
It is true that quantitative summaries and comparisons may be easier to retain than long lists or narratives. So find things to count in your qualitative data and measure them qualitatively. What does this mean? Using qualitative quantifiers like ‘rare’, ‘frequent’, ‘improbable’ and similar words to describe your findings makes your results as memorable as the quantitative “10%” but avoids an all to common pitfall that qualitative researchers fall into. The pitfall is the temptation to say something like “80% of the people we researched did not like this feature” and when asked what quantity that represents you are forced to say “8” because you interviewed 10 people! Hardly a statistically significant sample in quantitative terms but very significant if positioned as “Oh yes, we observed that use case taking place frequently. Here is why they do it…”
- Reliability Does Not Imply Validity
It seems like a no-brainer but the central point is that just because your data sources and collection may be reliable and traceable, does not imply that your conclusions are valid. Reliability (achieved through scientific rigor) is important because it sets the baseline for validity, i.e. reliability is a pre-requisite for validity. However to produce valid conclusions you need to be careful not to base them on personal bias, gut feelings, hunches, false correlations or invented cause-and-effect relationships. How do you do this? Well, experience helps but also triangulation (i.e. use several methods to arrive at the same conclusion) and judgmental heuristics (i.e. have different people carry out the same analytics to ensure the interpretation is the same or similar).
- Forget the Blank Slates, Fill ’em Up
The way to be objective is not to pretend that you can go into the field with a ‘blank slate’, or a completely open mind. Rather, acknowledge your background, assumptions, expectations, objectives and domain knowledge (as an individual and as an organisation) and map them out to create what David calls a ‘focus structure’. Doing this will achieve the goal of being ‘open minded’ because when you observe something that doesn’t fit into your pre-defined categories – and you will – it will stick out like a sore thumb, forcing you to acknowledge it and give it due attention.
- Deliver Compelling Field Findings: Behaviour and Conditions
“Be Compelling!” Thanks, but how? Two invaluable tips based on the fundamental point that the big advantage of qualitative research is its richness. First, focus your reporting around behavioural characterisation. While quantitative work characterises and segments a market or demographic, qualitative work characterises and (tries to) explain behaviour. If you can show and shed light on why somebody is behaving in a particular way you’re well on your way to being compelling. Secondly, focus on conditions and once again avoid the “8 out of the 12 users we interviewed did such and such”. That is precisely the kind of statement that is itching for a smart-ass quantitative dissenter to pipe up… and she would be right! Showing frequency in this manner is not your job. Discuss instead the conditions at which something happens, how likely those conditions are to be ubiquitous (or not) and what’s at stake when they do occur.
This short review hardly scratches the surface of the wealth of information and examples David brought to the table and I would strongly recommend keeping an eye out for his (and his colleague Susan Dray‘s) sessions in any HCI conference coming to a town near you.
Oh, and once again thanks to Red Gate for both sponsoring the 23rd British HCI Conference and for sending me along!
If you liked this post follow me on Twitter, I’m @richardmuscat, or subscribe to this blog’s RSS feed.