Why Clients (and Designers) Don’t Trust Qualitative Research

One of the biggest challenges facing qualitative research is the misconceptions about the value it provides and the work involved in doing it right. This excerpt from Chapter 2 of Sheila Pontis’ new book, Making Sense of Field Research: A Practical Guide for Information Designers examines common objections to qualitative research and how different points of view shape attitudes and approaches to research:

Despite the benefits of research to enhance the quality of information design work, its use is still often questioned among practitioners and, particularly, the use of qualitative, contextual studies, like field research, are rare. As an example, many information design companies don’t use qualitative research because they don’t believe in it. In other cases, those information designers using some form of field research as part of their regular practice need to “make the case” for it and convince clients of its value. Other information designers are sceptical of research in general and don’t appreciate its value because they see it as only necessary to compensate for lack of experience. These information designers rely heavily on their years of design experience in the field and find research time-consuming. For them, any type of research also restricts their creativity. The following are common criticisms about the use of qualitative research in professional contexts:

  • Qualitative inquiry isn’t useful: “Findings are not generalizable”. Qualitative research is often described as too subjective, with no analytic depth, whose findings don’t represent credible results because there are neither numbers nor an appropriate set of criteria to determine quality. This view corresponds to that of someone used to quantitative research. Quantitative findings are based on large samples, often used as “measurements” and “data” (numbers) to provide “objective” insights. This is a very different perspective from that of qualitative research.
  • Anyone can do it: “Asking questions is easy”. It is believed that everyone can research and collect data by asking questions and observing people doing something, while they just take some notes. This thinking challenges researchers’ expertise and disqualifies the deep insight they bring.
  • Qualitative data is difficult to analyse: “Yes, we use qualitative methods but analysing data is time-consuming and findings are hard to apply”. Even among those who do appreciate the value of qualitative research and believe that it decreases the unknowns associated with redesigning or creating a new design, the tendency to cut corners on analysis, due to either financial or time constraints, persists. The reality is that well-grounded interpretations and thorough analysis take time. When this part of the research process is rushed, resulting designs are ambiguous, unclear, and hard to understand or use. Even the richest of datasets would make no real difference to a design’s quality, if it weren’t properly analysed.

These views suggest that poor understanding of both what qualitative research entails and its benefits seems to be at the root of the problem. Qualitative research isn’t common knowledge beyond academic contexts; it is mostly foreign to design professionals, corporates, and organizations. These misconceptions may seem inconsequential acts within the bigger picture because many information design companies do research, but they have a cumulative effect with the potential to devalue qualitative work. If these misconceptions and tendencies perpetuate, they could lead to the wrong perception that qualitative research doesn’t actually have much to offer after all.

In addition, clients play a key role in the infrequent use of qualitative approaches in information design. Some have a persistent view that quantitative research, such as market research, is all they need and struggle to understand how a more interpretative type of research can actually make a difference to the final outcome. Those who seem to understand their value are reluctant to allocate the necessary time and budget, because they are perpetually in a hurry and pressed for immediate results.

Regardless of whether they are information designers or clients, what stands between those who don’t see the value in qualitative research and those who do is their different points of view about reality. When we encounter something that seems to present a view contradictory to our belief, our first reaction is to describe it as incorrect or invalid. Each research approach is rooted in what are called paradigms, philosophical traditions, or points of view about reality. Each paradigm represents a way of reasoning and thinking about the world, making sense of its complexities; that is, people who have different points of view believe in different types of ‘truths.’ When a point of view doesn’t resonate with ours, we don’t understand it. Paradigm-related biases are the source of misunderstandings and the reason why someone may dismiss the value of one research approach and not another.

There are many quantitative and qualitative research paradigms, each proposing a different explanation about reality that shapes the way we see and understand things. Particularly, this section discusses the views of positivists and constructive-interpretatives.

Many information designers, clients, marketers, product managers, and consultants are positivists: they judge what they consider “good” research and credible and valuable findings based on “facts” that they can observe—for example: how fast did people complete the new form? How many people downloaded the new app? How much faster is booking a flight when using the new interface? For many organizations, truth is provided by numerical data gathered via scientific methods, not by thick descriptions. The value from these descriptions is mostly unknown, undervalued, or unappreciated.

Positivists also follow assumptions that people’s actions can be predicted, and that there is a right answer that leads to accurate predictions. In information design, these assumptions strongly manifest themselves during the evaluation of solutions. Some information designers and clients believe that the only way to rigorously measure whether an information design solution is successful after implementation is through numbers, such as data collected with Google Analytics or surveys. While numbers and statistics are important because they provide hard evidence of overall effectiveness, numbers alone don’t suggest the reasons why a design works or doesn’t work, nor show human experiences or specify whether the context of use affects the performance of a solution in any way.

In contrast, constructive-interpretatives follow a different point of view from that of positivists: they believe that, while you can make inferences and educated guesses based on experiences and patterns, people’s dynamics and interactions are too complex to make radical judgements such as right or wrong responses. Instead of looking for “observable events”, this paradigm looks at what something means to people—for example: what was the experience of completing the new form for each person? How do people feel about the new app? To answer these questions, rather than seeking facts or right answers, constructive-interpretative researchers look for descriptions of human experience and meanings manifested through stories, behaviours, feelings, and opinions. In other words, they seek depth of understanding.

People find meaning in every situation, for example when navigating a space or understanding how to use a device or interpreting an infographic. The overall experience involves both how it makes them feel (“This is nice, I like it”) and what they can learn from it; that is, how they extract understanding from it (“To start using the remote control first I need to press the red button”). In most cases, this matters most to them, regardless of whether they find an exit quicker or achieve their goals in a shorter time. To identify the meanings that people attribute to information design solutions, information designers need more than just a quantitative approach; they need one that helps understand people holistically.

Circling back to the earlier discussion, Big Q studies, such as field studies, involve the use of qualitative methods within a qualitative paradigm, such as the constructive interpretative paradigm. These methods place people at the centre, to understand their social world. Insights gathered, such as people’s experiences, satisfaction, and emotions, can add significantly to traditional questionnaires, creating more actionable recommendations, and helping information designers make more objective decisions and give credibility to their claims. This type of qualitative insights gives information designers a theoretical supportive framework—the why—that even experienced designers don’t have.

References from this excerpt:
  • Ladner, S. (2014) Practical Ethnography: A Guide to Doing Ethnography in the Private Sector, Walnut Creek, CA: Left Coast Press.
  • Anderson, K. (March, 2009) Ethnographic research: A key to strategy, Harvard Business Review, 87(3), 24.
  • Babbie, E. (2010) The Basics of Social Research, 5th ed, Wadsworth Publishing.
  • Patton, M.Q. (2002) Qualitative Research & Evaluation Methods, 3rd ed, Thousand Oaks, CA: SAGE.

Making Sense of Field Research by Sheila PontisMaking Sense of Field Research: A Practical Guide for Information Designers, by Sheila Pontis, is available in hardcover, paperback, and eBook editions. 

Amazon UK
Amazon US