Qualitative, Quantitative, or Both? Tips for Choosing the Right Tool

Posted by Ashley Harrington

Wed, Aug 06, 2014

quantitative, qualitative, methodologyIn market research, it can occasionally feel like the rivalry between qualitative and quantitative research is like the Red Sox vs. the Yankees.  You can’t root for both, and you can’t just “like” one.  You’re very passionate about your preference.  But in many cases, this can be problematic. For example, using a quantitative mindset or tactics in a qualitative study (or vice versa) can lead to inaccurate conclusions. Below are some examples of this challenge—one that can happen throughout all phases of the research process: 

Planning

Clients will occasionally request that market researchers use a particular methodology for an engagement. We always explore these requests further with our clients to ensure there isn’t a disconnect between the requested methodology and the problem the client is trying to solve.

For example, a bank* might say, “The latest results from our brand tracking study indicate that customers are extremely frustrated by our call center and we have no idea why. Let’s do a survey to find out.”

Because the bank has no hypotheses about the cause of the issue, moving forward with their survey request could lead to designing a tool with (a) too many open-ended questions and (b) questions/answer options that are no more than wild guesses at the root of the problem, which may or may not jibe with how consumers actually think and feel.

Instead, qualitative research could be used to provide a foundation of preliminary knowledge about a particular problem, population, and so forth. Ultimately, that knowledge can be used to help inform the design of a tool that would be useful.

Questionnaire Design

For a product development study, a software company* asks to add an open-ended question to a survey: “What would make you more likely to use this software?” or “What do you wish the software could do that it can’t do now?”

Since most of us are not engineers or product designers, this question might be difficult for most respondents to answer. Open-ended questions like these are likely to yield a lot of not-so-helpful “I don’t know”-type responses, rather than specific enhancement suggestions.

Instead of squandering valuable real estate on a question not likely to yield helpful data, a qualitative approach could allow respondents to react to ideas at a more conceptual level, bounce ideas off of each other or a moderator, or take some time to reflect on their responses. Even if the customer is not a R&D expert, they may have a great idea that just needs a bit of coaxing via input and engagement with others.

Analysis and Reporting

In reviewing the findings from an online discussion board, a client at a restaurant chain* reviews the transcripts and states, “85% of participants responded negatively to our new item, so we need to remove it from our menu.”

Since findings from qualitative studies are not necessarily statistically significant, using the same techniques (e.g., descriptive statistics and frequencies) is not ideal as it implies a level of precision in the findings that is not necessarily accurate. Further, it would not be cost-effective to recruit and conduct qualitative research with a group large enough to be projectable onto the general population.

Rather than attempting to quantify the findings in strictly numerical terms, qualitative data should be thought of as more directional in terms of overall themes and observable patterns.

At CMB, we root for both teams. We believe both produce impactful insights, and that often means using a hybrid approach. We believe the most meaningful insights come from choosing the approach or approaches best suited to the problem our client is trying to solve. However, being a Boston-based company, we can’t say that we’re nearly this unbiased when it comes to the Red Sox versus the Stankees Yankees.

*Example (not actual)

Ashley is a Project Manager at CMB. She loves both qualitative and quantitative equally and is not knowledgeable enough about sports to make any sports-related analogies more sophisticated than the Red Sox vs. the Yankees.

Click the button below to subscribe to our monthly eZine and get the scoop on the latest webinars, conferences, and insights. 

Subscribe Here

Topics: Methodology, Qualitative Research, Research Design, Quantitative Research

What's the Story? 5 Insights from CASRO's Digital Research Conference

Posted by Jared Huizenga

Wed, Mar 19, 2014

CMB and CASROWho says market research isn’t exciting? I’ve been a market researcher for the past sixteen years, and I’ve seen the industry change dramatically since the days when telephone questionnaires were the norm. I still remember my excitement when disk-by-mail became popular! But I don’t think I’ve ever felt as excited about market research as I do right now. The CASRO Digital Research Conference was last week, and the presentations confirmed what I already knew—big changes are happening in the market research world. Here are five key takeaways from the conference:

  1. “Market research” is an antiquated term. It was even suggested that we change the name of our industry from market research to “insights.” In fact, the word “insights” came up multiple times throughout the conference by different presenters. This makes a lot of sense to me. Many people view market research as a process whereas insights are the end result we deliver to our clients. Speaking for CMB, partnering with our clients to provide critical insights is a much more accurate description of our mission and focus. We and our clients know percentages by themselves fail to tell the whole story, and can in fact lead to more confusion about which direction to take.

  2. “Big data” means different things to different people. If you ask ten people to define big data you’ll probably get ten different answers. Some define it as omnipresent data that follows us wherever we go. Others define it as vast amounts of unstructured data, some of which might be useful and some not. Still others call it an outdated buzzword.  No matter what your own definition of big data is, the market research industry seems to be in somewhat of a quandary about what to do with it. Clients want it and researchers want to oblige, but do adequate tools currently exist to deliver meaningful big data? Where does the big data come from, who owns it, and how do you integrate it with traditional forms of data? These are all questions that have not been fully answered by the market research (or insights) industry. Regardless, tons of investment dollars are currently being pumped into big data infrastructure and tools. Big data is going to be, well, BIG.  However, there’s a long way to go before most will be able to use it to its potential.

  3. Empathy is the hottest new research “tool.” Understanding others’ feelings, thoughts, and experiences allows us to understand the “why behind the what.”  Before you dismiss this as just a qualitative research thing, don’t be so sure.  While qualitative research is an effective tool for understanding the “why,” the lines are blurring between qualitative and quantitative research. Picking one over the other simply doesn’t seem wise in today’s world. Unlike with big data, tools do currently exist that allow us to empathize with people and tell a more complete story. When you look at a respondent, you shouldn’t only see a number, spreadsheet, or fancy graphic that shows cost is the most important factor when purchasing fabric softener. You should see the man who recently lost his wife to cancer and who is buying fabric softener solely based on cost because he has five years of medical bills. There is value in knowing the whole story. When you look at a person, you should see a person.

  4. Synthesizers are increasingly important. I’m not talking about the synthesizers from Soft Cell’s version of “Tainted Love” or Van Halen’s “Jump.” The goal here is to once again tell a complete story and, in order to do this, multiple skillsets are required. Analytics have traditionally been the backbone of market research and will continue to play a major role in the future. However, with more and more information coming from multiple sources, synthesizers are also needed to pull all of it together in a meaningful way. In many cases, those who are good at analytics are not as good at synthesizing information, and vice versa. This may require a shift in the way market research companies staff for success in the future. 

  5. Mobile devices are changing the way questionnaires are designed. A time will come when very few respondents are willing to take a questionnaire over twenty minutes long, and some are saying that day is coming within two years. The fact is, no matter how much mobile “optimization” you apply to your questionnaire, the time to take it on a smartphone is still going to be longer than on PCs and tablets. Forcing respondents to complete on a PC isn’t a good solution, especially since the already elusive sub 25 year old population spends more time on mobile devices than PCs. So what’s a researcher to do? The option of “chunking” long questionnaires into several modules is showing potential, but requires careful questionnaire design and a trusted sampling plan. This method isn’t a good fit for all studies where analysis dictates each respondent complete the entire questionnaire, and the number of overall respondents needed is likely to increase using this methodology. It also requires client buy-in. But it’s something that we at CMB believe is worth pursuing as we leverage mobile technologies.

Change is happening faster than ever. If you thought the transition from telephone to online research was fast—if you were even around back in the good old days when that happened—you’d better hold onto your seat! Information surrounds every consumer. The challenge for insights companies is not only to capture that information but to empathize, analyze, and synthesize it in order to tell a complete story. This requires multiple skillsets as well as the appropriate tools, and honestly the industry as a whole simply isn’t there yet. However, I strongly believe that those of us who are working feverishly to not just “deal” with change but to leverage it, and who are making progress with these rapidly changing technological advances, will be well equipped for success.

Jared is CMB’s Director of Field Services, and has been in market research industry for sixteen years. When he isn’t enjoying the exciting world of data collection, he can be found competing at barbecue contests as the pitmaster of the team Insane Swine BBQ

 

CMB Insight eZine


Sign-up here
for out monthly eZine for the latest Consumer Pulse
reports, case studies conference updates, webinars and more

Topics: Qualitative Research, Big Data, Mobile, Research Design, Quantitative Research, Conference Insights

Deconstructing the Customer Experience: What's in Your Toolkit?

Posted by Jennifer von Briesen

Wed, Sep 25, 2013

Disassembled rubix 1More and more companies are focusing on trying to better understand and improve their customers’ experiences. Some want to become more customer-centric. Some see this as an effective path to competitive differentiation. While others, challenging traditional assumptions (e.g., Experience Co-creation, originated by my former boss, Francis Gouillart, and his colleagues Prof. Venkat Ramaswamy and the late C.K. Prahalad), are applying new strategic thinking about value creation. Decision-makers in these firms are starting to recognize that every single interaction and experience a customer has with the company (and its ecosystem partners) may either build or destroy customer value and loyalty over time.

While companies traditionally measure customer value based on revenues, share of wallet, cost to serve, retention, NPS, profitability, lifetime value etc., we now have more and better tools for deconstructing the customer experience and understanding the components driving customer and company interaction value at the activity/experience level. To really understand the value drivers in the customer experience, firms need to simultaneously look holistically, go deep in a few key focus areas, and use a multi-method approach.

Here’s an arsenal of tools and methods that are great to have in your toolkit for building customer experience insight:

Qualitative tools

  • Journey mapping methods and tools

  • In-the-moment, customer activity-based tools

    • Voice capture exercises (either using mobile phones or landlines) where customers can call in and answer a set of questions related to whatever they are doing in the moment.

    • Use mobile devices and online platforms to upload visuals, audio and/or video to answer questions, (e.g., as you are filling out your enrollment paperwork, take a moment to take a quick—less than 10 second video, to share your thoughts on what you are experiencing).

  • Customer diaries

    • E.g., use mobile devices as a visual diary or to complete a number of activities

  • Observation tools

    • Live or virtual tools (e.g., watch/videotape in-person or online experiences, either live or after the fact)

    • On-site customer visits: companies I’ve worked with often like to join customers doing activities in their own environments and situational contexts. Beyond basic observation, company employees can dialogue with customers during the activities/experiences to gain immediate feedback and richer understanding.

  • Interviews and qualitative surveys

  • Online discussion boards

  • Online or in-person focus groups

Quantitative tools

  • Quantitative surveys/research tools (too many to list in a blog post)

  • Internal tracking tools

    • Online tools for tracking behavior metrics (e.g., landing pages/clicks/page views/time on pages, etc.) for key interactions/experience stages. This enables ongoing data-mining, research and analysis.

    • Service/support data analysis (e.g., analyze call center data on inbound calls and online support queries for interaction types, stages, periods, etc. to look for FAQs, problems, etc.).

What tools are you using to better understand and improve the customer experience? What tools are in your toolkit?  Are you taking advantage of all the new tools available?

Jennifer is a Director at  South Street Strategy Group. She recently received the 2013 “Member of the Year” award by the Association for Strategic Planning (ASP), the preeminent professional association for those engaged in strategic thinking, planning and action.

Topics: South Street Strategy Group, Strategic Consulting, Methodology, Qualitative Research, Quantitative Research, Customer Experience & Loyalty

Can Quantitative Methods Uncover Emotion?

Posted by Megan McManaman

Wed, Sep 14, 2011

Grocery shoppingPicture yourself pushing your cart down the grocery store aisle, you’ve planned your meals and are making choices that suit your family’s tastes and budget. The decisions you make are rational and logical. But as anyone who’s ever felt a sense of nostalgia over a chocolate chip cookie, or empowered by their choice of the natural peanut butter, can tell you, they are also emotional.

As market researchers we’re interested in knowing what decisions consumers make, how they make them, and why. Traditionally, we’ve used quantitative (survey) approaches to discover the “what” and the “how,” and turned to qualitative methods (IDI’s, focus groups) to understand the “why,” including the emotions underlying these decisions.  But merely asking people to name their emotions is not enough, language biases, the tedium of having subjects choose from lists of 50 or more emotions, and the dangers of self-report for something so nebulous, are all difficulties faced by researchers. To address these biases, scientists and quantitative researchers have come to recognize the extent to which decision-making takes place in the subconscious mind. The question is: how can we apply rigorous measurement to what seem like the most irrational, unpredictable human characteristics?

Medical science has offered new possibilities using relatively established technologies to gain insight and understanding into the relationships between human emotion and brain activity. EEG’s, eye tracking, and even MRI’s have helped us understand the nuances and complexity of the brain’s response in very concrete and visual ways. An fMRI like the one pictured below, and other technologies, are valuable in their ability to measure brain responses that the subject might not even know they’re having. But there are limitations, beyond being prohibitive from a cost perspective, the results lack the nuance and detail necessary for effective application for market researchers.

fmri measuring brain response
Researchers from AdSAM, a research company focused on Emotional Response Modeling, have developed a methodology using non-verbal techniques to identify and measure emotional response to understand consumer attitudes, preferences, and behavior. This approach uses pictorial scales to capture emotional reactions and predict behavior while minimizing the language biases common in verbal approaches and contextualizing the results of more costly brain imaging approaches.

Guy thinking resized 600Want to know more? Please join us on September 21st as CMB’s Jeff McKenna and AdSAM’s Cathy Gwynn discuss the development and application of this new approach to emotional response measurement.

 

 

 

Posted by Megan McManaman. Megan is part of CMB’s marketing team, and she isn't proud to say buying ketchup makes her happy.

Topics: Emotional Measurement, Webinar, Quantitative Research

Compilation Scores: Look Under the Hood

Posted by Cathy Harrison

Wed, Aug 03, 2011

My kid is passionate about math, and based on every quantitative indication, he math problemexcels at it.  So you can imagine our surprise when he didn’t qualify for next year’s advanced math program. Apparently he barely missed the cut-off score - a compilation of two quantitative sources of data and one qualitative source.  Given this injustice, I dug into the school’s evaluation method (hold off your sympathy for the school administration just yet).

Undoubtedly, the best way to get a comprehensive view of a situation is to consider both quantitative and qualitative information from a variety of sources.  By using this multi-method approach, you are more likely to get an accurate view of the problem at hand and are better able to make an informed decision.  Sometimes it makes sense to combine data from different sources into a “score” or “index.”  This provides the decision-maker with a shorthand way of comparing something – a brand, a person, or how something changes over time.

These compilation scores or indices are widely used and can be quite useful, but their validity depends on the sources used and how they are combined.   In the case of the math evaluation, there were two sources of quantitative and one qualitative source.  The quantitative sources were the results of a math test conducted by the school (CTP4) and a statewide standardized test (MCAS).  The qualitative was based on the teacher’s observations of the child across ten variables, rated on a 3 point scale.  For the most part, I don’t have a problem with these data sources.  The problem was in the weighting of these scores.

I’m not suggesting that the quantitative data is totally bias-free but at least the kids are evaluated on a level playing field.  They either get the right answer or they don’t.  In the case of the teacher evaluation, many more biases can impact the score (such as the teacher’s preference for certain personality types or the kids of colleagues or teacher’s aides).  The qualitative component was given a 39% weight – equal to the CTP4 (“for balance”) and greater than the MCAS (weighted at 22%).  This puts a great deal of influence in the hands of one person.  In this case, it was enough to override the superior quantitative scores and disqualify my kid.

Before you think this is just the rant of a miffed parent with love blinders on, think of this evaluation process as if it were a corporate decision that had millions of dollars at stake.  Would you be comfortable with this evaluation system?

In my opinion, a fairer evaluation process would have been qualification of the students based on the quantitative data (especially since there were two sources available) and then for those on the “borderline” use the qualitative data to make a decision about qualification.  Qualitative data is rarely combined with quantitative data in an index.  Its purpose is to explore a topic before quantification or to bring “color” to the quantitative results.  As you can imagine, I have voiced this opinion to the school administration but am unlikely to be able to reverse the decision. 

What’s the takeaway for you?  Be careful of how you create or evaluate indices or “scores.” They are only as good as what goes into them.

Posted by Cathy Harrison.  Cathy is a client services executive at CMB and has a passion for strategic market research, social media, and music.  You can follow Cathy on Twitter at @virtualMR     

 

Topics: Advanced Analytics, Methodology, Qualitative Research, Quantitative Research