I recently went on a first date with a musician. We spent the first hour or so talking about our careers: the types of music he plays, the bands he’s been in, how music led him to the job he has now, and, of course, my unwavering passion for data. Later, when there was a pause in the conversation, he said: “so, do you like music?”
Um. . .how was I supposed to answer that? There was clearly only one right answer (“yes”) unless I really didn’t want this to go anywhere. I told him that, and we had a nice laugh. . .and then I used it as a teaching opportunity to explain one of my favorite market research concepts: Leading Questions.
According to Tull and Hawkins’ Marketing Research: Measurement and Method, a Leading Question is “a question that suggests what the answer should be, or that reflects the researcher’s point of view. Example: “Do you agree, as most people do, that TV advertising serves no useful purpose?”
In writing good survey questions, we need to give enough information for the respondent to fully answer the question, but not too much information that we give away either our own opinions or the responses we expect to hear. This is especially important in opinion research and political polling when slight changes in word choice can create bias and impact the results. For example, in their 1937 poll, Gallup asked, “Would you vote for a woman for President if she were qualified in every other aspect?” This implies that simply being a woman is a disqualification for President. (Just so you know: 33% answered “Yes.”) Gallup has since changed the wording—“If your party nominated a generally well-qualified person for President who happened to be a woman, would you vote for that person?”—and the question is included in a series of questions in which “woman” is replaced with other descriptors, such as Catholic, Black, Muslim, gay, etc. Of course, times have changed, and we can’t know exactly how much of the bias was due to the leading nature of the question, but 92% answered “Yes” as recently as June 2015.
The ordering of questions is just as important as the words we choose in specific questions. John Martin (Cofounder and Chairman of CMB, 1984-2014) taught us the importance—and danger—of sequential bias. In writing a good questionnaire, we’re not only spitting out a bunch of questions and receiving responses—we’re taking the respondent through a 15 (or 20 or 30) minute journey, trying to get his/her most unbiased, real, opinions and preferences. For example, if we start a questionnaire by showing a list of brands and asking which ones are fun and exciting, and then ask unaided which brands respondents know of, we’re not going to get very good data. Just like if we ask a person whether he/she likes music after talking for an hour about the importance of music in our own lives, we might get skewed results.
One common rule when it comes to questionnaire ordering is to ask unaided questions before aided questions. Otherwise, the aided questions would remind respondents of possible options—and inflate their unaided answers. A couple more rules I like to keep in mind:
- Start broad, then go narrow: talk about the category before the specific brand or product.
Remember that the respondent is in the middle of a busy day at work or has just put the kids to bed and has other things on his/her mind. The introductory sections of a questionnaire are as much about screening respondents and gathering data as they are about getting the respondent thinking about the category (rather than what to make for the kids’ lunch tomorrow).
- Think about what you have already told the respondent: like a good date, the questionnaire should build.
In one of my recent projects, after determining awareness of a product, we measured “concept awareness” by showing a short description of the product to those who had said they were NOT aware of it and then asking them if they had heard of the concept. Later on in the questionnaire, we asked respondents what product features they were familiar with. For respondents who had seen the concept awareness question (i.e., those who hadn’t been fully aware), we removed the product features that had been mentioned in the description (of course, the respondent would know those).
- When asking unaided awareness questions, think about how you’re defining the category.
“What Boston-based market research companies founded in 1984 come to mind?” might be a little too specific. A better way of wording this would simply be: “What market research companies come to mind?” Usually thinking about the client’s competitive set will help you figure out how to explain the category.
So, remember: in research, just as in dating, what we put out (good survey questions and positive vibes) influences what we get back.
Talia is a Project Manager on CMB’s Technology and eCommerce team. She was recently named one of Survey Magazine’s 2015 Data Dominators and enjoys long walks on the beach.
We recently did a webinar on research we conducted in partnership with venture capital firm Foundation Capital. This webinar will help you think about Millennials and their investing, including specific financial habits and the attitudinal drivers of their investing preferences.