Last weekend, my family and I took a trip to Charlotte, North Carolina. We rented a car and stayed at a hotel. Within 12 hours of arriving home I received an online survey from each company. In both cases, the experiences were excellent and I was happy to share the details. In one case, the survey took me about 1 ½ minutes to complete. The other one took me about 10 minutes. For the survey that took me 1 ½ minutes, when I reached the end, I thought “Well, they asked about the key aspects of the experience and got what they needed.” In contrast, by the time I reached the midpoint of the 10 minute survey, I was exhausted and just wanted to end the damn thing – and then when I reached the end, they asked if I wanted to answer more(!?!) questions.
In the 1 ½ minute survey I could clearly see the questions focused solely on the experience and managing the key aspects of the service –they probably have more than enough data to get deep insights since they know who I am, my travel details, and have similar data for the thousands of other travelers who are also rating the experience.
In the 10 minute survey, I could see that the company was asking for details beyond the experience, they were seeking to understand competitive positioning and future intended travel behaviors—all things that are clearly outside the scope of the service experience. They also asked questions about very detailed aspects of the experience e.g., the mechanical condition of the car and softness of the towels. It led me to ask: “Really? You want me to rate this aspect of the service? Aren’t you guys smart enough to tell these things are up to standard?”
Here’s an example from another industry: homebuilding. I’ve seen surveys that ask buyers to rate the window quality in the home. Why?!? Shouldn’t the builder know if the windows they are putting into the home are high-grade or low-grade? Remember, we’re assessing the home purchase experience, NOT homebuyer preferences. If you’re trying to achieve both in the same research study, you’re going to be (as Mr. Miyagi says) “like the grasshopper in the middle of the road.”
As researchers and companies asking our valued customers for feedback, we need to be very aware of the unstated agreement for what’s in scope and out of scope for these customer experience surveys. I’m not opposed to having surveys do “double-duty,” but we should be clear with our customers that we are doing so, AND not kill them with gruelingly long surveys.
Jeff is VP, Market Science Solutions at CMB. He always takes time for a customer experience survey, but keep it short he's very busy, he needs time to blog and occasionally tweet @McKennaJeff.
See how CMB is helping Royal Caribbean measure guest experience and improve customer satisfaction and retention. Click here.