Our world has gotten feedback happy. And it’s annoying.
In a 48-hour period at the beginning of May I received five surveys.
One paper survey via snail mail with 66 questions to rate my son’s pediatrician. One online survey from a hotel I had stayed at the prior weekend and another from a hotel stay two weeks before. A third online survey from the conference I had just attended, and a fourth one from OpenTable for the restaurant I had dined in the Friday before.
It seems there is little you can do these days without being asked to rate your experience. I had lunch last week with my friend Kathy who was bewildered by a gas station that asked her to rate her experience pumping gas.
Pumping gas. This has gotten out of hand people.
The idea of using customer satisfaction to drive performance has run amok.
Coupled with the ease of online survey services like SurveyMonkey, every hotel stay, online purchase and doctor visit triggers a survey request.
And don’t even get me started about surveys from car dealerships.
As a marketing research professional, you’d think this would be exciting to me.
It’s not.
I am a consumer like you, with limited time on Earth. None of which I want to spend answering 66 questions about my son’s pediatrician.
This survey diarrhea wastes our time and has not made a proportional improvement in the commercial world.
Why?
Because most companies and organizations fail to realize that it is not the doing of the survey that improves customer experience and brand image, it’s how you respond to the information.
Moreover many surveys are poorly done.
Too many questions.
Badly written questions.
Questions with industry jargon that customers don’t understand.
Meaningless ratings scales, like the one in the graphic above.
What happens when you do a survey poorly?
You leave yourself wondering how to respond, because your questions were not designed to set you up for action with the answers.
Then you feel like you’ve wasted the time and money you invested in the research. Worse yet, you may have damaged your brand by asking something of your customers and then ignoring their responses.
It doesn’t have to be this way.
Good research can set you up for a quantum leap in brand success.
You can get on the path to good research right now by following these simple rules:
- Only ask when you need to know. If you have no plans to use the information, don’t waste your time or your audience’s.
- Begin with the end in mind. What are your goals for the research? Be specific. Every survey should be designed for a clear purpose and action.
- Focus on what you need to know. Need-to-know items are the ones that lead to decisions and prompt action. Nice-to-know items lengthen surveys unnecessarily and cloud the purpose.
- Put the most important thing first. Maybe make it the only thing you ask.
- Avoid jargon. Use the language your audience would use. If you don’t know the words they would use, you need to spend time talking to them directly or observe interviews or focus groups.
- Heed the results. Take time to understand the insights you get and then act on them.
Handle your brand’s time in front of your audience with care.
One brand that does this well is Zappos. If you go to their website and click on “feedback on our website,” they ask two questions. Two and only two.
The first one is the likelihood that you would recommend Zappos to a friend or colleague, rated on a scale of 0-10.
The second is “How can we improve Zappos.com?”
They prompt a similar survey after each purchase, with the second question being “How could we have improved this purchase experience for you?”
You can see the specific goals of their surveys and easily envision how they can use the responses they get.
I’m not saying that you should only ask two questions. I am saying that every question should serve the purpose of your survey. When the purpose is covered, your survey is complete.
So what did I do with the five surveys in two days?
I answered the one from the conference on the day it arrived and the OpenTable survey two days later, because I thought those organizations would listen and take action. I recycled the paper survey and deleted the emails from the two hotel chains.
Have you had any ridiculous survey experiences? Please share them here.
If you liked this post, you’ll love the next one.
To have future posts sent to your inbox...
You hit a nerve with this one. I recently called Schwab to ask for the company’s snail mail address and subsequently answered a short survey about this very short interaction. That same afternoon, a Schwab rep called me for details. I told him why I had said what I had said — and thereby doubled the time I had to spend with Schwab. After this call, I received two voice messages from Schwab, both asking me to call back and give them more info about this survey. I ignored them. I then got an email asking me for more input. I ignored that. Then I got ANOTHER phone call and spent about 10 minutes telling the woman how harassed I felt by this high level of demand for feedback. I have to say she didn’t appear to get it. Very irritating!
Wow Kathryn, Schwab has really gone out of their way to abuse your kindness.
I do wonder if this is a case of lower scores having an impact on someone’s performance review. Some large companies have adopted this connection, and I think it is unhealthy.
No matter the reason, I have to believe your affinity for Schwab was damaged in the process. I hope they are at least generating good returns for you!
Completely agree about our survey-happy society and it is fatiguing.
Like step #2 the best. Start with desired output and work back. Also think you are spot on about asking what you need to know and take action on versus what we are curious about because knowing is more always fun — unless of course you are the respondent. Excellent practical piece on what we should consider when surveying our market. Keep ’em coming!
Thanks so much Jill! And yes, I think #2 is perhaps the most important. It’s the one that ensures the usefulness of your research efforts.
Three comments –
1. Totally agree with survey fatigue. It is maddening how many I receive in a given week and lately the only ones I am inclined to fill out are for more ‘negative’ experiences. (My hope that some good would come out of my response.)
2. I find calling Comcast infuriating (beyond the fact that they are Comcast.) Before I am allowed to be routed to an agent, I have to stop to agree to a survey or actively opt out. Given some recent experiences, I half believe that when I say I won’t do a survey, I get ‘less qualified’ reps and the calls take twice as long to resolve anything.
3. Your car dealership one really hit the nail on the head for me. I take my car in for regular service and am ‘reminded’ by the service contact that ‘corporate’ only cares about ‘Excellent’ responses and they are dinged for any lesser responses. So are you saying I should lie? To be honest, I can’t really ever say I have had an ‘excellent’ experience waiting for an oil change, you know what I mean? Then you finally get the survey and it is too long. AND, they send you reminders if you don’t complete it. I have received both email and phone calls to my house phone with reminders. To me, that is a 100% foolproof way of getting me to never reply.
Tina,
Thanks so much for your three comments.
1. You are not alone in your inclination to fill out surveys only for your negative experiences. Most respondents are usually either very happy or unhappy. Middle of the road folks rarely take the time.
2. Comcast is maddening and your theory is interesting. Why not test it next time by saying yes to the survey and seeing if you have a more qualified rep help you? There is no law saying you have to stay on the phone for the survey once your issue is resolved. I recently called Fidelity’s new credit card partner and was not asked but told I would have a one minute survey after the call. I did not stay on the line.
3. Oh the car dealership thing. I have adopted a policy of not filling out those surveys. The corporations have set their franchisees up to fail with those and clearly don’t care about the customer experience. To avoid wasting my time and hurting the dealership (the rewards are structured to take the focus off of service improvement and focus only on begging for perfect scores), I just don’t answer. A bigger problem for the dealership and company: I stopped trusting them and no longer go there. I’ve only been back for recalls, and once because I was overdue for an oil change and my mechanic was out of the way.
Evelyn,
On the one hand, I can’t help but be happy that brands/company’s are FINALLY interested in what would make a customer happy – but definitely hear you on the over kill – 66 questions…Um.. NO Thanks.
I have not truly been inundated w/ surveys lately (probably jinxed myself) but I do hate AT&Ts. EVERY TIME I call… I get a text message telling me that a survey is coming, THEN a text message with the survey. Did I mention I hate text messages and have told them repeatedly while on the phone, NOT to send it. (Sorry lots of caps here. Kind of annoyed.)
With those calls, I somewhat feel like, you know whether I’m happy or not by how the conversation is going and if I have to escalate to feedback, I will.
I also agree with having goal in mind. I wouldn’t mind seeing – a “We heard you” type of message down the line and how they implemented said feedback… Do you think that would be overkill.
I think most feedback can be achieved with a 5-7 question survey (or less as Zappos clearly is doing something right.) More than that… you’re getting too pick and taking advantage.
Ironically today at Home Depot a man came up and said…You’ll make me so happy if you give me 90 seconds. He startled me and I didn’t know how to reply. At least he set the time parameters. It ended up starting with a question that disqualified me when I didn’t have said Water heater… so it was less than 15 secs. Still this approach could use work too.
Great topic once again. You always know just what to write about!
Kris,
Isn’t is ironic that a tool that is supposed to make us happier with a brand annoys us enough to leave us less happy with the brand than if they hadn’t asked? AT&T’s constant and probably automated survey strikes me more as a boxed that needed to be checked (customer satisfaction survey in place?) than a mechanism for truly engaging. I wonder if anybody is even looking at the data they are collecting.
I do like the Home Depot interviewer’s approach and have used it myself. Setting time expectations, especially when the time is short, conveys that you value the respondent’s time and leave it to them to choose whether to partake.
Thanks for the kudos! Glad you enjoyed this.