Every company claims to care about customer satisfaction. Fancy television salesman claim that “customers make the difference” or that “they’re our bottom line.” On the Internet, the large majority of client landing pages have the word values or goals in bright colors, immediately followed by claims about how their customers come first.
While it's understandable why many companies make these claims—regardless of how valid they are—what's alarming to me is the percentage of companies that are completely backwards in the way that they measure their customer feedback. It's as thought they think it's more important to claim satisfied customers than to actually find out what makes them tick. Not only do the majority lack the proper strategies to accurately gauge satisfaction, but they also compound the mistake by making serious decisions based on the inaccurate data they gather.
Let's take a look at the ways we can properly measure customer satisfaction and fix broken surveys!
Cementing Customer Satisfaction
The days of compliments and complaints inside our offices or shops have been tossed to the wayside as smart phones and social media have ushered in a radically more connected world. Our help tickets and personal conversations have, in many cases, been replaced by Twitter inquiries and automated customer satisfaction surveys.
The truth is that the resulting data often misleads companies. If the survey is not structured and conceptualized in a methodical manner, the results can often lead companies to attempt to fix issues that may not even be important to their customers. George Bradt, executive onboarding expert, explains in an article in Forbes:
“The fix is measuring satisfaction on an actionable dimension across a scientifically representative sample and then helping your front line managers use that data to effect change.”
Let's look at three ways that companies are misled by their customer satisfaction surveys.
1. They ask the wrong question.
Often times, business owners try to focus on too many measurables with every survey. As they begin to create it, they think of the variety of areas their customers interact—call centers, social media, online retailers, in-person—and they attempt to make one, all-encompassing survey.
However, often, their survey really boils down to one simple question:
How satisfied were you with your most recent experience?
Now, our customers are focused on a specific visit, purchase or interaction that rarely reveals valuable data about our product and/or our customer service. When this question is asked, customers are largely going to be focused on their experience with a specific customer support platform or service technician. Without guiding questions that go beyond asking customers about their most recent experience, you are likely to receive unreliable responses that change depending on your customers’ current moods (or how long they were left on hold).
2. They focus on call experience, not their product.
Along the same lines, the easiest trap to fall into is overvaluing call-center experiences when creating our customer satisfaction surveys. What I mean is that some business owners think of customer satisfaction as metric that can only be measured through conversations, email or social media interactions.
Were my agents kind and caring throughout the interaction? Did they answer promptly and act effectively in resolving the issue?
While notions like these are very important and do have an effect on how customers perceive your brand, they largely forget the most important part of the business—the product. If we are receiving complaints on our customer service channels but we are only measuring the effectiveness of our answers, we might be missing the real problem. If the product or service was efficient in the first place, our service channels wouldn’t be swamped with unhappy customers. Thomas Wailgum, Vice President of Media at ASUG, explains in an article for CIO,
“In all, 47 percent of consumers say companies meet their expectations only sometimes, rarely or never, and 41 percent describe the quality of service they receive as just fair, poor or terrible.”
Often times, our customer satisfaction surveys seem to address only issues with customer/agent interactions and ignore everything about our product, which was likely why they called or wrote us in the first place. If we get to the root of the problem and address the customer's underlying concerns with our product, they won't need to interact with customer service at all.
3. They rely on bad sampling.
Creating a proper customer satisfaction survey is a challenge, but understanding whom to target is a challenge by itself. Luckily, the Internet has made collecting and tracking data easier than ever. However, the measurables here can be quite tricky.
The issue is often in the responder. Such a small percentage of our customers actually respond to these surveys that it becomes difficult to apply the data to a much broader array of people. Sandy Rogers, Managing Director of FranklinCovey’s Loyalty Practice, suggests the following in an article for Forbes,
“You can check this by comparing the future spending of your survey responders to non-responders. If it’s different, your sample is not representative.”
By tracking the spending of our survey responders, we better understand if they’re representative of our broader customer segments as a whole and whether or not your customer satisfaction levels are having a measurable effect on your business.
Fix the Flaws
I have found flawed customer satisfaction surveys to be prevalent in almost every industry. Finding an actionable way to drive change in our businesses should always be at the top of our minds, which means finding better data to drive these changes. By restructuring our surveys and narrowing their focus, we can gather more insightful data that can be better applied to serving our customers.
What problems do you experience with customer satisfaction surveys? Join the conversation by tweeting us @EidsonPartners!