It was fun to pick apart T-Mobile’s NPS mistakes recently. So when I bought a shiny newMacBook Air online, I was very curious to receive an email asking about my shopping experience. After all, we all know Apple is a world leader in NPS score.
How does the world’s NPS leader measure their own NPS?
It was not at all what I was expecting to see! Apple’s system to measure NPS does not abide by many of the principles laid out in this blog. However, it may or may not be a bad strategic decision specifically because it is Apple. Today we will review what Apple does right, wrong and just plain ugly when measuring NPS.
[NOTE: See bottom of the post for the entire survey in full]
Point #1: Not In-App NPS
Apple could have easily chosen to embed the NPS survey into their shopping cart directly. Certainly the response rate could have been higher if they did, since only a fraction of people will end up opening the email and then only a smaller fraction will click the survey and only the smallest fraction will end up finishing the survey.
But Apple chose (wisely) to use the more intimate and personal email for measuring NPS and the overall customer experience. We have already discussed (to some heated debate in the comments) why in-app NPS is a mistake, so I won’t rehash the reasons for doing email NPS campaigns. But I was glad to see Apple making the right choice here.
Point #2: The Email Subject Line
The email received from Apple had some strengths and some weaknesses. The subject line was a big weakness. To understand why, let’s understand how people read email…
According to data from the US Consumer Device Preference Report: Q4 2013, way more than half of all email — a full 65 percent — is now being accessed via mobile devices in the U.S.
If people are reading email on phones, subject lines must be smaller. The subject line for Apple’s NPS is “How was your Apple Online Store shopping experience?”
Sure it is descriptive, but on a phone the only thing you can see is “How was your Apple…”
For Apple, this mistake is not such a big deal. That’s because their brand is so strong that when you get an email from Apple (no matter what the subject) you are very likely to see what is says.
But for smaller brands, using a shorter email subject will get you higher engagement rates. One that teases about what might be in the email. For example: “Following up” is a great email subject line. It is short and readable, and yet makes you want to know what the person is following up about. Let the email content do the explaining, let your email subject grab their attention.
Point #3: The Email Content
Apple did a good job with their email content. They begin with gratitude. “Thank you for shopping with us.” It would be an even stronger email if it could be slightly more customized: “Thank you for recently buying a MacBook Air.” Even though it is obviously an automated message, that little detail makes me feel like I am a particularly important customer of theirs.
Then they avoid a big mistake we have seen before. They do not offer any incentive for filling out their survey. In an attempt to get higher click rates, I have seen companies offer bribes (usually some kind of sweepstakes giveaway or gift card) to take their surveys.
These incentives ultimately get people to just fill out the forms as quickly as possible and put random data in them. That’s of no use to you, so I was glad to see Apple avoiding this mistake.
Point #4: The Placing of the NPS Question
Apple decided to put the NPS question 3rd behind how satisfied the customer was and how likely they are to purchase again.
At Promoter, we believe that of all surveys, the NPS survey is uniquely positioned to transform the growth trajectory of a business. Apple clearly isn’t unlocking NPS’s potential as asking numerous questions reduces the overall effectiveness.
I should also note that NPS can already tell you who your “satisfied” customers are (hint: the passives) and predict future purchase behavior anyways. No point in making the survey harder to complete.
More questions = Lower response / completion (yes, it’s that simple)
Point #5: The Placing of the Open-Ended Question
The open-ended follow-up question that usually comes with NPS is put at the end of Apple’s survey. This is a big mistake if you actually want feedback and plan to follow-up with your customers after the survey.
Also, Apple’s wording is a little funny: “Anything else on your mind?”
At Promoter.io, we have sent millions of NPS surveys and the most effective open-ended question we have found is: “What is the most important reason for your score?”
It is more specific and invites people to offer more constructive criticism.
Point #6: The Number of Questions
This is one aspect I simply couldn’t understand about Apple’s survey. It had over 23 questions in 15 separate steps. When running NPS, the rule of thumb is: With each added click or question, your chances of someone opting out of a survey drop 50%.
By this estimate, we can calculate that the total response rate for this survey is approximately:
0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 * 0.50 = 0.000012%
When you are Apple, and you sell 300 million devices a year, that means you still get 36 people to fill out this entire form in detail every single year (!!!). Of course there is going to be far more data collected with this survey than 36, since they are collecting the data incrementally as people send it in. Also, after a certain number of questions, if people have answered 10 questions they will be very likely to answer 20 questions as well. So they will get far more than 36 surveys completed. I would guess that around 1% of their 300,000,000 surveyors who click on the survey fill it out entirely. That’s still 3,000,000 responses.
But if you limit your survey to just 2 questions, imagine the volume of data they would collect. In Apple’s case, that volume might be too large, which is why they feel it’s ok to ask so much. But if you are selling hundreds or thousands of your product a year, not hundreds of millions, you might want to think hard before asking so many questions.
The quality and quantity of responses are both critically important.
Point #7: Unnecessary Questions
Given how many questions were asked on this survey, it raises the question of how many of these questions are necessary in the first place. For example, Apple asks if the customer scheduled an appointment for a video setup session afterwards. This seems like something they should be able to detect on their own.
Especially when you have too many questions in your survey, it’s important to see how much data collection in the survey can be automated.
Point #8: The Shipping Question
Another rule of thumb when constructing your survey questions is that, the likelihood that your customer will respond to a question is directly related to how many potential choices you provide them as answers.
Look for example at the shipping question in the middle of the survey. “Which of the following best describes when your item(s) were received?”
There are 8 potential answers you can click on. The chances that someone will read through all 8 and pick the appropriate one are slim to none. I wager to bet that most of the responses are “Don’t know/not sure” simply because people don’t want to have to read through so many options.
Point #9: The Time to Fill Out Survey
Along with the sheer length of this survey, it takes forever to fill out. For me, it took about 15-20 minutes. And, I consider myself pretty proficient at filling in things on my mobile device. How many people are really willing to sit for 15-20 minutes for any task these days? The average person spends 15 SECONDS on any given website at a time. So expecting people to spend 15 minutes on your survey is a big mistake. Again, if you are dealing with millions of surveyors per year, you can deal with gathering data from that small sub-section that are willing to take their time to do this. But then you have to ask yourself if the data you are getting is poisoned by coming from a self-selected group of people that have long attention spans and nothing better to do with their time. How valuable is that data really going to be?
Point #10: The Follow-Up To NPS
The worst part of Apple’s NPS survey is that after I spent 20 minutes, I got absolutely nothing but a thank you screen.
I’m not saying that Apple should call each and every person who responds to their survey and thank them personally (though that would be pretty freaking awesome). But if a customer spends 20 minutes of their time giving you their valuable input, it seems like they deserve some consideration.
Again, not a gift card or anything tangible. But a sincere personalized thank you that takes my feedback into account and shows that someone actually read it and it didn’t just end up in a void (which I am sure that in this case it did).
And, as I have pointed out countless times before on this blog, the NPS follow-up is the most important growth driver for doing NPS right. That’s why I give you a free cheat sheet to make it really easy for you to know how to respond when customers do give you feedback. No matter what that feedback is (positive, neutral or negative).
A lot of what Apple does with this survey only makes sense in the context of a company getting millions of customers every year. But I hope that deconstructing Apple’s survey has helped you see your own surveys in a different way. Even the best of the best has room for improvement (a lot of it).
To give credit where credit is due, Apple does a better job than anyone at curating Promoters who have a big hand in growing their brand. Their promoters, advocates, “fanboys”, whatever you may call them are their single biggest growth channel.
We will keep our eye out for more big companies and their NPS surveys to see what else we can learn, so subscribe to the blog for more updates.