Fred Reichheld

« The Right Number of Questions for NPS Surveys | Main | Don’t Forget About Promoters »


Ram Josyula

As a Six Sigma Master Black Belt (MBB) working on NPS and Voice of the Customer (VOC) surveys I am grateful for Fred's timely observations on the response rate in Lean Six Sigma perspective. In 1997 as newly minted MBB, I was chided then guided by John Hollenback, the chief survey taker for GE, that any survey that has < 60% response rate was in a danger of bias. He also taught us survey ethics: If you do not expect an answer then do not raise the question in the first place. Fred has brilliantly put these concepts in Lean Six Sigma paradigm: Any unanswered question, to which we had expected an answer, is a waste or worse a value-killer. In additon to relying on inadequate response rate, several practioners continue to commit another set grave of mistakes: Using statistical tools - designed to infer sample data - to produce flawed but impressive looking results, often drawing wrong and dangerous conclusions about customers, employees and businesses. "Ignorance is not so much as not kowing something; but knowing something that ain't so" - Will Rogers.

Amy Shipley

As an organisation that works with companies to develop and implement online surveys, achieving high response rates is a continuous challenge. We have been working with one large multi national organisation who surveys their customers through out the world (in their own native languages) twice a year. The first time we sent the survey out, the response rate was 22%, 6 months later the response rate was 30% and the last time we did it the response rate was 39%. I believe this increase in response rate has been a direct result of them following our recommendations of feeding back to customers the results of the survey and the areas they are working on improving. Their customers have seen they are serious about this process and now see that the time they spend completing the survey is worth it! (And funnily enough their performance scores have increased!)

Shalini Malhotra

I would like to get some thoughts/experiences on what would be the minimum "respondent size" for an NPS score to be considered valid and meaningful. We conduct the NPS survey at a company level, but break it down business wise for analysis.. Doing that results in some businesses having less than 10 respondents - is that a reasonable respondent size to be analyzing seperately? I have been trying to find the answer to this in a lot of blogs but could not find anything. Please share any thoughts you may have on this.

Net Promoter Community

Shalini - we have routed your question to the discussion forum. Here is the link:

Vinit Bharara

We are about to do our first NPS survey ever and are confused how we can possibly generate such a high response rate. I recall from The Ultimate Question that phone surveys are frowned upon (b/c the answers would likely be more positive) and e-mail is preferable. But simply getting 30% of folks to open any email these days is an enormous challenge, no?

Net Promoter Community

Here is a related discussion forum post (viewing posts requires brief registration):

Bob Vogel

We have a specialized hosted application with more than 13,000 customers logging in daily to run their IT Services businesses.

When we learned about the Net Promoter program, we developed an application that displays the Ultimate Question immediately after logging-in, actually "intercepting" the customer on the way into the app. We have them the choice of picking their score and submitting, or to "Ask Me Later." If they select the latter, they get a pop-up that explains how important their opinion is to us, and that we will ask them again tomorrow. The system waits 24 hours before serving up the question again.

We just ran our first survey two weeks ago. Thus far we have 60% responses, and we're tracking the "Ask me laters." While we do have a handful of people who click that button every day, the vast majority generally answer after the first few requests. Once they respond, we have the system remember when they responded, and they will be asked again every 90 days. This puts all of our customers on their own unique survey cycle.

We also built into our inhouse CRM system a link to the company's Net Promoter score, so with the click of a button, we can see all users from a given account, and what their individual scores are. While we are tracking their historical responses, we show at a glance the must current response and the previous one. A graphic thumbs up, and thumbs down are displayed next to the promoters and detractors, respectively.

I'm sure we will be learning a lot as we get more experience with this fantastic tool.

Geoff Graham

I like Ram Josyula's initiative of rewarding respondents by sharing summary feedback from surveying.

My business surveys homeowners and homebuyers on behalf of about 600 builders, developers, and remodelers from around the United States. We've found that response rates correlate directly with company performance: the more likely the builder is to deliver an exceptional customer experience, the higher the response is going to be.

We've also seen a strong correlation between size of company and response rate: many of our smaller custom builders (under 20 homes per year) enjoy a 100% response rate. I suspect this is at least in part because the executives/owners generally know the customer personally (hence another great benefit of using a third-party to get objective feedback). Among our builders with more than 1,000 customers per year, our highest response rate is about 80%.

Response rates (and performance) are also obviously impacted by the quality of customer contact information. The better their information, the more likely they are to be able to effectively serve the customer, and the more likely we are to be able to elicit a survey response from the homeowner. This is challenging for some for-sale homebuilders, where they have not collected multiple phone numbers or email addresses, and contact information may change after the customer moves into the home. Often, the commencement of a survey initiative (and their personal realization that a high response rate is critical) creates their incentive to collect better information.

While all of our surveys are brief, we've not seen a difference in response rate among 5 question surveys versus 20 question surveys.

Of our last 30,000 surveys, our response rate is a little better than 70%. For many of our building professionals, it's above 90%, for very few is it below 50%. We survey with a combined email, phone, and mail strategy.


Thanks for this interesting post!

The comments to this entry are closed.