You sent out a survey. You waited. The responses trickled in. Now you are staring at a 9% response rate and wondering if this is normal, or if something went wrong.

The short answer: it depends entirely on the channel. A 9% response rate from an email link is solidly average. The same 9% from an SMS survey would be a disaster. Without benchmarks, you are flying blind.

This article compiles the latest research on survey response rates by distribution channel, explains why rates have been declining for decades, and gives you five evidence-based strategies to push your numbers higher.

1. Response Rate Benchmarks by Channel

Response rates vary dramatically depending on how you deliver the survey. The table below summarizes current benchmarks from peer-reviewed sources and industry research published in 2025.

Channel Average Response Rate Source
Email (embedded survey) 15 -- 25% Zonka Feedback 2025
Email (linked survey) 6 -- 15% Zonka Feedback 2025
SMS 45 -- 60% SurveySparrow 2025
In-app (mobile) 27 -- 36% Qualaroo 2025
In-app (web) 20 -- 27% Qualaroo 2025
Employee engagement 60 -- 80% Simpplr 2025

The spread is enormous. SMS surveys routinely hit 45 -- 60% response rates because text messages have near-universal open rates and the survey is right there in the conversation. Email linked surveys, by contrast, require the recipient to open the email, click a link, load a new page, and then start answering -- every step is a drop-off point.

Key Takeaway

Comparing your response rate to a generic "industry average" is misleading. Always compare against the benchmark for your specific channel. A 20% rate is excellent for email-linked surveys but below average for in-app mobile.

Why in-app surveys outperform email

According to Qualaroo's 2025 analysis, in-app surveys on mobile devices achieve 27 -- 36% response rates compared to email's 6 -- 15%. The reason is context: the user is already engaged with your product when the survey appears. There is no channel-switching friction. The survey feels like part of the experience rather than an interruption.

Web-based in-app surveys perform slightly lower at 20 -- 27%, likely because desktop users have more tabs and distractions competing for their attention.

Employee surveys are the outlier

Internal employee engagement surveys routinely achieve 60 -- 80% response rates, according to Simpplr. This makes sense: employees have a built-in incentive (improving their own workplace), the survey comes from a trusted sender (their employer), and there is often implicit social pressure to participate. If your employee survey is below 60%, something is actively wrong -- either trust issues, survey fatigue, or a distribution problem.

2. The Long Decline of Survey Response Rates

Survey response rates have been falling for decades, and the trend is accelerating. The most comprehensive data on this comes from the Pew Research Center.

Pew Research Center data shows that telephone survey response rates dropped from 36% in 1997 to just 6% in 2018 -- a staggering six-fold decline over two decades. This is not a blip. It is a structural shift in how people relate to unsolicited requests for their time and attention.

The causes are overlapping:

The telephone survey decline is a preview of what happens to any channel that gets saturated. Email surveys are now following the same trajectory, and the 2025 deliverability crisis (below) accelerated that decline sharply.

3. The 2025 Email Deliverability Crisis

If your email survey response rates dropped suddenly in 2025, you are not alone. According to a detailed analysis by KL Communications, inbox placement rates for survey emails collapsed from 49.98% to 27.63% -- meaning more than 7 out of 10 survey emails never reached the recipient's inbox.

What caused this? A combination of factors converged in late 2024 and early 2025:

Warning

If you are still relying on email as your primary survey channel, your actual reach may be less than half of what your email platform reports as "delivered." Delivered does not mean seen. Check your inbox placement rate (not just delivery rate) using tools like Google Postmaster, Validity Everest, or GlockApps.

The practical implication is clear: researchers and businesses that depend on email surveys need to diversify their distribution channels or accept dramatically lower sample sizes.

4. Why Your Response Rate Is Low

Before jumping to solutions, diagnose the actual problem. Low response rates usually stem from one or more of these root causes:

Your survey is too long

According to Qualaroo, surveys that take longer than 12 minutes to complete see steep drop-off in participation. The ideal survey length is 5 minutes or fewer. Every additional minute beyond that costs you respondents.

You are surveying at the wrong time

Timing matters more than most people realize. Formbricks research found that post-experience feedback collected immediately is 40% more accurate than feedback collected 24 hours later. The same principle applies to response rates: people are most willing to give feedback while the experience is still fresh.

Your invitation does not communicate value

A subject line that says "Please take our survey" gives the recipient zero reason to click. Contrast that with "Help us decide what to build next -- 2 minute survey." The second version communicates both the purpose and the time commitment.

You have not earned trust

First-time survey senders to cold lists will always see lower rates than established brands surveying existing customers. Response rate is, in part, a measure of relationship strength.

Your audience never received the invitation

This is the most overlooked cause. If your email ended up in spam -- and in 2025, there is a 72% chance it did -- no amount of question optimization will help. Fix deliverability before optimizing content.

5. Five Proven Ways to Improve Response Rates

These strategies are ranked by the strength of the evidence behind them.

5.1 Offer incentives (especially monetary ones)

A systematic review and meta-analysis published by the National Institutes of Health (PMC, 2019) analyzed multiple studies and found that monetary incentives increase survey response rates by an average of 19 percentage points, while non-monetary incentives (gift cards, prize drawings, charitable donations) increase rates by 8 percentage points.

The effect is consistent across survey types and populations. Prepaid incentives (included with the invitation) outperform promised incentives (given after completion) because they trigger reciprocity: the respondent feels obligated to return the favor.

Practical Application

You do not need large incentives. The NIH meta-analysis found that even small amounts ($1 -- $5 prepaid) significantly boost response rates. The act of giving matters more than the amount.

5.2 Personalize the invitation

According to Qualtrics research, personalized survey invitations can increase response rates by up to 48%. Personalization goes beyond inserting the recipient's first name. Effective personalization includes:

The underlying principle is relevance. A survey that feels like it was written specifically for you gets answered. A survey that feels like a mass blast gets deleted.

5.3 Send strategic reminders

According to UserFeedback research, sending 1 to 3 follow-up reminders can increase survey response rates by up to 36%. The key details:

Each reminder should be a fresh message, not a "just following up" forward. Rephrase the value proposition, mention how many people have already responded (social proof), and restate the deadline.

5.4 Keep it short

Qualaroo's data is clear: surveys under 12 minutes maintain reasonable completion rates; beyond that, abandonment increases sharply. But "under 12 minutes" should be a ceiling, not a target. The best-performing surveys take 2 -- 5 minutes.

Practical rules for keeping surveys short:

5.5 Survey at the right moment

Formbricks research found that post-experience feedback is 40% more accurate when captured immediately versus 24 hours later. This accuracy effect also translates to higher response rates: people are most willing to share feedback when the experience is fresh in their mind.

Best timing practices by context:

The Compound Effect

These five strategies are not mutually exclusive. Combining a short, personalized, well-timed survey with a small incentive and one follow-up reminder can push response rates well above channel benchmarks. The companies achieving 40%+ email survey response rates are typically doing all five simultaneously.

6. How to Calculate Your Response Rate

The formula is straightforward, but the denominator matters:

Response Rate = (Completed Responses / Total Invitations Delivered) x 100

Notice: the denominator is "delivered," not "sent." If you sent 10,000 survey emails but only 5,000 reached inboxes (the rest bounced or went to spam), your denominator is 5,000. Using "sent" as the denominator artificially deflates your response rate and masks a deliverability problem.

Also distinguish between:

A survey with a 30% response rate but only a 12% completion rate has a design problem -- people start it but drop off partway through. That is a signal to shorten the survey or fix confusing questions in the middle.

7. What Response Rate Is "Good Enough"?

This is the question everyone asks, and the answer is less about the percentage and more about what you plan to do with the data.

For statistical validity

If you are making high-stakes decisions (pricing changes, product launches, policy shifts), you need enough responses for statistical significance. For a population of 10,000 people, you need roughly 370 responses for a 95% confidence level with a 5% margin of error. That is a 3.7% response rate. The response rate itself does not determine validity -- the absolute number of responses does.

For directional insight

If you are looking for general trends rather than precise measurements ("Do customers prefer feature A or feature B?"), even 50 -- 100 responses can be useful. Many product decisions are made on this basis, and that is perfectly reasonable as long as you acknowledge the limitations.

For benchmarking over time

Sometimes the absolute response rate matters less than the trend. If your quarterly customer satisfaction survey gets a consistent 18% response rate and then drops to 11%, that decline itself is a data point worth investigating -- even if both numbers are "low" in absolute terms.

Non-Response Bias

The real danger of low response rates is not small sample size -- it is non-response bias. If the people who did not respond are systematically different from those who did (e.g., unhappy customers are less likely to take your satisfaction survey), your results may be misleading regardless of the response rate. Always consider who is missing from your data.

Build Polls That Get Answered

Poll Pixie makes it easy to create short, focused polls that respect your audience's time. Embed them anywhere, share via link, and watch results in real-time.

Create a Free Poll