28 Bad Survey Questions Examples: How to Spot & Avoid Them
Discover 25+ bad survey questions examples to avoid common survey mistakes and learn how to write good survey questions for accurate results.
Bad survey questions are the uninvited guests at your data party. They sneak in, ruin the vibe, and leave you with a mountain of confusing, unreliable answers. They don’t just annoy your respondents—they can lead to bad business decisions and embarrassing mistakes. In this article, you’ll discover the ugly truth about common survey mistakes, see real examples of question wording errors, learn how they show up, and find easy ways to avoid them. Stick around until the end for a brisk checklist to spot and fix survey blunders before they wreck your results.
Quick Primer – What Makes a Survey Question “Bad”?
What’s So Bad About a Bad Survey Question?
A bad survey question is like a malfunctioning compass: it points respondents in the wrong direction and leaves you lost in the land of unreliable data. So, what gives a question this dubious honor? Several culprits:
They introduce bias, nudging answers in a particular direction.
Ambiguity creeps in when questions are vague or unclear.
Double-barreled questions juggle two ideas at once, muddying insights.
Negative and double-negative wording makes interpretation tricky.
Some questions offer incomplete or overlapping options, trapping respondents in confusion.
Others may ask for overly sensitive or presumptive information, causing discomfort or dishonesty.
Problematic questions are often the result of rushed survey design, lack of feedback, or simply not knowing the rules of great survey design. Their impact? Skewed data, frustrated respondents, and wasted resources.
Armed with an understanding of what makes a question “bad,” you’re ready to spot the villains hiding in your next survey. Let’s break down the culprits and fix them for good!
Double-barreled questions, which address multiple issues simultaneously, can confuse respondents and compromise data quality. (en.wikipedia.org)

Create your survey, it's 100% free
How to Create a Survey with HeySurvey in 3 Easy Steps
If you’re new to HeySurvey and want to build your own survey—whether it’s to avoid bad questions or gather reliable feedback—here’s how to do it in a snap. Not sure about survey design yet? No worries! You can start by opening one of HeySurvey’s handy templates with the button below these instructions.
Step 1: Create a New Survey
After logging into HeySurvey (or starting without an account), click “Create New Survey.”
Choose your preferred starting point:
- Empty Sheet to build from scratch,
- Pre-built Template to save time with ready-made structures, or
- Text Input Creation to simply type your questions and let HeySurvey format them.
- Empty Sheet to build from scratch,
Give your survey an internal name so you don’t lose it in your survey library.
Step 2: Add Your Questions
Inside the Survey Editor, click “Add Question” at the top or between existing questions.
Pick the question type: single or multiple choice, scales, text input, or statements for info display.
Enter your question wording thoughtfully—remember to apply what you learned about avoiding bias, double-barrels, ambiguous terms, and absolutes.
Mark questions as required if you want to make sure respondents can’t skip them.
Add images or apply markdown formatting for better readability and engagement.
Use the branching feature to create custom survey paths based on answers, making your survey more relevant and dynamic.
Step 3: Preview and Publish
Click the Preview button to see your survey exactly as respondents will, testing flow and usability.
After you’re happy, hit the Publish button.
You’ll receive a shareable link or iframe embed code for your website.
Just a heads up: publishing requires creating a free HeySurvey account so you can collect and view responses.
Bonus Step: Apply Branding, Define Settings, or Skip Into Branches
Branding: Upload your logo and customize colors, fonts, and backgrounds using the Designer Sidebar to make the survey feel truly yours.
Settings: Adjust start and end dates, set response limits, define redirect URLs after completion, or allow respondents to view aggregated results.
Branching: Use advanced branching to control the survey path based on respondents’ answers, showing only relevant questions and improving response quality.
With these simple steps, HeySurvey makes crafting well-designed, respondent-friendly surveys easy—even for first-timers. Ready to get started? Just click the template button below and let the data magic begin!
Leading & Loaded Questions
The Pitfalls of Pushing a Point
Leading and loaded questions are sneaky saboteurs. They’re designed—sometimes unintentionally—to steer respondents toward a particular answer. This happens when your wording or phrasing hints at a “right” response or loads assumptions directly into the question.
You’ll recognize a leading question by its pushy tone or because it all but answers itself. A loaded question, meanwhile, often carries an assumption—sometimes absurd, sometimes offensive—that traps respondents into agreeing with your premise.
Why and When These Questions Get Used
It’s tempting to use leading or loaded questions when you want to confirm your own beliefs. Maybe you’re desperate for validation at a product launch, under pressure to prove a marketing campaign worked, or stuck in a stakeholder echo chamber. You might not even notice you’ve added a hint of bias—until it’s too late.
Leading and loaded questions also sneak into political polls and customer experience (CX) surveys, especially when teams want quick wins or compelling soundbites.
Five Sample Bad Questions: Leading the Witness
“Don’t you agree that our product is the best on the market?”
“How much did you love the keynote speaker?”
“How satisfied are you with our excellent customer service?”
“Why do you prefer brand A over inferior brand B?”
“How amazing was your flight experience today?”
Each of these questions nearly drags the answer out of respondents.
How to Fix: Keep It Neutral
Try this fix:
Use neutral wording free from adjectives or leading phrases.
Balance positive and negative response options to avoid telegraphing your desired outcome.
Ask, “Would someone disagree with this question’s premise?” If yes, rewrite it!
Your survey becomes infinitely more trustworthy when you ditch bias and let respondents speak their truth.
Leading and loaded questions in surveys can result in inaccurate data by influencing respondents to provide responses that may not reflect their true thoughts, feelings, or behaviors. (entropik.io)
Double-Barreled Questions
The Danger of Two-for-One Questions
Double-barreled questions are data disasters in disguise. Each one combines two (or more!) topics into a single question, expecting just one answer. This isn’t just lazy writing—it’s sabotage for your survey’s clarity. Respondents are left confused about which part they’re supposed to answer, leading to messy, unreliable data.
Why and When They Sneak In
These questions pop up for a variety of reasons:
You’re running out of space and want to speed things up.
You’re rushing and don’t have time for careful question vetting.
You’ve fallen victim to survey fatigue and want to pack multiple issues into just one question.
When clarity loses to convenience, double-barreled questions rear their heads, and your insights suffer.
Five Sample Bad Questions: Double Trouble
“How satisfied are you with your salary and job security?”
“Rate the taste and packaging of our new snack.”
“Was the website easy to find and navigate?”
“Do you support lower taxes and increased spending on schools?”
“How helpful and friendly was the support team?”
How can anyone possibly answer these accurately? If you don’t split the issues, you’ll never know which half influenced the response.
Tips for Clarity
Here’s how to rescue your survey:
Split each concept into separate questions so you can analyze each topic independently.
For related items, consider matrix tables, where respondents can rate each attribute (like “taste” and “packaging”) on its own row.
Check each question for “and” or “or.” If you spot one, it’s likely a double-barreled culprit.
Clear, single-focus questions are your secret weapon for turning muddled surveys into powerful data tools.
Ambiguous or Vague Questions
When “Regular,” “Often,” and “Affordable” Mean…What, Exactly?
Ambiguity is the survey designer’s arch-nemesis. When a question is ambiguous or vague, each respondent interprets it differently. That means your data isn’t just unreliable—it’s basically Fortune Cookie wisdom dressed up as analytics.
When you ask questions without clear time frames, definitions, or context, you leave answers open to a huge range of interpretation. That’s a sure ticket to unusable results.
Why and When They Appear
Vague questions usually sneak in because:
You skipped operational definitions—the clear, shared meanings behind terms.
Your survey draft was rushed, or lacked cross-team review.
You didn’t account for cultural or regional differences in how people define frequency and value.
Failing to define what “regular” means or exactly what “affordable” entails? That’s a recipe for chaos.
Five Sample Bad Questions: The Vagueness Hall of Fame
“Do you exercise regularly?”
“Do you use social media often?”
“Are our prices affordable?”
“Was the event satisfactory?”
“Do you think management communicates enough?”
What’s “often” for one person might be “rarely” for another. “Affordable” can mean bargain-basement to some and just-not-outrageous to others.
How to Fix Ambiguity
Ramp up clarity by:
Specifying clear time frames or frequencies (e.g., “How many times did you…”).
Swapping vague adjectives for concrete, defined terms.
Providing context to anchor interpretations (such as “in the last month” or “relative to competitors”).
With clear, defined questions, “regular” finally means something—and so does your data.
Ambiguous survey questions, such as those lacking clear definitions or time frames, lead to inconsistent interpretations and unreliable data. (pmc.ncbi.nlm.nih.gov)
Absolute & Extreme Wording (“Always,” “Never,” “Every”)
The Perils of Absolutism in Surveys
Questions with absolute or extreme wording force respondents into extreme positions. When a question uses words like “always,” “never,” or “every,” it creates an all-or-nothing trap, even when the truth is messy and nuanced.
These kinds of questions dramatically inflate your error rate. Most people “sometimes” do things, but rarely “never” or “always” do them. Your respondents end up just picking whatever is closest, or worse, abandoning your survey entirely.
Why and When They Are Used
Absolutes sneak into surveys for a few reasons:
New survey writers crave decisive insights and think absolutes produce clearer data (spoiler: they don’t).
Legal or compliance checks may require rigid, all-encompassing questions.
Sometimes, it’s out of habit or impatience.
What you get: less accuracy, more frustration.
Five Sample Bad Questions: Absolutes Gone Wild
“Do you always recycle?”
“Have you ever missed a payment?”
“Do you never shop online?”
“Do you eat meat every day?”
“Will you purchase every product we launch?”
It’s easy to see how tricky and unrealistic these questions are.
How to Replace Absolutes with Real Answers
Swap out absolutes for frequency scales:
The best surveys offer ranges like: “Never,” “Rarely,” “Sometimes,” “Often,” “Always.”
Ask about behaviors during specific time frames (like “in the last month”).
Avoid double-checking your respondent’s entire life history with a single word.
Everyone makes exceptions. Your survey should, too.
Negatively Worded & Double-Negative Questions
When Surveys Become a Logic Puzzle
Double negatives and negative wording create mental gymnastics for your respondents. These questions load up on “not,” “disagree,” or “oppose,” which forces participants to slow down and decode what you’re actually asking.
This complexity increases cognitive load—the mental effort required to understand and answer the question. That means more mistakes, skipped questions, and dirty data.
Why and When Negativity Creeps In
Negatives pop up for several reasons:
Some survey designers try to reduce “yes” bias by turning questions upside down.
Legal teams sometimes insist on awkward phrasing.
Poor translation from other languages can twist questions into knots.
The end result is a headache (and unreliable data) for everyone involved.
Five Sample Bad Questions: The Negativity Wall of Shame
“Do you disagree that our policies are not clear?”
“Isn’t it true that you don’t dislike the new interface?”
“Do you not want to cancel your subscription?”
“Would you oppose not increasing the budget?”
“Are you against not hiring more staff?”
It takes a triple espresso just to answer some of these.
How to Fix: Keep it Simple and Positive
Try these tricks:
Rephrase questions in the affirmative—positive statements are much easier to comprehend.
Run readability tests or ask a friend to review tricky items (if their eyes glaze over, try again).
If your legal department insists on negatives, add a clarifying explanation.
Positive, direct questions = happy respondents and cleaner, smarter data.
Non-Exhaustive or Overlapping Answer Options
When Answer Choices Become a Trap
Not every survey mistake comes from the questions themselves. Sometimes, it’s about bad answer options. If answers are not mutually exclusive, collectively exhaustive (MECE), your data gets messy fast.
Overlapping ranges force people to choose between two applicable answers.
Non-exhaustive sets leave respondents unrepresented—or, worse, make them abandon your survey.
Both problems warp your findings.
Why and When It Happens
Often, answer flaws originate from:
Hasty segmentation work and rushing to copy-paste options from prior surveys.
Lack of planning, leading to missed answers for some groups.
Oversights in designing age, income, and frequency ranges.
If your answer options don’t “fit” every respondent, your survey doesn’t fit the real world.
Five Sample Bad Questions: The Answer Trap Collection
“What is your age? 18-25, 25-35, 35-45, 45+”
“How often do you shop online? Rarely, Sometimes, Frequently, Often”
“Household income? < $30k, $30k-$50k, $50k-$75k, $75k-$100k”
“Which device do you use? Laptop, Desktop, Tablet, iPhone”
“How long have you been a customer? 0-6 months, 6-12 months, 1-2 years, 2+ years”
Some respondents won’t know where they fit. Others will fit in multiple categories. Neither is good for your numbers!
How to Design Flawless Answer Sets
Keep these rules close:
Create MECE answer options: no overlaps, and all possibilities are covered.
Offer an “Other (please specify)” option when possible, for unexpected answers.
Double check that no respondent is left without a place to fit their situation.
Great answer choices are just as important as well-written questions.
Sensitive or Presumptive Questions
When Surveys Get a Little Too Personal
Sensitive and presumptive questions risk making your respondents squirm. Asking for deeply personal information or making assumptions can lead to dishonest answers, abandoned surveys, or worse: a reputation for being invasive.
Such questions can foster social desirability bias (people answer with what sounds “right,” not what’s true) and genuine privacy concerns.
Why and When They’re Asked
These tricky questions often show up because:
Market segmentation studies want precise personal details.
HR or compliance teams need in-depth demographic data.
Health and wellness researchers ask about habits or history.
But not every topic is fair game, especially when anonymity isn’t guaranteed.
Five Sample Bad Questions: The Overstep Files
“Why did you vote for candidate X?”
“How much do you weigh?”
“How often do you drink alcohol at work?”
“Why don’t you have children?”
“Where do you go to church?”
Yikes—these are more likely to spark eye-rolls or outright refusals than honest answers.
How to Respect Boundaries
Here’s your cheat sheet:
Make sensitive questions optional and be transparent on why you’re collecting the info.
Allow for anonymous responses, especially with touchy topics.
Use indirect questioning to soften the impact (e.g., “People like you…”).
Treat your respondents as you’d treat a guest at your own dinner table: with respect, discretion, and empathy.
Best Practices & Dos and Don’ts for Avoiding Bad Survey Questions
When you want data you can trust, these best practices help you dodge common survey pitfalls. Here’s a quick-scan checklist you can keep handy for every survey project.
Dos
Always pilot test your survey with a small sample before full launch.
Use plain language and keep sentences short.
Provide balanced answer scales—let respondents choose from clear, logically ordered options.
Ensure your answer sets are mutually exclusive and collectively exhaustive (MECE).
Design with a respondent-first mindset—ask, “Would I find this question easy and comfortable to answer?”
Don’ts
Don’t lead with bias or insert opinions into your questions.
Don’t combine unrelated ideas or cram double-barrels.
Avoid jargon, acronyms, or technical language that might confuse people.
Never use double negatives or complicated logic structures.
Don’t pry into sensitive topics without offering an opt-out or explaining why you need the data.
Great survey writers aren’t born—they’re made through continual learning, regular A/B testing of wording, and inviting outside experts to review your drafts. Every question is an opportunity to learn and refine your approach. Trust in good survey design, and your data (and business decisions) will thank you.
Ready to level-up your surveys? Download our free “Good Question Template” or subscribe for more research wisdom. Your respondents—and your future self—will be glad you did!
Related Question Design Surveys

29 Quantitative Survey Research Questions Example for Success
Explore 25+ quantitative survey research questions example with clear explanations and tips for c...

32 Good Survey Question to Boost Your Data Quality
Discover how to craft good survey questions with 30 sample questions across 8 types for better da...

31 Survey Question Mistakes You Need to Avoid Today
Discover 25 common survey questions mistakes with real examples and expert tips to craft clear, u...