Platform overview
Data quality
Analysis
By Use Case
Brand tracking
Consumer profiling
Market analysis
New product development
Multi-market research
Creative testing
Concept testing
Campaign tracking
Competitor analysis
Quant & qual insights
Seasonal research
By Role
Marketers
Insights professionals
Data storytelling playbook
2025 UK F&B trends report
2025 US F&B trends report
2024 UK media consumption report
Consumer Research Academy
Survey templates
Help center
Blog
Careers
Most brands that start out on survey projects are guilty of saying this phrase at some point early on: ‘Survey scripting is easy, it’s just common sense’. Unfortunately, that’s not always the case. To help, here are 10 unexpected best practices, you'll be writing great surveys in no time.
Most brands that start out on consumer intelligence projects are guilty of saying this phrase at some point early on: ‘survey scripting is easy, it’s just common sense’.
Unfortunately, that’s not always the case. We know for a fact that survey scripting is a hard task (and we’re here to help, if you need it!), precisely because some of the best practices are counterintuitive.
Read on for our list of the 10 most unexpected best practices for survey scripting, to ensure that your next consumer insights survey is collecting the actionable results you need.
There are, however, better and worse methods.
‘The perfect survey’ for one company might be so far from perfect for a another company , that the results produced are untrustworthy and unusable.
To create the perfect survey, if such a thing exists, several variable elements including audience, scope, aim and timing all need to come together in harmony. It’s clear, then, that changing any of these fundamental factors will change the nature of the survey itself, so there cannot be an objectively perfect survey that works for everyone.
We can, however, work towards scripting the best survey possible to provide high quality, actionable responses to key business questions.
Sometimes it’s clear to respondents which answer the creator is looking to receive, and whether or not this answer is true for them will impact how they respond throughout the rest of the survey.
If this question was posed:
‘How much would you be willing to pay for a bottle of wine in the supermarket?’
And the answers listed as:
‘Under £5
£5-£7.50
Over £7.50’
It is clear to respondents that the creator of the survey is interested, mainly, in consumers who fall into the £5-£7.50 bracket, as this is the only one they’ve put the effort into quantifying precisely. Respondents who answer with either of the other two options might become disengaged from the rest of the survey as they’re aware that they’re not the target audience.
Furthermore, the brand running the survey has revealed that they produce relatively average (priced) wine, giving the respondent prejudices as they move forward with the survey.
It’s a controversial point, as many surveys rely on self-reported averages. However the point isn’t that you should never ask for a self-reported average, instead that you should be aware of the question you’re actually receiving an answer to.
If you were asked ‘How much do you spend on groceries in an average week?’, you might struggle to think back further in time than the amount you spent last week, and so just offer up the total from last week as a fairly representative figure without being exact.
When you’re asking for self-reported averages, it’s worth bearing in mind that most respondents will reply with the figure from the latest time they completed this activity, as it takes the least effort and might be the only response they can recall.
This very simple takeaway is one in consistency; avoid drawing the eye of the respondent to one particular answer in a list, and thus skewing results towards this answer.
An answer may draw the eye because it is shorter than all the others, longer, or phrased differently, as in these examples:
‘What’s your favourite chocolate bar?’
Twirl
Twix
Reese’s Peanut Butter Cups
Mars
Snickers
Cadbury Dairy Milk
Maltesers Buttons
Hershey’s Cookies & Cream
‘When did you last buy a new mattress?’
Less than one month ago
1-6 months ago
Six months to one year ago
Over one year ago
Randomisation won’t work to counteract this phenomenon, instead answers should be adjusted for consistency or, if not possible in written format, swapped for image answers.
At Attest, we optimise all surveys by capping them at 7 minutes (23 questions) long. Much independent research shows this to be the optimal length for high quality responses with sufficient depth to address the scope. After millions of completed answers on our platform, it’s clear that the research holds, with high quality and considered answers just as likely at Q23 as Q3.
The industry average length of surveys, however, is 20 minutes long. Considering that research shows the average attention span of a Brit is just 14 minutes, surveys this long will see a significant drop in quality towards the end. Drop-out rates are also likely to increase with the length of the survey (making it much more expensive for you to get the number of completes you need). And although some respondents will be willing to answer until the very last question, these people might not be attitudinally representative of the population.
Sticking to a survey length of 7 minutes also means the most important questions can fit into their natural flow in the survey, rather than being shoehorned in at the beginning where attention is highest. This provides a better experience to respondents, which again increases the likelihood of you receiving the highest quality data.
When asking a question on a complicated topic, it’s natural to add as many words to the question as you feel are necessary to properly convey the request.
Inversely, though, the best practice is to include only vital words in the question, and keep it as short as possible to avoid misinterpretation.
For instance, you might shorten: ‘Now can you please select the following brands you have heard of’ to ‘Have you heard of…’.
Decreasing reading time in this way will also keep engagement high.
It’s common sense to remove marketing jargon from consumer surveys. For instance changing ‘Do you think product X is a good brand fit for brand Y?’ to ‘Would you be surprised if brand Y started selling product X?’.
But how far should this be taken?
Again, the answer lies in consistency. Choose the phrasing that is consistent with the way a respondent would answer each question.
For instance, when choosing whether a question should read ‘When did you last buy/purchase product X?’ consider that a respondent is more likely to answer ‘I bought product X yesterday’ than ‘I purchased product X yesterday’, so you should opt to ask when they last bought the item.
When asking respondents to rank their emotions on a scale, offering images of humans can seem like a great way to inspire thought and feeling, but it can actually lead to confusion.
While a face might represent a professional, happy person to one respondent, to another that same expression might look smug.
Perhaps it’s a sad state of affairs, but it’s true that people are increasingly representing their feelings with emojis, as these can be a less ambiguous way to represent emotions.
How can this be true, when we spend so much time removing bias from surveys?
As counterintuitive as it sounds, there certainly is a place for unbalanced scales in survey scripting.
Perhaps a brand is asking consumers about a topic that is generally considered to be a positive thing, for instance going on holiday. They’re interested in the extent of positivity towards each of the destinations they’re offering up, and less interested in the level of negativity (because if someone felt negatively about a holiday destination they’d simply rule that out of consideration). In this context, a scale such as the one below can be used to assess the positive reactions more granularly than a balanced scale could (without being too long and confusing).
‘How likely are you to visit Spain on a holiday in the next few years?’
Unlikely / Neither likely nor unlikely / Slightly likely /Very likely / It’s a certainty
We know this feeling well. After all, we were new to survey scripting once upon a time!
Though much of the process can be a common sense exercise, the examples above are just some of the factors which are less easily grasped without experience. What works for one company, or even another team within your own company, may not be suited for your purposes, but trialing the survey, as you go, with friends and colleagues will illuminate the areas requiring additional attention.
If you’d like to explore Attest’s scalable intelligence platform, and receive help from our talented team, get in touch with us today.
Content Team
Our in-house marketing team is always scouring the market for the next big thing. This piece has been lovingly crafted by one of our team members. Attest's platform makes gathering consumer data as simple and actionable as possible.
12 min read
6 min read
2 min read
Fill in your email and we’ll drop fresh insights and events info into your inbox each week.
* I agree to receive communications from Attest. Privacy Policy.
You're now subscribed to our mailing list to receive exciting news, reports, and other updates!