If you want better results from your original research, take a hard look at the survey experience.
By Clare McDermott
More companies are using research as a form of content marketing – be it survey-based studies that dive deep into an industry trend or analyses of internal user data to show off the brand’s expertise and point of view.
Yet, all that new interest comes with a learning curve. Original research is one of those areas where so much can go wrong, especially if you’re inexperienced. Most problems I see relate to one of two things: poor survey design or faulty statistical analysis.
For this article, I’ll focus on one aspect of survey design – the survey experience.
Survey experience is about how well your survey takers think the questions are relevant, intelligent, and appropriate. Will they be able to (and feel motivated to) complete your survey? Will they answer honestly and openly? Would they answer a survey in the future based on this experience?
While it may seem like a nice-to-have, a good survey experience will boost completion rates and accuracy. A bad survey experience can nuke your survey results (more on that in a bit).
After 20 years of working with clients using research-as-content, I pay close attention to these eight experience elements.
The survey-taker experience begins before they start the survey and after they hit complete. In the invitation to take the survey, be sure to explain why you’re conducting the survey, what you aim to do with the data, and how long the survey will reasonably take. (Be brief in these explanations. You don’t want someone to drop out before they’ve even begun.)
If you plan to collect survey responses from anyone in Europe, even unintentionally, turn on the GDPR opt-in settings available on all survey platforms and provide a link to your company’s data privacy policies.
Also, mind the post-survey experience. Someone who completes the survey should see a custom thank-you page, not the default page provided by your survey platform.
If you collect survey takers’ email addresses, be crystal clear about your intent. For example, at my company, we collect emails from those who want a copy of the final report or to enter the raffle for completing the survey. We never use an email for any other purpose (and in fact, I strip out the email column in my spreadsheet and put it in a different tab to decouple identities from responses).
If survey takers believe their responses will be used to market to them in any way, they will not want to take your survey.
Don’t you love the “please-take-our-20-minute-survey” invitation? That’s a hard no for me. Unless you pay someone to take your survey, it should never be more than eight minutes – and even eight minutes is a big ask.
Consider this: Completion rates drop for each additional minute required to answer the questions. In my experience, it begins to fall off a cliff at around the eight-minute mark. Editing for survey length is an absolute necessity, and it’s an excellent way to ensure your survey is tightly focused.
An analysis of survey length by SurveyMonkey found drop-off rises with each additional question — an important reminder to keep survey length as tight as possible.
Most survey platforms warn when questions and/or answer options are too long. That’s because long questions or answers lead to fatigue, speeding, misunderstanding/error for the survey takers, and they look like hell on mobile. Avoid long questions and answers unless it’s absolutely mission-critical for one or two questions.
Also, beware of the compound question (e.g., “Does your job give you satisfaction and pride?”) Boil down your question to a single idea or variable so that your survey taker can answer easily, and you can report findings clearly.
Your survey pace should resemble a conversation between two strangers. Don’t dive into the most probing, sensitive questions up front. Wait until the survey taker can see your study is worthwhile based on the quality of your survey questions. Then, they may be more willing to share sensitive details. Income is always a sensitive topic, but others also make people uneasy, such as plans to leave a job or reveal sensitive company information.
I advise a few antidotes to this awkwardness: Put those types of questions toward the end of the survey and make them optional (or add an option for “prefer not to answer”). You may even remind survey takers at sensitive moments that their responses will be fully anonymized and won't be used for any other purpose.
Capturing demographic information is essential to ensure the survey sample represents the audience you’re attempting to study. Demographic responses also can expand options for interesting “cuts” of the data, showing you how different cohorts of your study group (e.g., generations) differ.
In recent years, survey designers have evolved how they ask demographic questions to be more inclusive. For example, they ensure questions about sexual identity or gender identification use language that includes rather than alienates — an important aspect of survey experience.
The challenge is balancing inclusivity and brevity. Rather than list a dozen choices for gender identity, for example, I limit answer choices to man, woman, non-binary, and prefer to self-identify (write-in).
The prefer-to self-identify option ensures everyone has a choice that fits. And fewer answer choices make it more likely you will have segments large enough to compare.
Your survey platform can be a good resource when designing demographics questions. (They often have question libraries to draw from. SurveyMonkey’s library is particularly good).
I like to look at what major research organizations like Pew Research Center use for demographic questions. Whether I’m asking questions about gender, race, ethnicity, age, sexual identity, or other characteristics, I consult established surveys to find consensus.
When does this advice go out the window? When gathering granular detail is part of your study’s primary aim (e.g., the research focuses on gender identity) and commonly used demographic questions don’t provide the specificity you need.
Don’t even think about asking self-serving or promotional questions. Invariably, I work with companies that want to toss in a few questions that not-so-subtly promote their product/service. The problem? Your survey takers are smart, and they will resent the question and even punish you for it.
I worked with a client a few years ago who was adamant they wanted to ask, “Which do you like better?” in relation to an analytics dashboard. They used illustrations of each option. One image was clearly superior (and belonged to the client), and one was the primitive Stone Age option.
Guess what? A third of respondents chose the Stone Age option. I suspect they knew it was a setup. The results were not usable.
Testing your survey is the most important thing to do before you release it to the wild. Recruit at least five people (over 10 is better) to take the survey and comment on ANY issue that gives them pause. Testers should be in your target study group so they can gut-test the question wording and answer choices.
This screenshot from the survey tool Alchemer shows how to generate automated tests. Tools like Alchemer allow you to generate dummy responses, an excellent way to test your conditional formatting and piping.
My company pays testers to ensure they take it slowly and record all their questions and concerns. Is any question unclear? Do the answer choices make sense? Are they able to answer every question or do some not apply?
Make sure some testers respond to the survey on mobile and others on a desktop to validate both experiences.
When your testers finish, run automated testing through your survey platform to generate responses. Comb through your summary report. These dummy responses can help pinpoint any problems with survey logic and piping.
Make sure to include an email address in the introduction, on disqualify pages, and at the end for people who have questions.
If you’ve designed a great survey experience, you likely won’t receive any emails (we rarely do). But providing the option can serve as an early warning system to any survey problems missed in testing.
(Note: You can’t substantially edit a question once you’ve gone live, but you can either choose to restart the survey or drop the offending question.)
Why does experience matter so much?
A frictionless experience increases your number of completes and your sample size – boosting your study’s credibility and making it more likely that you can tell interesting stories.
Plus, a good survey experience signals to survey takers that the research is worthwhile, which is critical when you ask customers or other audience members to participate. CCO
Clare McDermott is the founder of Ravn Research and former editor of CCO magazine. Follow her on Twitter @clare_mcd.