Too often marketers try to use 3rd party data or personal opinion to define their customers and work out what they want. I find the simplest and most effective method is to ask them.
A well designed survey will help you clearly define who your customers are, what they like about you, what they think you could improve, as well as offering great product/ service ideas for the future.
This article will look at the survey structure and questions formats you should be using, common survey mistakes to avoid, questions for you to consider, as well as which software to use.
Defining the survey’s purpose
The first question to ask yourself should be “What information do I want about my customers?” i.e. what is the purpose of your survey?
Is it to find out who they are (e.g. demographics), their likes and interests, what motivates them to buy, what they like/ dislike about you, their opinions on a specific product…etc?
Once you have a clear idea on the survey’s purpose you’ll find thinking of what questions to ask comes a lot easier to you.
Do you want to find out about:
- Your customers demographic make up
- Your place in the market
- How customers perceive your brand
- Your customer service
- How many customers use your competitors as well
- How people heard about you
- Which features are most important to them
- How you could improve
Survey format & question types
Your survey has to strike a balance between capturing enough usable data to be worthwhile, and not having so many questions people don’t complete it.
Ask yourself what you will do with the information from each question.
If you cannot give yourself a satisfactory answer, leave it out. There are no hard fast rules about survey length, however I tend to try and keep it around the 10 – 15 question mark.
Open or closed questions (or both)?
The two broad types of questions found in surveys are called “open” and “closed”.
Open questions allow the participant to respond using any answer they see fit, usually in a text input field. An example would be “Do you have any ideas how we can improve our products?”
A closed question gives the participant a limited choice of answers to pick from, and these can be a radio button, a tick box or a drop down menu. This website has a nice list of survey question formats.
Open questions allow for any type of answer but are time consuming to analyse, and whilst closed questions only offer predetermined answers, they are very easy to analyse.
I would recommend using primarily closed questions, with one or two open questions added to capture data for anything you haven’t thought of.
Survey mistakes to avoid
The way you ask a question and the options you make available can greatly influence the data you receive. Here are some easy traps to fall in to you should avoid:
This occurs in multiple choice questions, where we ask the respondent to choose from different values. I commonly see the following:
How much do you typically spend?
a) £0 – £5
b) £5 – £10
c) £10 – £15
So if I spend £5, do I choose a) or b)?
Two questions in one
A very easy trap to fall in to is asking two questions in one. For example:
“How useful do you find Widget Inc’s Online Support Database and the email support centre?”
In this example, I may find the online support database very useful, but the email support centre a waste of time.
Which way do I go?
You have to keep your questions neutral, and not guide the user to think a certain way. For example, this type of question presumes they liked it in the first place.
“What did you like about our product?”
A better way to phrase this would be:
“How would you rate our product for the following…?”
Presuming they can give an answer
There will be questions that not every respondent can answer. This may because they haven’t used the product in that way, or they simply can’t remember.
That ties in with another point; don’t ask people to think back over extended time periods, it is unrealistic to expect accurate answers.
Rather than forcing them to give a knowingly incorrect answer, or essentially asking them to guess, always give an ‘NA’ or ‘I don’t remember’ option.
Inconsistent presentation of the scales
Keep scale ratings consistent throughout your survey so that e.g. 5 = very good in every question. This also goes for the way you present the scales to the user.
How would you rate our support? Very Good, Good, Poor, Very Poor
How would you rate our reliability? Very Poor, Poor, Good, Very Good
If you choose to have “Very good” on the left for one question, it should always be on the left throughout the survey. This is to prevent respondents training themselves to just click there without really reading it properly.
Using large scales
I have never understood surveys that use any scale bigger than 1 to 5. Beyond that, it becomes very difficult to differentiate what those values mean.
For example, how does a 6 differ from a 7? This is especially so when we go even higher up to a 1- 100 scale.
Being too long
The longer your survey goes on for, the more people you will see drop off before the end. I personally would keep any survey to 10 – 15 questions, or maybe 20 as an absolute max.
Anything over that and you can expect really low completion rates.
If you spread the questions over a number of pages, give them a progress update (a filled in bar, page X of Y etc.) so they know they don’t have long to go.
Read more about the science behind setting questions here http://www.greatbrook.com/survey_question.htm
Getting customers to participate
To get your customers to take part in your survey, they must first know it exists. Your customer marketing communications (newsletter, control panel alert, specific email etc.) are perfect methods to drive them to your survey.
A few of your customers will give feedback because they have strong opinions (good and bad) or they have specific areas they want to feedback on, but I would expect the majority won’t take part unless you provide an incentive.
Entering all completed survey forms in to a prize draw to win a decent value prize is a tried and tested approach to boost response rates.
Give your customers a clear cut-off date by when they need to have provided there answers, and send a reminder email to anyone who hasn’t completed the survey in the first 5 days.
There are some really good survey software providers out there. A lot of companies use Survey Monkey, but I think the templates are looking very dated now.
I personally would recommend taking a look at Typeform, PollDaddy, or SmartSurvey.