Businesses use Demographix in many different ways
Contact us and we'll show what we might do for your company
( 98.5KB )
Good survey scripting is a science that can involve lots of training in how to ask questions in an unbiased way that give reliable data to analyse. Your objectives in designing good online surveys are to get high response levels (i.e. maximum number of respondents submitting a survey) and reliable and accurate data. For the first, you need to give them a good reason to complete the survey and submit the data – these can be such things as incentives like prize draws, getting access to the results or other defined benefits, being part of a trusted panel, or making a donation to a specified charity for each response given.
The possibilities of the online environment can also assist survey designers to build surveys that are much more interactive and interesting to complete, which paper and telephone surveys cannot match. You can use images, multimedia (audio and video), ranking questions, dynamic answer lists in matrixes, and sliders to make the survey visually interesting. You must also be very conscious that good data comes only through well thought-out questions.
If questions are compulsory and the respondent doesn't have an opinion or answer at hand, they will become frustrated and either drop out or give you incorrect data. Missing an option on a commonly used list (e.g. leaving out a region in a location question) can unsettle respondents and undermine confidence in the veracity of the data set. Demographix allows you to create commonly used lists that can be quickly inserted into a new question.
Perhaps the best piece of advice we can give for building online surveys is to preview and check your survey as you build it. Get others to check it for literal mistakes, such as grammar and spelling, and also for simple common sense – for instance, is there an obvious answer missing?
Adopt a tone of voice that doesn't use jargon and speaks to a wide range of respondents. Tell them how long it will take, on average, to complete the survey, why the data will be useful, and where the data will be used or made available.
Make the experience as interesting and interactive as you can. Use "online-only" survey tools not available to paper, face-to-face or phone researchers. These tools include sliders, image maps, move & rank questions, answer piping, derived answers and multimedia.
Guide your respondents gently through your questioning – imagine you are directly talking to them, explain what is about to happen and explain things as simply and clearly as possible. "Now, a few questions about your last shopping experience in Store X".
Check for possible misunderstanding or ambiguity in questions. Have others preview your survey before making it live, to test it and read it for possible misunderstanding. Ambiguity often comes from including too much information in one question.
Structure your survey so you focus on specific, well sign-posted areas. It is best to structure your survey into small groups of questions on different topics or related concepts. Try to avoid requiring them to scroll down in a browser window or tab. If this happens, there are usually so many questions on a page, it all looks like too much hard work and they opt out. They can also lose track of where the Next and Previous buttons are, and get confused or frustrated.
Be aware of the length of your survey – this will vary depending on how committed and involved you expect the cohort of respondents to be. If they have a vested interest in the subject, they are more likely to complete longer surveys than if they are consumers. Signposting where they are in a survey ("Page 6 of 12" or "Survey 45% complete") can reassure them. Survey rhythm is how the cumulative effect of the questioning is felt by respondents – varying the types of questions can speed up or slow down a survey, depending on the level of evaluation you are expecting them to do.
Ask only questions that are relevant to them. By careful use of conditional routing, you can show only those questions that the respondent needs to answer, and hide all those which they are not qualified to answer. This reduces the length of time the survey takes, and thus increases response rates. For instance, if you have a series of questions aimed at specific genders, regions or age groups, you can make them conditional on whether respondents have answered the initial questions in ways that qualify them. Others will not be shown these questions at all.
In paper surveying, you use the "Jump to Question XX" method; but with online surveys using Demographix, you use conditionality – i.e. "Show them this page of questions if they've answered this answer or these answers". It involves a different way of thinking, but allows for finer "embedded qualification" from respondents as they proceed through a survey.
We recommend you put personal data collection at the end of a survey, not at the beginning. They will be reluctant to tell you who they are or how to contact them before they know what questions they will be asked. If they are asked for personal details (name, email, phone) after they've answered the questions, they will be more inclined to do so.
Don't exit certain categories of respondents unless you're absolutely sure you may not want their data. For example, do you really only want female respondents? You can create an exit page for those who answer male. Or collect data for everyone and filter out the male respondents using analysis tools. Remember Demographix is not pay-per-response, so you need not worry about the cost of "completes".
If you are offering a list of answer options, derived from what you expect the main set of answers to be, then always include an Other option, so they can specify something that's missing. If they tick it, show them a write-in to get the verbatim, open-ended response. It is often the case that the Other data is the most valuable to you, as it is the unexpected or unknown or new data that is being collected here.
A major consideration for all research is that all respondents see roughly the same set of questions in the same order (the content of the questions themselves are entirely consistent for all respondents). While answer options within a question can be rotated (see 18), randomly rotating pages in a survey and questions on a page can give different "experiences" with unexpected consequences. Just because you can do it online, doesnt mean to say it's a desirable thing.
Avoid sticking to one or two question types. A survey stuffed full of simple single-choice and multiple choice questions can become incredibly boring to answer and response levels will fall. Use matrix, ranking, sliders to break boredom.
Consistent answer lists used in all your surveys, mean you can be consistent in data analysis and monitor long-term trends. Demographix allows survey builders to create commonly used sets of answers (such as lists of regions or job titles) and these can be added to a question with a single click. No need to check for spelling mistakes or missing options each time you use the list, either.
Think about the type of question you are asking and what evaluation skills you are expecting respondents to use in each of them. Is your question asking them to evaluate in an objective way (Excellent to Very Poor; Very Satisfied to Very Unsatisfied, Rate on a scale of 1 to 10) or give more subjective opinions (Agree/Disagree with these statements, Warm to Cold slider scales for them to express feelings towards things). We call this Head and Heart questioning heart questions can reveal deeper insights and mean your survey will appeal to a wider set of respondents (as all respondents are Head or Heart people, or a mix of these two).
Make sure your questions are not biased towards one answer or another? A question beginning "Do you agree with " should be avoided because it positively encourages respondents to agree with a proposal or statement. However, rewording as "How much do you agree or disagree with " is less directional.
Using a 10-point scale rather than a list of 4/5 specified responses gives finer data. A set of responses (such as "Totally Agree, Agree, Neither agree nor disagree, Disagree and Totally disagree") allows for variations in responses on a 5 point scale. You can also assign numeric values to each option to calculate a mean/average figure. Using a bigger scale (e.g. "from 1 to 10, where 1="Totally agree" and 10="Totally disagree") will give you a more granular set of responses and finer data. However, dont take this to the opposite extreme – too big a range (e.g. 1 to 25) gives too much granulation of scale. If respondents find the scale confusing or difficult to work with they will opt out of the survey.
Use answer rotation to avoid bias towards selection of top elements in a list. Online surveys allow you to rotate a list of answers (e.g. in a multiple choice or ranking question) to avoid respondent bias to answers at top of list. Each time the survey is served, the answers are listed in a different order.
It can be good practice to ask a series of unprompted questions up front, to ascertain what things are top of respondents minds. E.g. "Thinking about home insurance providers – can you name any?". Later you may ask: "Here's a list of home insurance providers, which of them do you know?". The first unprompted question reveals which brands (qualities, elements ) have "recall value" to the respondent, such as emotional or experiential value.
There is some debate over how to ask the component parts of a gap analysis question online. Some designers say that the two questions ("How important is this criteria to you?" and "How satisfied were you with the fulfilment of the criteria?") need to be answered in parallel. It is our considered view that you should ask the questions sequentially: the importance question first, followed by the satisfaction rating (either immediately after, or later in the survey). When you come to analyse the results, you can assign values to the ratings, and quantify the "gap" between importance and fulfilment.
Don't forget to check out our tips on maximising your survey's response rates.
Website and survey technology © Demographix Ltd. All rights reserved. Company registered in England and Wales .
Demographix is the registered trademark of Demographix Limited.