My husband says I ask too many questions and sometimes asks me, ‘Why do you even need to know that?’ My children say I ‘interrogate’ them. I think this is a bit harsh. I’m only trying to find out what they did at school that day. Any parent will tell you that you have to ask at least 20 questions to find out anything their children did during the day. My favourite silly question is one that my children use a lot. They choose some complicated word, such as ‘psychological’ and say Psychological is a very difficult word to spell. Can you spell it? Of course, the answer is I-T. Depending on how they thread this into the conversation, they can catch each other out regularly.
Questions are at the heart of research, so it’s important that you get them right. Asking the wrong questions or asking questions that are not phrased correctly, confuse participants, are impossible to answer or that lead the participant to give a particular kind of answer will all lead to disaster. By disaster I mean that your research findings won’t be valid.
First you have to make sure that the questions you ask, from the project-level research questions down to the individual interview question or questionnaire item, are clearly linked to your research objectives. There has to be a reason for asking the question.
Here are my thoughts on common problems with question design in surveys, with some ideas about how to solve those problems.
Asking for information you don’t need, such as age or gender if you have no intention of using this information for analysis. If there is no research objective around comparison of results among different age groups or genders, don’t ask for this information. (You may need to ask for demographic information if you need to demonstrate that your sample is representative of a larger population, for which you have demographic data.)
Asking questions that really should be two questions, e.g. Did you have bacon and eggs for breakfast? Yes/No What do participants say if they just had scrambled eggs or just had a bacon butty? Solution – ask two questions. Did you have bacon for breakfast? Yes/No Did you have eggs for breakfast? Y/N Or, better still, Which of these did you have for breakfast? Followed by a list of options for selection. I hope I’m not making you feel hungry. This sort of problem crops up a lot in surveys. I’ve seen statements such as My office is warm and comfortable or I’ve had a meaningful appraisal with my manager in the past year listed in surveys. The participant is forced to answer ‘No’ if their office is warm, but not comfortable (and vice versa) and must answer ‘no’ if they have had an appraisal but did not feel it was meaningful or it was not in the past year. In fact, that last one could lead to conclusions that no appraisals had taken place, which would be quite the wrong conclusion to draw.
Asking questions without appropriate options for response, e.g. forgetting to use ‘Don’t know’, ‘Not applicable’, ‘Prefer not to say’ and ‘Other’ as options where they may be needed. When options are closed, participants must often choose something in order to proceed with the survey, so they may choose an answer that is wrong because the answer they want to give is not available.
Asking questions that lead the participant to a particular type of response, for example, How good do you think the leader’s speech was? Outstanding/ Excellent/Very good/ Good. What do participants do if they didn’t rate the speech? They have to choose something! Another example is Politics can be very confusing for some people. What do you feel about politics? This question has already put in the participant’s mind the idea that politics is confusing, thereby potentially colouring their answer. I see this in dissertation surveys, particularly if the subject is something about which the student is passionate. It’s really important that we protect the environment. What do you do to contribute to this? Ouch – shamed into saying you do something to protect the environment, even if you don’t. Plastic in our oceans is killing wildlife. What are you doing to reduce your use of plastics? Ouch – shamed again into making something up about reducing your use of plastics. It’s important that questions are phrased objectively and that the researcher’s own bias doesn’t seep through.
Asking questions that should provide a filter for following questions, but don’t, such as Do you read magazines? Yes/No followed by several questions asking for the title(s) of magazines you read, your favourite type of content, how much you pay, whether you subscribe or buy in store… If the participant has answered ‘No’ to the initial question, all the subsequent questions are irrelevant and a filter should be applied to lead the participant to the next relevant question instead. This is particularly irritating if the ‘not applicable’ answer is not available in those subsequent questions, as participants who have answered ‘No’ might then be forced to give answers that are false, just to get through the survey. This will annoy your participant and may lead to them dropping out half-way through the survey. It will also make your results confusing. If half the participants have said they don’t read a magazine but all of them have selected favourite content, etc. that’s not usable data.
Asking questions that assume too much prior knowledge or technical understanding, such as a survey about the use of e-books asking participants about the technical details of the operating system or a survey about sustainability asking participants to say how their organisation is working towards the UN’s sustainability development goals (SDGs) without providing information about those SDGs. This sort of question might be skipped by participants. Find another way of getting the information or provide more context so participants can respond.
Using inconsistent rating scales, such as Excellent/Very good/Good/Poor/ Very poor for one question, a numbered scale from 1 to 5 for the next and a numbered scale from 1 to 7 for the one after that. Try to frame the questions so that they could all be answered using the same scale – this is much easier for participants to follow. How many points should be included on a rating scale is something I shall muse on another time!
Asking questions that are difficult to understand or are ambiguous, such as including this statement in a list to be used with a rating scale: Being asked not to drop litter is not unreasonable. What? There are too many negatives to make this statement clear. Rephrase it so it is clearly understood: It is reasonable to ask people not to drop litter.
Answers that jump from one subject to another and then back again – this can be confusing. Make sure questions on the same topic are grouped together. If you are asking the same question in different ways deliberately in order to test whether an individual’s answers are internally consistent, they should still be in the same section of the survey. A survey about healthy lifestyles that covers diet, exercise and leisure shouldn’t see a question on diet cropping up in the exercise section (although I appreciate there will be links between some of the ideas).
Expecting open-ended questions to lead to long responses giving lots of detail – leave this sort of question for interviews. Open-ended questions in surveys are more likely to lead to single-word answers, phrases or short sentences. Participants don’t want to write paragraphs in a survey that has been sold to them as ‘just needing five minutes’.
Surveys that are just too long. Your survey must be short enough for participants to complete in one sitting. If you can, give an indication of how long it will take to complete the survey at the start or show a ‘% completed’ or ‘Question 1 out of 25’ or similar. Participants like to know that the end is in sight.
So there you have it. My top tips on question design for surveys. Do share any tips you have in the Reply box!