Research on Research – uncovering what really matters

  1. Research on Research – the awareness illusion
  2. Research on Research – to agree or not to agree? That is the question…
  3. Research on Research – uncovering what really matters

Open or closed?

Next up in our Research on Research series, which looks at how changes to questionnaire design can influence the data and ultimately the insights generated, we’re focusing on an open question vs a closed question.

Most online surveys consist predominantly of closed questions, perhaps peppered with the odd open question here and there, as encouraging detailed open responses is trickier in a survey environment; they eat up survey space, and the cost of coding and organising the responses can be prohibitive (especially if back translations are required or there is a very large sample). However, open questions can be a great way to capture thoughts in respondents’ own words, without priming them, to get snapshots that bring other insights to life, and even as a quality control measure.

When asking closed questions, the choice of options provided, the wording of each option, the number of options offered and the order in which the options are read can all influence how people respond. Offering an ‘other, please tell us’ option can often serve as a back up to ensure a key answer isn’t missed from a closed list, but it may be harder for respondents to complete this once they have been primed by the options provided, especially if the list of closed options is comprehensive. Therefore, if it’s important to find out respondents’ top of mind thoughts as cleanly as possible, without researcher-introduced bias or influence, an open question is likely the way to go.

Not two sides of the same coin

It might seem logical to think that if a respondent gives a particular answer in an open question, then it’s a given this answer would have been selected if the respondent was instead presented within a closed list, but this isn’t always the case. Towards the end of our survey, we asked respondents what a new video on demand service would need to offer in order for them to consider signing up. Half the respondents were prompted to answer in open text boxes, and the other half were offered a closed list of eight options, with the chance to fill in an ‘other’ open text box.

Open question
Closed question

When explicitly offered ‘ability to cancel anytime’ as an option in the closed question, 60% of respondents chose this answer, whereas only 3% volunteered it in the open box. This doesn’t mean that flexible contracts aren’t appealing or important, but that they aren’t top of mind for consumers when evaluating a service. Instead, the lower prominence of flexible contracts in the open question may suggest that it’s a hygiene factor rather than a marketing hook – it’s something respondents expect of a service when shown, but not something they’re immediately thinking about themselves when looking at a service.

The most common answers in the open question related to affordability and content, on quite a granular level, focusing on specific types or genres of content, indicating that ultimately the service needs to have something on it that they really want to watch and provide detailed insight into what that content might look like.

While we gave respondents who saw the closed list an ‘other, please specify’ open option to capture anything not shown in the answer list, only 3% of respondents volunteered an additional answer – but a massive 66% of those answering the open version gave an answer that wasn’t included in the closed question list.

This shows the powerful influence our list of options has on respondents – it’s cognitively easy to only consider the options presented to us, and if one or more apply respondents are likely to move on quickly, rather than stop to think whether they have any additional views. It also reflects the ability of closed questions to aggregate themes over the nuanced perspectives captured in open questions – this can be a pro or a con depending on your objectives; do you need to get a sense of size for key factors you’re already focusing on, or understand the granular detail of the consumer perspective and uncover any factors you might have missed?

When to use open questions

Open questions can be especially useful if there are knowledge gaps about how consumers use and talk about your product or service and there hasn’t been time or budget to run a preliminary qualitative stage. If the ultimate objective is still to assess the prevalence of key expectations or needs, an open question can be asked in either an upfront survey or an omnibus, prior to writing your main survey, allowing you to write a relevant closed list with confidence.

There are also instances where you may want to combine open and closed questions to ensure you’re not being misled by the data. For example, we always recommend including an open question when it comes to identifying and understanding KPIs, such as brand satisfaction. Often we run statistical analysis such as key driver analysis on the scores for key metrics, using questions with closed lists or banks of statements to uncover what’s driving satisfaction scores, for example – but this type of analysis won’t reveal if the score is being influenced by a metric that’s not captured.

Open questions can also be used to show rather than to know, providing snippets from the consumer perspective to bring other survey data to life. In our closed question, the response ‘an affordable subscription option’ can be illustrated with a quote from our open question; ‘a reasonable price, no adverts, no need for a premium subscription’ which gives a more rounded sense of what an affordable subscription option looks like for this respondent.

One watch out though – open questions should be avoided when the topic is too broad or unfamiliar, because without the guidance of a closed list consumers will struggle to answer meaningfully, with different respondents interpreting the question in different ways, potentially resulting in unusable data. If you need to give examples within the question text of an open question to make the meaning clear, you’ll introduce bias into the results.

Often to get the truest picture of what’s important to consumers, questions need to be asked in a mix of different ways, gradually priming respondents to ensure potentially crucial views aren’t missed because they were omitted from a closed list. That’s not to say closed lists need to be exhaustive; they will likely be tailored only to what’s crucial to your objectives. Ultimately every step and decision in the questionnaire design links back to your objectives, with open questions primarily used when it’s relevant to truly find out what we don’t know.

Our survey was completed online by ~1000 respondents, representative of the UK 16+ population in terms of age x gender, region, and household access to SVOD and Pay TV. Where two versions of the same question were asked, the sample was split into two cells of ~500 respondents, matched in terms of the quotas to ensure comparability of results.

Photo by Jason Leung on Unsplash