Research on Research – the awareness illusion

  1. Research on Research – the awareness illusion
  2. Research on Research – to agree or not to agree? That is the question…
  3. Research on Research – uncovering what really matters

Researchers shape the reality they capture

Here at Old Salt we believe the insights you get are only as good as the questions you ask, and as is our nature as quantitative researchers, we decided to quantify the degree of influence the design of a question has on the results it produces.

Naturally it’s important to have a well-thought-through survey that asks questions to investigate your hypotheses using objective language, but it’s also how you ask your questions that will determine the results – and there are often a variety of ways to ask the same question. We’re not just talking about instances where there’s a clear best practice approach, we’re talking about seemingly small decisions that can impact how a question is perceived, and ultimately the data we get back.

The reality is there’s no such thing as a ‘true’ answer in market research; responses are primed by the position, structure, context, and even format of the question. As researchers we’re shaping the version of reality we capture, so expert design is imperative to ensure questions are consistently interpreted, responded to in the way they are intended, and optimum for the purpose of the objectives. The influence of the question design may also need considering when analysing and reporting results.

To provide an objective view on how the data might differ based on small adjustments to the question design, we’ve run our own UK nat rep survey asking a series of questions in different ways – our next few blog posts will share the findings!

Assessing awareness

First up in our Research on Research series we’re focusing on probably one of the most fundamental research questions – awareness. It’s a foundational question, captured by most brands on a regular basis and typically included in almost every survey. It seems simple on the surface, a clear-cut question where there is a ‘true’ answer – either respondents are aware of your brand, or they aren’t. And while this is broadly the case, the way your awareness question is designed can shape your results and the competitive context.

Generally speaking, multicode questions work well and are often the default choice for awareness; you can cover a lot of ground with one question by showing respondents a list of options (e.g. brands) and ask them to select as many as apply (i.e. which ones they’re aware of). This is a particularly good approach if the list is short as respondents are likely to work through each option, considering them in turn. But often a wider competitive set is relevant, which can be problematic because unfortunately respondents’ attention spans can dwindle quickly. In these instances, using a different question format that forces more considered answers can be helpful. This is also a tactic worth considering if your brand, or even the category, isn’t well established; a more nuanced question could generate additional valuable insights that can play a crucial role in understanding your brand’s market position.

To test this, early on in our survey (before brands could be introduced and influence awareness!), we asked respondents which video on demand services they’re aware of from a list of 11 options. We split our sample into two cells (matched on our demographic and TV quotas) to ask our awareness question in two different ways. In one version (a) we used a multicode question where respondents could select all that apply, while in the other version (b) we formatted our question into a carousel, where each brand was shown in turn and awareness was captured as one of three options that also accounted for degree of familiarity.

Awareness survey question, version a
Awareness survey question, version b

Awareness of every brand asked about was significantly higher when respondents were forced to engage with each brand individually (version b) and were able to give a more considered answer that accounted for awareness with limited familiarity. Awareness levels were between 8pp and 24pp higher depending on the brand, with the greatest differences seen amongst the lesser-known brands.

So what’s driving this difference?

The difference in results is likely driven in part by a tendency for respondents to scan options in a long list (like version a) and select a handful before moving on, which can result in an underestimation of awareness levels. This is evident when we net together the proportion who said they were aware of all services asked about, which was 19% in the multicode question (version a) but a whopping 90% in the carousel question (version b). Respondents may also be unsure whether to say they are aware of a brand if they have little knowledge of it beyond seeing or hearing the name before, causing them to skip past lesser-known brands in favour of selecting those they know best unless more granular answer options are available.

Why does this matter? Well, aside from giving a different sense of how well-known a brand is, using a multicode question to capture awareness can have implications if awareness is a filter for follow on questions – possibly skewing your sample away from those who most need educating; or you are closely monitoring awareness as a KPI – masking potential growth; or you are comparing awareness metrics captured through different questions – creating seemingly inexplicable jumps in levels of awareness.

For example, in our data, if the routing of the survey was designed to ask all those aware of DAZN follow on questions about the service, the sample for these questions would be almost 1.7x times larger using the more nuanced awareness definition rather than a multicode question, with the ability to cut further answers by degree of familiarity. The more granular carousel question also shows that the majority of those aware of NOW lack familiarity with it, providing a different insight to relatively good overall awareness as captured by the multicode question.

All this might be leading you to think a multicode awareness question isn’t fit for purpose, but fear not! There are plenty of instances where it is, and don’t forget it has the advantage of being cognitively easy and taking up less valuable survey space than a more granular option. It works well for brands with good awareness using the question for benchmarking and context, or to remain consistent with other internal data sources where it’s captured in this way; and sometimes a detailed response needs to be traded off to capture a broader competitive set. These are all considerations for questionnaire design that an experienced quanty will be able to weigh up, factor in and explain their rationale for.

Regardless of which format is used, designing questions that hold attention is crucial to the quality of data captured. When knowledge is scarce or budgets are tight it’s understandable to want to squeeze as much as possible into every survey, but we always champion shorter surveys with questions that are cognitively easy to answer and tailored to the objectives. Yes, this might mean prioritising questions or making sacrifices, but we believe the trade-off is more than worth it if it captures high quality data and ensures reliable, accurate insights

Our survey was completed online by ~1000 respondents, representative of the UK 16+ population in terms of age x gender, region, and household access to SVOD and Pay TV. Where two versions of the same question were asked, the sample was split into two cells of ~500 respondents, matched in terms of the quotas to ensure comparability of results.

Photo by Tamanna Rumee on Unsplash