How do I prioritise marketing messages or product attributes? (Max Diff)

  1. How do I appeal to the largest number of consumers? (TURF analysis)
  2. How do I prioritise marketing messages or product attributes? (Max Diff)
  3. How do I find out what people value in my (new) product / service? (Conjoint)
  4. How do I identify what drives a desired behaviour or outcome? (Key driver analysis)
  5. How do I know what to prioritise to meet strategic goals? (Gap analysis)
  6. How do I build consumer loyalty? (Consumer journey mapping)
  7. How do I use behavioural science to improve my research? (Cognitive biases)
  8. How do I live without you? (LeAnn Rimes)
  9. How do I know how many people will buy my product at a given price? (Van Westendorp’s price sensitivity meter)
  10. How do I assess the impact of my advertising? (Ad effectiveness)
  11. How do I turn data into clear findings (Data visualisation)
  12. How do I tap into the unconscious perceptions that influence decision-making? (Implicit response testing)
  13. How do I reduce a large amount of data into something more meaningful? (Factor analysis)
  14. How do I group people together based on shared characteristics? (Segmentation)
  15. How do I forecast market share at a given price point? (Brand price trade off)
  16. How do I account for cultural differences when surveying across markets? (ANOVA)
  17. How do I judge brand performance relative to competitors (Correspondence analysis / brand mapping)

Why ranking questions fall short

Ranking questions can actually be among the more contentious when writing a survey… If you have 20 product attributes and you want to know how much a customer values them, can you reasonably expect them to rank 20 attributes listed out on a screen? Say you ask for their top three to make it a more meaningful question, how can you tell if those top three are valued almost the same amount, or if the attribute ranked number one is a clear leader, while two and three trail much further behind? When it comes to analysis, are you only interested in the amount of times an attribute is ranked number one, or is any top three ranking relevant? Can. Of. Worms.

If understanding how a range of attributes such as product features, marketing messages or brand performance indicators are ranked by consumers is a core objective to your research, using a choice based technique can provide clear direction.

How can max diff help?

One of the most widely used choice based techniques is called maximum difference scaling, or max diff for short. Max diff is a bit of a does-what-it-says-on-the-tin technique, in that it looks to rank (or scale) the data by asking respondents to choose their most and least preferred / important answers (i.e. answers that show the maximum difference in their preference / importance). This approach allows us to create a definitive, and relative, ranking of data points.

Respondents are asked to pick their top and bottom option from a choice of typically four or five at any one time. The exercise repeats until each answer option has appeared a minimum number of times (the number of exercises and the design of the max diff depends on how many attributes are being tested).

Every choice a respondent makes gives us a hierarchy of preference, which we analyse collectively across audiences. The output is a share of preference (or importance, depending on what you’re testing) for each attribute tested. Across all attributes tested, the share of preference / importance will sum to 100%. Going back to our example of 20 product attributes, if all 20 were valued equally, they would have 5% of the share of preference / importance. The data never falls out this equally, but it does allow us to see the most preferred / important attributes, and how other attributes compare.

In this totally made-up example max diff output, we can see that for prospects or a content service, the top three genres are equally appealing. Although ranked first, second and third, they are all extremely important. Combined the top three genres account for a third of all the new content prospects most want to watch. At the other end of the chart, we can see genres with more niche appeal vs. the other genres tested.

The output from max diff can be paired with TURF analysis to ensure the most effective strategy is deployed with the available resources. Would you believe we’ve got a blog post on TURF analysis too, what luck!