This week, The Guardian, the Daily Mail and 北京赛车pk10 have all published stories questioning the effectiveness of the "When the fun stops, stop" campaign that promotes responsible gambling.
Each of their articles was based on a new academic study – Testing a Gambling Warning Label’s Effect on Behaviour (Newall, Walasek, Singmann, Ludvig; University of Warwick, 2019).
As part of the team responsible for "WTFSS", we were surprised, as other results consistently show the campaign has worked very well. Of course, we appreciate the need to learn from new insight and understanding – especially with such a significant issue. It’s vital to get campaigns addressing socially sensitive topics right.
In this case, however, the study that has generated the headlines contains major flaws.
Let’s start with what’s fundamental: the only way to evaluate advertising is against the advertising’s objectives. For this campaign, the objective of "WTFSS" is not to stop all gambling. It’s to reduce the likelihood of gamblers entering problem territory. That is, displaying problem behaviours as identified in the Problem Gambling Severity Index. This can be various things, but the first symptom assessed is betting more than you can afford.
In the Warwick study, all participants were prompted to place a bet on last season’s FA Cup final – some saw a virtual ad that included the "WTFSS" yellow banner at the bottom, others saw the same virtual ad without it – and the stake they could place was… 10p. Now, it seems a stretch to imagine that most gamblers would feel they’ve gone too far by betting 10p on the cup final. What’s more, it wasn’t even 10p of their own money; it was part of the research incentive. A clear departure from actual signs of entering problem gambling territory in real life.
Second, the study didn’t use the actual ad that is now seen in public. It used an old design (with a larger "FUN") and also omitted reference to gambling charity GambleAware and the "18+" age restriction. Small details, yes, but ones that ground and strengthen the message. Some thinking went into the importance of including them. It seems a little unfair to remove them from the "test".
Third, the study extrapolated from a single (inaccurate) example of the advertising to make a conclusion about the whole campaign. "WTFSS" has featured in a range of different media, including stand-alone TV, press, digital, social activity and so on – work that is understandably targeted at those with a higher propensity to gamble. Judging all this activity using just one conveniently small element is a bit like condemning "Vorsprung durch technik" because of how it looks on oil-stained overalls.
Fourth, by focusing only on an artificially created gambling scenario, the study ignores all the ways in which "WTFSS" has modified gambling attitudes and behaviour outside the immediate gambling moment. In the same way that most advertising increases "saleability" more than "sales" – creating a more favourable context for a brand or, in this case, a way of thinking – "WTFSS" has successfully promoted responsibility way before a finger hovers over the "bet" button, as demonstrated in some of the results below.
Fifth and finally, the study didn’t actually get a statistically significant result. In other words, the authors can’t be confident that if they ran the experiment again, they wouldn’t get a different answer. In the media coverage, only The Guardian noted this lack of statistical significance and that was halfway through the article. So it was midway through the piece before there’s acknowledgement that none of this can be confidently taken to really mean anything.
All in all, some big limitations.
Leaving aside this particular study, there have been various other industry-based reviews of "WTFSS" that we’re aware of. Of these, we can disclose a few of the historic figures from the independently run, twice-yearly tracker that gives some indication of the campaign’s impact. For example, more than one-fifth of regular gamblers (defined as people who gamble twice a month or more) who recognised the campaign in previous tracking agreed it has "led me to warn others about their gambling, if only jokingly", while more importantly 33% in the same study said it had "made me think about my gambling behaviour".
If we allow extrapolation from a 2,000-person poll to the population at large, that’s 3.2 million people claiming to think things through more. Of course, self-reported evidence has its limits – as do extrapolations – though it seems a far cry from the accusation that the campaign "doesn’t work".
None of this is to negate the importance of ongoing learning or the potential for comms to evolve. But a campaign as important as this requires informed judgment, not just attention-grabbing headlines.
Ollie Gilmore is strategy director at The Corner