As those of us in the UX world know, it’s quite common to watch people struggle to use a site, then subsequently say how easy to use it was – which is why it’s important to observe what people do, not what they say. People tell you what they want in order to achieve the desired response. Could this also apply to political polls?
Are Polls different to User surveys?
Polling companies are now under the spotlight after recent failures to predict correct outcomes (though most have been within the rarely reported margin of error). There are a number of problems with the approach polling companies use, the main one being the difficulty in maintaining representative samples:
Companies frequently poll from panels, therefore responders are likely to be moderately politically engaged
‘Random’ sampling – either online (through pop-ups) or by phone (dialling random numbers from the phone book) – receive very low responses; how representative is a poll in which 9 out of 10 people refuse to answer?
People aren’t always honest, either because they don’t want their true political views to be known, or possibly in order to galvanise a particular side by stating an opposing position.
Answers are only as good as the questions that are asked. It is important to understand the context, and the reasons behind the answers which may be far more revealing than the actual result. Recent political events seem motivated in large part as a vote against the establishment, borne of frustration, i.e. the question itself was not so important as the opportunity for many marginalised people to make their voices heard.
How not to conduct a poll
I was interested to hear recently that an old clip from 80s sitcom “Yes, Prime Minister” had gone viral on YouTube in India.
In November 2016, in an attempt to reduce corruption and “black money”, the Indian government suddenly announced high denomination bank notes (86% of all currency in circulation) would immediately cease to be legal tender, and gave citizens merely a few weeks to exchange them. This caused significant disruption as people were unable to pay for basic services in cash, and queues at banks and ATMs were so long that a number of people are reported to have died while waiting.
Despite the chaos from abolishing cash from a society where over 95% of all payments are conducted in cash, the PM released his own survey responses showing an overwhelming number of people positively supported the changes.
Prime Minister Modi’s phone app asked users a series of leading questions which framed a final question, then they reported back only the headline figure showing a vast majority of people were in favour. Aside from the biased question style, the sample was totally unrepresentative of the population as a whole: it only consisted of smartphone users (i.e. affluent Indians with bank accounts – those least likely to be affected by changes), who had installed the Prime Minister’s own app (those likely to be in favour of the government).
This clip from “Yes, Prime Minister” is a fantastic lesson in how to get any answer you want from a poll! (Don’t do this!)
The problem with statistics – even when the data is reliable – is that they never show the full picture. However, they are popular because they appear to reduce complex issues to simple numbers.
How to use numbers effectively
At Bunnyfoot, we often use statistics such as Google Analytics data, that only point us towards issues which we can then investigate further. We also usually use rating systems like NPS and SUS, but these are used for benchmarking and as a tool to probe participants’ attitudes, rather than as a conclusive summary measure. Qualitative measures are the only way to truly understand what’s going on.
Politicians and journalists like to reduce everything to numbers, but by doing so they’re further distancing themselves from the people they exist to serve.
We need more empathy!
“When one man dies it’s a tragedy. When thousands die it’s statistics.” Joseph Stalin
Although he wasn’t talking about Google Analytics, Stalin clearly understood man’s difficulty empathising with a large number of people. I think recent events indicate a widening disconnect between the people and the politicians that are meant to represent us, and a lack of empathy is to blame.
At the moment the main ways politicians interact is through:
Highly choreographed visits to schools and companies in which interaction is limited
Constituency surgeries – for people with problems who are willing to persevere; only the most committed attend
On the doorstep visits, most usually in the run-up to elections
How much can you learn in a 5 min conversation?
Politicians need to persuade, but also need to listen, and their current approach doesn’t allow them enough insight into the lives of ordinary people. There is an empathy gap. Engaging and understanding with the customer at all levels is key to a successful product/policy.
It’s difficult for organisations or governments to design good products or services without empathising with the end users. Human-centred design tools like personas exist to build empathy – perhaps politicians could try tools like these in the early stage of policy formulation, rather than exclusively relying on focus groups to test half-baked policy ideas.
Back to basics
Ahead of this year’s French elections, I am pleased to see one French newspaper has decided to change focus from relying on polls, and is forcing journalists back out into the field to spend time speaking to real people. Le Parisien’s editorial director admitted that a traditional over-reliance on polls has increased distance between the newspaper and its readers.
So what could politics learn from UX?
Empathy leads to good design – whether of products or services. I’d like to see politicians and journalists follow the lead of Le Parisien and get out there and spend significant time listening to people and understanding their lives – a quick chat on the doorstep is not enough. Only then should policymakers work to develop and test new services.