Bobby Duffy is the author of the new book Why We're Wrong About Nearly Everything: A Theory of Human Misunderstanding. He is director of the Policy Institute at King's College London, and he lives in London.
Q:
You begin the book with a discussion of whether the Great Wall of China is
visible from space. Why did you start there, and how does that question, as you
write, "highlight why there might be this gap between perception and
reality"?
A:
It’s a great way to get people to see the wider causes and implications of even
simple misperceptions – partly because we know from surveys that about 50
percent of people wrongly think it is visible from outer space. And I confirm
those sort of figures in just about every talk I do, with a huge range of
audiences.
The
point is not to make people feel stupid in any way, it’s to highlight how even
this shows the four to five key points from the book and why we get things
wrong and the implications.
First,
we tend not to think about this sort of question very deeply, because it’s
quite trivial. But we don’t give lots of day-to-day decisions a lot of thought
either; that would be exhausting. Instead we use what Daniel Kahneman calls
fast thinking, where we’re not engaging in careful consideration.
We
also struggle with scales as humans, often mixing them up. So the Great Wall of
China is extremely big, in fact it is one of the largest man-made structures on
earth, but it’s its length that gives it that scale, and that doesn’t make it
visible from outer space.
We
also suffer from illusory truth bias, which means we’re more likely to believe
and accept something when we hear it repeated several times. We’ll have heard
this “fact” in many circumstances, and not thought much of it, but its
repetition alone helps make it seem more real.
But
it’s also more emotional than it might seem for such a trivial fact – we want it
to be real, because it’s just a cool and unusual fact that we can make
something visible from space.
This
is a key point of the book – that our views of reality are more affected by our
emotions and identity than we often realise or would like to admit.
But
finally, when I tell audiences I’ve looked into it, and the best evidence is
that it’s not visible from outer space, they (usually!) believe me.
Again,
this is an important point – people do change their minds with new information,
and we need to hold on to that fact, that we’re not all completely set in our
ways.
Q:
In the book, you discuss Brexit and the 2016 U.S. presidential election. What
do those events say about why people misinterpret what's going on?
A:
Well, I think they tell us more about how political actors and communications
play on the causes of our misperceptions to make a connection with people.
So
in the UK, Boris Johnson made a big play about how the EU is banning bendy
bananas! This wasn’t really true, not in the sense it was presented, but it was
very effective – because it’s such a vivid story, but also implies that if the
EU are meddling with our bright yellow potassium rich food, what else are they
interfering in?
And
in the US, on the campaign trail in 2016, Donald Trump repeatedly said that –
incorrectly – unemployment was above 30 percent in the US, and that crime was
much higher than decades ago. But his messages were less about the facts, more
about showing he understood people’s sense of decline.
These
plays on key biases – that we naturally focus more on negative information and
we think things were better in the past than they really were.
Our
focus on the negative is a deep evolutionary trait, as in our cave people past,
negative information was often threat information, like a warning of a lurking
sabre tooth tiger – and people who didn’t take notice were edited out of the
gene pool.
We’re
the end of a long line of humans who did very well by focusing on the negative.
Q:
What are some answers to this ongoing problem of human misunderstanding?
A:
The key point of the point of the book is that we go wrong because of both “how
we think” and “what we’re told.”
This
isn’t just about our biases on the one hand or fake news or misleading
politicians on the other hand – it’s about how the two interact. One plays off
the other, and that points to what we can do.
The
main point is that we can’t just teach people critical or news literacy and
expect them to cope better. And it doesn’t mean we can just get the social
media platforms to clean up their act and for that to solve it.
What
we need are actions that deal with both sides of this “system of delusion”: do
better at bringing our skills up to meet the hugely different information
environment we’re in now, and regulate and control how people can abuse this
system more effectively, focusing on key decision points, such as round elections.
We’re
not even applying the same standards to online campaigning that we do to
offline campaigning in many countries, and that’s a big gap, considering how
important and effective online campaigning has become.
Q:
What do you see looking ahead, given rapid technological advances?
A:
It’s a really good question. There is a worrying combination of three
technologies that could come together.
First
we have greater ability to micro-target people online, based on a huge amount
of data that is known about our behaviours and attitudes.
Second,
AI has allowed communicators to create and test huge numbers of variations of
messages to see which is most effective. So, for example, it was common for
100,000 tiny variations of message to be tested every day in the 2016 campaign,
using the speed and processing power of AI to do something that just would not
be possible by humans.
And
finally, the deepfake capabilities, of being able to realistically fake videos,
have come on hugely in recent years. The risk here is not so much fake videos
of presidents or prime ministers starting wars, but how the tech can be used in
campaigns.
Taking
these three together, this means that we could have personalized messages
targeted to tight groups of individuals, where the president or PM is saying
something entirely different to you from what he or she is saying to me.
This
means we lose our common understanding of political positions and discourse,
where it is all hidden, which is scary.
But
much more important than any of these individual applications is the more
general point that tech is always evolving beyond our ability to regulate or
control it.
This
is the real issue – that the accelerating pace of change means we’re always
behind the curve. While we’re still worrying about how to deal with deepfakes,
something else pops up.
It
means we have to shift our approach to be less about the individual
capabilities and more about the principles we want to hold up. Then future innovations
need to be tested against whether they fit with these principles.
That’s
difficult to do, but vital if we’re not always going to be legislating for old
threats.
Q:
What are you working on now?
A:
I’ve got two main themes.
The
first is generational differences, which is the subject of my next book. I’m
looking to separate myth from reality on this too – as there is a lot of
nonsense talked about generational differences, but this obscures the very real
changes that are going on underneath.
And
in my “day job” I’m focusing on the growing threat of polarization in the UK,
and how we need to understand that better to again not jump to the wrong
conclusions. We think we’re more divided than ever, but actually there is a lot
that brings us together, and we need to focus on supporting that, not talking
up division.
--Interview with Deborah Kalb
No comments:
Post a Comment