Online disinformation campaigns ‘will get worse’ says Bangor professor
Both the government and public should keep a close eye on online disinformation campaigns, which are likely to get worse before they get better.
This is the advice from Bangor University’s Vian Bakir, a Professor in Journalism and Political Communication, to the UK’s Digital, Culture, Media and Sport (DCMS) Parliamentary Select Committee.
On Sunday 29 July 2018, the DCMS Committee published its interim report on Disinformation and ‘fake news’.
It sees the entire structure and future of our democracy as under threat by misinformation and disinformation spread rapidly online.
Vian Bakir has made multiple contributions to this inquiry, including one oral submission and three written submissions – two of them with Andrew McStay, Professor of Digital Life, also from Bangor University.
Throughout, she has highlighted the propagandistic and commercial drivers of disinformation and misinformation online.
She has also highlighted that the causes of this problem are multi-faceted– and hence, so must the solutions be.
Disinformation is pumped online by disreputable domestic political campaigners and malign foreign states, willing to spread falsehoods with emotive, xenophobic and racist ‘dark ads’ that only small segments of the population can see online.
“The business model of digital intermediaries like Facebook enable such emotive, deceptive campaign materials to spread rapidly, because said digital intermediaries make money from user engagement,” she said.
“Outrageous lies, or heart-tugging calls to action - garner plenty of engagement online.
“What is more, such campaigning material can be timed to hit people right when they can have most impact, For instance days before a national vote, or delivered at a time of day when that target audience is most likely to be active on social media.
“Meanwhile, mainstream news is decreasingly trusted or engaged with, and struggles to make enough money to employ investigative researchers, leading to the recirculation of public relations material rather than stories that speak truth to power.
“Other segments of mainstream news are highly politicised, pushing partial truths or improperly contextualised facts that fail to properly educate the audience. They cannot be relied upon to reach people with the full facts of a matter.
“And people are predisposed to believe what they think is already true, finding it difficult to recognise lies and deception, especially if they confirm what they already think. In this way, disinformation spreads rapidly online through social media.”
While the Fake News Inquiry will continue to dig into the commercial drivers of online disinformation and misinformation, it has already made recommendations that call for much tougher actions on the digital intermediaries.
In part, this is because of their central role in spreading false information, but it is also because their commercial underpinnings are opaque, and open to exploitation by disreputable political campaigners and foreign states.
Professor Bakir suggests that the problem is unlikely to go away soon, due to a number of issues:
- The difficulty that people face in recognising deception online, and their propensity to share false information rather than the truth
- The rapid development of technologies are rapidly being developed that make it even harder to recognise deception (e.g DeepFakes)
- The way tech platforms are constantly looking for ways to increase user engagement
- The fact that there will probably always be disreputable politicians seeking to win at all costs.
As a result, Bakir’s over-arching recommendationto the Fake News Inquiry was to establish a working group that has the skills to understand the multi-faceted nature of the problem and monitor what is being developed in the area of misinformation and disinformation.
“What is around the corner may be much more worrying than what we have experienced to date,” she said.
Building on this guidance, the Fake News Inquiry recommended that: ‘As technology develops so quickly, regulation needs to be based not on specifics, but on principles, and adaptive enough to withstand technological developments.’
Related, they recommended that: ‘A professional global Code of Ethics should be developed by tech companies, in collaboration with this and other governments, academics, and interested parties, including the World Summit on Information Society, to set down in writing what is and what is not acceptable by users on social media.
‘The Code of Ethics should be the backbone of tech companies’ work, and should be continually referred to when developing new technologies and algorithms. If companies fail to adhere to their own Code of Ethics, the UK Government should introduce regulation to make such ethical rules compulsory.’
Publication date: 6 August 2018