Advanced Television

MPs: ‘Big tech, social media information failures’

May 21, 2024

By Colin Mann

Ahead of the Ministerial evidence session for its inquiry on Defending Democracy with the Minister for Security, Tom Tugendhat and the Secretary of State (DSIT) Michelle Donelan, the Joint Committee on the National Security Strategy (JCNSS) of the UK Houses of Parliament has published the final pieces of written evidence received from big tech and social media companies – with some serious concerns over the disparity in their approaches to mis- and dis-information, deepfakes and monitoring online content.

2024 is expected to see record numbers go to the polls around the world, with elections expected in over 70 countries including in the UK, Europe, USA, and India. There is widespread acknowledgement of the escalating threat from malicious actors seeking to interfere in national elections, using emerging technologies such as artificial intelligence to create misleading deepfakes and propagating mis- and dis-information on online platforms. Rapid advances in these technologies make mis- and dis- information hard to detect, especially at speed.

Against this backdrop, Dame Margaret Beckett MP, Chair of the JCNSS, is expressing concern over the differing approaches across big tech and social media companies to monitoring and regulating their online content, and in their level of engagement with the inquiry’s subject: defending democracy.

Much of the evidence shows companies developing individual policies each based on their own set of principles, rather than coordinating standards and best practice; with silo-ed responses to evolving and sophisticated digital threats.

Giving evidence in Parliament on March 18th 2024, a representative from the Oversight Board for Meta argued that transparency around how information is being handled, assessed, processed and moderated should take precedence over platforms adjudicating on content themselves. In reality, opaque and diffuse statements of widely varying policies makes holding these companies to account for how they put their policies into practice very difficult.

Another concern is the lack of moderation and regulation regarding algorithms, with the potential for creation of ‘echo chambers’ for users on these sites. Such echo chambers limit the content and information users are likely to access and possibly use to inform judgments during an election period.

“The Committee understands perfectly well that many social media platforms were at least nominally born as platforms to democratise communications: to allow and support free speech and to circumvent censorship,” states Beckett. “These are laudable goals – but they never gave these companies or any individual running and profiting from them the right or authority to arbitrate on what legitimate free speech is: that is the job of democratically accountable authorities. That only holds truer for the form that many of these publishing platforms have in fact taken – one of monetising information spread through addictive technologies.

“So I am concerned to see the huge disparity in approaches and attitudes to managing harmful digital content in the written evidence submissions we have received across companies from X, TikTok, Snap and Meta to Microsoft and Google. This year we have seen groups developing technology to help people decipher the veracity of the dizzying variety of information on offer at every moment online. We would have expected that kind of front foot and responsibility from the companies profiting from spreading the information.”

“For a start, we expected social media and tech companies to proactively engage with our Parliamentary inquiry, especially one so directly related to their work at such a critical moment in our global history. And if we must pursue a company operating and profiting in the UK to engage with a Parliamentary inquiry, we expect much more than a regurgitation of some of its publicly available content which does not specifically address our inquiry.”

“Much of the written evidence that was submitted shows – with few and notable exceptions – an uncoordinated, silo-ed approach to the many potential threats and harms facing UK and global democracy. The cover of free speech does not cover untruthful or harmful speech, and it does not give tech media companies a get-out-free card for accountability for information propagated on their platforms.”

“Though we have not concluded our inquiry or come to our recommendations, there is far too little evidence from global commercial operations of the foresight we expected: proactively anticipating and developing transparent, independently verifiable and accountable policies to manage the unique threats in a year such as this. There is far too little evidence of the learning and cooperation necessary for an effective response to a sophisticated and evolving threat, of the kind the Committee described in our report on ransomware earlier this year. The Government’s Taskforce on Defending Democracy might be a useful coordinating body for social media companies to proactively submit and share their learning on foreign interference techniques.”

“The UK’s own communications regulator Ofcom gave us oral evidence of their new powers to counter online threats – powers that will likely only come into effect after our elections are concluded and the UK’s next government decided. The JCNSS will expect to hear a better account from the tech and social media companies in oral evidence in Parliament, in the early Autumn,” she concludes.

Categories: Articles, Business, Policy, Regulation, Social Media

Tags: , , ,