Twitter boycott: When is the government going to act to tackle ‘Online Harms’ to our democracy?

Author:
Josiah Mortimer, former Head of Communications

Posted on the 28th July 2020

Users have been boycotting Twitter this week, over what they say are staggering levels of hate speech and racism on the platform. 

But while social media companies have a lot to answer for, they act within a legal framework. And unfortunately, the laws we have are not fit for purpose in the digital age.

To make the internet a safer space, the government published a plan last April to tackle ‘online harms’, which it defines as ‘online content or activity that harms individual users’, particularly children. Momentum for change grew after the tragic Hate speech also formed a part.

The proposals, published in a White Paper, called for properly enforced ‘codes of practice’ on internet companies, and a statutory ‘duty of care’ – as well as sanctions for tech giants if the codes are not complied with.

The plans were welcomed on the whole – although they were worryingly quiet on the need to clamp down on the threats to democracy online. That includes ‘dark ads’ – where political campaigners hide who is paying to spread messages online.

As we told Parliament, there is “a near-total lack of regulation of online political advertising” which is putting the integrity of our elections at risk. The Online Harms bill – if it has teeth – could get to grips with this. The All-Party Parliamentary Group for Electoral Campaign Transparency called for any Online Harms regulator to be properly resourced, independent from government – and able to tackle foreign interference in our elections.

It has been well over a year since that White Paper, and we’ve seen no legislation put on the table. Nor have we seen any action on ‘dark ads’ undermining trust and transparency in politics.

Despite countless regulators, campaigners and committees calling for action, there has been woeful stalling from the government when it comes to updating Britain’s analogue-age rules.

Last year, the Health Secretary, Matthew Hancock MP, warned tech companies, including Facebook, Google and Twitter, that they must remove inappropriate, harmful content, following the events surrounding the death of Molly Russell who took her own life in 2017 aged just 14.

Then-minister Margot James MP pointed out there have been no fewer than fifteen voluntary codes of practice agreed with platforms since 2008: “Where we are now is an absolute indictment of a system that has relied far too little on the rule of law.”

Those warnings however are just words, without anything to back them up. As the past few weeks have shown, we cannot leave the UK’s online sphere at the whim of Silicon Valley giants.

It is time for some legislation with teeth – to tackle hate speech and ensure voters have confidence in democratic debate online.

Sign our petition for campaign rules fit for the 21st Century

Enjoy this blog? Sign up for more from the Electoral Reform Society

  • If you already receive emails from us, you don't need to complete this form
  • This field is for validation purposes and should be left unchanged.

Read more posts...