Aug 2, 2021 | Blog

Plans to tackle legal but harmful content will threaten free speech

Howard Freeman

Howard Freeman

A report from the House of Lords has criticised the government’s forthcoming Online Safety Bill. The criticism imposes duty of care on the tech platforms. This is when dealing with what has been called legal but harmful content. The claim is that the bill threatens freedom of expression online. Is this necessarily a bad things.

The government is planning to impose a duty of care on technology platforms when dealing with “legal but harmful” content in the Online Safety Bill. This would be ineffective and threatens freedom of speech, a House of Lords report has warned.

Duty of Care

Under the Bill’s duty of care, tech platforms that host user-generated content or allow people to communicate will have legal obligations. They will be required to proactively identify, remove and limit the spread of both illegal and legal but harmful content. This includes as child sexual abuse, terrorism and suicide material. Tech Platforms can be fined up to 10% of their turnover by the online harms regulator, now confirmed to be Ofcom.

The report was published on 22 July 2021. The House of Lords Communications and Digital Committee said that it welcomes the Bill’s proposals to oblige tech platforms to remove illegal content and protect children from harm. However, it does not support the government’s plan to make companies moderate content that is legal, but may be objectionable to some.

Instead, the Lords argued that existing laws should be properly enforced, Any serious harms not already made illegal should be criminalised.

Racist Abuse

“For example, we would expect this to include any of the vile racist abuse directed at members of the England football team which is not already illegal,” peers wrote in the report.

“We are not convinced that they are workable or could be implemented without unjustifiable and unprecedented interference in freedom of expression. If a type of content is seriously harmful, it should be defined and criminalised through primary legislation.  

“It would be more effective – and more consistent with the value which has historically been attached to freedom of expression in the UK – to address content which is legal but some may find distressing through strong regulation of the design of platforms, digital citizenship education, and competition regulation.”

Enforcement

The peers said platforms should also be made to contribute more resources to help police enforce pre-existing laws. This is particularly important when dealing with illegal content.

The report also pointed out that platforms’ moderation decisions were often “unreasonably inconsistent and opaque”. It also said they could be influenced by commercial or political motivations.

It added that, given the market is dominated by a handful of powerful companies such as Facebook and Google. These platforms must not be allowed to monopolise the digital public square. Instead, there should be a range of interlinked services between which users can freely choose and move”.

To achieve this, peers said the Digital Markets Unit (DMU) should make structural interventions to increase competition This would include mandating interoperability between social media services. The DMU was set up to scrutinise the dominance of tech giants in the UK economy. It has begun its work on developing legally binding codes of conduct to prevent anti-competitive behaviour in digital markets.

“The benefits of freedom of expression online mustn’t be curtailed by companies such as Facebook and Google. They are  too often guided by their commercial and political interests than the rights and wellbeing of their users,” said committee chair Lord Gilbert.

Lack of Competition

“People have little choice but to use these platforms because of the lack of competition. Tougher regulation is long overdue. The government must urgently give the Digital Markets Unit the powers it needs to end these companies’ stranglehold. The lack of competition in this market is unacceptable.”

It said the DMU should therefore make further structural interventions in the search engine market. This would include “forcing Google to share click-and-query data with rivals. It would also prevent the company from paying to be the default search engine on mobile phones”.

Gilbert added that freedom of speech is not an unfettered right. However, the right to speak your mind is the hallmark of a free society. “The rights and preferences of individuals must be at the heart of a new, joined-up regulatory approach. This will bring together competition policy, data, design, law enforcement and the protection of children.” I wondered if Lord Gilbert considers the rights of racists to speak their minds?

Campaign Group

In late June 2021, the newly formed campaign group Legal to Say, Legal to Type critiqued the Online Safety Bill. They stated that the bill was overly simplistic. It also cedes too much power to Silicon Valley firms over freedom of speech in the UK.

Conservative MP David Davis, speaking at the Press Conference to launch the group, characterised the Bill as a “censor’s charter”: Silicon Valley providers are being asked to adjudicate and censor ‘legal but harmful’ content. The criteria is vague and so is the fine size unclear. We do know what they will do. They’re going to lean heavily into the side of caution.

“Anything that can be characterised as misinformation will be censored. Silicon Valley corporations are going to be the arbiters of truth online. The effect on free speech will be terrible.”

But will it. Surely regulation that keeps so called fake news away from platforms whilst silencing the racists, can’t be all bad?

 

 

 

0 Comments

Can we help?