Skip to main contentSkip to navigationSkip to navigation
An encryption message on WhatsApp
WhatsApp already uses end-to-end encryption but Meta has delayed the rollout of similar technology on Facebook Messenger and Instagram. Photograph: Phil Noble/Reuters
WhatsApp already uses end-to-end encryption but Meta has delayed the rollout of similar technology on Facebook Messenger and Instagram. Photograph: Phil Noble/Reuters

UK could force messaging apps to adopt new technology to tackle abuse images

This article is more than 1 year old

Amendment to online safety bill would require tech firms to make their ‘best endeavours’ to deploy new technology

Heavily encrypted messaging services such as WhatsApp could be required to adopt cutting-edge technology to spot child sexual abuse material or face the threat of significant fines, under changes to UK digital safety legislation.

The amendment to the online safety bill would require tech firms to make their “best endeavours” to deploy new technology that identifies and removes child sexual abuse and exploitation content (CSAE).

It comes as Mark Zuckerberg’s Facebook Messenger and Instagram apps prepare to introduce end-to-end encryption despite strong opposition from the UK government, which has described the plans as “not acceptable”.

The home secretary, Priti Patel, a longstanding critic of Zuckerberg’s plans, said the change in the law balanced the need to protect children while providing privacy for online users.

She said: “Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe. Privacy and security are not mutually exclusive – we need both, and we can have both, and that is what this amendment delivers.”

Child safety campaigners have said heavy encryption to ensure that only the sender and recipient can view their messages would prevent law enforcement and tech platforms from seeing illegal content. However, officials said the amendment was not an attempt to stop the rollout of more such services and that any technology deployed would have to be effective and proportionate.

Zuckerberg’s Meta business, which also owns the encrypted WhatsApp messaging service, is delaying introducing its Messenger and Instagram plans until 2023.

Vetting private messages for child abuse material has proved controversial, with campaigners warning of negative consequences for user privacy. One controversial method that could be considered by the communications watchdog Ofcom, which is overseeing implementation of the bill, is client-side scanning. Apple has delayed plans to introduce the technology, which would involve scanning user images for child sexual abuse material before uploading them to the cloud. The company has proposed deploying a technique that would compare photos with known images of child abuse when users opt to upload them to the cloud.

Under the proposed amendment, Ofcom would be able to demand that tech firms deploy or develop new technology that can help find abuse material and stop its spread. The amendment tightens an existing clause in the bill that already proposes giving Ofcom the power to require deployment of “accredited technology”. The change would require companies to use their “best endeavours” to deploy or develop new technology if the existing technology is not suitable for their platform.

Sign up to First Edition, our free daily newsletter – every weekday morning at 7am

If a company fails to adopt that technology, Ofcom would have the power to impose fines of up to £18m or 10% of a company’s global annual turnover – whichever is higher.

The online safety bill returns to parliament next week after being scrutinised by a committee of MPs and is expected to become law around the year end or in early 2023.

There are between 550,000 and 850,000 people in the UK who pose a sexual risk to children, according to the National Crime Agency. Rob Jones, the NCA director general for child sexual abuse, said: “We need tech companies to be there on the frontline with us and these new measures will ensure that.”

The UK data watchdog has also intervened in the debate about end-to-end encryption, which is used by WhatsApp and other services such as Signal. In January the Information Commissioner’s Office said strongly encrypting communications strengthened online safety for children by reducing their exposure to threats such as blackmail, while also allowing businesses to share information securely.

Most viewed

Most viewed