Popular messaging app WhatsApp has said banning its service in the UK would override compliance with the government’s proposed online safety law.
It believes the bill, if enforced, would weaken the privacy of its service for users by undermining end-to-end encryption, which ensures that no one but the sender and recipient of messages on the platform view its contents.
The controversial bill aims to tackle the increasing proliferation of child abuse material, by allowing the national communications regulator, Ofcom, to require encrypted messaging apps to use ‘accredited technology’ to identify and remove such material .
Undermine privacy
Head of WhatsApp Will Cathcart said that “98% of our users are outside the UK, they don’t want us to lower the security of the product”, adding that “we have recently been blocked in Iran for example. We have never had a liberal view of democracy [this].”
Signal, another popular privacy app, has already threatened to leave the UK if the law comes into effect. Director Meredith Whittaker tweeted (opens in new tab) in support of WhatsApp, saying she looked forward to Cathcart and others working to “pull back” against the bill.
Cathcart believes the UK sets a bad example for other liberal democracies, saying that “when a liberal democracy says, ‘Is it OK to scan everyone’s private communications for illegal content?’ that encourages countries around the world that have very different definitions of illegal content to propose the same.”
He also added his concern that other countries may have their own definitions of illegal content that they ask messaging services to scan: “If companies… [users’] communication against a list of illegal content, what happens if other countries show up and give another list of illegal content?”
On the other side of the table, the UK government and the National Society for the Prevention of Cruelty to Children (NSPCC) argue that encryption of messages prevents them from stopping the spread of child abuse online.
“It is important that technology companies do everything they can to ensure that their platforms do not become a breeding ground for pedophiles,” the interior ministry said.
Richard Collard, Associate Head of Child Safety Online Policy at the NSPCC, added that the bill “will rightly make it a legal requirement for platforms to identify and disrupt child sexual abuse on their sites and services.”
He also said these companies could develop “technology solutions” that protect users’ privacy while ensuring the safety of victims of child abuse.
He claimed that “experts have shown that it is possible to address child abuse and grooming in end-to-end encrypted environments”.
The UK government clarified that end-to-end encryption is not being banned and that child privacy and safety are not mutually exclusive in an online context.
However, critics argue that the only way to check for illegal content is to scan the messages on a user’s device with an additional service, which means that the content of their messages is no longer private.
Lawyer Graham Smith likened it to digging a hole to get around a fence without breaking it, tweeting “once the hole is dug you might as well not have the fence.”
Dr. Monica Horten of the Open Rights Group said the bill could turn WhatsApp into a “mass surveillance tool” as all users’ messages could potentially be scanned.
The Information Commissioner’s Office, which apparently works closely with Ofcom, told BBC News that “Where less intrusive measures are available they should be used” and supported “technological solutions that facilitate the detection of illegal content without undermining privacy protections for all” . “.