Could the UK be about to break end-to-end encryption?

Once again there are indications the UK government intends to use the law to lean on encryption. A report in The Sun this week quoted a Conservative minister saying that should the government be re-elected, which polls suggest it will, it will move quickly to compel social media firms to hand over decrypted data.

The paper quoted an unnamed government minister saying: “The social media companies have been laughing in our faces for too long”, and suggested that all tech companies with more than 10,000 users will face having to significantly adapt their technology to comply with the decryption law.

The relevant Statutory Instrument, to enable UK government agencies to obtain warranted access to decrypted data from communications service providers, currently sitting in draft form, will be voted through parliament within weeks of a new government taking office after the June 8 general election, according to the report.

As is typically the case when strong encryption comes back under political pressure in the modern Internet age, this leaked hint of an impending ‘crack down’ on tech firms came hard on the heels of another terrorist attack in the UK — after a suicide bomber blew himself up at a concert in Manchester on Monday evening. The underlying argument is that intelligence agencies need the power to be able to break encryption to combat terrorism.

Strong encryption, cryptic answers

The problem — as always in this recurring data access vs strong encryption story — is that companies that use end-to-end encryption to safeguard user data are not able to hand over information in a readable form as they do not hold the encryption keys to be able to decrypt it.

So the question remains how can the government compel companies to hand over information they don’t have access to?

Will it do so by outlawing the use of end-to-end encryption? Or by forcing companies to build in backdoors — thereby breaking strong encryption in secret? The latter would arguably be worse since government would be opening app users up to potential security vulnerabilities without letting them know their security is being compromised.

The UK government has been rubbing around this issue for years. At the back end of last year it passed the Investigatory Powers Act, which threw up questions about the looming legal implications for encrypted communications in the UK — owing to a provision that states communications service providers may be required to “remove electronic protection of data”.

It’s those powers that ministers are apparently intending to draw on to break social media firms’ use of strong encryption.

During the scrutiny process of the IP bill last year, ministers led a merry dance around the implications of the “electronic protection” removal clause for e2e encryption. The best interpretation of which was that the government was trying to frame a law that encouraged tech platforms to eschew the use of strong encryption in order not to risk falling outside the scope of an unclear law.

“He seems to be implying that providers can only provide encryption which can be broken and therefore can’t be end-to-end encryption,” was Lord Strasburger’s assessment of the government response to questions on the topic last July.

No clarity has emerged since then. The situation is still ongoing fuzziness about the legality of e2e encryption in the UK. To break or not to break, that is the question?

Arguably, as Strasburger suggested, this is strategic; intentional obfuscation on the part of the UK government — to spread FUD as a strategy to try to discourage use of a technology their intelligence agencies view as a barrier to their work.

But the problem for the government is that use of e2e encryption has been growing in recent years as awareness of both privacy risks and cyber security threats have stepped up, thanks to data breach scandal after data breach scandal, as well as revelations of the extent of government agencies’ surveillance programs following the 2013 Snowden disclosures.

Not holding encryption keys allows tech firms to step outside the controversy related to digital snooping and to bolster the security cred of their services. Yet, as a result, popular services that have championed strong encryption are increasingly finding themselves in the crosshairs of government agencies. Be it the Facebook Messenger app, or Facebook’s WhatsApp messaging platform, or Apple’s iOS and iMessage.

After another terror attack in London in March, UK Home Secretary Amber Rudd was quick to point the finger of blame at social media firms — saying they should not provide “a secret place for terrorists to communicate with each other”, and asserting: “We need to make sure that our intelligence services have the ability to get into situations like encrypted WhatsApp.”

Of course she did not explain how intelligence agencies intended to “get into” encrypted WhatsApp. And that earlier political pressure on encryption morphed into calls for social media firms to be more proactive about removing terrorist content from their public channels. At least publicly. Discussions held vis-a-vis encryption were not made public.

But again, if the latest reporting is to be believed, Rudd is intent on breaking strong encryption after all.

Exceptional access, unacceptable risk 

It’s worth revisiting Keys Under Doormats; aka the paper written by a group of storied security researchers back in 2015, re-examining the notion of so-called “exceptional access” for security agencies to encryption systems — at a time when debate had also been re-ignited by politicians calling for ‘no safe spaces for terrorists’.

The report examined whether it is “technically and operationally feasible to meet law enforcement’s call for exceptional access without causing large-scale security vulnerabilities” — posing the question of whether it’s possible to build in such exceptional access without creating unacceptable risk?

Their conclusion was clear: exceptional access without unacceptable risk is not possible, they wrote. Nor is it clear it would even be feasible given how the services in question criss-cross international borders.

Here’s one key paragraph from the paper:

Designing exceptional access into today’s information services and applications will give rise to a range of critical security risks. First, major efforts that the industry is making to improve security will be undermined and reversed. Providing access over any period of time to thousands of law enforcement agencies will necessarily increase the risk that intruders will hijack the exceptional access mechanisms. If law enforcement needs to look backwards at encrypted data for one year, then one year’s worth of data will be put at risk. If law enforcement wants to assure itself real time access to communications streams, then intruders will have an easier time getting access in real time, too. This is a trade-off space in which law enforcement cannot be guaranteed access without creating serious risk that criminal intruders will gain the same access. Second, the challenge of guaranteeing access to multiple law enforcement agencies in multiple countries is enormously complex. It is likely to be prohibitively expensive and also an intractable foreign affairs problem.

They further concluded:

From a public policy perspective, there is an argument for giving law enforcement the best possible tools to investigate crime, subject to due process and the rule of law. But a careful scientific analysis of the likely impact of such demands must distinguish what might be desirable from what is technically possible. In this regard, a proposal to regulate encryption and guarantee law enforcement access centrally feels rather like a proposal to require that all airplanes can be controlled from the ground. While this might be desirable in the case of a hijacking or a suicidal pilot, a clear-eyed assessment of how one could design such a capability reveals enormous technical and operational complexity, international scope, large costs, and massive risks — so much so that such proposals, though occasionally made, are not really taken seriously.

One thing the paper did not consider is that much politicking can be primarily intended as a theatre of influence for winning votes from spectators.

And the timing of the latest leaked call for ‘decryption on-demand’ coincides with an imminent UK general election, while also serving to shift potential blame for security failures associated with a terrorist attack that took place during the election campaign off of government agencies — and onto a softer target: overseas tech firms.

As we’ve seen amply in recent times, populist arguments can play very well with an electorate. And characterizing social media companies as the mocking, many-headed pantomime villain of the story transforms complex considerations into a basic emotional attack that might well be aimed at feeding votes back to a governing party intent on re-election.

“…to disclose, where practicable… in an intelligible form”

Returning to UK law, the (still draft) ‘Investigatory Powers (Technical Capability) Regulations 2017‘ is the legal route for placing obligations on comms service providers, under the IP Act, to maintain the necessary technical capabilities to afford government agencies the warranted access on demand that they keep demanding.

Yet exactly what those technical capabilities are remains unclear. (And “vague” technical requirements for exceptional access are also raised as a problem in Keys Under Doormats.)

Among the list of obligations Technical Capability Notices can place on comms service providers is the following non-specific clause:

To provide and maintain the capability to disclose, where practicable, the content of communications or secondary data in an intelligible form and to remove electronic protection applied by or on behalf of the telecommunications operator to the communications or data, or to permit the person to whom the warrant is addressed to remove such electronic protection.

The document also sets out that decrypted data must be handed over within a day after a CSP has been served a warrant by a government agency, and that CSPs must maintain the capability to intercept simultaneously comms and metadata for up to 1 in 10,000 of their customers.

The technical details of how any encryption perforations could be achieved are evidently intended to remain under wraps. Which means wider data security risks cannot be publicly assessed.

“I suspect that all the vagueness about concrete technical measures is deliberate, because it allows the government to deal with the technical details within a particular technical capability notice, which would be under a gag order, and thus avoid any public scrutiny by the infosec community,” argues Martin Kleppmann, a security researcher at the University of Cambridge who submitted evidence to the parliamentary committees scrutinizing the IP bill last year. And who has blogged about how the law risks increasing cyber crime. 

“It’s easy to criticize encryption technologies as providing ‘safe spaces’ for terrorists while forgetting that the exact same technologies are crucial for defence against criminals and hostile powers (not to mention protecting civil liberties),” he adds.

“Intelligence agencies don’t seem to actually want bulk access to encrypted data, but merely want the capability to intercept specific targets. However… if a system allows encryption to be selectively circumvented at the command of an intelligence agency, it’s not really end-to-end encryption in a meaningful sense!”

One possibility for enabling ‘exceptional access’ that has sometimes been suggested is a NOBUS: aka a ‘nobody but us’ backdoor — i.e. a backdoor which is mathematically/computationally impossible to find. However Kleppmann points out that even if the math itself is solid, it merely takes one person with knowledge of the NOBUS to leak it — and then, as he puts it, “all mathematical impossibility goes out of the window”.

“The only way of making a system secure against adversaries who want to harm us is by designing it such that there are no known flaws or backdoors whatsoever, and by fixing it if any flaws are subsequently discovered,” he argues.

Crypto expert Bruce Schneier is also dismissive of notion there might be a ‘third way’ for authorities to securely gain exceptional access. Asked by TechCrunch whether there could be any way for the UK government to implement a technical capability without the law effectively removing e2e encryption, his response is very clear. “No,” he says. “Just like the US, the way for the government to get the access it wants is to destroy everyone’s security.”

Meanwhile, on the vulnerability front, Kleppmann notes that even users of services which have open source components — such as WhatsApp, which uses the respected (and independently security reviewed) Signal Protocol for its encryption system — there’s still a requirement for users to trust the company’s servers are doing what they say they are when they hand over keys. Which could offer a potential route for a government-mandated backdoor to be slipped in.

“With WhatsApp/Signal/iMessage there is the remaining problem that you have to trust their server to give you the correct key for the person you want to communicate with,” he says. “Thus, even if the encryption is perfect, if a government agency can force the server to add the government’s special decryption key to your list of device keys, they can still subvert the security of the system. People are working on improving the transparency of key servers to reduce this problem, but we still have a long way to go.”

“I do believe open source is very helpful here,” he adds. “It’s not a silver bullet, but it makes it more difficult to sneak in a backdoor unnoticed.”

Previously, UK government ministers have both claimed they do not want to ban end-to-end encryption nor are demanding that backdoors be built in digital services. Although they have also described the rise of e2e encryption as “alarming“.

When interrogated specifically on the e2e question, the former UK Home Secretary (and now UK Prime Minister) said that companies should take “reasonable steps to ensure that they are able to comply with the warrant that has been served on them”.

Yet — and you might be spotting a pattern here — there has been no definition of what those “reasonable steps” might be.

Therefore it remains unclear where the UK’s legal line will be drawn on encryption.

Backdoors and outlaws

If The Sun‘s story is correct, and UK government-ministers-in-waiting are indeed preparing to demand the likes of WhatsApp and Apple hand over decrypted messages then those “reasonable steps” would presumably require an entire reworking of their respective security systems.

And if the companies don’t bow to such demands what then? Will the UK government move to block access to WhatsApp’s e2e encrypted messaging service? Or ban the iPhone, given that Apple’s iMessages also uses e2e encryption? We just don’t know at this point.

A spokesperson for WhatsApp declined to comment when contacted for a response to this story.

Apple’s press team did not respond to a request for comment either. But the company has a history of strongly defending user privacy — taking to the courts in the US last year to fight the FBI’s demand to weaken iOS security to help facilitate access to a locked iPhone that had been used by a terrorist, for example.

WhatsApp has also had its service blocked multiple times in Brazil after it was taken to court for not handing over decrypted data to law enforcement authorities. Its response? To state in public that it cannot hand over information it does not hold.

However, the legal situation in the UK is different owing to the 2016 IP Act — with its troublesome clause about “removing electronic protection”.

And while there may be fresh moves afoot in the US to introduce a decrypt bill in the US — such legislation has not yet come to pass. Whereas in the UK the relevant law is now framed in such a way as to be possible to interpret that it requires CSPs to deliver up decrypted data on warranted demand.

So it’s not apparent that there would be any legal route for Apple to try to fight a decryption order for iMessage — should it be handed one by UK government agencies — given the company has a substantial presence in the UK. (As does Facebook, the parent of WhatsApp.)

“You can’t run a company as an outlaw,” says Danvers Baillieu, former lawyer turned COO for a startup after a stint working for VPN firm, HideMyAss. “If you change the law and it is [a company’s] legal duty to do something they don’t really have a leg to stand on. It’s all very well them saying they’re going to crusade for this and that but they ultimately have to comply with the law.”

“As a VPN provider we obviously told people to get lost the whole time from other countries because we didn’t have a physical presence there and we said we just had to abide by UK law. So we were constantly having services taken down in countries like India and Turkey and other places — because the authorities there would then lean on our local server providers,” he adds.

“But we could get away with it because we weren’t physically there. But the moment you have a physical presence — and the moment we got taken over by a multinational [HMA was acquired by AVG in 2015] we suddenly had to think about these things far more, because suddenly we were part of a multinational with offices in all these countries. And we had to be a lot more sensitive to these things.”

At this point we simply do not know what these multinational tech giants might feel they have to do to their security systems behind closed doors when/if they are being leant on by the full force of UK law — also behind closed doors, as CSPs are forbidden from disclosing the existences of Technical Capability Notices.

And if they’re being leant on to build and test backdoors to afford UK intelligence agencies access to their systems we may never know as there’s no legal route for them to tell their users what’s happening.

Perhaps they’d just remove marketing materials that mention ‘end to end’ encryption from UK versions of their services — and, much like a warrant canary, we’d have to make an inference that a certain service might no longer be trustworthy for UK users from that moment on.

“It would certainly make for some very bad PR, were a company to defy the gag order and make it publicly known,” says Kleppmann. “So maybe in such cases the government would choose not to serve a technical capability notice in the first place, and only rely on cooperation from companies that are happy to cooperate voluntarily. But now we’re really in guesswork territory.”

Meanwhile, plenty of tech services are of course built and maintained by overseas firms or developers with little or no presence in the UK.

Which raises the question of how the government would respond to that workaround for its plan to acquire decrypted data? And whether it would seek to block access to services that offer e2e encryption and cannot be legally compelled to build in backdoors.

A lawyer we spoke to for this story who did not wish to be identified suggested there may be some overseas providers that are willing to “do something” — “if they can find a way to do so, and want the comfort of a legal compulsion”.

For those overseas providers that are adamant they will not remove electronic protection when handed a UK warrant, it’s difficult to say what the government might do. The source suggested they could try blocking access to such services by leaning on other UK-based companies — such as ISPs and multinational app stores.

“We’ve seen in the Digital Economy Act, in the context of overseas porn sites which fail to comply with UK rule, the fall-back position is one of ISP blocking,” they said. “There is also the (seemingly non-binding) approach of having a chat with app store operators and other ‘ancillary service providers’, to encourage them to take action — presumably, removing an app from the store, or the removal of payment services provision from the app/service in question.”

A blocking strategy would be highly unlikely to render it impossible to access all services offering e2e encryption without any government backdoors — so, as ever, the political desire to have an absolute workaround for strong encryption would be doomed to fail. Meanwhile, the cost to mainstream app users of government requiring CSPs build access exploits into their systems ‘just in case’ would be greater risk their communications are hacked, leaked and snooped on.

“I think ultimately the reputable, multinational companies would comply but then you’re always going to have some kid spinning up a service from their bedroom in the middle of nowhere — or you have the latest version of Telegram, or something like that — and then it’s not going to comply. So obviously any sensible criminal or terrorist is not going to use the mainstream ones,” says Baillieu. “Criminals are generally quite dumb about this sort of stuff. But whether that applies to the more motivated terrorists, we just don’t know.”

“I think equally there’s a very good argument to say you should make it hard for these people to do this stuff,” he adds. “They shouldn’t just be able to use the most convenient apps that everyone has on their phone. We should make it difficult for them — and they might slip up… And I think you can make it quite hard for non-compliant apps to get distributed.

“I think a lot of people, probably this week, are feeling a little bit vulnerable. And we have to do something to address this.”

While Baillieu’s view is understandable, given the horror and fear generated by terrorism, it does risk losing sight of the wider, day to day risks posed to all users of digital services if governments systematically undermine data security. And we don’t have to look far back in time for an example of the risks.

The WannaCry ransomware, which caused havoc globally earlier this month, including locking out healthcare systems in the UK, utilized an exploit developed by (and leaked from) US intelligence agencies.

So, really, a “clear-eyed assessment” is what’s called for here — despite, and perhaps even because of, the horrors of terrorism.

“These proposals are unworkable in practice, raise enormous legal and ethical questions, and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm,” is how Keys Under Doormats‘ assessed the “exceptional access” proposals of 2015.

Two years later their assessment would surely be that the risks of seeking to systematically backdoor encryption now are only greater — as more and more systems are being connected and more and more people are dependent on the data they contain.

Yet politicians in positions of power are apparently intent on waging yet another self-defeating crypto war. Where’s the sense in that?