Back to News & Commentary

The Recent Ploy to Break Encryption Is An Old Idea Proven Wrong

Clipper Chip
Clipper Chip
Jon Callas,
Senior Technology Fellow,
ACLU
Share This Page
July 23, 2019

This is the fourth and final in a series of essays about a proposal by officials at Britain’s GCHQ about requiring encrypted communications platforms to be designed to secretly add an extra participant — the government — to a conversation. In the previous essay, I explained why network design and cryptography mean that the GCHQ proposal cannot listen from afar as their metaphor of crocodile clips implies. They must be on the participants’ devices, and yet must be secret, as listening in when everyone knows there’s a listener is comically silly. In this essay, I explain how criminals and terrorists would take advantage of that technological fact to evade the so-called “ghost user.”

Whenever you build a system, you have to test it in two ways. Quality assurance teams make sure that the system can be used correctly and produces the correct results when its users do the things you expect them (and instruct them) to do. In my career as a software engineer and security specialist, I led a team that did adversarial testing, also known as Red Teaming. Red Teams do unexpected, incorrect, devious, willfully obtuse, and downright malicious things to a system to see how it responds. Both of these kinds of testing are necessary before any system is deployed. Tools must both work when given the correct commands and respond well when given incorrect ones. We design technology to resist people who try to trick it into the wrong behavior.

In the early 1990s, the U.S. government had another proposal that would purportedly preserve secure communications for the “good guys,” and provide “wiretappability” for the “bad” ones. This proposal was the notorious Clipper Chip, and it was finally abandoned because a flaw in its access system ensured that criminals could get around it. In brief, Clipper Chip telephone handsets would encrypt calls, but held 40 bits of the 80-bit encryption key in government hands. This gave the US government an easy 40-bit break of the encryption, while making everyone else have to do an 80-bit key search, which is daunting but not impossible today. That a phone was correctly escrowing half the key was signaled through a metadata a hint called the Law Enforcement Access Field, or LEAF. To the outside world, the handsets had rather strong encryption, but to US agents, the LEAF would make breaking the encryption much easier, only taking a few hours or days. At least, that was the idea.

In fact, (currently the ) did Red Team testing of the Clipper Chip and defeated its security. His analysis showed that one could forge a LEAF, and thus create a phone that would work alongside Clipper phones, and yet not give exceptional access to the government. If, for example, you had one of these forged phones and I had a Clipper phone, law enforcement would be able to decrypt my half of the conversation, but not yours. If we both have forged phones, we have opted out of Clipper’s access system altogether. This discovery lead to the Clipper proposal fading away — because it just didn’t work. Potential customers didn’t want one of these forged phones (how do you trust such a thing), and the government didn’t want a system where someone could opt-out by simply using a forged phone. If the proposal had been implemented it would have created two populations of users: the “smart” ones evading surveillance by using forged equipment, and the “dumb” users who are using the conventional system.

The same thing would happen under the ghost user proposal. While law enforcement typically replies to such issues with the comment that most criminals are dumb, I believe that a system that permits intelligent criminals to operate with impunity, while everyday people can be spied upon, is an affront to nearly every principle of civil liberties, and certainly to the principles that the GCHQ authors use to justify their proposal, particularly those of fairness, proportionality, transparency, and trust.

Build Our Own Canary for This Coal Mine

Nate Cardozo and Seth Schoen of the EFF wrote an in which they show how the “ghost user” in the GCHQ proposal can be detected with some sophisticated cryptographic techniques. Their article is clever and worth a read. I take a different approach to defeat the ghost user system, one that is directly analogous to the defeat of Clipper. I can write an alternate app that runs alongside the official installation of Whatsapp or other software and performs the same function as the official app yet tells the user all the other parties in the conversation. There is no way to prevent such an app because, for the reasons I explained in previous essays, the conversation keys have to be on the device in order for the conversation to be end-to-end encrypted. The “client” software that operates on the computer or smartphone can always tell a user about all the participants and can report any user, ghost or not, entering or leaving the conversation. This app might do nothing more than tell me who the participants are, or alert on the addition or deletion of new devices. In security, we call this a “canary app,” after the proverbial canary in the coal mine. The way this canary app would work is, if suddenly it looks like Carol has just gotten a tablet, Alice might say, “Congratulations on the new tablet, Carol.” Carol replies, “What new tablet?” and then the jig is up. They know that someone or something is pretending to be Carol’s new tablet.

A more sophisticated canary could simply reject the exceptional access request by refusing to negotiate the “ghost” encryption key exchange or sending it bogus keys. A very clever app could send different messages to the ghost than to the real people using a chatbot. Clever people will think of other ways to troll the spies on the line, starting with sending them malware.

Canary apps can’t be prevented. They can be created from existing open-source apps or created from whole cloth by reverse-engineering the network communications. People will write, publish, and provide these apps so that people vulnerable to attacks by their government could protect themselves. Some criminals would install it, and there is no mechanism to prevent that.

Actual Bugs and Threats

There will be other security flaws in implementing a “ghost user” architecture beyond those I’ve identified. All software has bugs. Building a multi-user chat system is complex and there are many things to get wrong. For example, Apple’s multi-user FaceTime had an interesting bug in which someone could turn on another user’s microphone before they answered. Apple had to shut down multi-user functionality across the globe while the company fixed the problem.

In another example, the French government created a secure messenger called intended for trusted government actors to use instead of messengers like WhatsApp and Telegram, which the government did not entirely trust. A security researcher , thereby defeating the purpose of the app.

Software is hard to do correctly. It’s impossible to get it right the first time. Software that has a security goal that is in opposition to itself — be secure, but let certain parties break it — is even harder. It will be under attack from honest people who don’t want to be spied on. It will be under attack by criminals. It will be under attack by other governments who want to subvert the rules of exceptional access. For example, if the Chinese government learns to spy on UK citizens by pretending to be GCHQ, they will, and they aren’t going to tell anyone that they can.

If it Doesn’t Work, It Doesn’t Work

The government abandoned the Clipper Chip proposal because a researcher found that an adversary who wanted encrypted calls that could not be decrypted could cheat and do so. It wasn’t worth incurring the security problems and expense of the Clipper Chip when it couldn’t reliably give the government the access it needed. The same is true with the GCHQ proposal: some programmers will make canary apps that detect or thwart the spying. The more high-profile the target, the more justified the exceptional access, the more resources and incentives the target will have to fight back against a secret government user.

The GCHQ proposal could be called “Clipper 2.” As with that discarded, flawed proposal, both citizens and government lose all while bad actors do as they wish with impunity. The GCHQ proposal introduces serious cybersecurity and public safety dangers without assuring government agents get the data they want. It creates an international surveillance free-for-all where smart criminals can decide to opt-out of government eyes while leaving the law-abiding without security. It permits and encourages brazen governments to move their international information security battles into the phones of every honest person everywhere in the world. Like Clipper, the Ghost User proposal must be put aside.

Further Reading

Here is some further reading on the issues in this essay.

The French Government “Tchap” app

Romain Dillet,""

, ""

Spyware, Malware, Stalkerware

Andy Greenberg, ""

Michael M. Grynbaum, ""

EFF’s Ghost Detector

Nate Cardozo and Seth Schoen, ""

Learn More About the Issues on This Page