If you sat down and made a list of prominent privacy and security activists, Michael Chertoff isn’t likely to make the cut. Chertoff is a co-author of the Patriot Act who served as the US Secretary for Homeland Security under George W. Bush. And this week, he told listeners at the RSA security conference that forcing Apple to build a version of its iOS that could unlock the device for the FBI was like forcing the company to construct a biological weapon.
Bet you didn’t see that coming.
Mike McConnell, the former director of the NSA, also spoke at the RSA conference and echoed Chertoff’s statements. That’s particularly interesting, because it was McConnell who led the NSA’s push to require the use of the so-called “Clipper chip” in the 1990s.
The Clipper Chip was a processor that the NSA first proposed in 1993. It was an encryption device with a built-in backdoor that would allow the United States government to decrypt and monitor any communication passing through the phone or other device. Each chip would contain a cryptographic key that was provided to the government when the device was built.
“Once you’ve created code that’s potentially compromising, it’s like a bacteriological weapon. You’re always afraid of it getting out of the lab,” Chertoff said.
He’s not the first to use a medical analogy. USA Today reports Tim Cook as saying the only way to give the FBI what it wants is to write software that Apple “view[s] as sort of the equivalent of cancer.”
Yet this week, McConnell stood on stage and stated “ubiquitous encryption is something the nation needs to have.”
When the Clipper chip was proposed, McConnell and other authorities who favored its use attempted to reassure the public by claiming the government would never abuse the privilege and that its use would be subject to strict judicial oversight. More than 20 years later, those assurances seem like quaint relics of another time. Had the Clipper chip been successfully deployed in the 1990s, there’s little doubt that the post-9/11 NSA would have used the capability to argue for complete surveillance of all smartphones, everywhere, arguing that Americans had no reasonable expectation of privacy given that they’d agreed to buy devices that contained the Clipper chip.
That key could then be used to decrypt communications passing through the device, once the government “established their authority” to listen to a communication. The proposal was defeated after enormous outcry from the general public and robust opposition from the tech industry. Much of that opposition hinged on the argument that no backdoored computer system can be made secure, and the knowledge of such a backdoor cannot be perfectly confined or kept.
Could the First Amendment shield Apple?
Switching gears slightly, the EFF has published an amicus brief it filed with the US District Court in California. It argues that forcing Apple to sign code that complies with the FBI’s wishes is a violation of the company’s free speech rights. Apple hasn’t yet pursued a defense on First Amendment grounds, but it’s possible the company will make such a claim in its appeal of last month’s decision.
I’m not sure this analysis will fly in a court of law, and it seems telling that Apple, to date, has not advanced this line of argument. Free speech is not a blanket defense against any and all government authority, and the FBI might argue that creating a tool for law enforcement to use is a private matter rather than a public statement of support. Apple is absolutely free to tell the public that it was compelled to create and use its software under extreme duress. The judge could also rule that while corporations have some freedom of speech, that freedom does not extend to a freedom to disobey the government in a criminal investigation where a warrant has been legally obtained.
The EFF first notes that computer code already enjoys the full protection of the First Amendment. Like music, code is considered “an expressive means for the exchange of information and ideas.” This is already settled law.
The question then becomes, is Apple’s refusal to write code that does what the FBI wants it to do a violation of the company’s right to free speech? This is where things get interesting. Under Tim Cook, Apple has taken a strong public position on encryption. It did not begin doing so when the San Bernardino case became public, but years before the shooting even occurred. Its current refusal to cooperate with the FBI in this case or in others around the country is not a development of the past few months, but the culmination of a strategy Apple has pursued for the past several years.
In order to fulfill the FBI’s demand, Apple must first write code, then digitally sign it. Here’s the EFF:
Apple’s code and digital signature, separately and together, affirm a commitment and belief regarding the authenticity of the code and the value of their customer’s privacy and security. The order compels Apple and its engineers to repudiate that belief, and undermine the very security they designed.
With prominent experts and advocates lining up on both sides, it’s going to be interesting to see how this plays out. If the FBI’s order stands, Apple will have been ordered to create an unlock system that can then be used by law enforcement around the country. If you consider the consequences, it’s hard to see how this doesn’t fall under the realm of “compelled speech,” but does the First Amendment apply? That’s unclear.