On October 22, the former general counsel of the FBI Jim Baker published a lengthy and astonishing piece called “Rethinking Encryption.” In that article, the conservative-leaning current director of national security and cybersecurity at the R Street Institute advised the Justice Department and law enforcement to “embrace reality and deal with it” when it comes to encrypted communications.
Running counter to the now decades-long on-again and off-again pursuit by the Justice Department and law enforcement for a backdoor that would allow access to encrypted communications, Baker wrote that encryption “is one of the few mechanisms that the United States and its allies can use to more effectively protect themselves from existential cybersecurity threats, particularly from China. This is true even though encryption will impose costs on society, especially victims of other types of crime.”
What triggered Baker to write the piece is the recently renewed push by the Justice Department under William Barr to raise again the idea that law enforcement is “going dark” thanks to the rise of end-to-end communications encryption, unable to track terrorists and predators as they carry out their misdeeds. After a three-year relative silence by federal law enforcement, Barr gave a speech on July 23 in which he called for “lawful access” to encrypted communications saying that so-called “warrant-proof” encryption is “seriously impairing our ability to monitor and combat domestic and foreign terrorists.” He asked Silicon Valley to come up with technological solutions, warning that a significant incident would sooner or later “galvanize” public opinion against encryption.
In early October, the Justice Department sent a letter to Mark Zuckerberg asking Facebook not to proceed with its end-to-end encryption plans for its Messenger service after the US agreed with the UK to allow the two countries’ respective law enforcement agencies to demand electronic data regarding serious crimes. The next day, the Justice Department held what it called a Summit on Lawful Access, during which Barr and FBI Director Christopher Wray raised again the need for some encryption solutions that would give law enforcement access to secured communications.
Although Baker in his piece spells out a number of good reasons why he thinks the feds should just give up on the notion of encryption backdoors, he hits the nail on the head when he writes “there is no law that clearly empowers governmental actors to obtain court orders to compel third parties (such as equipment manufacturers and service providers) to configure their systems to allow the government to obtain the plain text (i.e., decrypted) contents of, for example, an Android or iPhone or messages sent via iMessage or WhatsApp.”
Baker doesn’t explicitly cite it, but one significant impediment to the government’s ability to compel third parties to alter their systems is the First Amendment right to free speech, specifically the free speech rights of equipment makers and service providers themselves. Despite his omission of a First Amendment analysis, there is a long history of complex legal developments that wrestle with the notion of whether encryption is entitled to First Amendment protection and whether encryption code or technology qualifies as “speech” entitled to protection from government interference.
Starting in 1993, the National Security Agency (NSA) promoted what it called the “Clipper Chip,” an encryption chip with built-in backdoors. NSA wanted communications companies to use the Clipper Chip for voice communications to intercept potential international criminals and terrorist operatives. The Clinton Administration advocated for the Clipper Chip, saying law enforcement needed such a backdoor to track criminals and terrorists.
Although the problems that led to the ultimate demise of the Clipper Chip were primarily technological and privacy-related, not to mention telecommunications providers’ failure to adopt the chip, legal scholars and even some government attorneys at the time raised constitutional objections on the First, Fourth and Fifth Amendments ground about the technology.
Then, a famous case brought in 1995, known as Bernstein v. United States, involving a UC Berkeley student Daniel Bernstein who wanted to publish encryption code, ultimately produced a ruling in the Ninth Circuit Court of Appeals four years later that software source code is indeed speech protected by the First Amendment.
In 2016, Apple fought a high-profile battle with the FBI under James Comey over the Bureau’s demands that the iPhone giant be required to develop a system for bypassing the security features of the iPhone used by the San Bernardino shooter. A federal judge supported the FBI in its request and ordered Apple to help the Bureau.
In fighting that judge’s order, Apple laid out, among other things, a strong First Amendment case against complying with the FBI’s demands, saying that forcing the company to write software that would neutralize safety features was tantamount to government-compelled speech, a clear violation of the First Amendment. “Under well-settled law, computer code is treated as speech within the meaning of the First Amendment,” Apple’s attorneys wrote in their motion to vacate the federal judge’s order.
Of particular importance in the history of jurisprudence surrounding this issue is the fact that the FBI ordered Apple to sign its safety-evading code using its proprietary encryption methods cryptographically. “This amounts to compelled speech and viewpoint discrimination in violation of the First Amendment,” Apple argued, saying that “[t]he Supreme Court has made clear that where, as here, the government seeks to compel speech, such action triggers First Amendment protections.”
Although Apple’s argument was not tested any further because the FBI abandoned its pursuit once it used other methods to access the San Bernardino shooter’s phone, the question arises again in light of Barr’s backdoor push: Would any effort by the federal government to force tech companies to build encryption backdoors be doomed to fail on First Amendment grounds?
To help answer that question, CSO talked with one of the country’s top First Amendment lawyers, Bob Corn-Revere of Davis Wright Tremaine, whose history on this question goes back to the Bernstein case, on which he worked with the Electronic Frontier Foundation in representing Bernstein. (Before that, Corn-Revere represented Phil Zimmerman, the creator of the Pretty Good Privacy program, the first widely available program implementing public-key cryptography. In the early 90s, Zimmerman ran into hot water for allegedly violating the Arms Export Control Act, which barred the export of cryptographic software because it was then considered a munition. The government ultimately dropped the case.)
“It’s a very complicated subject,” Corn-Revere says. “The issue comes back in various forms over time.”
Despite Apple’s arguments in the San Bernardino case, “there was quite a bit of disagreement, even among First Amendment experts, about whether or not forcing Apple to create a program to decrypt the phones was a First Amendment issue. It’s not settled.”
From Corn-Revere’s perspective, though, “If you force someone to create a program, you are compelling speech, and compelled speech is no different from banning speech in First Amendment terms.”
Others, however, would argue that cryptographic software is functional and that the closer some activity gets to the conduct side of the line, as opposed to the speech part of the line, people will find less First Amendment protection, Corn-Revere says.
Barr, in his defense of the lawful access concept, relies heavily on the Fourth Amendment right against unreasonable searches and seizures, which inherently balances the “individual citizen’s interest in conducting certain affairs in private and the general public’s interest in subjecting possible criminal activity to investigation,” he said during his July speech. On those grounds, Barr said, tech companies need to compromise, need to weigh the risk of insecure communications against the need to protect the public.
Corn-Revere, however, argues that unlike the Fourth Amendment, the First Amendment, and the Fifth Amendment right against self-incrimination, don’t as written have balancing requirements. For example, compelling someone to speak, compelling someone to take an oath are unconstitutional without question and with no balancing factors, at least in terms of the plain language of the constitution.
While the government and courts have grappled with the introduction of new technology in weighing First Amendment rights, particularly regarding broadcast and cable television, when it comes to the internet, courts have granted it full First Amendment protection. “What’s interesting about the internet is that it was the first technology to come along, and the first time the Supreme court looked at it, it said, ‘Oh, yeah, it gets full protection,’” Corn-Revere says, referring to the case of Reno v. ACLU.
“It is a new and complicated area. I think courts are reluctant to draw hard lines that say government can never do X or Y,” Corn-Revere says. From a pure Fourth Amendment perspective, the government has a better shot of requiring encrypted communications access. “It is a complicated question and one that is at this point, relatively untested in courts.”
This story, “US Department of Justice push for encryption backdoors might run afoul of First Amendment” was originally published by