Apple had intended to make end-to-end encryption of an entire device’s data, which would then be uploaded to iCloud, available to customers. But then the FBI stepped in and put the kibosh on those plans.
The problem, according to law enforcement: Fully locked-down iPhones could be a roadblock to investigations, like the probe into a Saudi Air Force officer who shot three people dead at a Pensacola, Florida naval base last month.
U.S. Attorney General William Barr publicly asked Apple to unlock the two iPhones the shooter had in his possession. The company eventually did hand over backups from his iCloud account, but the whole ordeal shone a light on the back-and-forth dialogue going on between the U.S. government and tech companies that disagree about whether or not end-to-end encryption should be allowed. Just last month, both Democratic and Republican senators considered legislation to ban end-to-end encryption, using unrecoverable evidence in crimes against children as an example.
Apple had been planning to introduce end-to-end encryption for over two years and even told the FBI, according to a Reuters report that cited one current and three former Bureau officials, as well as one current and one former Apple employee. Shortly thereafter, the FBI’s cybercrime agents and its operational technology division came out as staunchly opposed to those plans because it would make it impossible for Apple to recover people’s messages for use in investigations.
“Legal killed it, for reasons you can imagine,” another former Apple employee told Reuters. “They decided they weren’t going to poke the bear anymore.”
In this case, the bear is the government. In 2016, a nearly identical showdown between the FBI and Apple took place after the two parties got into a legal battle over access to an iPhone owned by a suspect in the San Bernardino, California mass shooting.
The nixed encryption plans are a loss for iPhone users because end-to-end encryption is more advanced than today’s industry standard for security: basic encryption. Loads of companies use encryption, which basically scrambles the contents of a message or some other snippet of data, rendering it completely useless without the decryption key, which can unshuffle the jargon and restore the original.
Under this framework, a company usually has the cryptographic encryption key, which means the data isn’t truly safe if a government or hacker gets their hands on the key. End-to-end encryption, though, means only the, well, end computer—the one receiving the data—has the encryption key stored. In theory, that person’s computer could still be hacked and the encryption key could be forfeited, but it really reduces those odds.
But that limitation on who has access to the encryption key is the very crux of law enforcement’s issue with end-to-end encryption: If Apple doesn’t have the encryption key to access backups of a person’s iPhone on the cloud, then the government can’t access that data either.
Still, it’s not entirely clear that the government is to blame for this project being killed. It’s entirely possible Apple didn’t want to have to deal with the headache of its customers accidentally locking themselves out of their own data.
For the rest of the world’s smartphone users who rely on the Android operating system, end-to-end encryption is an option. Back in October 2018, Google announced that customers could use a new capability that would keep backed-up data from their phones completely locked down by using a decryption key that’s randomly generated on that user’s phone, using their lock screen pin, pattern, or passcode.
“By design, this means that no one (including Google) can access a user’s backed-up application data without specifically knowing their passcode,” the company wrote in a blog post. This end-to-end encryption offering is still available.