Reuters today alleged Apple did bow to pressure from the Federal Bureau of Investigation (FBI) which reportedly demanded that the Cupertino firm drop plans to roll out end-to-end encryption for iCloud device backups, claiming doing so would harm investigations.
Although Apple withstood pressured from the FBI which in 2016 wanted it to add a backdoor to iOS to bypass code which limits password guesses to ten consequential attempts, it’s never employed end-to-end encryption for iOS device backups in iCloud, and now we know why.
From the report:
The tech giant’s reversal, about two years ago, has not previously been reported. It shows how much Apple has been willing to help US law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers’ information.
According to a former Apple employee, the Cupertino tech giant was motivated to avoid bad PR and didn’t want to be painted by public officials as an enterprise that protects criminals.
“They decided they weren’t going to poke the bear anymore,” the person said. Another employee said, “legal killed it, for reasons you can imagine”.
Apple openly admits to providing iOS device backups from iCloud to law enforcement agencies, according to the company’s latest Transparency Report:
Examples of such requests are where law enforcement agencies are working on behalf of customers who have requested assistance regarding lost or stolen devices. Additionally, Apple regularly receives multi-device requests related to fraud investigations. Device-based requests generally seek details of customers associated with devices or device connections to Apple services.
Here’s what end-to-end encryption would mean in terms of keeping iOS device backups in iCloud safe from government requests, according to 9to5Mac’s Benjamin Mayo:
End-to-end encryption works by making an encryption key based on factors that are not stored on the server. This may mean entangling the key with a user password or some cryptographic key stored on the hardware of the local iPhone or iPad. Even if someone hacked into the server and got access to the data, the data would look like random noise without having the entangled key to decode it.
Apple currently stores iCloud backups in a non end-to-end encrypted manner.
This means that the decryption key is stored on Apple’s servers. If a police entity comes to Apple with a subpoena, then the company has to give over all of the iCloud data — including the decryption key. This has further rounds of ramifications. For instance, whilst the iMessage service is end-to-encrypted, the conversations stored in an iCloud backup are not.
In other words, even though the Messages in iCloud feature that keeps your messaging synced between devices uses end-to-encryption, it becomes meaningless if you enable iCloud Backup because your device backup then includes a copy of the key protecting your Messages.
According to Apple, this ensures you can recover your Messages should you lose access to iCloud Keychain and all the trusted devices in your possession. “When you turn off iCloud Backup, a new key is generated on your device to protect future messages and isn’t stored by Apple,” the company claims in a support document on its website outlining iCloud security.
Moreover, Apple employs iCloud end-to-end encryption selectively for things like your calendar entries, the Health database, iCloud Keychain and saved Wi-Fi passwords, but not your photos, files in your iCloud Drive, emails and other categories.
In spite of FBI officials’ recent attempts to accuse Apple of helping terrorists and sexual predators by refusing to “unlock” iPhones, Forbes reporter Thomas Brewster revealed that the law enforcement agency has used GrayShift’s GrayKey tool to obtain data from a locked iPhone 11 Pro Max during a recent criminal investigation.
That doesn’t mean that Apple’s latest iPhones are inherently insecure or prone to hacking with tools like the GrayShift device or the Cellebrite software, it just means that iOS can be hacked. By no means does it mean that the embedded Security Enclave cryptographic coprocessor that controls encryption and evaluates a passcode or Face ID/Touch ID has been compromised.
What tools like GrayKey do is guess the password by exploiting flaws in the iOS operating system to remove the limit of ten password attempts. After removing this software, such tools simply take advantage of a brute-force attack to automatically try thousands of passcodes until one works.
Jack Nicas, writing for The New York Times:
That approach means the wild card in the Pensacola case is the length of the suspect’s passcode. If it’s six numbers — the default on iPhones — authorities almost certainly can break it. If it’s longer, it might be impossible.
A four-number passcode, the previous default length, would take on average about seven minutes to guess. If it’s six digits, it would take on average about 11 hours. Eight digits: 46 days. Ten digits: 12.5 years.
If the passcode uses both numbers and letters, there are far more possible passcodes — and thus cracking it takes much longer. A six-character alphanumeric passcode would take on average 72 years to guess.
It takes 80 milliseconds for an iPhone to compute each guess. While that may seem small, consider that software can theoretically try thousands of passcodes a second. With the delay, it can try only about 12 a second.
Your key takeaway should be that the 80 millisecond processing time for passcode evaluation cannot be bypassed by hackers because that limitation is enforced in hardware by the Secure Enclave.
So, what does all of the above amount to?
As noted by Daring Fireball‘s John Gruber, if you’re concerned about your phone being hacked, use an alphanumeric passphrase as your passcode, not a 6-digit numeric passcode.
And when it comes to encryption, Apple contends that it cannot and will not subvert the encryption on the device, noting the following in its latest Transparency Report:
We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers. Today, law enforcement has access to more data than ever before in history so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users’ data.
Circling back to the Reuters story, I’m expecting additional attempts by the US government in an effort to gather public support for making encryption illegal.
How do you feel about the latest Apple vs. FBI, and encryption in general?
Let us know in the comment down below!