One of the things that Apple appears to pride itself with in its many marketing campaigns is user privacy and security. But does the Cupertino-based company and its staff actually hold these values as dearly and close to the heart as they lead on?
Many Apple fans wouldn’t think twice about answering the above question with a firm “yes,” or at the very least, argue that Apple is doing more in this space than other tech companies. But others would oppose such answers and argue an entirely different point of view, and perhaps for a good reason.
In an exclusive correspondence with 08Tc3wBB, we were graciously enlightened by a security researcher’s perspective on the matter. The findings may open your eyes to a totally new viewpoint, especially if you ordinarily stand behind Apple’s marketing claims.
If you didn’t already know, 08Tc3wBB is a hacker and security researcher that frequently makes headlines after finding security vulnerabilities in Apple’s operating systems. One of his most recognizable achievements was an exploit used by the Odyssey and unc0ver jailbreaks to pwn various versions of iOS 12 & 13. More recently, he announced the discovery of a 0-day vulnerability for iOS 15, however there are no immediate plans to release it publicly.
That impressive track record aside, let’s return to the point at hand…
Apple’s marketing is just that… marketing
Apple embellishes the security of its many platforms, including iOS, iPadOS, and macOS on a dedicated web page. The company cites all sorts of security mechanisms designed to “maximize” user security, including the hardware itself, the software that runs on it, curated app and service offerings, and end-to-end data encryption.
But does any of that stuff even hold any water if Apple’s operating systems don’t receive timely and adequate security patches soon after security researchers report their issues to the company?
08Tc3wBB is of the mindset that Apple makes a genuine effort to keep its users secure, and while I believe that’s true to an extent, one thing 08Tc3wBB and I both agree on is that Apple could benefit from establishing internal company practices that reward higher quality bug fixes that will improve user security.
The vulnerability that never should have existed
In October of this year, 08Tc3wBB was awarded $52,500 by Apple after reporting a critical kernel vulnerability affecting M1 chip-equipped Macs that, if exploited, would have granted an attacker read and write privileges to the device’s kernel memory. This bug was officially fixed in May, but it’s worth noting that Apple was made aware of it several months earlier.
Exploits exist in virtually all hardware and software combinations, so the existence of one isn’t much of a concern. The concern instead involves with how 08Tc3wBB successfully based much of the M1-based vulnerability on details that were discussed in a post published to the Zimperium blog in mid-2017.
The blog post, authored by Adam Donenfeld, details at least seven different kernel-level bugs associated with a little-known driver module dubbed AppleAVE that, for whatever reason was, “neglecting basic security fundamentals, to the extent that the vulnerabilities…were sufficient enough to pwn the kernel and gain arbitrary read/write and root.”
Apple allegedly patched these vulnerabilities later on and assigned CVE ID numbers to them. This typically happens when Apple publishes an ‘about the security content of…’ page referencing a specific software update, and you’ve probably seen one of these yourself immediately following a software update for your own iPhone or iPad.
But in this case, 08Tc3wBB describes how Apple merely obfuscated the vulnerabilities by hardening the sandbox rather than fixing them outright. Upon the release of new sandbox escape bugs that re-enabled access to those vulnerabilities in 2019, 08Tc3wBB has since reported a slew of different vulnerabilities that are directly related to that same AppleAVE driver module.
In total, 08Tc3wBB earned $315,500 in bounties from various sources including SSD Secure Disclosure, ZecOps, and Apple, after he was able to re-access and exploit the obfuscated AppleAVE driver module. Apple finally addressed the module’s poor security implementation in a recent update, but what took the company so long to do so?
The circumstances might sound familiar if you remember the dyld overlapping segment problem exploited by the evasi0n jailbreak for iOS 6, as this tool’s exploit was only partially fixed up until iOS 9.2. It was abused repeatedly by evasi0n, Pangu, TaiG, and others with only minor adjustments to the exploitation logic over the course of iOS 6, 7, 8, and 9. This happened because Apple’s fixes were ineffective until they were taken more seriously later on.
Starting to connect the dots with the patterns described above? 08Tc3wBB certainly has…
Apple should try to boost morale regarding security research
These stories beg the question: if the AppleAVE driver module was as insecure as described from the very start, then why didn’t Apple’s security team address it properly in 2017 when it was originally disclosed? Perhaps more importantly, why did it take so many additional vulnerability reports related to the AppleAVE driver module and hundreds of thousands of dollars in bounty payouts before Apple’s security team officially took acceptable action?
The public may never discover the true answer to these questions since Apple’s security team staff are likely bound by non-disclosure agreements, but 08Tc3wBB is under the impression that motivation could have a lot to do with it.
Why, you ask? When a security researcher reports a bug in any of Apple’s platforms, they receive both public recognition and a considerable lump of money in return. But as highlighted by 08Tc3wBB, Apple’s security team staff don’t really receive the same type of credit, praise, or incentives for their continuous work. If that wasn’t unfortunate enough, stringent deadlines roll down the chain of command and pressure those same security team staff members, as would happen in any job.
From here, basic human psychology kicks in, and it’s easy to understand why those security team employees may not be as motivated as they could be to do their best work. Instead, the Apple’s current system incentivizes ‘good enough’ fixes in shorter periods of time so that Apple can say they ‘did something about it,’ and this doesn’t benefit the end user in any way.
A direct consequence is that users receive lower-quality, small-effort software patches that are easily and repeatedly exploited by determined security researchers who change their method of attack. In a properly incentivized system, users could receive well-thought security patches that would be substantially more difficult for hackers to bypass.
Will this ever change?
Reality certainly hits hard if you were of the mindset that Apple does everything they can to keep your data secure, but 08Tc3wBB believes things could change if Apple simply afforded their security team staff with the same levels of recognition, praise, and incentives that they provide to those who discover them. After all, it’s those staff members who dedicate their time and effort continuously countering the onslaught of security vulnerabilities.
08Tc3wBB went on to to tell us that Apple could and should be more transparent about the vulnerability patching process, especially toward the security researchers who report them.
Currently, Apple doesn’t provide any feedback to bug reporters regarding how they plan to patch a bug. But they should, because it’s often the security researchers themselves who can provide valuable perspective about how to effectively fix what they find.
It’s evident from Apple’s marketing language that the company takes security more seriously than a lot of other tech companies do. But based on @08Tc3wBB’s observations, it seems that the biggest weakness in Apple’s current bug-patching system is low morale.
To fix this, Apple should consider doubling down on efforts to boost said morale among its internal software security staff with incentives that reward satisfactory and timely bug fixes. This might include working with external security researchers to enhance the quality of those security patches and offering worthwhile recognition for all the hard work that goes into making this all possible.
Our discussions with 08Tc3wBB were indeed insightful. We’d certainly like to hope that Apple discovers a meaningful way to address these concerns, and not just a meaningful way, but the right way. Wouldn’t you agree? Share your thoughts about the matter in the comments section down below.