First, we have iPhone owners who wanted to either install application software of their choosing on their devices and/or unlock their devices to use on a different mobile carrier. Its pretty likely weve all heard or read about the iPhone hacking that took place. The attackers are playing a game of cat-and-mouse that appears to be doing what they intended.
Second, we have Apple trying to protect their (and AT&Ts) commercial interests by attempting to prevent iPhone owners from doing either of the above. They issued a software patch that bricked unlocked iPhones essentially rendering them unworkable.
From there, things went dreadfully bad, and there are at least a couple of important security lessons that we can learn from in examining things a bit further.
Here we have a mobile device that is owned by and is under complete control of the end users. On the other hand, the vendor is trying to control what can be done on and with the deviceessentially a Digital Rights Management (DRM) issue. As is their prerogative, Apple issues software updates and patches from time to time. Among other things, some of these patches have clearly been intended to thwart the efforts of the iPhone hacking community.
Mozilla Firefox vs. Internet Explorer: Which is Safer?
Is the Mac Really More Secure than Windows?
Mac vs. Linux: Which is More Secure?
The Emerging Dell-Linux-Apple War|
The problem with this approach, on the other hand, is the state of Apples own software security on the iPhone device. The iPhone is built on top of Apples Darwin (essentially UNIX) kernel, but it turns out that many of the software security lessons learned in the decades since UNIXs dawn were largely ignored on the iPhone. For example, all of the applications on the iPhone run in a privileged (root) state on the device.
That is a precarious position to be in if their adversaries find even one software glitch which they did.
At least one of the defects the iPhone hackers used was a vulnerability in the iPhones Mobile Safari web browser a vulnerability that had been previously patched in Apples desktop version of the Safari browser at least a year prior. The vulnerability, a TIFF graphics rendering buffer overflow, gave away direct file system read and write access to the attackers. Had the browser been running as a normal user or (better still) in a sandbox, the same buffer overflow wouldnt have been quite so catastrophic for Apple.
So, putting aside the political aspects of this battle and focusing on the technical ones, it seems that Apple was in a largely impossible situation. On one hand, they wanted to protect their intellectual property, but on the other, their own software security actions made that impossible.
What lessons can we learn from this that apply to more general circumstances? I can think of several:
Dont trust the client. Ive discussed this one here more than once. The client, in this case the iPhone device itself, is under the control of its owners. The extent that they will go to in order to alter their devices is nearly limitless. At least, I dont think weve seen what the limit is yet.
If youre going to try to restrict client activities, youd better have your own software security in order. Bricking peoples phones is tantamount to a launching a volley across the bow of your adversary. Its a deliberate and unfriendly action to take. Even if you agree with their right to do this, its a major escalation. If youre not prepared for what your adversary will do next, escalating a conflict is not a good idea. Didnt we learn that on the elementary school playground?
Todays mobile devices more closely resemble general-purpose computers than they do the old dumb phones of the past. Weve got to treat their software like we should the software on a general-purpose computer. Those old security principles that Saltzer and Schroeder taught us in the 1970s really do turn out to be important! The principle of least privilege, in particular, should have been applied judiciously here. Shame on Apple.
We are left with a situation that annoyed many customers and didnt accomplish what the vendor tried to do. Thats quite a comprehensive failure from where I sit. Now, it seems that the situation is improving a little bit, with Apple announcing an upcoming software development kit that will open up iPhone application software to other developers. Clearly, this is a good step in the right direction. Lets hope theyve made use of some of UNIXs security mechanisms by then.
Lets further hope other product vendors have been paying close attention to this and wont make the same silly mistakes in the future. But dont ask me to take that bet.