People in glass houses shouldn’t brick phones. But that’s what Apple did to thousands of iPhone owners in September. Let’s take a closer look at this textbook example of two wrongs making for a really bad situation, from which no one emerged victoriously.
First, we have iPhone owners who wanted to either install application software of their choosing on their devices and/or unlock their devices to use on a different mobile carrier. It’s pretty likely we’ve all heard or read about the iPhone hacking that took place. The attackers are playing a game of cat-and-mouse that appears to be doing what they intended.
Second, we have Apple trying to protect their (and AT&T’s) commercial interests by attempting to prevent iPhone owners from doing either of the above. They issued a software patch that “bricked” unlocked iPhones — essentially rendering them unworkable.
From there, things went dreadfully bad, and there are at least a couple of important security lessons that we can learn from in examining things a bit further.
Here we have a mobile device that is owned by and is under complete control of the end users. On the other hand, the vendor is trying to control what can be done on and with the device—essentially a Digital Rights Management (DRM) issue. As is their prerogative, Apple issues software updates and patches from time to time. Among other things, some of these patches have clearly been intended to thwart the efforts of the iPhone hacking community.
Related Articles |
|
Mozilla Firefox vs. Internet Explorer: Which is Safer? Is the Mac Really More Secure than Windows? Mac vs. Linux: Which is More Secure? The Emerging Dell-Linux-Apple War
|
The problem with this approach, on the other hand, is the state of Apple’s own software security on the iPhone device. The iPhone is built on top of Apple’s Darwin (essentially UNIX) kernel, but it turns out that many of the software security lessons learned in the decades since UNIX’s dawn were largely ignored on the iPhone. For example, all of the applications on the iPhone run in a privileged (root) state on the device.
That is a precarious position to be in if their adversaries find even one software glitch…which they did.
At least one of the defects the iPhone hackers used was a vulnerability in the iPhone’s Mobile Safari web browser — a vulnerability that had been previously patched in Apple’s desktop version of the Safari browser at least a year prior. The vulnerability, a TIFF graphics rendering buffer overflow, gave away direct file system read and write access to the attackers. Had the browser been running as a normal user or (better still) in a “sandbox,” the same buffer overflow wouldn’t have been quite so catastrophic for Apple.
So, putting aside the political aspects of this battle and focusing on the technical ones, it seems that Apple was in a largely impossible situation. On one hand, they wanted to protect their intellectual property, but on the other, their own software security actions made that impossible.
What lessons can we learn from this that apply to more general circumstances? I can think of several:
• Don’t trust the client. I’ve discussed this one here more than once. The client, in this case the iPhone device itself, is under the control of its owners. The extent that they will go to in order to alter their devices is nearly limitless. At least, I don’t think we’ve seen what the limit is yet.
• If you’re going to try to restrict client activities, you’d better have your own software security in order. Bricking people’s phones is tantamount to a launching a volley across the bow of your adversary. It’s a deliberate and unfriendly action to take. Even if you agree with their right to do this, it’s a major escalation. If you’re not prepared for what your adversary will do next, escalating a conflict is not a good idea. Didn’t we learn that on the elementary school playground?
• Today’s mobile devices more closely resemble general-purpose computers than they do the old “dumb” phones of the past. We’ve got to treat their software like we should the software on a general-purpose computer. Those old security principles that Saltzer and Schroeder taught us in the 1970’s really do turn out to be important! The principle of least privilege, in particular, should have been applied judiciously here. Shame on Apple.
We are left with a situation that annoyed many customers and didn’t accomplish what the vendor tried to do. That’s quite a comprehensive failure from where I sit. Now, it seems that the situation is improving a little bit, with Apple announcing an upcoming software development kit that will open up iPhone application software to other developers. Clearly, this is a good step in the right direction. Let’s hope they’ve made use of some of UNIX’s security mechanisms by then.
Let’s further hope other product vendors have been paying close attention to this and won’t make the same silly mistakes in the future. But don’t ask me to take that bet.