At the SysCan conference in Taiwan this week, security researcher Charlie Miller will describe a flaw he discovered in the iPhone’s web browser that allows a malicious app installed on the phone to download executable code from a remote server. Miller is well-known for finding security flaws in Apple software, and this latest instance could be the most serious flaw he’s uncovered yet. A hacker that sneaks an app that exploits this vulnerability into the App Store would essentially have free reign over the phones on which it is installed, including access to photos and contacts.
To prove his point, Miller did just that. He submitted and got approved a stock price ticker app called Instastock. Unbeknownst to users who installed it, the app called in to a server at Miller’s home in St. Louis, and from there Miller could control the compromised phones. When this came to light, Apple was irate. It took down the app from the App Store and suspended Miller from the iOS developer program for one year.
(MORE: 50 Best iPhone Apps of 2011)
The reaction to the news was swift and damning, with headlines such as: Apple kicks out developer for exposing security bug and So that’s what happens when you highlight an iOS security hole. Even Internet guru Lawrence Lessig tweeted to ask whether the Sixth Amendment’s guarantee of a trial by an impartial jury of one’s peers should apply. (Note to Lessig: This is not a criminal case.)
“It appears that Apple is the latest company to take a ‘kill the messenger’ approach to security vulnerabilities,” wrote Mike Masnick at Techdirt, encapsulating the mood. “The obvious implication: don’t search for security vulnerabilities in Apple products, and if you do find them, keep them to yourself.”
But should we really be so surprised by Apple’s reaction? The accepted norm among white hat hackers is “responsible disclosure,” which requires a researcher who finds a vulnerability to report it to the software author and give a deadline after which he will disclose his findings. This satisfies the hacker’s obligation to make the public aware of the vulnerability while also giving the software maker time to prepare a fix. In this case, Miller stretched the boundaries of the norm.
According to Miller, he did contact Apple about the vulnerability in mid-October, and for that information, Apple was no doubt grateful. Miller, however, did not tell Apple about his app, which had been in the App Store since September. The first time Apple heard about it was when he posted a video demonstrating the flaw last week.
Given that the iOS developer agreement clearly prohibits any “attempt to hide, misrepresent or obscure any features, content, services or functionality” in submitted apps, and given that other developers were no doubt watching, Apple did what it had to do and sanctioned Miller.
So let’s be clear: Apple did not ban Miller for exposing a security flaw, as many have suggested. He was kicked out for violating his agreement with Apple to respect the rules around the App Store walled garden. And that gets to the heart of what’s really at stake here–the fact that so many dislike the strict control Apple exercises over its platform.
article continues on next page…
Miller says that he had no choice but to sneak in his app to demonstrate that it could be done. “For the record, without a real app in the App Store, people would say Apple wouldn’t approve an app that took advantage of this flaw,” he tweeted.
The implicit argument is that the reasonable disclosure norm is not compatible with a closed platform like Apple’s. That’s debatable, but it shouldn’t be shocking when the logical consequence follows and Apple holds Miller responsible for his choice to break his agreement.
As similar App Store restrictions come to the Mac, we’ll surely see more stories like this one, with heartless Apple shutting out helpless developers in the name of keeping the walled garden unsullied. What we have to remember is that as strict as Apple may be, its approach is not just “not bad” for consumers, it’s creating more choice.
As New York Law School’s James Grimmelmann recently wrote about the Mac, “Apple is now giving users the best of both worlds, open and closed. Users who want the power of openness can install applications directly. Users who want the safety of closure can install applications from the Mac App Store.”
And as for iOS, as long as there are healthy competitors with open platforms like Android, the same is true. Consumers, developers, and security researchers all now have a choice about which universe they want to inhabit. This is a better state of affairs than an all-open or all-closed world.
For Miller, this means that he has to accept the consequences of breaking his contract with Apple, and it means the same thing for Apple—security researchers may well decide to suspend Apple for a year and release vulnerabilities without first informing it. Open and closed platforms are not intrinsically good or bad, but choice is certainly always a good thing.