That's the question a BetaNews reader asked me earlier today, when forwarding news that esteemed researcher Charlie Miller had gotten the shaft from Apple. Miller released an app that exposed a serious security flaw in iOS. His reward: Banishment from Apple's developer program, for one year. Perhaps longer.
I asked colleague Ed Oswald to write the news story. My followup here seeks to answer the question asked by the reader: "Can you imagine if Microsoft took this approach?" No, because that would go against Microsoft's security policies. But I can imagine the response had Microsoft done something like this -- punish a respected researcher for bringing a major security flaw to its attention. Vilification. Condemnation. Damnation. In blogs. In news commentaries. On social networks. And Apple? There is little noise at all. Once again Apple can do no wrong.
From the most straightforward perspective, Miller got what he deserved from Apple. He released to the App Store a stock app that actually gave him broad control over devices. He was pretty much free to do what he wanted with them, although he chose to do nothing. That he released an app that did something other than its stated purpose violated Apple's terms of service. The app also made iOS devices liable to attack and seizure, although it can be argued, and Miller does, that the vulnerabilities already were there, which was the point of the app.
But the rogue app reveals something more, which makes Apple's seeming retribution all the more egregious. Apple is renown for secrecy, particularly about new products and their development. But there also is secrecy about security matters -- security by stealth. Apple discloses little about its products' security problems, other than scant info when releasing occasional updates. According to the company's security website: "For the protection of our customers, Apple does not disclose, discuss or confirm security issues until a full investigation has occurred and any necessary patches or releases are available".
That reads to me like a policy of denial, or better stated non-denial, non-admittance. Another reason for Apple secrecy: Managing perceptions -- the company's public image. Secrecy serves that objective exceptionally well, particularly around sensitive security matters. Apple's products are perceived to be more secure -- Mac OS better Windows, iOS more than Android -- which secrecy helps preserve. For developers who can't keep their silence, and most do (now why is that), there is punishment.
It's Miller Time
This isn't the first time Miller broke Apple's unwritten code of developer silence and doing so embarrassed the company. In July he exposed something simply unbelievable -- a security vulnerability affecting, get this, Apple laptop batteries. Yeah, batteries. Apple left unchanged the default passwords used for accessing and programming the batteries, which savvy developers could exploit in their applications. Miller published a paper showing how. Apple might as well have set all its employee passwords to "password".
That's just a recent incident, and from an accomplished security researcher. In a July BetaNews post, Larry Seltzer wrote: "Charlie Miller is one of the best-known characters in the vulnerability research business. For years he has been famous as the only person to do serious research on Apple products, especially on Macs. He has a shelf full of Pwnie awards to show how good he is at it".
Miller also is a consecutive-year Pwn2Own winner -- iPhone 4 in 2011 and MacBook Pro via patched Safari browser the three previous years. Perhaps Apple had enough of his exploits and the rogue iOS app is excuse to banish him.
But what if Apple wasn't so secretive? Would Miller need, or want, to release an iOS-compromising app? He has been very active on Twitter this week. "I did violate the ToS, but I doubt the ToS let's me do any of the crap I do. So why boot me now?" Miller asks.
It's common practice -- and one that Microsoft was instrumental establishing -- that researchers disclose vulnerabilities to companies first. The policy is sound, for protecting everyone from flaws being exploited before their patches. Not all security researchers abide by the practice, while others will disclose if they feel the vendor has been unresponsive.
One-and-a-Half-Way Street
In the case of the vulnerability Miller uncovered, he saw demonstration as the way to prove it. "Without a real app in the AppStore, people would say Apple wouldn't approve an app that took advantage of this flaw", he tweets two days ago. Apple is known for its strict App Store approval policy and perception that rogue or unsigned code can't sneak through. Miller also proved that perception wrong. But in doing so, he also didn't disclose to Apple beforehand. He explains: "I did contact Apple about vuln 3 weeks ago. Didn't tell'em there was an app. App has been in store since Sept."
Disclosure is fundamental to the work security researchers do with major developers. But with Apple, disclosure is a one-and-a-half-way street. The company expects developers to disclose what they find, yet doesn't publicly reveal as much -- certainly not like Microsoft.
Microsoft's security policy changed in the early Noughties. The company asked researchers to not publicly reveal security flaws before they were patched. Microsoft agreed to give these researchers credit for their discoveries when disclosing the vulnerabilities. The company expanded the role of the Microsoft Security Response Center, with a mandate of transparency and disclosure and release of updates on a regular monthly schedule. The amount of security information Microsoft releases is staggering compared to Apple. Security by stealth isn't sound policy.
You tell me, IT managers dealing with Apple and Microsoft software, do you feel the company's give you equally adequate security information -- ranging from best practices to vulnerability disclosure. It's hard to conceive of Microsoft doing the same to a researcher of Miller's stature for exposing a major OS vulnerability.
Former hacker turned security consultant Kevin Mitnick offers Miller advice in a tweet yesterday: "So tell Apple they are on a 1 year suspension for being notified first before you publish their vulnerabilities".