Apple's goto fail needs a massive culture change to fix - ZDNet

"If that Apple SSL thing happened to Microsoft, literal s*** would be pouring down on Redmond right now. Pouring," tweeted @explanoit on Monday. And, as Kyle Maxwell added soon after, "Gates would be holding public executions in the courtyard". Both of these people show signs of knowing a bit about security. Both are, at least metaphorically speaking, 100 percent correct.


Thousands of words have already been written about Apple's little coding oopsie, so I'll just summarise things before moving on to my key point: Apple seems to have a serious cultural problem.


Secure Sockets Layer (SSL) authentication wasn't working in either iOS or OS X. A vast amount of software running on iDevices and Macs believed that their encrypted connections were connecting to the right place, and were being given the visible padlock of security, when they may not have been. Key SSL tests simply weren't being done. Apps could well have been connecting somewhere else — including to an impostor executing a "man in the middle attack", decrypting and monitoring users' data before re-encrypting it and passing it on to the correct destination.


It is of course hilarious that the actual error consisted of the repeated words "goto fail;".


The legendary computing scientist Edsger Dijkstra wrote about the risks of the goto statement in programming languages way back in 1968, in his famous letter Go To Statement Considered Harmful — the text of which is available online, both in the original 1960s-style formatting (PDF) and more modern typography. "The go to statement as it stands is just too primitive, it is too much an invitation to make a mess of one's program," he wrote. Dijkstra instead promoted the discipline of structured programming.


Even though I was indoctrinated in structured programming, I don't think the goto is the real problem here. Anyone can screw up code with an ill-judged copy-paste or a slip of the mouse. We've all been there, right? Pointy-haired managers, think "reply all".


But Apple needs to answer some serious questions.


Why wasn't this broken code spotted by some sort of review process before it ended up in a software build? After all, this sort of mistake can even be picked up by various automated code analysis tools, let alone by human reviewers.


Why wasn't the failure picked up in the testing phase, before the software was published? After all, testing that each step in a security authentication process still works is kind of important.


Why was a patch for iOS released, thereby revealing the existence of the problem and giving security researchers good and evil the opportunity to reverse engineer it and see whether the problem also existed in OS X — which it did — before that operating system was also patched? After all, both operating systems are produced by the same company. Don't these people talk to each other?


I think we have some cultural problems here, folks.


The apparent lack of communication between the iOS and OS X teams is bad enough. But what's far more worrying is how such a serious error could have escaped detection — let's skip the more tinfoil-oriented explanation that it was a deliberate "mistake" to help the NSA, and a programming error gives Apple plausible deniability — and how the impact of the error is magnified by Apple's complete lack of transparency when it comes to security issues.


"For the protection of our customers, Apple does not disclose, discuss, or confirm security issues until a full investigation has occurred and any necessary patches or releases are available," says Apple. Which means it may know full well about unpatched vulnerabilities, but even if they're being actively exploited, you won't know about them.


Nothing must tarnish the image of Apple's pretty, pretty garden, even if beneath the surface it's rotten. Or poisoned.


That's why I agree with Eugene Kaspersky, head of Kaspersky Lab, who nearly two years ago wrote that when it comes to security, Apple is 10 years behind Microsoft. At the time, I called him a "glorious global megatroll" for that suggestion, but also wrote that Apple's supposed invulnerability is a myth based on ancient history.


Back when Windows was vulnerable to myriad viruses and worms, Bill Gates issued his Trustworth Computing memo and Microsoft completely re-engineered the way it made software. The Security Development Lifecycle (SDL) methodology was the result. Windows was dramatically improved — well, at least from a security standpoint — so much so that the attackers moved up the stack and tore Adobe's products a new one.


Apple's goto fail is a clear sign that the magic garden needs weeding — or even a good dose of Agent Orange, rather than endless Kool-Aid. But the first step in fixing a problem is admitting that it exists, and Apple has yet to do that. It seems that when it comes to security, Apple still couldn't find its butt with both hands. Perhaps it should be using Apple Maps to help. No, wait.


Disclosure: Stilgherrian has travelled to US security events twice as Microsoft's guest, including a briefing on SDL. He uses a MacBook Pro, having been primarily a Mac user since 1985, and an Android phone.






via apple - Google News http://ift.tt/1ei1YEj

0 comments:

Post a Comment