The Heartbleed bug appears to be a contender for Greatest Internet Security Fuckup of all time, but it also fits in the category of Here We Go Again.
At this point discoveries of massive data breaches have become pretty much routine. A study cited by Farhad Manjoo in the New York Times reported that 814 million data records were exposed in 2013, and that was before Heartbleed was outed.
It’s not data breaches, per se, that concern me, however. Rather, I’d like to point out a couple of Heartbleed’s broader implications.
The first is that we miss Heartbleed’s most important lesson if we think it applies only to the question of Internet security. The designers of all sorts of technologies regularly assure us, and themselves, that their machines and their systems are absolutely safe and secure. Just as regularly we discover they’re wrong. Recent examples include the Deepwater Horizon oil spill, the Fukushima meltdowns, the chemical leak that poisoned West Virginia’s Elk River and Michael Lewis’s revelations about how the stock market has been rigged by flash traders.
Deepwater Horizon |
Synthetic biology is especially relevant here because its advocates are consciously basing, at least in part, both their hopes for advancing the technology and their confidence in its safety on the open source computing model that gave us Heartbleed.
The thinking in both cases is that there’s safety in numbers. Open source advocates argue that more people looking at computer code makes it more likely that openings for hackers will be prevented or closed. Synthetic biologists argue that the more people who know how to manipulate strands of DNA, the more prepared we’ll be to respond to the accidental release of harmful organisms or a bioterrorist attack.
Manjoo argues that the Internet is less likely to correct its lapses than other large-scale industries have been because of its ubiquity, its complexity and its interdependence, and also because the human beings getting rich off the web pay more attention to building the applications that will make them rich than they do ensuring that those applications are safe. I agree with all those contentions except that the Internet is different from other large-scale industries, and the suggestion that industries become "locked down" as they mature.
Manjoo credits Upton Sinclair’s The Jungle and Ralph Nader’s Unsafe at Any Speed with helping alert the public and lawmakers to unsafe conditions in the “chaotic, unruly days” of the meatpacking and automobile industries, respectively. (Never mind that Nader’s book came out in 1965.) Those unsafe conditions have since been rectified, Manjoo says by “a combination of regulation and industrywide cooperation.”
Mary Barra, CEO of General Motors, is sworn in to testify before the House Energy and Commerce subcommittee on Oversight and Investigation |
Well, yes and no. Certainly there have been improvements, but Manjoo seems not to have noticed that executives of General Motors have spent a lot of time recently testifying before Congress, trying to explain why they failed to correct a flaw in their ignition systems that killed 13 people over the past decade. And if Manjoo thinks The Jungle solved the problems in the meatpacking industry, he hasn’t read Eric Schlosser’s Fast Food Nation.
The point — painfully obvious yet persistently ignored — is that breakdowns in large-scale technological systems are inevitable, and breakdowns produce consequences. The complexity, ubiquity and interdependence of those systems contribute to that inevitability, as do the oversights and mendacities of the human beings who design and run them (not to mention the oversights and mendacities of the human beings who use them). The scale of consequences will vary from insignificant to catastrophic, but there will be consequences.
This brings me to the second point I’d like to make about Heartbleed.
In the days after the bug was revealed, most of the blame was pinned on the open source engineers who failed to detect it before the updated software was released. This seems to be the sort of explanation intended to make us all feel better. If only proper procedures had been followed, everything would have been fine.
Again, though, the oversight that opened the Heartbleed door is hardly an isolated incident. You don’t have to be a volunteer working on open source software for free to miss a flaw that will cause problems. Plenty of paid professionals do, too. Indeed, the Heartbleed bug was overlooked for more than two years by any number of major companies and institutions using the OpenSSL software that carried it, among them Google, Amazon, Cisco, Facebook, Netflix, Yahoo, the Pentagon and the FBI.
“Given enough eyeballs, all bugs are shallow,” says open source software guru Eric S. Raymond.
The Heartbleed fiasco shows that the bug that lays you low can hide in plain sight, no matter how many people are looking. It also affirms Farhad Manjoo's point that Internet companies pay more attention to profits than they do to security.
All systems are fallible, and all systems are vulnerable. Anyone who says different is lying.
###
Note: This post originally stated that the Heartbleed bug was discovered by an engineer at Google. Other reports tell a different story, so I've eliminated the reference.
Earlier posts related to this subject can be found here, here and here.
Image credits: Heartbleed tshirt: Martin Mulazzan. Eyeball: Thinkstock
©Doug Hill, 2014
No comments:
Post a Comment