April 27, 2014

Ecotone: Renegotiating boundaries between humans, animals and machines

Cover image from New York Times Magazine, by Alex Prager

He hangs between; in doubt to act or rest
In doubt to deem himself a god or beast
                         – Alexander Pope

The cover story in the New York Times Magazine today is an article by Charles Siebert on the campaign to win legal recognition of the rights of animals. The piece focuses on lawyer Steven Wise's efforts to establish precedents in court that would make it illegal to imprison and enslave chimpanzees and, eventually, other intelligent animals, including whales, dolphins and elephants.

I bring this up here because Wise’s law suit represents half of a pair of parallel phenomena in today’s culture that fascinate me. On the one hand we seem to feel a need to reassess and redefine our relationship with animals. On the other hand we're trying to figure out our relationship with machines.

Underlying both feelings is a recognition that human beings aren’t as unique as we’d previously believed. More and more we’re learning that the thinking and emotional lives of various animal species are far more sophisticated than had long been assumed; more and more we’re aware that at some point in the not-too-distant future our intelligence may be equaled or outpaced by various forms of artificial intelligence. Advances in biology and engineering are disassembling, brick by brick, the walls of particularity we’ve held in place for millennia to preserve the specialness of homo sapiens.

Poster image from the 1977 movie version of H.G. Wells' "The Island of Doctor Moreau"

My own feeling is that the lengths many of us go to today to connect with animals — the attention lavished on dogs and cats, the whale-sighting tours, the resorts that let guests swim with dolphins — are manifestations of our desire to reestablish a vanishing bond with nature in an increasingly technological world. The problem isn’t only that we sense nature’s absence. On some intuitive level we also feel threatened by technology’s encroachment. The more lifelike our technologies become, the easier it is to see them as predators. For millennia we worried about becoming dinner; now we worry about becoming obsolete. Artificial intelligence enthusiasts joke that even when the machines take over, there will always be a place for us. As pets.

A concept that scholar Bruce Mazlish and others use to describe that uneasiness is called "The Fourth Discontinuity." It’s  based on a comment by Sigmund Freud. Through most of their history, Freud said, human beings were confident they were at the center of the universe. In the past 450 years, however, that cherished self-image has suffered three major blows. The first of these was the Copernican Revolution, when we learned that Earth is a satellite of the Sun, rather than the other way around. The second was Darwin's On the Origin of Species, which revealed that man was descended from apes. The third, Freud modestly contended, was his theory of psychoanalysis, which demonstrated that our thoughts and our behaviors are not entirely in our control, but instead are influenced by drives and conflicts hidden deep within our subconscious.

The Fourth Discontinuity is a recent addition to the list that takes into account how quickly machine intelligence has advanced in the past fifty years. Just as the second discontinuity acknowledged that we can no longer claim to be an order of nature distinct from and superior to animals, the fourth discontinuity holds that we can no longer claim to be an order of nature that is distinct from and superior to machines. The result is an ongoing identity crisis, one that may help explain why people seem so pervasively angry these days.

I talk about all this at more length in a chapter of my book called  “Ecotone.” The American Heritage Science Dictionary defines an ecotone as "a transitional zone between two ecological communities." The definition adds that each of the two overlapping ecological communities in an ecotone retains its own characteristics in addition to sharing certain characteristics with the other community. That speaks to our confusion about where the line between humans, animals and machines should be drawn. The etymology of "ecotone" implies as much: "tónos" is Greek for “tension."   

©Doug Hill, 2014

Norbert Wiener: When curiosity becomes worship

As long as automata can be made, whether in the metal or merely in principle, the study of their making and their theory is a legitimate phase of human curiosity, and human intelligence is stultified when man sets fixed bounds to his curiosity. Yet there are aspects of the motives to automatization that go beyond a legitimate curiosity and are sinful in themselves. These are to be exemplified in the particular type of engineer and organizer of engineering which I shall designate by the name of gadget worshiper.

Norbert Wiener, God and Golem, Inc., 1964

April 21, 2014

Heartbleed's Companions: There Will Be Consequences

The Heartbleed bug appears to be a contender for Greatest Internet Security Fuckup of all time, but it also fits in the category of Here We Go Again.

At this point discoveries of massive data breaches have become pretty much routine. A study cited by Farhad Manjoo in the New York Times reported that 814 million data records were exposed in 2013, and that was before Heartbleed was outed.

It’s not data breaches, per se, that concern me, however. Rather, I’d like to point out a couple of Heartbleed’s broader implications.

The first is that we miss Heartbleed’s most important lesson if we think it applies only to the question of Internet security. The designers of all sorts of technologies regularly assure us, and themselves, that their machines and their systems are absolutely safe and secure. Just as regularly we discover they’re wrong. Recent examples include the Deepwater Horizon oil spill, the Fukushima meltdowns, the chemical leak that poisoned West Virginia’s Elk River and Michael Lewis’s revelations about how the stock market has been rigged by flash traders.

Deepwater Horizon
We are similarly assured that we need not fear the even greater risks posed by such developing technologies as nanotechnology and synthetic biology, both of which have the potential to unleash unexpected consequences of Biblical proportions. (See Bill Joy’s famous article in Wired, "Why the Future Doesn’t Need Us,” for details.)

Synthetic biology is especially relevant here because its advocates are consciously basing, at least in part, both their hopes for advancing the technology and their confidence in its safety on the open source computing model that gave us Heartbleed.

The thinking in both cases is that there’s safety in numbers. Open source advocates argue that more people looking at computer code makes it more likely that openings for hackers will be prevented or closed. Synthetic biologists argue that the more people who know how to manipulate strands of DNA, the more prepared we’ll be to respond to the accidental release of harmful organisms or a bioterrorist attack.

The Heartbleed bug doesn’t prove that the open source philosophy is false, but it does demonstrate that it’s not perfect. As Farhad Manjoo put it, Heartbleed showed that “the Internet is still in its youth, and vulnerable to all sorts of unseen dangers, including simple human error. Today’s digital systems are complex and penetrate every corner of our lives. It is impossible to lock them down.” 

Manjoo argues that the Internet is less likely to correct its lapses than other large-scale industries have been because of its ubiquity, its complexity and its interdependence, and also because the human beings getting rich off the web pay more attention to building the applications that will make them rich than they do ensuring that those applications are safe. I agree with all those contentions except that the Internet is different from other large-scale industries, and the suggestion that industries become "locked down" as they mature.

Manjoo credits Upton Sinclair’s The Jungle and Ralph Nader’s Unsafe at Any Speed with helping alert the public and lawmakers to unsafe conditions in the “chaotic, unruly days” of the meatpacking and automobile industries, respectively. (Never mind that Nader’s book came out in 1965.) Those unsafe conditions have since been rectified, Manjoo says by “a combination of regulation and industrywide cooperation.”

Mary Barra, CEO of General Motors, is sworn in to testify before the House Energy and Commerce subcommittee on Oversight and Investigation

Well, yes and no. Certainly there have been improvements, but Manjoo seems not to have noticed that executives of General Motors have spent a lot of time recently testifying before Congress, trying to explain why they failed to correct a flaw in their ignition systems that killed 13 people over the past decade. And if Manjoo thinks The Jungle solved the problems in the meatpacking industry, he hasn’t read Eric Schlosser’s Fast Food Nation.

The point — painfully obvious yet persistently ignored — is that breakdowns in large-scale technological systems are inevitable, and breakdowns produce consequences. The complexity, ubiquity and interdependence of those systems contribute to that inevitability, as do the oversights and mendacities of the human beings who design and run them (not to mention the oversights and mendacities of the human beings who use them). The scale of consequences will vary from insignificant to catastrophic, but there will be consequences.

This brings me to the second point I’d like to make about Heartbleed.

In the days after the bug was revealed, most of the blame was pinned on the open source engineers who failed to detect it before the updated software was released. This seems to be the sort of explanation intended to make us all feel better. If only proper procedures had been followed, everything would have been fine.

Again, though, the oversight that opened the Heartbleed door is hardly an isolated incident. You don’t have to be a volunteer working on open source software for free to miss a flaw that will cause problems. Plenty of paid professionals do, too. Indeed, the Heartbleed bug was overlooked for more than two years by any number of major companies and institutions using the OpenSSL software that carried it, among them Google, Amazon, Cisco, Facebook, Netflix, Yahoo, the Pentagon and the FBI.

“Given enough eyeballs, all bugs are shallow,” says open source software guru Eric S. Raymond.

The Heartbleed fiasco shows that the bug that lays you low can hide in plain sight, no matter how many people are looking. It also affirms Farhad Manjoo's point that Internet companies pay more attention to profits than they do to security.

All systems are fallible, and all systems are vulnerable. Anyone who says different is lying.


Note: This post originally stated that the Heartbleed bug was discovered by an engineer at Google. Other reports tell a different story, so I've eliminated the reference.

Earlier posts related to this subject can be found here, here and here.

Image credits: Heartbleed tshirt: Martin Mulazzan. Eyeball: Thinkstock 

©Doug Hill, 2014


April 14, 2014

A Problem with Reason

"The feelings of our heart, the agitation of our passions, the vehemence of our affections, dissipate all the conclusions of reason."
David Hume

April 13, 2014

The Company They Keep

"On the whole, technical people come to share the perspective of those who wield power rather than those over whom the power is wielded, with managers rather than labor, with officers rather than soldiers. If for no other reason, this happens simply because technical people do their work almost exclusively with the former rather than with the latter, and come to share a world with them. But they have very little, if any contact with the others, about whom they typically remain woefully ignorant."
David F. Noble, Forces of Production