February 19, 2012

To Infinity and Beyond! (Annals of Childish Behavior™, continued)

Refusing to clean up after ourselves is a childish behavior that has polluted not only the planet but also outer space.

The New York Times reports today that so many pieces of space junk now surround the earth – the Air Force currently tracks 20,000 of them, some as large as a Greyhound bus – that low-Earth orbits will eventually have to be abandoned if some method of cleaning up the mess isn't found.

“It will be a huge risk for an astronaut to go to space,” said John L. Junkins, a professor of aerospace engineering at Texas A&M University, adding: “No one will insure a space launch.”

The Times article includes a link to a paper that provides a more technical description of what scientists call "the Kessler Syndrome," named for the former head of NASA's office of space debris. That paper concludes, "As is true for many environmental problems, the control of the orbital debris environment may initially be expensive, but failure to control leads to disaster in the long-term."

Is modern culture being overwhelmed by an epidemic of childishness? José Ortega y Gasset, writing in 1930, thought so. Annals of Childish Behavior™ chronicles contemporary examples of that epidemic. The childish citizen, Ortega said, puts "no limit on caprice" and behaves as if "everything is permitted to him and that he has no obligations."

February 18, 2012

O'Reilly and Me

Just a note to say that O'Reilly Radar picked up my Falling Man essay this week.


Annals of Childish Behavior™ (continued)

"At the root of the reality distortion field was Jobs's belief that the rules didn't apply to him. He had some evidence for this; in his childhood, he had often been able to bend reality to his desires. Rebelliousness and willfulness were ingrained in his character. He had the sense that he was special, a chosen one, an enlightened one….Even in small everyday practices, such as not putting a license plate on his car and parking it in handicapped spaces, he acted as if he were not subject to the strictures around him."
                                                             Steve Jobs, by Walter Isaacson

Is modern culture being overwhelmed by an epidemic of childishness? José Ortega y Gasset, writing in 1930, thought so. Annals of Childish Behavior™ chronicles contemporary examples of that epidemic. The childish citizen, Ortega said, puts "no limit on caprice" and behaves as if "everything is permitted to him and that he has no obligations."

Talking Technology!

The New York Times published a report on the start of the Japanese government's attempt to "rehabilitate" the more than 8,000 square miles of fields, mountains, forests, office buildings, schools, roads, homes, businesses – everything – contaminated by the post-tsunami meltdowns of the Fukushima Daiichi nuclear reactors last March. An initial $13 billion has been allocated for the project, but how effective it will be is anyone's guess: the instruction manual for scrubbing down a radioactive country has yet to be written.

“We are all amateurs,” said a worker wiping down windows at an abandoned school. “Nobody really knows how to clean up radiation.” 

The hope is that if the campaign is successful, 80,000 or more people displaced by the meltdowns may be able to return to their homes. Early eradication attempts have proved disappointing, however, in part because citizens have proved reluctant to allow tons of collected radioactive dirt to be stored in their communities. Another concern is that even if radioactivity can be cleaned from towns and villages, more radiation will be deposited there by the wind or rain or washed down from the surrounding hills. 

There's also considerable anger over the fact that the $13 billion allocated for the cleanup so far is going to the same giant construction companies that built the nuclear plants in the first place. "It's a scam," one critic told the Times. "The Japanese nuclear industry is run so that the more you fail, the more money you receive."

Photo credit: Reuters

Talking Technology!

According to the Centers for Disease Control, overdoses of prescription pain meds are up 90% since 1999 and accidental ODs now kill more Americans than car crashes.
                                        New York Daily News


February 10, 2012

RIP Roger Boisjoly: Some Underappreciated Lessons of the Challenger Disaster

Roger Boisjoly has died.

The name may not ring a bell, but Boisjoly's place in history certainly will: He was the engineer who tried in vain to persuade NASA that it was unsafe to launch the space shuttle Challenger on January 28, 1986.

The Challenger explosion remains today one of our most evocative images of technology gone wrong. This is due in part to the personal nature of the tragedy – the schoolteacher onboard, the family members watching – and in part to the subsequent revelations that NASA proceeded with the launch despite Boisjoly's warnings.

My intention here is not to rehash the chain of events that led to the Challenger's demise, but to show how some of those events demonstrate patterns of error that are commonplace – indeed, almost inevitable – in the operation of complex technological systems. 

These thoughts have been inspired mainly by the analysis of the Challenger explosion provided by Harry Collins and Trevor Pinch in their book, The Golem at Large: What You Should Know About Technology. Other key sources include Charles Perrow's Normal Accidents: Living With High-Risk Technologies and Jay Hamburg's reporting in The Orlando Sentinel.

I'll group the patterns to be discussed – let's call them Underappreciated Contributing Dynamics – in two categories, the first involving the question of certainty, the second involving the consequences of human interaction with machines.

Underappreciated Contributing Dynamic #1: There is no certainty.

The Challenger explosion is thought to have occurred because the O-rings separating sections of the booster rockets that powered the shuttle's ascent into space failed to seal properly. The failure of the seals allowed a tiny gap to form between the sections. Flaming gas leaked through the gap and exploded.

The conventional wisdom is that NASA bureaucrats, anxious to press forward with the launch largely for public relations reasons, ignored the warnings of Boisjoly and others who recognized the danger and tried to stop the launch. There's truth to that narrative – comforting truth, because it reassures us that if we only follow the proper procedures, such accidents can be prevented. In practice, it's not that simple.

Engineers at NASA and Morton Thiokol, the contractor responsible for building the booster rockets, had known for years that there was a problem with the seals. The question was not only what was causing the problem and how to fix it, but also whether the problem was significant enough to require fixing.

According to Collins and Pinch, the O-rings were just one of many shuttle components that didn't perform perfectly and about which engineers had doubts. To this day, they add, we can't be sure the O-rings were the sole cause of the explosion. "It is wrong," they write,

to set up standards of absolute certainty from which to criticize the engineers. The development of an unknown technology like the Space Shuttle is always going to be subject to risk and uncertainties. It was recognized by the working engineers that, in the end, the amount of risk was something which could not be known for sure.

Part of the uncertainty regarding the O-rings was that NASA and Morton Thiokol could never determine exactly how large the gaps in the seals became in liftoff conditions, and thus how serious a danger they represented. Countless tests were run trying to answer that question, but they consistently produced inconsistent results. This was so in part because NASA's and Morton Thiokol's engineers couldn't agree on which measuring technique to trust. Each side, say Collins and Pinch, believed its methods were "more scientific," and therefore more reliable.

Charles Perrow writes that the inability to pinpoint the source of technical failures is especially common in what he calls "transformation" systems, such as rocket launches or nuclear power plants: the intricacy of the relationships between parts and processes ("tight coupling") makes it impossible to separate cause and effect. "Where chemical reactions, high temperature and pressure, or air, vapor or water turbulence [are] involved," he writes,

we cannot see what is going on or even, at times, understand the principles. In many transformation systems we generally know what works, but sometimes do not know why. These systems are particularly vulnerable to small failures that 'propagate' unexpectedly, due to complexity and tight coupling.

Roger Boisjoly's suspicion that cold weather was the source of the Challenger's O-ring problem was just that – a suspicion. As of the night before the Challenger launch, he had some evidence to back up his suspicion, but not enough to prove it. On the strength of Boisjoly's concerns, his superiors at Morton Thiokol initially recommended that the launch be delayed, but NASA's managers insisted on seeing data that quantified the risk. Unable to provide it, Morton Thiokol's managers reversed their recommendation, and the launch was approved. 

Roger Boisjoly

Underappreciated Contributing Dynamic #2: The Double Bind of the Human Factor

We know now that Morton Thiokol's managers should have supported their engineer's conclusions and held their ground, and that NASA, upon hearing there was a possibility of catastrophic failure in cold weather, should have exercised caution and postponed the launch. Again, all that is true, but it's not the whole truth. To pin the blame on irresolute and impatient managers is to underestimate the complexities of the human dynamics that led to the decision.

We like to think that sophisticated machines are reliable in part because they eliminate human error. In truth complex technological systems always include a human component, and therein lies the dilemma. There's no shortage of examples before and after Challenger proving that the interaction of human beings and machine can end badly. It's also well known that we ask for trouble when we unleash powerful technologies without including human judgment in the mix. Human beings: can't live with 'em, can't live without 'em.

A subcategory of the human factor dilemma is what Charles Perrow calls the "double penalty" of high-risk systems. The complexity of those systems means that no single person can know all there is to know about the myriad elements that comprise them. At the same time when the system is up and running, one central person needs to be in control. This is especially true in crisis situations when the person in control is called upon to take, as Perrow puts it, "independent and sometimes quite creative action." Thus complex technological systems present us with built-in "organizational contradictions."

Communication issues can exacerbate those organizational contradictions. Middle level managers, for example, may decide that it's unnecessary to pass relevant information up the chain of command. In Challenger's case, many of NASA's senior executives were unaware of the ongoing questions regarding the booster seals. It's likely no one told the astronauts, either. Opportunities for misunderstanding also arise from the manner in which information is offered and from the manner in which it's interpreted. On at least two occasions NASA managers shrugged off engineers' warnings about the risks of cold-weather launches because the engineers themselves didn't seem, as far as NASA's managers could tell, that alarmed about them.

Collins and Pinch stress that in many respects the arguments between NASA and Morton Thiokol the night before the Challenger launch were typical of the sorts of arguments engineers and their bosses (also engineers, usually) routinely engage in as they iron out problems in complex technological operations. And, as mentioned above, these were continuations of discussions that NASA and Morton Thiokol had been having over the O-ring problem literally for years.

The longevity of those arguments actually became a barrier to their resolution. Some of the engineers at NASA and Morton Thiokol had invested so much time and energy in the O-rings that they developed a sort of psychological intimacy with them. Believing the problem fell within acceptable margins of risk, they grew comfortable wrestling with it. It was a problem they knew. This is an example of a phenomenon called "technological momentum." Simply put, habits of organizational thought and action become embedded and increasingly resistant to change. Devising an entirely new approach to the booster seals – one that would surely have had its own problems – was a step the shuttle engineers were reluctant to take, given the pressure they were under to move the project forward. Roger Boisjoly was able to look at the booster problem differently because he joined Morton Thiokol several years after the shuttle project had begun.

A major reason NASA's engineers were inclined to resist Morton Thiokol's recommendation that the launch be scrubbed because of the cold weather was that temperature had never before been presented to them as a determinative element in a launch/no launch decision. This wasn't Roger Boisjoly's fault: the freezing temperatures on the eve of the launch were a fluke, and therefore presented conditions that hadn't been encountered before. Nonetheless the novelty of Boisjoly's theory helped sway the consensus against him, as did his admitted lack of definitive data.

"What the people who had to make the difficult decision about the shuttle launch faced," Collins and Pinch write,

was something they were rather familiar with, dissenting engineering opinions. One opinion won and another lost, they looked at all the evidence they could, used their best technical standards and came up with a recommendation.

This may seem a cold assessment in light of what occurred, and Collins and Pinch aren't arguing that the decision the engineers made that night was correct. Obviously it wasn't. Still, the question must be asked: Isn't this exactly the sort of rational decision-making we generally prize in our scientists and technicians?

We understand that human judgment is fallible, but when complex technological systems go awry, we want to insist that it shouldn't be. Which is to wish for another sort of double jeopardy: to have our cake and eat it too. 

©Doug Hill, 2012

February 2, 2012

Falling Man

AMC's Mad Men returns in March, but already the advertising for this show about advertising has successfully stirred a bit of controversy.

I refer to the video teasers and posters that exploit the Falling Man motif of the show's opening title sequence. The vertiginous imagery is controversial because it evokes, intentionally or not, one of the most harrowing news photographs ever taken: that of the "falling man" plunging to his death from the World Trade Center on 9/11.

I'm a fan of Mad Men, but I'm also among those who find the title sequence disturbing. That's not because of any personal connection to 9/11, I don't think, although as a longtime resident of New York, it hits close enough. The source of my reaction is the power of the Falling Man photograph itself.

I'm not the first to observe that the Falling Man image is evocative on at least two visceral levels. It captures, in an excruciatingly personal way, the literal terror of 9/11. It also captures what it feels like, existentially, to be living in a world of terrifying uncertainty. The source of our anxiety isn't only terrorism, although that's part of it now. It's about a loss of psychic footing in a world of overwhelming change.

Critics have noted an infatuation in contemporary culture with nostalgia. This isn't surprising, given the degree of change that's subsumed us lately, and that's subsumed us ever since Watt introduced his steam engine. The past, unlike the present, offers something to hold onto. No accident that even as the Industrial Revolution raged around them, Victorians celebrated medieval chivalry and piety, lounging in drawing rooms that excluded, as Lewis Mumford put it, “every hint of the machine.” World War I brutally ended any illusion that the machine could be kept at bay, an awakening depicted on the current season of PBS's Downton Abbey.

Mad Men gets terrific mileage out of nostalgia, but we also enjoy knowing a secret the show's characters mostly don't: that their world is about to be turned upside down. Executive producer Matthew Weiner suggested in a recent interview that the dislocating effects of change may be Mad Men's most important underlying theme. Specifically he noted the plaintive question asked by a character in the third season: "When is everything going to get back to normal?”

We know that change has been a constant of human affairs, of course, but we also know that technology has amplified the pace and scale of change exponentially. It's interesting that Alvin Toffler's concept of "future shock" doesn't get talked about much any more, despite the fact that the acceleration of technological change responsible for that state of psychological dislocation has, as he predicted, only increased in the decades since he coined the phrase. Would-be tech billionaires are fond of bragging that the application or device they're selling promises to be the most "disruptive" technology to come along since Google and Facebook, but even if they succeed they'll soon be looking over their shoulders for the next disruptive technology coming round the bend, as Google and Facebook already are.

In one form or another, the Falling Man has become the archetypical figure of the technological era, spinning his way into space from a center that cannot hold. A standard-bearer of Gilded Age displacement was Henry Adams, who in the opening pages of his autobiography described himself wondering, "What could become of such a child of the seventeenth and eighteenth centuries, when he should wake up to find himself required to play the game of the twentieth?...No such accident had ever happened before in human experience. For him, alone, the old universe was thrown into the ash-heap and a new one created."
Adams was far from alone, but it was no surprise he felt that way. Isolation is another symptom of the psychology of modernism –  and another primary theme of Mad Men, according to Matthew Weiner. The nineteenth century versions of "future shock" were Marx's "alienation" and Durkheim's "anomie." In 1897 Durkheim published a study on the alarming rise in the number of suicides across Europe, a rise he attributed to the "morbid disturbance" caused by "the brilliant development of sciences, the arts and industry of which we are the witnesses." The work of centuries, he said, "cannot be remade in a few years."

We're often told that in order to maintain some semblance of balance in the world technology has made, we have to get used to the fact that everything is never going to get back to normal. So it is that the nostalgic appeal of Mad Men is precisely equivalent to that of Downton Abbey: We get to watch complacently as complacency is overturned.

©Doug Hill, 2012

February 1, 2012

Talking Technology!

On Super Bowl Sunday, Jan. 22, 1984, Apple ran one of the most famous TV advertisements of all time. It opened with a gray theater full of people with shaved heads, wearing gray jumpsuits, staring expressionlessly at a large screen. From the screen, an Orwellian “Big Brother” intoned, “We are one people, one whim, one resolve, one course. Our enemies shall talk themselves to death, and we shall bury them with their own confusion. We shall prevail.”

As he spoke, a blond woman ran into the theater, bearing a sledgehammer. She threw it at the screen, and the screen exploded. An off-camera voice declared, “On Jan. 24, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like 1984.” Today, more than two decades later, the message remains tremendously powerful: Innovative technology in the hands of brave people can free us all from tyranny.

Fifteen years later, in the fall of 2009, Apple officially launched the iPhone in China in partnership with a domestic mobile carrier, China Unicom. As a condition for entry into the Chinese market, Apple had to agree to the Chinese government’s censorship criteria in vetting the content of all iPhone apps available for download on devices sold in mainland China. (Most apps are created by independent developers— individuals, companies, or organizations—and then submitted to Apple for approval and inclusion in its app store.) On Apple’s special store for the Chinese market, apps related to the Dalai Lama are censored, as is one containing information about the exiled Uighur dissident leader Rebiya Kadeer. Apple similarly censors apps for iPads sold in China. So much for that revolutionary, Big Brother-destroying Super Bowl ad. Apple seems quite willing to accommodate Big Brother’s demands for the sake of market access.

Rebecca MacKinnon, from an excerpt on Slate of her new book, Consent of the Networked: The Worldwide Struggle for Internet Freedom.