I've confessed before to an
annoying fondness for saying "I told you so." Still, I can't help but
note that an article in the New York Times this morning
affirms arguments that are regular themes here.
The article is a profile
of computer scientist Peter G. Neumann, who has warned for decades that the complexity
of the computer systems we rely on makes it "virtually impossible" (as the Times put it) to assure that they'll run
reliably and safely.
Neumann is recognized as one of the nation's leading
experts on computer security. Nonetheless the hardware and software industries have
consistently ignored his predictions that the products they sell, as they proliferate, will become increasingly vulnerable to breakdown and attack. His mantra is that complex systems break in complex ways, a principle that's been vindicated by legions of computer bugs and viruses, regular reports of massive data breaches and thefts, and growing fears of cyber warfare.
“His biggest contribution is to stress the ‘systems’ nature of the security and reliability problems,” said Steven M. Bellovin, chief technology officer of the Federal Trade Commission. “That is, trouble occurs not because of one failure, but because of the way many different pieces interact.”
Neumann's current project, Clean Slate, has been funded by the Pentagon's Defense Advanced Research Projects Agency (DARPA). It's an effort to rethink computer design from the ground up. The goal: an entirely new race of machines that's simpler, more stable, and less easily violated.
If that sounds idealistic, it is, and Neumann knows better than to expect eager assent from the digital establishment.
“I’ve been tilting at the same windmills for basically 40 years,” he tells the Times' John Markoff. “And I get the impression that most of the folks who are responsible don’t want to hear about complexity. They are interested in quick and dirty solutions.”
Again, forgive me for
pointing out how clearly this echoes themes I harp on regularly in this
space (though certainly not original to me). Recent examples include "On immovable technologies," "Recycling Ellen Ullman," and "Technological Autonomy: Greasing the rails to Armageddon."
The broader point to be emphasized
is that the systemic nature of technology – and the fallibility that's
an inevitable consequence of that systemic nature – is not only an issue in the world of computers.
It applies to the technological society as a whole. Peter Neumann's argument that
the complexity of digital systems makes them unstable and insecure can be
extended to the inconceivably complex web of technological systems that
now dominate the ecosystem we inhabit.
If it's difficult to imagine
starting over with a "clean slate" in the world of computers, how
much more difficult is it to imagine rethinking and rebuilding the entire technological
edifice on which we depend, regardless of how unstable that edifice might be? As I've said before, it's not a problem we seem prepared to contemplate.
©Doug Hill, 2012
No comments:
Post a Comment