September 29, 2012

Talking Technology! ("Ouch!" edition)



When O'Reilly Radar posted my essay on Steve Jobs' affinity for Romanticism earlier this week, a commenter posted a link to a 2001 video interview with Jobs. 

The commenter was pointing out that some of what Jobs says in the interview speaks directly to the issues addressed in my essay (see 7:16 forward), and he was right.

It's interesting to note, in addition, that some of Jobs' other comments take on a certain resonance in the context of Apple's recent troubles with the Maps application for the iPhone 5 and iOS 6.

The interviewer asks (at 4:15) how Jobs motivates the employees of Apple to "think different." What do you do, she wants to know, to make sure that attitude "permeates" the company? Jobs answers:

You permeate it by example, ultimately. In other words, when something's not quite good enough, do you stop and make it better, or do you just ship it? And everybody watches to see how the senior management makes those decisions. And what we've tried to do is stop and make it great before we ship it. When we have problems, stop and fix them.…You can say anything you want but everybody watches very carefully when you're in a difficult situation, what decisions you make, what values you have.







September 25, 2012

Steve Jobs, Romantic

Samuel Taylor Coleridge
Steve Jobs


"…the season
Wherein the spirits hold their wont to walk
the fruitful matrix of Ghosts…"
                                                  Samuel Taylor Coleridge 


Steve Jobs died a year ago October 5th, and we can expect his ghost to appear in any number of recollections and assessments as the anniversary approaches.

I'd like to talk here about a spirit that Jobs carried within himself. It's a spirit he relied on for inspiration, although he seemed at times to have lost track of its whisper. In any event what it says can tell us a lot about our relationship to machines.

I refer to the spirit of Romanticism. I spent much of this past summer reading about the Romantics – the original Romantics, that is, of the late eighteenth and early nineteenth centuries – and it's remarkable how closely their most cherished beliefs correspond to principles that Jobs considered crucial to his success at Apple.      

What Apple does that other companies don't, Jobs often said, is infuse the technologies it produces with human values. "It's in Apple's DNA that technology alone is not enough," he said during one of his famous product introductions. "We believe that it's technology married with the humanities that yields us the result that makes our heart sing."      

Jobs can be forgiven for never getting very specific about what he meant by marrying technology to the humanities. It's by definition a subject that's hard to pin down, though not especially hard to understand. Basically he was saying that Apple's products have soul, and that people are attracted to those products because they can feel that soul, both consciously and unconsciously. These are things the Romantics thought about a lot.

That the creative artist can bring life to inanimate objects was a central conviction of the Romantic poets. (I'm speaking of the thrust of the Romantic movement in general; individuals within the movement disagreed on specific issues.) For them the inanimate object in question was words, for Jobs it was technology, but the basic point – that a work of art, properly executed, carries within it an invisible, living essence – was the same. Devoid of this essence, said Samuel Taylor Coleridge, what's produced is as lifeless as the "cold jelly" of a corpse. 


Jobs onstage presenting Apple's iPad 2

Put in contemporary terms, soul from the Romantic perspective is an emergent quality, a product of an organic, harmonious relationship between constituent parts. Even when those individual elements are familiar in other contexts, as the elements of Apple's products were often said to be, combining them with due attention to essence can bring something new into the world. As Coleridge put it, the true artist "places things in a new light…What oft was thought but ne'er so well exprest…[He] not only displays what tho often seen in its unfolded mass had never been opened out, but he likewise adds something, namely, Lights & Relations."

Relations, in turn, create unity. Each part is completely faithful to the creation as a whole. To construct a work in accord with some "mean or average proportion" is to dilute its essence, said William Hazlitt, "for a thing is not more perfect by becoming something else, but by being more itself."

This supports Jobs' insistence that Apple maintain control over both its hardware and its software, a policy that insured they would work seamlessly together. Soul emerges on its own in nature, but not in art. The unity on which it depends is concealed, as one critic put it, beneath "a surface world of chaos and confusion." To reveal essence requires not only vision, but also focused attention and deliberate action. Coleridge coined a word to describe the unifying power of the creative imagination: "esemplastic," derived from the Greek for "to shape into one."   

Nor will essence emerge on the strength of reason alone. Indeed, Romanticism was explicitly and decidedly a revolt against reason, a rejection of the empirical presumption of the Enlightenment. Coleridge considered the "Mechanico-corpuscular philosophy" his lifelong enemy; its endless reductionism smothered, he believed, any trace of vitality. What remained wasn't art, he said, but "a lifeless Machine whirled about by the dust of its own Grinding" – a fair description of how Steve Jobs viewed the products of Apple's longtime rival, Microsoft.          

The Romantic contemplates nature
There's no question that Jobs was intimately familiar with and sympathetic to the Romantics' convictions, if only because they were shared by two of his most formative influences, Eastern religion and the 60s counterculture. This is not to say he was directly aware of that coalescence; I've seen no interview with Jobs in which the Romantics are mentioned. Nor is there evidence to suggest he recognized how freely the streams of the three philosophies intertwined. Ralph Waldo Emerson, for example, wrote poetry based on the Bhagavad Gita and paid tribute in person to Coleridge and Carlyle. Autobiography of a Yogi, a book Jobs claimed to have read annually since he was in college, quotes Emerson several times. Values regularly celebrated in Romantic texts – passion, spontaneity, authenticity – were counterculture touchstones as well.

Jobs' philosophy, then, overlapped with the Romantics', whether he knew it or not. Coleridge famously said that every person is either a born Platonist or a born Aristotelian – the Romantics were Platonists, Bill Gates would qualify as an Aristotelian – and that no one changes from one orientation to the other. It may be that Jobs was, as he and many others contended, an exception to that rule, able to play successfully on both sides of the technology/humanities divide.

There were signs that Jobs wasn't finding it easy to hold on to his Romanticism as his business career progressed. In Apple's early days he'd been a believer in the messianic promise of computers, convinced they were the greatest force in history for human liberation. In more recent interviews he dismissed suggestions that technology can solve the problems of the world, and he was stung by critics who said that some of Apple's products were more about consumerism than creativity. He also expressed disappointment in the narrowness of vision he saw in the students who came to hear him speak on college campuses. The only thing that seemed to impress them, he said, was how much money he'd made.

Jobs' weariness speaks to a point I'd mentioned at the beginning of this article: that the spirit of Romanticism can tell us a lot about our relationships to machines. To believe that technology can be our savior was a minority opinion in the counterculture. The predominant sentiments of the time were more in tune with the Romantics, who believed that salvation was to be found not in mechanical power, but by living as simply and as close to nature as possible.

Pastoral retreat on any substantial scale isn't likely at this point. Our technologies are with us to stay. Living more simply would seem to be an option, though. We might also consider the possibility of constructing those technologies more Romantically. That would entail recognizing, as Steve Jobs did, that the things we create really do have souls, and that they speak a language we can hear.                  








Books that were especially useful in my research for this reflection were Richard Holmes' two-volume biography of Samuel Taylor Coleridge, M.H. Abrams' The Mirror and the Lamp, David Newsome's Two Classes of Men: Platonism and English Romantic Thought, and Walter Isaacson's Steve Jobs.


Note: This essay was published earlier this morning by O'Reilly Radar. Thanks to Mac Slocum for opening the door. 

Image credits: iPad 2 presentation: Rob Pegoraro, Washington Post; "Wanderer Above the Sea of Fog," Casper David Friedrich, 1818.



©Doug Hill, 2012







September 24, 2012

Talking Technology! (No Free Lunch edition)


The New York Times published a superb report yesterday (the first part of a series) on the massive, and massively wasteful, power consumption that drives the Internet.

The article is notable not only for its reporting but also for its appreciation of the underlying issues.

Specifically it emphasizes that the Internet is not, as we'd like to believe, an entity that exists in some friction-less, dream-like realm called "cyberspace" or "the cloud." In truth, the Internet isn't "virtual" at all: It runs on such Earth-bound realities as diesel generators and lead-acid batteries, not to mention huge amounts of electricity and substantial amounts of pollution.

Here are some quotes:
Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.

“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”

Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.

The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.

At least a dozen major data centers have been cited for violations of air quality regulations in Virginia and Illinois alone, according to state records. Amazon was cited with more than 24 violations over a three-year period in Northern Virginia, including running some of its generators without a basic environmental permit.

Each year, chips in servers get faster, and storage media get denser and cheaper, but the furious rate of data production goes a notch higher.

With no sense that data is physical or that storing it uses up space and energy, [Internet users] have developed the habit of sending huge data files back and forth, like videos and mass e-mails with photo attachments. Even the seemingly mundane actions like running an app to find an Italian restaurant in Manhattan or a taxi in Dallas requires servers to be turned on and ready to process the information instantaneously.

“If you tell somebody they can’t access YouTube or download from Netflix, they’ll tell you it’s a God-given right,” said Bruce Taylor, vice president of the Uptime Institute, a professional organization for companies that use data centers.

A crash or a slowdown could end a [data center manager's] career…A field born of cleverness and audacity is now ruled by something else: fear of failure.

 “When somebody says, ‘I’m going to store something in the cloud, we don’t need disk drives anymore’ — the cloud is disk drives,” Mr. Victora said. “We get them one way or another. We just don’t know it.”

“That’s what’s driving that massive growth — the end-user expectation of anything, anytime, anywhere,” said David Cappuccio, a managing vice president and chief of research at Gartner, the technology research firm. “We’re what’s causing the problem.”



September 17, 2012

Technological Autonomy: Greasing the rails to Armageddon


There are any number of ways to frame the apocalypse, I suppose. As one who spends a lot of time thinking about technology, mine is a phenomenon known as "technological autonomy." 

I'm convinced that technological autonomy may be the single most important problem ever to face our species and the planet as a whole. A huge statement, obviously, but there's plenty of recent evidence to back it up.

Briefly stated, technological autonomy describes a condition in which we are no longer in control of our technologies: they now function autonomously. This isn't as absurd as it may sound. The idea isn't that we can't switch a given machine from "on" to "off." Rather, technological autonomy acknowledges the fact that we have become so personally, economically, and socially committed to our devices that we couldn't survive without them.*

Technological autonomy is probably the most controversial theory in the rarified but growing field known as the philosophy of technology. Paul T. Durbin, a professor emeritus of philosophy at the University of Delaware, has written that the discipline is roughly divided between those who interpret technology narrowly and those who interpret it broadly. If you think of technology as tools, period, scholars in the narrow camp agree with you. They tend to have engineering backgrounds and become irritated at any suggestion that machines have taken on a life of their own. "It is not the machine that is frightening," says Joseph Pitt of Virginia Tech University, "but what some men will do with the machine; or, given the machine, what we fail to do by way of assessment and planning."

Scholars in the broad camp, who often come from philosophy or sociology backgrounds, says it isn't that simple. They insist that technology must be seen systemically, that it includes not only machines but also the social relationships and economic structures in which machines flourish. As Thomas Misa of the University of Minnesota puts it, technology is "far more than a piece of hardware," but rather "a shorthand term for the elaborate sociotechnical networks that span society."

From that perspective we can see that controlling our machines involves much more than just deciding, "Okay, we're not going to do that anymore." All of us, whether we like or not, are enmeshed in a massively complex web of interconnected, interdependent technologies and technological systems. To extricate ourselves from those systems would inflict massive, probably irreparable, damage to our way of life. I use the term "de facto technological autonomy" to suggest that while we can literally turn off our machines, practically we are unable to do so.


The people of Japan have learned a lot about technological autonomy since the tsunami hit the Fukushima reactors. They'd love to get rid of nuclear power altogether, but their leaders are telling them that to do so invites economic disaster. In much the same way we Americans, along with most of the rest of the developed world, are trapped by our automobiles. We know that for lots of reasons we'd be better off if we stopped driving them tomorrow, but we can't. If we did, life as we know it would collapse, since in one way or another we depend on the internal combustion engine for our jobs, our food, and virtually everything else we need. It's impossible to overestimate the implications of that particular dilemma, politically, economically, militarily, and – most important – environmentally.

The reasons I think technological autonomy is the most crucial issue in history are contained in several reports I've come across in recent months. They're collected on my hard drive in a folder labeled "The End of Civilization." Together they testify, explicitly or implicitly, to a growing consensus in the scientific community that we humans are not going to find it within ourselves to act soon enough or dramatically enough to forestall catastrophic climate change. If the battle is about bringing our machines to heel, it's pretty certain at this point we're going to lose.


An example appeared in a New York Times essay in July by Roger Bradbury, an ecologist at Australian National University. The world's coral reefs, sources of food for millions of human beings, have become "zombie ecosystems," he wrote. They will collapse entirely within a human generation. Although the evidence that this is happening is "compelling and unequivocal," scientists and politicians alike have consistently "airbrushed" the truth. There's hope to save the reefs, we're told, if only we take prudent action. Forget it, Bradbury said. There isn't any hope.  

The scent of doom similarly emanated from a report in England's Guardian newspaper in February. "Civilization faces a 'perfect storm of ecological and social problems,'" the headline read. This was the conclusion of a group of 20 scientists who had all been winners of the Blue Planet prize, an international award the Guardian described as "the unofficial Nobel for the environment." The paper issued by the group made repeated use of the word "unprecedented," as in this passage:

"In the face of an absolutely unprecedented emergency, society has no choice but to take dramatic action to avert a collapse of civilization. Either we will change our ways and build an entirely new kind of global society, or they will be changed for us."

A month later the journal Science published a paper signed by an international group of 32 experts who specialize in environmental governance. “Societies must change course to steer away from critical tipping points in the Earth system that could lead to rapid and irreversible change," the paper said. "Incremental change is no longer sufficient to bring about societal change at the level and with the speed needed to stop earth system transformation."

Yet another apocalyptic paper soon followed, this one by a group of 22 scientists from a variety of fields, writing in the June issue of the journal Nature. Entitled “Approaching a state shift in Earth’s biosphere," the paper warned that the planet's environmental systems are nearing breakdown on any number of fronts and that those "tipping points" will likely be sudden and dramatic rather than gradual. The Los Angeles Times quoted lead author Anthony Barnosky, a professor of integrative biology at UC Berkeley, as comparing the likely severity of the environmental shifts we're facing to the effects of an asteroid hitting the planet. 



 
That the world seems to have taken little to no significant notice of these warnings strikes me as utterly astonishing. It's as if a family has been told that their house is on fire and they remain glued to their TV shows and video games, potato chips and soda at hand. Certainly climate change hasn't been anything close to a central issue in the current presidential campaign. The only conceivable explanation is that we are simply unable to contemplate the scope and depth of changes that will be required to forestall the catastrophes the scientists are predicting. That by definition comprises a condition of de facto technological autonomy.     

The volume of scientific alarms increased in recent months in part in anticipation of Rio+20, an international colloquy on the environment held in Rio de Janeiro in June. Officially named the United Nations Conference on Sustainable Development, the nickname was a nod to the fact that the conference was convened on the twentieth anniversary of the 1992 Earth Summit, also held in Rio. At that meeting a global "blueprint" was adopted that would supposedly set the nations of the world on the path to a saner environmental future. 


As it's turned out, the two Rio conferences can be considered benchmarks on our journey over the environmental cliff. Global warming, among other sources of degradation, has only accelerated since the first one, and nothing emerged from the second one to suggest we'll find a way to reverse that trend anytime soon. Kumi Naidoo, executive director of Greenpeace International, called the conference as a whole "a failure of epic proportions" and its rambling, inconclusive final report "the longest suicide note in history."  

Our ongoing impotence in the face of climate change prompted one of our better-known environmental activists, Bill McKibben, to publish an angry and pessimistic jeremiad in the August issue of Rolling Stone. He spent a few thousand words documenting the latest irrefutable evidence that disaster's approach continues unimpeded while the "charade" that we're actually dealing with it continues, as it has for decades.

"Since I wrote one of the first books for a general audience about global warming way back in 1989," McKibben said, "and since I've spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we're losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in."

The conclusion is inescapable that some challenges are just too difficult to face. Controlling the machines we've unleashed seems to be one of them. 



*For previous essays of mine on technological autonomy, see here and here.




©Doug Hill, 2012
 

September 14, 2012

Annals of Childish Behavior™ (continued)


On "The Innocence of Muslims":

Morning Edition, National Public Radio:
Fawaz Gerges is a Muslim scholar and professor of international relations at the London School of Economics. He says the film…"gathers up all the historical falsities spread about Mohammad for centuries and includes them in one film" — or, actually, a 14 minute trailer.
Gail Collins, New York Times:
The trailer looks as though it was made by a 13-year-old boy with access to a large supply of fake beards.
 Audio soundbite from the trailer: 
UNIDENTIFIED WOMAN: I have not seen such a murderous thug as Muhammad. He kills men, captures women and children, robs the caravans, breaches agreements and treaties. He sells the children as slaves after he and his men have used them.




Is modern culture being overwhelmed by an epidemic of childishness? José Ortega y Gasset, writing in 1930, thought so. Annals of Childish Behavior™ chronicles contemporary examples of that epidemic. The childish citizen, Ortega said, puts "no limit on caprice" and behaves as if "everything is permitted to him and that he has no obligations."




September 8, 2012

Talking Technology!



The New York Times reports today that some American companies are making big profits selling a gas that's been banned by international treaty because it damages the ozone.
   
The gas, HCFC-22, is used to repair older air conditioners; newer machines don't use it. There's a flourishing market smuggling it into countries where it's illegal or exporting it to countries where it's legal.

DuPont makes more HCFC-22 than it can sell legally in the States and exports it to countries like Mexico. Marcone, a major supplier of appliance parts based in St. Louis, was recently convicted of selling smuggled HCFC-22 to American retailers with a discount promotion it called "Freaky Freon Fridays." A federal prosecutor testified that the senior vice president in charge of the operation was considered "a hero" in the company because of the revenue he was bringing in.

Indeed, the profits involved in selling HCFC-22 are such that it's actually becoming more abundant and cheaper on the global market than it had been before the treaty banning it went into effect, the Times says.

The vast amount of goods shipped internationally in the global economy makes it easy to smuggle HCFC-22. According to the Times, some of the gas sold by Marcone was made in the U.S., sold to Mexico, then smuggled back into the States. Another shipment had been manufactured in China and exported to Ireland and the Dominican Republic before being shipped by freighter to Miami, hidden in cargo containers among other, legal goods. Fake invoices were produced to fool Customs inspectors.





September 2, 2012

Recycling Ellen Ullman

Ellen Ullman

I'm an admirer of the writer Ellen Ullman, the software engineer turned novelist. Her 1997 memoir, Close to the Machine: Technophilia and Its Discontents, is a wonderfully perceptive reflection on her years as a professional programmer.

Ullman recently wrote a commentary for the New York Times on the computerized trading debacle triggered last month by the brokerage firm Knight Capital. In it she reaffirmed a crucial point she'd made in Close to the Machine, a point I find myself coming back to repeatedly in this space. To wit: If you think we're in control of our technologies, think again. 

To refresh memories, Knight, one of the biggest buyers and sellers of stocks on Wall Street and one of its most aggressive users of automated trading systems had developed a new program to take advantage of some upcoming changes in trading rules. Anxious to profit from getting in first, Knight set its baby loose the moment the opening bell sounded on the day the changes went into effect. It went rogue, setting off an avalanche of errant trades that sent prices careening wildly all over the market. In the forty five minutes it took to shut the system off, Knight lost nearly half a billion dollars in bad trades, along with many of its clients and its reputation.

Much of the finger-pointing that followed was aimed at Knight's failure to adequately debug its new system before it went live. If only the engineers had been given the time they needed to triple check their code, the story went, everything would have been fine. It was this delusion that Ullman torpedoed in her essay for the Times.

Wondering who's in charge here.
It's impossible to fully test any computer system, she said. We like to think there's a team of engineers in charge who know the habits and eccentricities of their programs as intimately as they know the habits and eccentricities of their spouses. This is a misconception. Systems such as these don't run on a single body of code created by one company. Rather, they're a collection of interconnected "modules," purchased from multiple vendors, with proprietary software that the buyer (Knight Capital in this case) isn't allowed to see. 

Each piece of hardware also has its own embedded, inaccessible programming. The resulting system is a tangle of black boxes wired together that communicate through dimly explained “interfaces.” A programmer on one side of an interface can only hope that the programmer on the other side has gotten it right.

The complexities inherent in such a configuration are all but infinite, as are the opportunities for error. Forget, in other words, about testing your way to perfection. "There is always one more bug," Ullman said. "Society may want to put its trust in computers, but it should know the facts: a bug, fix it. Another bug, fix it. The 'fix' itself may introduce a new bug. And so on."

As I say, these were the sorts of issues Ullman explored with terrific insight in Close to the Machine. Ullman's experience as a programming insider affirmed what so many us on the outside sense intuitively, that computer systems (like lots of other technologies) follow their own imperatives, imperatives that make them unresponsive to the more fluid needs of human beings. “I’d like to think that computers are neutral, a tool like any other,” she wrote, “a hammer that can build a house or smash a skull. But there is something in the system itself, in the formal logic of programs and data, that recreates the world in its own image.”



I discussed this tendency in my 2004 masters thesis on the philosophy of technology, citing a passage from Ullman's book as an example. Here's part of what I wrote: 

In her opening chapter, Ullman describes a meeting she has with a group of clients for whom she is designing a computer system, one that will allow AIDS patients in San Francisco to deal more smoothly with the various agencies that provide them services. Typically, this meeting has been put off by the project’s initiating agency, so that the system’s software is half completed by the time Ullman and her team actually sit down with the people for whom it is ostensibly designed.

As the meeting begins, it quickly becomes apparent that all the clients are unhappy for one reason or another: the needs of their agencies haven't been adequately incorporated into the system. Suddenly, the comfortable abstractions on which Ullman and her programmer colleagues based their system begin to take on “fleshly existence.” That prospect terrifies Ullman.  “I wished, earnestly, I could just replace the abstractions with the actual people,” she writes.

But it was already too late for that. The system pre-existed the people. Screens were prototyped. Data elements were defined. The machine events already had more reality, had been with me longer, than the human beings at the conference table. Immediately, I saw it was a problem not of replacing one reality with another but of two realities. I was at the edge: the interface of the system, in all its existence, to the people, in all their existence.

The real people at the meeting continue to describe their needs and to insist they haven’t been accommodated. Ullman takes copious notes, pretending that she’s outlining needed revisions. In truth she's trying to figure out how to save the system. The programmers retreat to discuss which demands can be integrated into the existing matrix and which will have to be ignored. The talk is of “globals,” “parameters,” and “remote procedure calls.” The fleshly existence of the end users is forgotten once more.

“Some part of me mourns,” Ullman says,

but I know there is no other way: human needs must cross the line into code. They must pass through this semipermeable membrane where urgency, fear, and hope are filtered out, and only reason travels across. There is no other way. Real, death-inducing viruses do not travel here. Actual human confusions cannot live here. Everything we want accomplished, everything the system is to provide, must be denatured in its crossing to the machine, or else the system will die.


Ullman's essay on the Knight Capital trading fiasco shows that in the fifteen years since Close to the Machine was published, we still haven't gotten the bugs out of the human-machine interface, or out of the machine-machine interface, for that matter. Nor are we likely to anytime soon.



September 1, 2012

Talking Technology!



An edited version of a press release issued on July 16, 2012, by Cyclone Power Technologies Inc.:

Cyclone Power Technologies Responds to
Rumors about “Flesh Eating” Military Robot


POMPANO BEACH, Fla.– In response to rumors circulating the Internet about a “flesh eating” robot project, we would like to set the record straight: This robot is strictly vegetarian.

The Energetically Autonomous Tactical Robot (EATR™) is an autonomous robotic platform capable of performing long-range missions without the need for conventional re-fueling. It will be able to find, ingest and extract energy from biological material in the environment.

Despite reports that this biological material includes human bodies, the public can be assured EATR's engine runs on fuel no scarier than twigs, grass clippings and wood chips – small, plant-based items for which its robotic technology is designed to forage. 

EATR was developed by Cyclone Power Technologies Inc. and Robotic Technology Inc. for the Defense Advanced Research Projects Agency (DARPA).

“We completely understand the public’s concern about futuristic robots feeding on the human population, but that is not our mission,” stated Harry Schoell, Cyclone’s CEO.







Annals of Childish Behavior™ (continued)




Democratic government "implies tools for getting at truth in detail, and day by day, as we go along….Without such possession, it is only the courage of the fool that would undertake the venture to which democracy has committed itself."                            
                              John Dewey, 1899

Distortion in the news represents the first step toward "a sham universe," a step that leads inexorably to "the disappearance of reality in a world of hallucinations."
                              Jacques Ellul, 1954

The “judicious study of discernible reality” is “not the way the world really works anymore.” In the modern media environment, “we create our own reality.”
Official in the administration of President George W. Bush, 2004

“We’re not going let our campaign be dictated by fact-checkers.” 
                                  Neil Newhouse, Mitt Romney campaign, 2012
  
"Honesty is a lost art. Facts are for losers. The truth is dead." 
                                        Charles M. Blow, 2012
 

Sources:
Dewey: "Consciousness and Experience," in The Influence of Darwin on Philosophy, and Other Essays in Contemporary Thought
Ellul: The Technological Society
Bush official: Frank Rich, The Greatest Story Ever Sold: The Decline and Fall of Truth From 9/11 to Katrina
Romney pollster: Michael Cooper, "Campaigns Play Loose With Truth in a Fact-Check Age," New York Times, August 31, 2012.
Charles M. Blow, "The G.O.P. Fact Vacuum," New York Times, August 31, 2012.

  
Is modern culture being overwhelmed by an epidemic of childishness? José Ortega y Gasset, writing in 1930, thought so. Annals of Childish Behavior™ chronicles contemporary examples of that epidemic. The childish citizen, Ortega said, puts "no limit on caprice" and behaves as if "everything is permitted to him and that he has no obligations."