Sunday, March 29, 2009

The Technium Discussion

Kevin Kelly has a good, new discussion about technology risks at

His post lists four arguments against technology, which he summarizes as: "contrary to nature...(contrary to) humanity..., (contrary to) technology itself "and "because it is a type of evil and thus is contrary to God".

He summarizes the 'contrary to technology itself 'argument as:

"Technology proceeds so fast it is going to self-destruct. It is no longer regulated by nature, or humans, and cannot control itself. Self-replicating technologies such as robotics, nanotech, genetic engineering are self-accelerating at such a rate that they can veer off in unexpected, unmanageable directions at any moment. The Fermi Paradox suggests that none, or very few civilizations, escape the self-destroying capacity of technology."
That statement immediately reminds me of Bill Joy's Why the Future Doesn't Need Us, John Robb's Brave New War and John Gray's Straw Dogs. As John Gray wrote in 2002: "The development and spread of new weapons of mass destruction is a side effect of the groculdwth of knowledge interacting with primordial human needs. That is why, finally, it is unstoppable. The same is true of genetic engineering...It will occur haphazardly, as part of competition and conflict among states, business corporations and criminal networks”. And so it may well "veer off in unexpected, unmanageable directions", as Kevin paraphrased. But as the AIG risk manager undoubtedly once added, 'what's the worst that realistically could happen'.

Much of the discussion is strongly supportive of technology so his post offers viewpoints contrary to my own. Unfortunately, much of the discussion focuses on inconveniences of emerging technologies when the urgent issues are existential risks and sustainability of our core liberties.