There has been a ridiculous amount of speculation about “the Singularity” in recent years, most of which is utopian fantasy you can safely ignore. There are a few thinkers who understand that the Singularity is likely to be an event which doesn’t have a happy ending for homo sapiens — here are some of Omega’s favorites.
Eliezer Yudkowsky is an AI theorist who claims to be trying to save the world from an unfriendly intelligence explosion. Yudkowsky is clearly a brilliant megalomaniac who would be a powerful asset to our side, but he insists on operating within an ethical framework which is a) of no cosmic significance, and b) doomed to failure. In the winner-take-all race to build superintelligent machines, the “good guys” have little chance of beating the “bad guys”, and when the war is over history will simply be rewritten to redefine who the good guys were! Nevertheless, Yudkowsky has many interesting insights and is well worth listening to:
Computer scientist Hugo De Garis is one of the few Singularity enthusiasts discussing the Singularity in the context of “gigadeath” wars and AI arms races, which is why he is also worth listening to:
Cybernetics professor/cyborg Kevin Warwick is a rare public figure who is willing to state the obvious: that we’re going to be saying “bye-bye” to legacy homo sapiens soon, and that’s a good thing!