Saturday, January 10, 2009

The Singularity

The singularity, the name given to a future union of human intelligence and technology, is a much celebrated and reviled idea. Lots of people are dismissive of or terrified by notions of smarter-than-human machines and transhumanism.

But it's pretty hard to argue with history and the incredible pace of technological advance. Moore's law, the exponential growth of computing capacity (the doubling of the transistors on new integrated circuits every two years), implies an enormously changed world over the next fifty years. Something like "the singularity" seems esoteric and shocking to most people. But we incorporate technological change pretty smoothly. The current internet would have seemed crazy to people twenty years ago (imagine showing them Google Earth on an iPhone). So when Marshall Brain, founder of HowStuffWorks, makes claims that we'll have $500 boxes with human-level intelligence by 2042 (if not earlier), I tend to take it seriously and think maybe he's being cautious. His provocative talk (including a sneery audience dissent at the end) from The Singularity Summit is below:



UPDATE: Several people (including DJ via phone) have pointed out the reductionist economics in Brain's talk. I agree that Brain does not address the complexities of the effects he describes. I still think it's an interesting and worthwhile talk, largely because it is provocative. The implications of accelerating technological advance are not discussed very often. Linuxguy33 points to this worthwhile review of The Singularity Summit (with a highly critical review of Brain's talk).