Preview Mode Links will not work in preview mode

Razib Khan's Unsupervised Learning

Feb 23, 2024

On this episode of Unsupervised Learning Razib talks about AI, the singularity and the post-human future, with James D. Miller, a Smith College economist, host of the podcast Future Strategist and the author of Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World. Miller and Razib first met at 2008’s “Singularity Summit” in San Jose, and though Singularity Rising was published in 2012, some of the ideas were already presented in earlier talks, including at that conference. More than 15 years since Miller began formulating his ideas, Razib asks him how the theses and predictions in his book have held up, and how they compared to Ray Kurzweil’s The Singularity is Coming. On this last point, Miller is very bullish on Kurzweil’s prediction that artificial intelligence will surpass that of humans by 2030. He also believes that the “intelligence explosion,” Kurzweil’s “technological singularity” when AI transforms the earth in unimaginable ways through exponential rates of change will in fact come to pass.

But while Kurzweil predicts that the singularity will usher in an era of immortality for our species, Miller has a more measured take. He believes AI will drive massive gains in economic productivity, from cultural creativity to new drug development regimes (one of the original rationales behind IBM’s AI program). But while Kurzweil anticipates exaltation of conscious human life into an almost divine state, Miller suspects that AI may eventually lead to our demise. He estimates a 10% probability that Kurzweil is correct that we will become immortal, and a 90% probability that AI will simply shove us aside on this planet as it begins to consume all available resources.

Overall, Miller is satisfied with the predictions in the first third of Singularity Rising. Computational technology has become far more powerful than it was in the late aughts, with a supercomputer in everyone’s pocket. Though the advances in AI seem to exhibit discontinuities, in particular with the recent seminal inventions of transformers and large language models coming to the fore, the smoothed curve aligns with Kurzweil’s 2030 target for human-level intelligence. On the other hand, where Miller has been disappointed is the merely modest advances in biological human engineering, with far fewer leaps forward than he had anticipated. Razib and Miller discuss whether this is due to limitations in the science, or issues of governance and ethics. Miller closes making the case for a program of cloning the great 20th-century genius John von Nuemann and the statesman Lee Kuan Yew.

While the computational innovation driving AI seems to have advanced on schedule, and the biological revolution has not taken off, the last section of Miller’s book focused on the economic impacts of the impending singularity. He still believes the next 10-20 years will be incredible, as our economy and way of life are both transformed for the good. Until that is, humans become obsolete in the face of the nearly god-like forms of AI that will emerge around 2050. Until then, Miller anticipates the next generation will see rapid changes as people make career shifts every half a decade or so as jobs become redundant or automated. If Singularity Rising proves correct, the next generation will be defined by what the economist Joseph Schumpeter termed “creative destruction.” If Miller is correct, it may be the last human generation.

For the first time ever, parents going through IVF can use whole genome sequencing to screen their embryos for hundreds of conditions. Harness the power of genetics to keep your family safe, with Orchid. Check them out at