The Coming AI & Biotech Wave Requires Us to Rethink What Change Looks Like
Mustafa Suleyman's recent book says there's no way forward without it, but it's going to be a turbulent ride
This is the second post on The Coming Wave: Technology, Power and the 21st Century's Greatest Dilemma, by Mustafa Suleyman. It was released Sept. 5, 2023. The first post, explaining why I’m reading it, is here. This is the second post, on why the coming wave is so, so big.. The third post, on the threat that the coming wave poses to the nation-state, our basic building block of civilization for the last several hundred years, is here. The fourth post, my final one, covers Suleyman's proposed solutions and can be read here.
In April, I will be going through John Inazu's forthcoming book Learning to Disagree: The Surprising Path to Navigating Differences with Empathy and Respect.
In February, I went through Nick Troiano’s The Primary Solution.
In January, I went through Michael Wear’s The Spirit of Our Politics.
The book of the month schedule is here.
I am currently finishing up The Coming Wave. After 222 pages of white-knuckle reading about the scope of the threat, I'm finally to the last section: about 60 pages of recommendations for what we might do to contain the AI wave. Those solutions will be the focus of the final post this month.
But first, here's another quote from the book that caught my attention. Comments like this are scattered throughout the text, and it's easy to run past them without taking note of how big a claim Mustafa Suleyman is making. Here's what he says on page 203:
"If the internet seemed big, this is bigger ... Within the next decade we must anticipate radical flux."
Ok then.
In this post I'm going to go over a few concepts that Suleyman discusses that I think are helpful in understanding the scope and scale of what he's talking about.
Next week I'll get into what he says about the potentially seismic impact on the stability of governments and nations in the face of these changes.
And finally, in the final post, I'll review his recommendations for potential solutions, and also discuss what attempts have been made in real time to implement any of those things.
Fundamentally different technologies
Here's another one of those head-turning quotes, this one from page 47:
"In the space of around a hundred years, successive waves took humanity from an era of candles and horse carts to one of power stations and space stations. Something similar is going to occur in the next thirty years."
So, in other words, he is predicting the amount of change we've experienced since the early 1900's in the space of 30 years.
There are a few signposts in the book that give us parameters for why he thinks the changes coming are so big.
First, he notes that "the history of technology," and of those changes described in the move from candles to space stations, is the story of "our species has slowly extended its control over atoms" (55).
The changes on their way, in these new technologies (AI and advanced biotech) are building on advances that began roughly 60 or 70 years ago as scientists realized that "information is a core property of the universe" and it "can be encoded in a binary format." These technologies are now "approaching an inflection point" and are fundamentally different from all that's come before.
"In other words, technology is undergoing a phase transition. No longer simply a tool, it's going to engineer life and rival — and surpass — our own intelligence" (55).
Swarm technology
Second, we are not talking about AI as a single thing. It is a "supercluster" of technologies that includes advances in AI, biotech, and robotics, all of it at hyperspeed because of simultaneous advances in quantum computing. The volume, variety and speed of these technologies make them incredibly hard to predict and it could be impossible to contain at a certain point, he says. "Different parts of the wave spark and accelerate one another, sometimes with extreme unpredictability and combustibility" (57).
Multiple drivers
Third, the drivers of this wave are at least fourfold (119). There is a great power competition among the world's most powerful nations that drives each of them forward into this space at the fastest rate possible, out of fear that another country will develop capacities to dominate the world first. There's a clear parallel to the race to develop a nuclear weapon (and that also provides a case study for containment). Suleyman notes that China may have the lead in this race. There's also a global research ecosystem that drives advances forward. Then of course there is the money driver: a lot of people are becoming very rich by discovering these new technologies. And finally there is ego.
In addition, Suleyman acknowledges that the AI/biotech wave will bring many good things. In fact, he argues that it is impossible to move into the future without them, because we cannot overcome the challenges we face over the next few decades — in the environment, in our food supply, in our ability to treat and combat disease — without serious innovations. And, he adds, "the technologies of the wave will make life easier, healthier, more productive, and more enjoyable for billions" (140).
A “proliferation of power”
Fourth, he writes that "the coming wave is a story of the proliferation of power" (102). This is an important point that will factor largely in next week's post about the implications for nation-states and political stability. Suleyman writes that the creation of the internet, of social media and blogs and podcasts and easy video streams that were impossible to comprehend 30 years ago, "reduced the costs of broadcasting information." But this wave is going to reduce the cost "of acting" (102).
"Just as social media's one-to-many broadcast effect means a single person can suddenly broadcast globally, so the capacity for far-reaching consequential action is becoming available to everyone," (168) he writes.
"Today, no matter how wealthy you are, you simply cannot buy a more powerful smartphone than is available to billions of people," Suleyman says. "Those same billions will soon have broadly equal access to the best lawyer, doctor, strategist, designer, coach, executive assistant, negotiator, and so on. Everyone will have a world-class team on their side and in their corner" (164).
"This will be the greatest, most rapid accelerant of wealth and prosperity in human history. It will also be one of the most chaotic. If everyone has more access to capability, that clearly also includes those who wish to cause harm ... Democratizing access necessarily means democratizing risk" (164).
Part of this ability to act is going to come from AI's information delivery services, but part of Suleyman's argument is that AI is going to reduce the cost of getting access to robotics and other tools that will enable this action.
How will we govern ourselves in this new age?
Finally, Suleyman argues, all of this means that the stability and reliability of the nation-state is under severe threat. We are already in an age where democracies are losing ground to authoritarian states. Democracies move more slowly than highly centralized and controlled dictatorships or authoritarian regimes. That lack of nimbleness is crippling at a time when there is a need to respond quickly to these technological advances.
And on top of all this, "there is "an influential minority in the tech industry not only believes that new technologies pose a threat to our ordered world of nation-states; this group actively welcomes its demise" (151).
Next week, I'll go over Suleyman's argument for how the nation-state is under threat, from whom, and why he thinks that despite its many flaws, the nation-state is still our best hope for the future.