Let's Not Keep the Bad Parts of AI While Killing the Good Parts
The following is the transcript of my troop deployment (i.e. ending rant) from episode 89 of my news podcast The Mind Killer. If you like it, please consider subscribing to the podcast
On the Mind Killer, we bitch a lot about how governments all over the world have essentially outlawed nuclear power. Few have outright banned it, of course, but almost all have let foolish alarmism about the supposed dangers of nuclear plants create unsurmountable regulatory barriers. Just last week, the first new reactor in over 30 years - Unit 3 of Georgia’s Plant Vogtle - entered commercial operation. And thanks to supposed “safety” regulations, the whole thing has been a disaster. I went $17 billion over budget. It was supposed to take five years once construction began. It took twelve. The increased costs wiped out any savings to energy costs that were expected. The experience has led other utilities to abandon plans for 24 other reactors which had been proposed over a decade ago. In South Carolina, two partially-built reactors have been shut down.
At the same time, the recent Oppenheimer movie reminds us that the true danger of nuclear energy was never power plants - it’s weapons. And we have plenty of those! While governments have been happy to restrict the creation of nuclear power, they’ve been all too quick to build more, and more destructive bombs.
I heard a take a couple of weeks ago that I can’t find right now, but the gist was that we’ve kept and proliferated the world-destroying promise of nuclear science while essentially banning the utopian promise of unlimited clean energy. And that this is a cautionary tale for artificial intelligence.
Since GPT3 was released, nearly everyone in the rationalist and EA movements has thrown their support behind some kind of government restrictions on the development of AI. I have been consistently against such restrictions, not because I don’t think AI is dangerous, but because I don’t see any way this doesn’t go the same way as nuclear. If AI can be used as a weapon, governments are not going to be able to stop themselves from developing a weaponized version. All regulations are going to do is prevent us, the plebs, from getting anything out of it. Assuming it doesn’t kill us all, AI has the potential to create a true utopia. I won’t even describe what that could look like, because it will probably be way different than I expect, but it could be great!
But if we regulate the shit out of it, we’ll never get there. Instead we’ll just get the part that will probably kill us all without the part that could actually improve our lives. Let’s not do that.