Discussion about this post

User's avatar
Phil Tanny's avatar

It's hard to have much confidence in governance schemes when, as best I can tell, such discussions never seem to mention what could be the most important threat presented by AI, further acceleration of the knowledge explosion.

Let's imagine for a moment that AI is made perfectly safe. Safe AI would still accelerate the knowledge explosion, just as computers and the Internet have. The real threat may come less from AI itself than from the ever more, ever larger powers which emerge from an AI accelerated knowledge explosion.

The knowledge explosion has already produced at least three powers of vast scale, which we basically have little to no idea how to make safe.

1) Nuclear weapons

2) Artificial intelligence

3) Genetic engineering

Instead of learning from this, we're using tools like AI to pore even more fuel on the knowledge explosion, which will almost certainly result in even more powers of significant scale which we will also struggle to make safe. As the emergence of AI illustrates, this processes is feeding back on itself, leading to ever further acceleration.

Experts are playing a losing game in trying to address emerging threats one by one by one as they emerge from the knowledge explosion assembly line, because that accelerating process is going to produce new threats faster than we can figure out how to defeat existing threats. Nuclear weapons were invented in 1945, before almost all of us were born, and we still have no clue how to get rid of them.

What we need are experts who are holistic thinkers. We need experts who will focus on the knowledge explosion assembly line which is producing all the emerging threats.

Taking control of the knowledge explosion so that it produces new powers at a rate which we can successfully manage is not optional. It's a do or die mission. Experts can declare this goal impossible all they want, but it will still remain a do or die mission.

The knowledge explosion has created a revolutionary new environment. Nature's primary rule is that creatures who can't adapt to changing conditions must die.

Expand full comment
Tom Dietterich's avatar

This is an exciting initiative! Building the technical basis for auditing and regulation is the highest priority. We are starting to see some papers on this, but we have a long way to go.

Expand full comment
13 more comments...

No posts