Discussion about this post

User's avatar
Gerben Wierda's avatar

In related news: https://www.bloodinthemachine.com/p/how-a-bill-meant-to-save-journalism

Here, big tech (in this case Google) was able to turn proposed checks and balances (on their somewhat predatory business model) in a Californian bill completely around. With some 'AI' thrown in for good measure.

Money is power. Power corrupts. Hence money corrupts.

Expand full comment
Paul Topping's avatar

While I applaud the efforts to corral AI before the beasts escape, I am resigned to the fact that it isn't going to happen. As is virtually always the case with regulating industry, bad stuff needs to happen before any preventative regulations can be passed. There are several reasons for this:

- No one is really sure what the bad things look like or how the scenarios will play out. This makes it hard to write effective regulations and there's nothing worse than ineffective regulations.

- Regulators are deathly afraid of restricting a possible economic powerhouse. After all, no one gives out awards for bad stuff avoided.

- When there are, say, 10 potential bad things predicted, it is hard to take the predictors seriously. They are hard to distinguish from people who simply want to thwart the technology. Gary Marcus constantly gets accused of this. The accusations aren't justified but it's still a problem.

- There's the feeling that even if US companies play by some new set of rules, other countries or rogue agents will not and the bad stuff will happen anyway.

Expand full comment
51 more comments...

No posts