11 Comments
Dec 9, 2023Liked by Gary Marcus

When you move in the cover of darkness because the light may shine on your true intentions, things usually come into focus eventually. And this won’t end well for OpenAI/Altman. But ironically, this self destructive behaviour might end up being in the best interest of humanity. 🙏🏾

Expand full comment
Dec 9, 2023·edited Dec 9, 2023Liked by Gary Marcus

Well, maybe not "the key example" but *a* key example certainly, and one we know about - one with evidence on paper so to speak.

He took the easy way out, pulling strings behind peoples back instead of doing the bold and courageous thing. Sometimes it takes balls to be honest and face the (possible) consequences directly. You have to learn that it’s not as bad as you think and the worse-er option is, dare I say “weak”? Sometimes strong love means saying things people don’t want to hear, more , but in the long run and in the bigger picture, it’s the best, even if it seems for you personally in the short run, it’s not. Maybe I’m digressing here, but I see this situation a lot with myself and others, and even in the politics of cities....

Expand full comment

This saga is filling the Succession-sized space in my head

Expand full comment

There is no grand failure of trust, villains, or 4D chess games here. Just people playing their cards as they could.

It was a culture clash. Pragmatics won.

Expand full comment

I know people like Sam Altman. He controls information without giving a thought. He represents other people’s opinions, modifying them to match his own. And he subtly manipulates events to his own benefit. And when being subtle doesn’t work, he gets angry. I know the type.

Expand full comment

When ethics gets in the way of snake oil sales.

Expand full comment

This was essentially in the new Yorker piece

Expand full comment