30 Comments

Implausible deniability. “It wasn’t welded down, so it can’t be theft.”

Expand full comment

It is sad that most people do not turn the pages on their half truths and lies that they tell openly. the media companies are eating it up and likely in cahoots with them!

Thank you so much for exposing their BULLSHIT!

Expand full comment

I remember being told about them when they started as a possibly good place to work. I looked at them and said no. That non-profit stuff was bogus from the start. I expected legal trouble. Well I was wrong. They avoided that and people became rich. I'm glad I said no regardless. I have a life.

Expand full comment

OpenAI pretended to be "open" and "research-friendly" before they found out they've found something incredibly valuable. Then, with the help of Microsoft, it all became about ROI.

I get it. But they should stop pretending...

Expand full comment

Reading this, and the linked convo between Gates and Altman, I can't help rephrasing Conrad's exclamation, "The hubris! The hubris!"

Expand full comment

tl;dr:

Our system won't work, and more importantly we won't make any money, if we can't steal other people's work but we can't admit that because we'd be sued into oblivion.

Expand full comment

This largely reflects the hopeless inefficiency of these learning models. They have to hoover up enormous quantities of data because nobody has yet figured out how to learn this stuff efficiently.

Expand full comment

"Other than that, Mrs. Lincoln, how was the play?"

Expand full comment

Would be interesting to see BitTorrent traffic into OpenAI's IP addresses.

Expand full comment

No, we can't tell you what we trained it on, because we know we weren't allowed to take it. And besides, that would make it easier for you to know that the emergent capabilities we are claiming are really just data leakage. Look, shiny!

Expand full comment

Is this the worst example of misdirection and misleading "openness" from modern companies? It certainly seems to be one of the worst for some years, other than perhaps FTX and Theranos.

Expand full comment

I've been wondering a lot lately about what products I use are training these models. So many of their licenses allow for my data/usage to be used for "internal purposes". For example are or when will all our Google Meets be used to train Gemini and would Google even have to notify us if that is the case ? Would training Gemini be equivalent to recording a meeting without consent or do they (likely) see this as different ?

Expand full comment

Follow the money, grasshopper. Because it's a learning curve until it's an invoice, then it's a mistake.

Expand full comment

This emperor, on the other hand, is almost *all* clothes.

Expand full comment

I thought you had a positive one coming next @Gary Marcus, or was that the best one can say at this point? 😁

Expand full comment

the positive one got preempted by the OAI news!

Expand full comment

No news - no money. Have to crank the handle.

Looking forward to the bigger piece.

Expand full comment

Publicly available is a totally legit answer. Google search result provides me an answer to the query without a need to click on the source (if it is legit enough of a source). How is this different (except there could be some hallucination thrown in - making click and verify even more important)

Expand full comment