Discussion about this post

User's avatar
keithj's avatar

LLMs are actually *Large Word Sequence Models*. They excel at reproducing sentences that sound correct. Mostly because they have been trained on billions of small groups of word sequences.

However language exists to transfer meaning between humans. Calling Chatbot a LLM implies it conveys meaning. Any meaning and correctness behind these generated word sequences is purely incidental and any potential meaning is inferred solely by the reader.

Saying that, Chatbot is ground-breaking technology, it will help the non-English speaking with syntax and grammar. But it will help no-one with conveying meaning.

When the next generation looks back in 15 yrs and sees the $Ts poured into LLMs and non-symbolic algorithms they will be stunned at how short-sighted and misguided we currently are.

Expand full comment
Gerben Wierda's avatar

Well said, again. The level of BS we will have to endure because of the fact that these 'word order prediction systems' can produce 'correct nonsense' is really mind boggling and not many are aware of the scale of the problem. So, good that it is pointed out.

With respect to: what should we do about it: I would humbly suggest people to listen to the last 7 minutes of my 2021 talk: https://www.youtube.com/watch?v=9_Rk-DZCVKE&t=1829s (links to last 7 minutes) it discusses the fundamental vulnerability of human intelligence/convictions and the protection of truthfulness as a key challenge of the IT revolution.

Also in that segment: one thing we might do at minimum is establish a sort of 'Hippocratic Oath for IT'. And criminalising systems pretending to be human.

There is more and those were first thoughts (though before 2000 I've already argued that internet anonymity when 'publishing' will probably not survive the fact that it enables damage to society too much)

Final quote from that 7 minute segment at the end of the talk:

"It is particularly ironic is [sic] that a technology — IT — that is based on working with the values 'true' and 'false' (logic) has consequences that undermine proper working of the concepts of 'true' and 'false' in the real world."

Expand full comment
56 more comments...

No posts