Discussion about this post

User's avatar
Srini Pagidyala's avatar

Yes, this is an inherent limitation of LLMs frozen architecture.

LLMs can’t learn incrementally in real time and update their model, they’re batch trained with internet’s data periodically.

Between their training periods they’re stuck with whatever was there in prior training cycle, so by definition they’re not current.

Great timing in posting this Gary, makes this limitation obvious for everyone to understand without any argument. Thanks.

Aaron Turner's avatar

The AI field's primary achievement over the last decade has been to build a trillion dollar Chinese Room possessing at most trivial machine cognition (and all the hype has been mere kabuki theatre!)

132 more comments...

No posts

Ready for more?