1 Comment
⭠ Return to thread

Agreed, particularly re consciousness. The way I think about consciousness is that it is a process that requires a) continued presence/awareness and b) ongoing "training" in response to the presence.

I struggle to define a time quantum in which an LLM is "conscious" - is it the inference time for one word? Because after that the existence of that instance of LLM "being" ceases and a new one is booted up to predict the second word.

And, of course, LLM's don't (yet) learn from every request, so that throws any idea of reflection and thoughtfulness out of the window.

Once it has committed to the opener that "Gary Marcus has a chicken named Henrietta..." it can't back down and say "...actually no, he doesn't" - it presses on with growing confidence instead.

Expand full comment