Like Gary says, LLM will have their place in the AI pantheon, but not as a stepping stone to any kind of true AI, even less AGI.
Hi Andy, we can fuse that vast amount of data and make statistical predictions using algorithms running on the C and other computer languages.
That is exactly what I am doing in my NLG.
I trained it with 13 public domain books, I coded its algorithms, I can explain how it generates each sentence.
We can map the capabilities of Transformers into much more efficient programs.
I trained and inference on a 10-year old laptop running Linux and using no more than 2Gbytes of memory and no more than 200 MBytes of disk.
undefined subscriptions will be displayed on your profile (edit)
Skip for now
For your security, we need to re-authenticate you.
Click the link we sent to , or click here to sign in.
Like Gary says, LLM will have their place in the AI pantheon, but not as a stepping stone to any kind of true AI, even less AGI.
Hi Andy, we can fuse that vast amount of data and make statistical predictions using algorithms running on the C and other computer languages.
That is exactly what I am doing in my NLG.
I trained it with 13 public domain books, I coded its algorithms, I can explain how it generates each sentence.
We can map the capabilities of Transformers into much more efficient programs.
I trained and inference on a 10-year old laptop running Linux and using no more than 2Gbytes of memory and no more than 200 MBytes of disk.