7 Comments
тна Return to thread
Comment removed
Nov 10Edited
Comment removed
Expand full comment

Thanks for your input Andy.

Transformer-based LLMs are definitely something to remember as we move forward with AI R&D, but they should not be a stepping stone; this is my opinion and what I am practicing.

Learning about Transformers gave me an idea to algorithmically correlate words to each other.

The simplest useful relationship can be expressed as a trigram: previous word, current word, and next word.

Using those trigrams, a Natural Language Generator (NLG) that I wrote makes grammatically-correct sentences not in the training set such as the ones below:

====== NLG-generated ==========

I want to speak more.

I want to change her tune.

================

There is too much overhead and uncertainty in GenAI's Transformers to make them a foundation for future AI R&D.

Expand full comment
Comment removed
Nov 10
Comment removed
Expand full comment

Like Gary says, LLM will have their place in the AI pantheon, but not as a stepping stone to any kind of true AI, even less AGI.

Expand full comment
Comment removed
Nov 10
Comment removed
Expand full comment

Hi Andy, we can fuse that vast amount of data and make statistical predictions using algorithms running on the C and other computer languages.

That is exactly what I am doing in my NLG.

I trained it with 13 public domain books, I coded its algorithms, I can explain how it generates each sentence.

We can map the capabilities of Transformers into much more efficient programs.

I trained and inference on a 10-year old laptop running Linux and using no more than 2Gbytes of memory and no more than 200 MBytes of disk.

Expand full comment

Plus, I don't need to start over, I've been working on my AI R&D for a year now.

Expand full comment