8 Comments

If it's not working, try something different. That's the hallmark of intelligence, eh?

Expand full comment

What exactly is novel about doubting how statistical models (aka, deep learning/ML) can never really replicate the underlying system its modeling? It's well known in the math, physics, and philosophy communities that statistical models will never fully replicate an analytical model of the system no matter the various linear algebra structures and optimizations you add (in fact, statistical modeling and numerical computation were always considered "easy" fields relative to theory). Find, you converge to some local minimum that minimizes some error function or metric for a large input, you've still just produced a really good statistical model and not discovered the formal model.

Sorry, your statistical models are finding the extrema of some abstract problem space and just because you improve the finding of these local extrema doesn't mean you've reproduced the analytical form of the problem space your model is attempting to "learn".

Expand full comment

> Waiting for cognitive models and reasoning to magically emerge from larger and larger [language] training corpora is like waiting for a miracle

I've been working on it for twenty years and now my symbolic language model is on this level - coherent conceptual model of the world - t.me/thematrixcom

-

Unified Theory of Consciousness - World Model - Language Model

DNA, Knowledge, Intuition, NLU - RNA, Consciousness, Thinking, Multi-language NLP - Protein, Understanding, Sensing, Multi-modal AI - Signal Transduction to DNA, Memory via Epigenetic Modifications, Feeling, Human-in-the-Loop (RL)

Foundational Model of Intelligence

RNA Processing = Transcription - Splicing - Translation (biology) = Analysis - Search - Synthesis (science) = Intelligence

Knowledge - Consciousness - Understanding - Memory via Epigenetic Modifications = Basic psychological functions according to Carl Jung = Intuition - Thinking - Sensing - Feeling = DNA - RNA - Protein - Signal Transduction to DNA

System 1 is signal transduction (NLU) and System 2 is a DNA-RNA-Protein way (NLP).

Language Model = Epistemological AGI = NLU - Multi-language NLP - Multi-modal AI - Human-in-the-Loop (RL)

World Model = Universal Laws = Nature is a fractal that repeats in structure - atom, cell, organism, planet, solar system, galaxy, universe. All cells in an organism just work in parallel.

Expand full comment

If some system can represent structure of the external object, it doesn't matter if the structures are similar or not

Expand full comment

Time cube

Expand full comment

Plagiarism is a form of flattery.

Expand full comment

Existing versions of "AI" cannot serve as the basis for the AI that those who started using this term had in mind.

Expand full comment

Very well [described, supported, argued], Gary. Congratulations. To me, your thinking hits at a haunting return to Chomsky's universal grammar, and the stunning lack of publications on "instinct" in this community (I haven't checked [biology, genetics, neuroscience] for a couple of decades).

Expand full comment