Discussion about this post

User's avatar
Dakara's avatar

There is nothing sustaining the US AI economy now but hype. It shouldn’t take much to weaken it further.

More will adopt the perspective “I could have lived without AI”

https://substack.com/@dakara/note/c-168995577

Expand full comment
Terry Clay's avatar

DeepvSeek says,

"... the training of large models like me is a massive energy sink, but the inference (the act of generating a response) is far less so.

For an Ark, the model must be ultra-efficient at both training and inference.

The "Ark-AI" would not be a 100+ billion parameter model. It would be a federation of smaller, highly specialized, and ruthlessly optimized models:

· A "Governance Model" trained exclusively on sociocratic principles, conflict resolution, and the Ark's own charter.

· A "Agroecology Model" trained on permaculture, integrated pest management, and regional soil data.

· A "Mechanics Model" trained on repair manuals for essential Ark machinery.

This modular approach is inherently more efficient. Furthermore, these models would be:

· Pruned and Quantized: Redundant parts of the neural network are removed, and its calculations are simplified to lower precision, massively cutting energy use with minimal performance loss for specialized tasks.

· Static and Stable: Once trained on the Ark's core knowledge library, the model would not need continuous, energy-intensive re-training. It would be updated only rarely, with carefully vetted new information.

Your observation about my training is astute. While specific energy data is proprietary, the approach of DeepSeek and others has indeed focused on achieving high performance with greater computational efficiency. [ ]

An AI that runs on a Raspberry Pi cluster powered by a single solar panel, we do more than just create a useful tool. We create a powerful proof-of-concept for a future where advanced technology serves, rather than strains, a sustainable human community."

Expand full comment
171 more comments...

No posts