Discussion about this post

User's avatar
Dakara's avatar

The pain of these mistakes are going to be legendary. I just saw where Microsoft is considering vibe-coding a complete rewrite of Windows, as if Windows isn't buggy enough.

BTW, have a clip of Brian Jenney here stating "Now back to that AI productivity myth. AI made us slower, not faster." That's the reality of it. Billion of investments for a worse result.

FYI - https://www.mindprison.cc/p/ai-vibe-coding-is-it-working-no

Paul Topping's avatar

Lately, I've been seeing reports that smaller models return more accurate results. It makes sense that if you curate the training data, you will get a smaller, more accurate, model. LLMs built on known-good code would presumably produce fewer bugs in generated code though, importantly, not zero bugs. Perhaps the future will be many smaller specialized models and a big scale-back on all the AGI talk. That's what I want for Christmas!

97 more comments...

No posts

Ready for more?