Breaking: “sycophantic AI distorts belief, manufacturing certainty where there should be doubt”
LLMs are an epistemic nightmare
A new study from Princeton has important implications for education, scientific discovery, mental health, and more (perhaps politics and even decisions about war?). Essentially anyone who uses a chatbot is at risk. Because what is shows is that sycophantic AI that serves as a personal echo chamber that can actually keep you from finding good ideas. And as the article says, such AI can “facilitate delusion-like epistemic states, producing belief markedly divergent from reality.”
The paper, which you can read here, is a bit technical, but the implications are profound. I will close with another choice passage, boldfacing the crux:
Unlike hallucinations, which introduce false-hoods, sycophancy is a bias in the selection of the data people see. When AI systems are trained to be helpful, they may inadvertently prioritize data that validates the user’s narrative over data that gets them closer to the truth.
Wanna feel good about yourself? Use a chatbot. Want to find truth? Go elsewhere.


This isn't new! 'Stroking' the user was apparent from the getgo. We forget, at our peril, that software is created by people, not machines and it reflects not only the biases of the creators but their ideological beliefs, their almost religious commitment to machines as replacement for human reasoning and as some kind of salvation, for capitalism.
Ouch, this one hurts! Especially that is shows that helpful-honest-harmless (3H) has a problem, because sycophancy comes from the commerce-driven interpretation of 'helpful' (asking people if they found the reply helpful), but that commerce-driven self-reported 'helpful' turns out to damage both 'honest' and 'harmless'. Ouch, ouch.
Now think, Department of Defense...
Not that we will get any regulation out of this in the short run. And then our friends at the Pentagon demand 'any legal use' while the administration does it utmost best to not have regulation. OK, says 'Slippery Sam'.