Actually, let me say at the outset, that my title is a literary stretch. Although this is a Dickensian tale of two that don’t tell the truth, one lies and the other doesn’t.
The first is George Santos. He lies are legendary. His lies just got him convicted put him in the jackpot, with a House ethics committee saying there is clear evidence he committed serious crimes. (Two of aides have plead guilty to fraud.)
The other is … GPT. GPT’s “lies” are also legendary, and prolific, but not really lies, because real lies intend to deceive, and GPT has no intent. Some people have drinking problems; GPT has a truthiness problem. But it’s not deliberate, it’s because it doesn’t know any better.
§
New York Magazine has a fun story on Santos’ origin stories, portraying him as having perfected the art of the lie while working at a call center, selling services to customers that maybe didn’t really need what he was selling. It’s a tell of avarice.
GPT’s origin story is different; GPT isn’t trying to make money, nor to become popular. It just is. It autocompletes sentences, some of those completions are true given that the training data is so vast, and some aren’t, given that the training data aren’t infinite.
It’s fun to compare the two—they have so many differences—even though neither can be trusted. All of which serves as a firm reminder of one of the most important adages in psychology (a field in which I worked for decades): any one behavior can develop in many different ways. GPT’s mistruths can look like a lot like Santos’ lies, but that doesn’t mean the machinery in Santos brain has anything to do with GPT’s GPUs.
§
But, man, I stopped cold at this little bit:
To a man recently steeped in GPT, that part—the excuses—sounds incredibly familiar.
Gary Marcus has been studying both natural and intelligence for decades, and always gets a kick out of comparing the two.
To communicate interlocutors need to share some context. To lie, the liar needs to have a private context unknown to the other party. If lie is about the shared context and can be easily called, it is not lie, it is stupidity.
When someone says GPT is lying, they are granting it a distinctly human trait. A trait it clearly does not have.
I'd suggest that both Santos and GPT had faulty training data. Maybe Santos believes that American business is based on lies, so he feels no guilt when he lies. I won't deal with companies that lie to me, but that is another subject.
GPT seems capable of producing good code, perhaps it's because there is less trash code on the Internet.
For other subjects the amount of careless data, incompetent ramblings and outright lies is much greater. GPT does not have the ability to discern the validity of that data, and that is the source of its less than accurate results.