Actually, let me say at the outset, that my title is a literary stretch. Although this is a Dickensian tale of two that don’t tell the truth, one lies and the other doesn’t.
The first is George Santos. He lies are legendary. His lies just got him convicted put him in the jackpot, with a House ethics committee saying there is clear evidence he committed serious crimes. (Two of aides have plead guilty to fraud.)
The other is … GPT. GPT’s “lies” are also legendary, and prolific, but not really lies, because real lies intend to deceive, and GPT has no intent. Some people have drinking problems; GPT has a truthiness problem. But it’s not deliberate, it’s because it doesn’t know any better.
§
New York Magazine has a fun story on Santos’ origin stories, portraying him as having perfected the art of the lie while working at a call center, selling services to customers that maybe didn’t really need what he was selling. It’s a tell of avarice.
GPT’s origin story is different; GPT isn’t trying to make money, nor to become popular. It just is. It autocompletes sentences, some of those completions are true given that the training data is so vast, and some aren’t, given that the training data aren’t infinite.
It’s fun to compare the two—they have so many differences—even though neither can be trusted. All of which serves as a firm reminder of one of the most important adages in psychology (a field in which I worked for decades): any one behavior can develop in many different ways. GPT’s mistruths can look like a lot like Santos’ lies, but that doesn’t mean the machinery in Santos brain has anything to do with GPT’s GPUs.
§
But, man, I stopped cold at this little bit:
To a man recently steeped in GPT, that part—the excuses—sounds incredibly familiar.
Gary Marcus has been studying both natural and intelligence for decades, and always gets a kick out of comparing the two.
To communicate interlocutors need to share some context. To lie, the liar needs to have a private context unknown to the other party. If lie is about the shared context and can be easily called, it is not lie, it is stupidity.
Yes, GPT is a liar and is dumb. Bard is worse, but it is very good at getting work done. For some intricate things I'd have to do manually I now just ask the chatbot and it gives good and detailed solutions, even with code.
This is the path forward. Useful but imperfect tools, which get smarter as people use them more and companies reinvest profits. Bottom-up to AGI.