Death of a Chatbot, body blow to an industry
What happens to chatbots if they people who use them for customer service are held responsible for the things those bots say?
Breaking news:
In the tech industry, this should be the top story.
In brief, an Air Canada chatbot confabulated (“hallucinated”) an answer – telling a customer that bereavement trips would be refunded. The customer kept a screenshot, and demanded the refund that had been promised.
Air Canada tried to throw the chatbot under the bus. (“the chatbot is a separate legal entity that is responsible for its own actions”).
The tribunal judge didn’t buy it.
Nor should he have.
If other judges rule the same way, and hold companies responsible for the nonsense their chatbots say, one of the biggest alleged use cases for large language models could dry up, fast.
Gary Marcus has been warning about the fundamental untrustworthiness of LLM-powered chatbots for quite some time.
“the chatbot is a separate legal entity that is responsible for its own actions” 😂
That's rich. Thank you for my laugh of the morning.
(Has the chatbot hired its own lawyer? 😆).
This is priceless. Just wait until the Supreme Court rules that Chatbots can donate to politicians and buy elections, maybe even run for congress!
Wait a minute, I'm pretty sure some of those folks now in congress ARE chatbots!