A chatbot used by Air Canada hallucinated an inaccurate bereavement discount policy, and a customer who flew based on this information sued the company over being misled. The small claims court sided with the deceived customer, arguing that the chatbot was acting as an official agent of Air Canada, and that there was no reason a customer should have to double check resources from one part of the Air Canada website against different parts of the same website.
HAHAHAHAHAH fucking amazing
Oh please, I really hope we get more stuff like this. Nothing will kill this fad faster than companies realising they’ve been swindled by techbros threatening them with FOMO and this algorithm bullshit won’t actually do anything useful for them.
They’ll kill the ability to sue for damages before sacrificing the cash cow
Then I’m not going to talk to them. If the information they give me may be incorrect and not binding, then what’s the point?
It’s not a “hallucination” you dorks it’s a random number generator that you used to replace labor.
I like to call it a hallucination, because while yes the thing isn’t smart enough to experience thoughts, it really gets at how absolutely unreliable these things are. People are talking to the thing and taking it seriously, and it’s just watching the pink dragons circle around
Yeah why is it a “hallucination” when the AI just makes shit up, but when a person does it they’re either lying or just plain wrong.
According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.
Prepare for more of that, applied to weaponized drones