article updated - apparently Whisper uses a transformer model and is closely related to GPT-2. thus, hallucinations.
I will never get tired of that saltman pic.
Computers hallucinating medicine into your diagnosis! Tonight on Sick Sad World!
That’s what you get when you demand results regardless of whether or not there is a result to share.
Wonder how that might apply to the rest of the economy…
This is a malpractice lawsuit waiting to happen. And probably a product liability lawsuit, if this LLM’s hallucinations lead to someone getting hurt.