Google returns sources that you can evaluate for accuracy.
Chatgpt just says things.
Every output of chatgpt should end with “source: just trust me bro”.
Chat gpt said things you can evaluate, which i did by googling it. And when i could not find the event in question, i went back into Lemmy and asked for more information. So tell me where i err’d? Was it not taking the posters word on it? Or trying to get context in the first place?
You err’d fucked-up twice.
Once when you flat out failed to find anything using Google, when other people clearly had no trouble at all. If you’re telling the truth, this just means you suck at Google. There’s no reason to be googling chatgpt’s hallucinations instead of searching for the stuff an actual human told you about.
The second time was when you took chatgpt seriously. Just don’t. It’s a very expensive toy that occasionally does something cool. We’re still trying to figure out if it’s actually useful for anything, or if it’s just really good at appearing useful.
Ok, one: chill the fuck out
Two: when google did not return anything useful, for WHATEVER REASON, i didn’t come back and assume the event didn’t happen, i asked for MORE info, like a good little netizen.
Three: the event chat gpt referenced was NOT a hallucination: https://slate.com/news-and-politics/2002/10/paul-wellstone-s-memorial-service-turns-into-a-pep-rally.html Surprise! When i looked up Paul Wellstone and filming at a memorial , this is the event i found for the first page.
Four: me bringing up chat gpt was due to just how uncharacteristic it shut down my query. So i did my due diligence. Chill out.
You can retrieve sources from chat gpt. And that is besides the point that i didn’t simply rely on gpt. Even without prompting, i did my own digging on google, found his wiki page looked up articles about Paul and filming at a memorial and only found the incident from 2002. Thats two more paths to sources that failed me.
Chat gpt is a tool that is useful if used right, but even i did not take its word for it.