Michael Cohen, the former lawyer for Donald Trump, admitted to citing fake, AI-generated court cases in a legal document that wound up in front of a federal judge, as reported earlier by The New York Times. A filing unsealed on Friday says Cohen used Google’s Bard to perform research after mistaking it for “a super-charged search engine” rather than an AI chatbot.
I… don’t even. I lack the words.
I’m genuinely amazed at the calibre of people running the US. More so that aparently half the nation thinks its the best choice.
Michael Cohen was working for Trump precisely because he couldn’t get a proper lawyer job elsewhere. Good lawyers will steer clear of a client that will ask them to commit crimes for them.
I think this is fairly reductive. I work in a related industry. Often it’s the best lawyers working for the shadier clients, for obvious reasons.
I’m not.
We’ve seen little other than the loss of economic and social liberty in the last 40 years.
99% of voters still choose the same two parties in charge of it like clockwork.
Instead of amazement, I feel cynical resignation.
The impending doom of the fascist right is the only thing keeping me voting for the dems. If we had rank choice I’d be so much happier voting every election.
That’s the thing. I look around and have no reason to think fascism is impending. It’s here.
Women are getting jailed for miscarriage, cops are hanging out lackadaisically outside a school shooting on their phones with zero consequences, homeless jumped 12% in one year, and the big issue is sending hundreds of billions more off to other countries’ wars.
The only plus is that things have gotten so bad it’s forced unions to become more aggressive and unyielding, which has effected more positive change for workers than the ruling parties have achieved in decades.
I… don’t even. I lack the words.
Have you tried ChatGPT?
That’s the second time a lawyer has made this mistake, though the previous case wasn’t at such a high level
Not even close to the second time. It’s happening constantly but is getting missed.
Too many people think LLMs are accurate.
Problem is that these llm answers like this will find their way onto search engines like Google. Then it will be even more difficult to find real answers to questions.
Some LLMs are already generating answers based on other llm generated contant. We’ve gone full circle.
I was using phind to get some information about edrum sensors, (not the intended usecase, but I was just messing around) and one of the sources was a very obvious AI generating article from a contant mill.
Skynet is going to be so inbred
I work for a law firm, and yeah, this happens a lot. The stupidity and laziness of our clients’ in-house attorneys is making us a lot of money.
Why is there not an automated check for any cases referenced in a filing, or required links? It would be trivial to require a clear format or uniform cross-reference, and this looks like an easy niche for automation to improve the judicial system. I understand that you couldn’t interpret those cases or the relevance, but an existence check and links or it doesn’t count.
I assume that now it doesn’t happen unless the other side sys a paralegal for a few hours of research
This is what you get when the political system favours lies above truth
The more these people lie and get away with it, the more it will become the culture. China levels of big brother oppression are only a decade or so away if this keeps on going.
The problem is breathless AI news stories have made people misunderstand LLMs. The capabilities tend to get a lot of attention, but not so much for the limitations.
And one important limitation of LLM’s: they’re really bad at being exactly right, while being really good at looking right. So if you ask it to do an arithmetic problem you can’t do in your head, it’ll give you an answer that looks right. But if you check it with a calculator, you find the only thing right about the answer is how it sounds.
So if you use it to find cases, it’s gonna be really good at finding cases that look exactly like what you need. The only problem is, they’re not exactly what you need, because they’re not real cases.