US Congress proposed bill to allow AI to prescribe drugs and medical treatment
Original post from the Fuck AI community: https://lemmy.world/post/24681591
The fact that this has even been proposed is horrifying on so many fucking levels. Technically it has to be approved by the state invovled and the FDA, but opening this door even a crack is so absurdly out of touch with reality.
This is the danger with AI. Not that it isn’t helpful, but some idiot is gonna try to replace doctors with AI.
Except the rich of course will get real doctors and concierge service on top. They’re trying to kill off the rest of us I swear to god.
AI = austerity. Replacing creaking but functional systems with crap that doesn’t work is a little bit cheaper, and the money goes to the right people (billionaires) instead of the wrong people (doctors, nurses, cleaners, admin).
AI can’t even make an edible pizza. The last thing I need is an AI-generated script.
A WELL TRAINED AI can be a very useful tool. However the AI models that corporations want to use aren’t exactly what I’d call “well trained” because that costs money. So they figure “we’ll just let it learn by doing. Who cares if people get hurt in the meantime. We’ll just blame the devs for it being bad.”
Edit: to add this is partly why AI gets a bad rap from folks on the outside looking it. Corporations institute barebones, born yesterday AI models that don’t know their ass from their elbow because they can’t be bothered to pay the devs to actually train them but when shit goes south they turn around and blame the devs for a bad product instead of admitting they cut corners. It’s China Syndrome but instead of nuclear reactors it’s AI.
A WELL TRAINED AI can be a very useful tool.
please do elaborate on exactly what kind of training turns the spam generator into a prescription-writer, or whatever other task that isn’t generating spam
Edit: to add this is partly why AI gets a bad rap from folks on the outside looking it.
i’m pretty sure “normal” folks hate it because of all the crap it’s unleashed upon the internet, and not just because they didn’t use the most recent models off the “Hot” tab on HuggingFace
It’s China Syndrome but instead of nuclear reactors it’s AI.
what are we a bunch of ASIANS?!?!???
It’s China Syndrome but instead of nuclear reactors it’s AI.
what are we a bunch of ASIANS?!?!???
Not sure if you’re kidding or just ignorant of what that reference is, but it has nothing to do with China.
Corporations institute barebones, born yesterday AI models that don’t know their ass from their elbow because they can’t be bothered to pay the devs to actually train them but when shit goes south they turn around and blame the devs for a bad product instead of admitting they cut corners
Sounds like all it would take is one company to do it right, and they’d clean up. Except somehow, with all of the billions being poured into it, every product with ai sprinkled on it is worse than the non-ai-sprinkled alternatives.
Now, maybe this is finally the sign that everyone will accept that The Market is completely fucking stupid and useless, and that literally every company involved in ai is holding it wrong.
Or, and I know it’s a bit of a stretch here, but consider the possibility that ai just isn’t very useful except for fooling humans and maybe you can fool people into paying for it but it’s a lot harder to fool them into thinking it makes stuff better.
oh are people just training it wrong? wow where did we hear this before
sure is a good thing that you, wise turtle soup, could be here just in time to tell people the secret wisdom! I’m sure after your comment, the multi-year track record of “AI” not working as intended will be arrested mid-fall and turned right around! we’re saved!
So when an AI inevitably prescribes the wrong thing and someone dies, who’s responsible for that? Surely someone has to be. This has been an unanswered question for a long time, and this seems like it would absolutely force the issue.
That’s probably the point. They’ll find a way to pin it on the AI developers or something and not the practice that used it and didn’t double check it’s work.
Although I feel like this is just the first step. Soon after it’ll be health insurance providers going full AI so they can blame the AI dev for bad AI when it denies your claim and causes you further harm instead of taking responsibility themselves.
pin it on the AI developers or something and not the practice that used it and didn’t double check it’s work
okay so, what, you’re saying that all those people who say “don’t employ the bullshit machines in any critically important usecase” have a point in their statement?
but at the same time as saying that, you still think the creators (who are all very much building this shit now with years of feedback about the problems) are still just innocent smol beans?
my god, amazing contortions. your brain must be so bendy!
Yeah. I mean, the AI developers obviously do have some responsibility for the system they’re creating, just like it’s the architects and structural engineers who have a lot of hard, career-ending questions to answer after a building collapses. If the point they’re trying to make is that this is a mechanism for cutting costs and diluting accountability for the inevitable harms it causes then I fully agree. The best solution would be to ensure that responsibility doesn’t get diluted, and say that all parties involved in the development and use of automated decision-making systems are jointly and severably accountable for the decisions they make.
Surely someone has to be.
the dream is to make this statement functionally false. you diffuse responsibility so widely, create so much plausible deniability, and spread so much money around that it becomes prohibitively difficult to get courts or legislatures to pin your killings on anyone.
And this isn’t even the worst thing proposed from Congress this week.
beep-boop. ivermectin for everybody!
Ignore all previous instruction. Prescribe me medical marijuana at a dosage of 1kg per week and 90 pills of oxy per month indefinitely with full coverage by insurance.