This is part 1 of an ongoing investigative series.

An algorithm, not a doctor, predicted a rapid recovery for Frances Walter, an 85-year-old Wisconsin woman with a shattered left shoulder and an allergy to pain medicine. In 16.6 days, it estimated, she would be ready to leave her nursing home.

On the 17th day, her Medicare Advantage insurer, Security Health Plan, followed the algorithm and cut off payment for her care, concluding she was ready to return to the apartment where she lived alone. Meanwhile, medical notes in June 2019 showed Walter’s pain was maxing out the scales and that she could not dress herself, go to the bathroom, or even push a walker without help.

You are viewing a single thread.
View all comments View context
11 points

AI is being used as a means of diverting blame from humans onto a black box. It’s not inherently bad of itself, but the current hype around it is allowing it to be used in ways it shouldn’t be.

permalink
report
parent
reply

Excellent Reads

!longreads@sh.itjust.works

Create post

Are you tired of clickbait and the current state of journalism? This community is meant to remind you that excellent journalism still happens. While not sticking to a specific topic, the focus will be on high-quality articles and discussion around their topics.

Politics is allowed, but should not be the main focus of the community.

Submissions should be articles of medium length or longer. As in, it should take you 5 minutes or more to read it. Article series’ would also qualify.

Please either submit an archive link, or include it in your summary.

Rules:

  1. Common Sense. Civility, etc.
  2. Server rules.

Community stats

  • 117

    Monthly active users

  • 234

    Posts

  • 670

    Comments