43 points

Garbage headline. This isn’t “AI” doing this, it’s hiring managers and companies. It’s policy. If I put all my applicants into a Microsoft excel spreadsheet and use the sorting function to sort by race, then only hire ones of of a particular skin tone, is Excel keeping millions of qualified candidates out of the workforce? No, of course not. Neither is AI. Replace “AI” with “company policy” in every one of these articles and you get at what’s actually occurring.

Same reason we don’t need to “regulate AI”. We need to regulate it’s deployment, just like we regulate whatever technology we used for it previously. In other words, we don’t need new rules, we just need enforcement of existing ones. You can’t have a hiring process that discriminatory. What tool you use to arrive at that end doesn’t matter.

permalink
report
reply
15 points

I completely disagree. It absolutely is AI doing this. The point the article is trying to make is that the data used to train the AI is full of exclusionary hiring practices. AI learns this and carries it forward.

Using your metaphor, it would be like training AI on hundreds of excel spreadsheets that were sorted by race. The AI learns this and starts doing it too.

This touches on one of the huge ethical questions with regulating AI. If you are discriminated against in a job hunt by an AI, who’s fault is that? The AI is just doing what it’s taught. The company is just doing what the AI said. The AI developers are just giving it previous hiring data. If the previous hiring data is racist or sexist or whatever you can’t retroactively correct that. This is exactly why we need to regulate AI not just its deployment.

permalink
report
parent
reply
7 points
*

“This touches on one of the huge ethical questions with regulating AI. If you are discriminated against in a job hunt by an AI, who’s fault is that?” It is the fault of the company hiring practices, which are to blindly trust an AI without testing whether or not it is discriminatory. It is also the fault of the producer of that AI software (or service) sold to the company for screening candidates. No new laws are needed to hold either of them accountable, existing laws cover the ground well. That company selling the AI screening services could have just been called “crystal ball hiring” before AI and would be equally liable if they just discriminated in their hiring suggestions. The tool isn’t the thing that needs regulating, the actions people and companies take based on the tool is. And that is already well regulated.

Make an AI in the privacy of your own home that does ____ literally anything? Fine. Collaborate making an OSS AI to do whatever with some of your friends? Also fine. Sell that AI as a employment screener app? Better make sure you’ve tested it to not have discriminatory outcomes. Use that AI to screen employees? Same deal.

permalink
report
parent
reply
2 points

The “AI” is simply a program used to filter data. Whose fault is it if using it causes problems? The nitwits who choose to use these programs and trust the results without understanding the limitations.

permalink
report
parent
reply
2 points

The “guns don’t kill people, people kill people” argument, eh?

permalink
report
parent
reply
4 points

AI is the new scapegoat over immigrants on who to blame for jobs being lost (even though it’s always the capitalist owner that drive these decisions)

permalink
report
parent
reply
4 points

Oh, you mean those immigrants who won’t work but steal our jobs?

permalink
report
parent
reply
1 point

It’s a brave new world, we new brave new laws

permalink
report
parent
reply
14 points

This is the best summary I could come up with:


According to Manjari Raman - one of the researchers behind that study and the Senior Program Director Managing for the Future of Work Project at Harvard Business School -  companies turn to these automated systems because they are sometimes flooded with applications.

The AI was trained on the company’s previous hiring track record, and since men dominate the tech industry, it decided that male candidates were preferable to female ones.

That same year, auditors of another screening tool found that the software ranked people with the name Jared and a history of playing lacrosse in high school more favourably than other applicants.

But they were ranked as a low candidate because the job asked for international experience, and the ATS screener thought the journalist didn’t meet this requirement despite them previously having worked in five different countries.

European Union officials are working on groundbreaking rules to regulate AI that could become the de facto standard for global countries because of the size of the 27 nation bloc and its market.

China is also drafting regulations requiring security assessments for any products using AI, while the UK’s competition watchdog has opened a review of the market.


I’m a bot and I’m open source!

permalink
report
reply
8 points

How depressing. Makes me wonder if that’s part of the reason I’m struggling to switch careers, because though I’m well qualified for my desired role - I don’t fit the stereotypical career history.

permalink
report
reply
2 points

I have never once gotten a job from applying directly. It seems like applying directly just immediately deleted your application. I’ve always gotten my jobs from referrals or a recruiter reaching out

permalink
report
reply
1 point
Deleted by creator
permalink
report
reply

World News

!worldnews@lemmy.ml

Create post

News from around the world!

Rules:

  • Please only post links to actual news sources, no tabloid sites, etc

  • No NSFW content

  • No hate speech, bigotry, propaganda, etc

Community stats

  • 5.6K

    Monthly active users

  • 11K

    Posts

  • 118K

    Comments