You are viewing a single thread.
View all comments View context
21 points

I can imagine in the future there will be grid locks in front of the police station with AV cars full of black people when the cops send out an ABP with the description of a black suspect.

We’ve seen plenty of racist AI programs in the past because the programmers, intentionally or not, added their own bias into the training data.

permalink
report
parent
reply
9 points

Any dataset sourced from human activity (eg internet text as in Chat GPT) will always contain the current societal bias.

permalink
report
parent
reply
1 point

The AIs are not racist themselves, it’s a side effect of the full technology stack: cameras have lower dynamic resolution for darker colors, images get encoded with a gamma that leaves less information in darker areas, AIs that work fine with images of light skinned faces, don’t get the same amount of information from images of dark skinned faces, leading to higher uncertainty and more false positives.

The bias starts with cameras themselves; security cameras in particular should have an even higher dynamic range than the human eye, but instead they’re often a cheap afterthought, and then go figure out what have they recorded.

permalink
report
parent
reply

Technology

!technology@beehaw.org

Create post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 2.7K

    Monthly active users

  • 2.9K

    Posts

  • 53K

    Comments