Chicago won’t renew its ShotSpotter contract and plans to stop using the controversial gunshot detection system later this year, Mayor Brandon Johnson’s office announced Tuesday.
The system, which relies on an artificial intelligence algorithm and network of microphones to identify gunshots, has been criticized for inaccuracy, racial bias and law enforcement misuse. An Associated Press investigation of the technology detailed how police and prosecutors used ShotSpotter data as evidence in charging a Chicago grandfather with murder before a judge dismissed the case due to insufficient evidence.
Chicago’s contract with SoundThinking, a public safety technology company that says its ShotSpotter tool is used in roughly 150 cities, expires Friday. The city plans to wind down use of ShotSpotter technology by late September, according to city officials. Since 2018, the city has spent $49 million on ShotSpotter.
That’s strange. I would assume this would be a problem unusually well-suited to machine learning techniques. Law enforcement misuse and racial bias I can see, but inaccuracy? It’s a triangulation problem mostly.
How TF do you racially bias a gunshot sound? That doesn’t make any fucking sense…
I was imagining selective enforcement/response, or restricting use to areas where one race predominates, or behaving differently in response to mic reports in white neighborhoods than black. Standard cop shit.
edit: yeah pretty much
Ah okay, so the article as written is just written stupidly. It doesn’t seem like the system itself is racially biased, so much as the cops using it.
The bias comes in because they installed them in areas with more gunshots, which happen to be areas with more minorities.
By not spending excessively on detectors uniformly distributed across the metropolitan area, they are targeting minorities.
Well they are placed in mostly minority communities but above that, a ShotSpotter tech admitted in court that they often change the analyst interpretations of what is and isn’t a gunshot at the request of their police department customers which, keep in mind, have successfully been used in court as evidence of a crime.
There is also overwhelming data showing that the majority of their alerts lead to no arrests. The Chicago IG believes this demonstrates false positives where as ShotSpotter (who changed their name after some criticism to SoundThinking) says that people can fire a gun and leave no evidence in spite of police investigating and asking people that would have witnessed it (and there being no victim).
Furthermore, the way it works is that AI ‘assists’ a human who then determines if it is a gun shot and then attempts to triangulate the position it came from. From the trial I mentioned earlier, ShotSpotter determined that there was no gun shot but then changed the analysis at LEO request but in actuality was proven to be a helicopter…
Furthermore, ShotSpotter keeps the details of its methodologies and models a secret and has refused an independent audit from IPVM.
So with all of that, one could easily argue that ShotSpotter/SoundThinking is as biased as a police officer and that the evidence is purely subjective and non-transparent.
The ACLU had a pretty good article on it a few years ago. It seems the inaccuracy comes from the number false positives and the resulting aggressive police response.
not if you consider fireworks, car misfires, echos and weird geometries, and the fact that supersonic bullets have a sonic boom that travels with it…
that and the ai was probably only trained in black neighborhoods so it thinks loud bass or black accents are required to be a positive? i dunno
all of those different noises have distinct soundwave profiles, and different geometries can be accounted for either in software or with strategic placement of mics. I’m convinced this would be a good ML project, if we could find a way of enforcing without police bias, which, good luck.
they have this in DC… coincidentally the day with the most “gunshots” is also the 4th of July when hoards of people are openly lighting off fireworks of all kinds in the street.
I wouldn’t assume a company like ShotSpotter uses modern machine learning techniques. It’s got a pitiful accuracy rate and the company was founded 28 years ago. They seem more like a company that hires people with connections rather than a company that hires AI experts and buys Nvidia H100 GPUs by the gross.
The system probably works great in military situations, which I believe is what it was designed for. In a dense city where sound can echo multiple times off various buildings and other structures? It probably gets things wrong quite often. Add in trigger-happy cops that don’t know how to interpret the data and you have a recipe for disaster.
This company has been caught multiple times going back and changing the location of recorded shots in their system after the fact. I don’t know why they’re still in business.