ShotSpotter Accuracy Debate Examined

By IPVM, Published Jun 25, 2021, 08:40am EDT

Is ShotSpotter, the widely used and widely criticized gunshot detection system, accurate or not? This has become a heated debate in Chicago with Northwestern Law School finding that nearly 90% of alerts are not gun-related incidents while ShotSpotter claims 97% aggregated accuracy. Who is right?

IPVM Image

IPVM explains both sides, explains how ShotSpotter calculates their accuracy, analyzing the limitations and possible flaws in both and how this can impact city usage of a gunshot detection system.

Update June 30th, 2021: ShotSpotter sent IPVM responses to this article that we have added.

Executive Summary

ShotSpotter claims 97% accuracy and guarantees 90% accuracy in a contract with Springfield MO, but ignores false alerts when counting accuracy for contracts. They also assume that all unverified alerts are true. The MacArthur Justice Center reports that 86% of ShotSpotter alerts in Chicago from 2019-2021 did not result in a case report, alleging that "ShotSpotter creates thousands of dead-end police deployments that find no evidence of actual gunfire". Despite this, Chicago continues to be ShotSpotter's biggest customer and Chicago Mayor Lori Lightfoot said "ShotSpotter Plays an Important Role" in public safety.

ShotSpotter Gunshot Detection Overview

ShotSpotter reports using 20 to 25 sensors per square mile for gunshot detection. After a sensor picks up noise, analytics along with human review are used to classify it as a gunshot or not.

US law enforcement accounts for 90%+ of ShotSpotter 2020 revenue. They report having 120+ cities as customers. For more on ShotSpotter's Business see ShotSpotter Financials and Business Examined

ShotSpotter Comments To IPVM On Accuracy

IPVM question to ShotSpotter: "Does ShotSpotter keep track of false alerts in anyway or have any metrics for how common they are in typical installations?"

It’s important to point out that 88% of all gunshots across the nation are not reported to 9-1-1. ShotSpotter alerts help to build community trust, helps save lives, enhances officer safety, and increases evidence collection.

It’s also important to point out that, based upon customer feedback, only 0.5% of ShotSpotter alerts as aggregated across the country are false positive.

In the ShotSpotter system, an alert is deemed a “false positive” alert when ShotSpotter alerts law enforcement to what it determines is a gunshot, police are deployed to the location of the alert, and they confirm the sound was something else – seeing evidence of fireworks for example.

ShotSpotter’s best practices include steps to ensure that if police officers conclude the ShotSpotter alert was indeed a “false positive,” they report that to our team so we can keep track and take remedial steps to minimize false positives in the future.

IPVM question to ShotSpotter: "Does ShotSpotter work on bullets smaller than .25 caliber? Is there a minimum accuracy similar to the guaranteed 90% on larger bullets?"

ShotSpotter guarantees that 90% of unsuppressed, outdoor gunfire incidents, using standard, commercially available rounds greater than .25 caliber, inside the Coverage Area will be detected and located within 25 meters of the actual gunshot.

IPVM question to ShotSpotter: "Does ShotSpotter have anything to say in response to the MacArthur Justice Center report on their accuracy? I believe ShotSpotter contends a lack of a police report does not mean a gun was not fired, but is there anything else?"

We know from our customers that the ShotSpotter system exceeds our guarantee of at least 90% detection accuracy because customers report false positives or negatives directly to the company. This enables us to keep track of and honor our service level guarantee as well as continually improve the system. Over the last two years ShotSpotter has maintained a 97% accuracy rate including a 0.5% false positive rate aggregated across all customers.

The report draws erroneous conclusions from their interpretation of police report categorizations, falsely equating them to no shots fired and therefore false positives. ShotSpotter has been in operation for 25 years, serves more than 110 cities and has earned trust and high renewal rates from many police departments because the system is effective in helping save lives, reduce gun violence, and make communities safer. 911 call center data alone provides an incomplete and misleading picture of ShotSpotter’s accuracy and effectiveness.

Updates from ShotSpotter on Accuracy

ShotSpotter told IPVM that they train and test installations and do not deploy them unless they can detect gunfire with 90% accuracy or higher:

For your background, ShotSpotter rigorously trains and tests every individual reviewing gunfire alerts at the incident review center. The detection technicians are only deployed if they are able to detect gunfire with 90% accuracy or higher. Most of ShotSpotter’s employees outperform the 90% accuracy requirement.

ShotSpotter also confirmed they rely on customers to report false positives:

The issue of false positives comes up quite a bit. For your background, ShotSpotter relies on its customers to report false positives, and that number is so small that it is not a concern for most customers.

Contract Accuracy Claims

ShotSpotter guarantees 90% of outdoor gunfire with rounds greater than .25 caliber will be detected, from a public Springfield MO contract with ShotSpotter:

90% of unsuppressed, outdoor gunfire incidents, using standard, commercially available rounds greater than .25 caliber, inside the Coverage Area will bedetected and located within 25 meters (82 feet) of the actual gunshot location.

This metric does not include false positives, like fireworks or other noises that the system might detect as gunfire, which is a core complaint from the MacArthur Justice Center.

However, ShotSpotter ignores false positives, such as alerting on fireworks and other loud noises, when calculating accuracy as seen by their equation:

The ShotSpotter Respond System will provide data for correct detection and accurate location for ninety percent (90%) of detectable (outdoor, unsuppressed) community gunfire which occurs within a coverage area, the “Coverage Area”, provided the measurement is Statistically Significant, as defined below. This performance rate shall be calculated as a percentage as follows:

IPVM Image

where the “Performance Rate” is a number expressed as a percentage, “NumberAccuratelyLocated” is the number of “Gunfire Incidents” occurring within the Coverage Area during the specified period for which the ShotSpotter produced an Accurate Location, NumberMislocated is the number of Verified Incidents (a “Verified Incident” is an incident where Customer has physical or other credible evidence that gunfire took place) for which the ShotSpotter produced an inaccurate location (i.e., a Mislocated Incident), and NumberNotDetected is the number of Verified Incidents for which the ShotSpotter failed to report a location at all (i.e., Missed Incidents).

The ShotSpotter performance rate only penalizes ShotSpotter if they fail to identify an actual gunshot (NumberNotDetected) or they detect it in the wrong location (NumberMislocated).

Only Counts As False If Verified What It Was Actually

ShotSpotter has a questionable method for counting false positives that only counts them if the police verify what the actual noise source was, as they explained to us:

In the ShotSpotter system, an alert is deemed a “false positive” alert when ShotSpotter alerts law enforcement to what it determines is a gunshot, police are deployed to the location of the alert, and they confirm the sound was something else – seeing evidence of fireworks for example.

The problem is that a significant, though undetermined, number of alerts are never verified as to what the source of the noise was. As such, that all unverified alerts are excluded from being false is misleading and will lead to overstated accuracy.

"Thousands of Dead-End Police Deployments"

In Chicago, 86% of ~40,000 ShotSpotter alerts led to no crime report according to the MacArthur Justice Center at Northwestern University’s School of Law. They calculated this based on data from the City of Chicago, ShotSpotter's biggest customer. The MacArthur Justice Center says the lack of case reports shows that "ShotSpotter creates thousands of dead-end police deployments that find no evidence of actual gunfire"

IPVM Image

ShotSpotter told IPVM that the report is misleading because it is equating a lack of a police report with a false positive:

The report draws erroneous conclusions from their interpretation of police report categorizations, falsely equating them to no shots fired and therefore false positives. [...] 911 call center data alone provides an incomplete and misleading picture of ShotSpotter’s accuracy and effectiveness.

While it is true that many alerts could be true even if a case report was not filed, the lack of a case report signals that there was no benefit to the public and the cost of the police responding to these calls.

The Center emphasizes that this leads to discriminatory deployments:

The City of Chicago has deployed ShotSpotter in 12 police districts. Those districts are the ones with the highest proportion of Black and Latinx residents in the city. ShotSpotter burdens residents on the South and West sides with thousands of high-intensity deployments where police are hunting for supposed gunfire in vain.

ShotSpotter's Response on Deployment:

ShotSpotter says deployments are based on where gun-related crime rates are highest:

Using data from fatal and non-fatal shootings, city governments and police departments determine where gun-related crime rates are the highest. This information is used to identify the parameters where ShotSpotter sensors will be most helpful in driving down gun-related violence – making neighborhoods safer.

Update: Chicago Inspector General's Report

The City of Chicago Office of Inspector General (OIG) released a report on August 24, 2021 analyzing the Chicago Police Department’s (CPD) use of ShotSpotter between January 1, 2020 and May 31, 2021, and found that it rarely led to evidence of gun crimes or investigatory stops:

From quantitative analysis of ShotSpotter data and other records, OIG concludes that CPD responses to ShotSpotter alerts rarely produce evidence of a gun-related crime, rarely give rise to investigatory stops, and even less frequently lead to the recovery of gun crime-related evidence during an investigatory stop.

In response, ShotSpotter told IPVM the study did not analyze ShotSpotter’s accuracy and that CPD describes ShotSpotter “as an important part of their operations”:

It is important to point out that the Chicago Police Department continually describes ShotSpotter as an important part of their operations. The OIG report does not negatively reflect on ShotSpotter’s accuracy which has been independently audited at 97 percent based on feedback from more than 120 customers. Nor does the OIG propose that ShotSpotter alerts are not indicative of actual gunfire whether or not physical evidence is recovered. We would defer to the Chicago Police Department to respond to the value the department gets from being able to precisely respond to criminal incidents of gunfire. We work very closely with our agency customers to ensure they get maximum value out of our service.

Scientific Studies

The Center also alleges that ShotSpotter has no scientific studies backing its performance:

ShotSpotter has never done a scientifically valid study to determine whether its system can reliably tell the difference between the sound of gunfire and other loud noises like firecrackers, cars backfiring, construction noises, helicopters, and other harmless sounds.

IPVM has not found any such study supporting ShotSpotter.

"ShotSpotter Plays an Important Role" - Chicago Mayor Lori Lightfoot

However, Chicago's mayor praised ShotSpotter and the city remains their largest customer, signaling that Chicago is satisfied with the level of accuracy they receive. Chicago Mayor Lori Lightfoot said "ShotSpotter plays an important role" in saving lives, according to APNews. Chicago also continues to pay ~$8 million a year for ShotSpotter, showing that they value the service they receive.

Accuracy Debate

While the public data does not enable a definitive estimation of false alerts, the false alerting problem is likely significantly greater than what ShotSpotter insinuates. ShotSpotter uses misleading assumptions and a misleading accuracy calculation for their advertised and guaranteed accuracy rates. This will lead to an overstating of accuracy and an undercounting of false positives which can waste police resources and limit the usefulness of ShotSpotter's software. However, on the Macarthur Center side, there will certainly be some cases of actual gunfire that do not lead to a police report but the question is how many of those are the case.

Value Even If Inaccurate

While large cities with significant gunfire like Chicago may justify innacurate alerting, smaller cities with lower crime rates may not. Because gunshots are such an important problem in cities like Chicago, even if 5/6 alerts are false ShotSpotter may still be beneficial to these cities. But, cities, where gun violence is not as big an issue as Chicago, will likely have a higher percentage of false alerts because there are fewer real gunshots and will see less benefit and greater operational costs as police officers respond to overwhelmingly false alerts.

Update: ShotSpotter Says They are Beneficial to Small Cities

In response to “But, cities where gun violence is not as big an issue as Chicago, will likely have a higher percentage of false alerts because there are fewer real gunshots and will see less benefit and greater operational costs as police officers respond to overwhelmingly false alerts.” ShotSpotter says their data will have a larger impact because these cities have less crime and that there is an increase in "smaller cities seeing the value of ShotSpotter":

This is misleading. It’s important to recognize that smaller cities have less incidents of gunfire and therefore the data is more easily impacted. ShotSpotter has seen a growing group of smaller cities seeing the value of ShotSpotter. ShotSpotter is helping them identify gunshots that they would otherwise not have known about which helps police to respond and save lives.

2 reports cite this report:

Flock Expands Into Gunshot And Audio Analytics on Oct 20, 2021
Flock Safety is expanding from its core market of LPR into "audio evidence...
MacArthur Justice Center vs ShotSpotter Commissioned Report on Aug 13, 2021
Amidst bad press, ShotSpotter commissioned Edgeworth Analytics to review...

Comments (12)

Only IPVM Subscribers may comment. Login or Join.

According to my friends in Chicago, the city is an absolute mess of gangs and corruption. Just going to put it out there that police may report things "differently" if they aren't able to deal with crimes and if somebody higher up wants the numbers low to keep their job. By the same token, there could be less than pure reasons the mayor likes ShotSpotter so much.

Agree: 2
Disagree
Informative
Unhelpful
Funny

ShotSpotter has been in Chicago for nearly a decade now, Lightfoot has only been mayor since 2019.

I didn't see this mentioned in the article, but at the scale of alerts 14% leading to a case is 5,600 cases annually. It may also be a useful investigative tool for them, providing more reliable date and time information than an eyewitness account even when ShotSpotter isn't the sole initiator for a call.

I think most importantly, does ShotSpotter false alerts also detract from an officers ability to respond to other calls/issues? If no, then its a question of what value do the above two benefits provide, how quickly is accuracy improving, and is it worth the price. Difficult to say though it certainly doesn't look great for ShotSpotter from where I stand.

Agree
Disagree
Informative: 2
Unhelpful
Funny

IPVM needs someone dedicated to proofreading. The bar graph alleging 86% of reports are NOT filed, says that reports ARE filed on the bar itself.

IPVM Image

And the other day, I saw an IPVM video ad on Facebook where a bottom title said "pixels per foor".

Agree: 2
Disagree
Informative
Unhelpful: 1
Funny: 1

Thank you for bringing that to our attention. I have updated the image to make it clear that 86% of alerts Did Not result in a case report being filed by the police according to the MacArthur Research Center.

Agree
Disagree
Informative
Unhelpful
Funny

It’s important to point out that 88% of all gunshots across the nation are not reported to 9-1-1. ShotSpotter alerts help to build community trust, helps save lives, enhances officer safety, and increases evidence collection.

their very 1st sentence in their reply to IPVM is the foundation of how they sell their value proposition to Chiefs of Police in U.S. cities - and it was their own data they used to make up validate that claim (more on that below).

after years of sales stagnation, they started hiring former Chiefs of Police years ago to sell their wares to current Chiefs of Police - and it seems to have worked well as a strategy, because their growth took off.

probable reason: Chiefs of Police love quotable numbers that show how effective they are. and Shotspotter makes them up provides them.

check out their 'results' page... it's an absolute assault of unverifiable numbers and percentages with little to no evidence presented to back up these claims.

they even claim the percentage of "Lives Saved' and "Reductions in Shootings" as if these numbers are verifiable facts that can be tied to the use of their products - which of course, they are not.

back to that '88% of all gunshots across the nation are not reported to 9-1-1' claim that I have always thought was complete bullshit.

Here is what they show as evidence to back up their claim: an actual Brookings Institute study from 2016 (I underlined the best part)

IPVM Image

Agree
Disagree
Informative: 2
Unhelpful
Funny

the last statistic they list at the bottom of their results page is my favorite:

IPVM Image

this is textbook 'emperors new clothes' marketing flim-flam.

Agree: 1
Disagree
Informative
Unhelpful
Funny: 1

I wonder why there is a delta on the percentages... are 6% of "communities" on the take?

Agree
Disagree
Informative
Unhelpful
Funny

Update: We have updated this report to include feedback from ShotSpotter.

ShotSpotter told IPVM they train and test installations and do not deploy them unless they can detect gunfire with 90% accuracy or higher:

For your background, ShotSpotter rigorously trains and tests every individual reviewing gunfire alerts at the incident review center. The detection technicians are only deployed if they are able to detect gunfire with 90% accuracy or higher. Most of ShotSpotter’s employees outperform the 90% accuracy requirement.

ShotSpotter also confirmed that they rely on customers to report false positives:

The issue of false positives comes up quite a bit. For your background, ShotSpotter relies on its customers to report false positives, and that number is so small that it is not a concern for most customers.

In response to “But, cities where gun violence is not as big an issue as Chicago, will likely have a higher percentage of false alerts because there are fewer real gunshots and will see less benefit and greater operational costs as police officers respond to overwhelmingly false alerts.” ShotSpotter says their data will have a larger impact because these cities have less crime and that there is an increase in "smaller cities seeing the value of ShotSpotter":

This is misleading. It’s important to recognize that smaller cities have less incidents of gunfire and therefore the data is more easily impacted. ShotSpotter has seen a growing group of smaller cities seeing the value of ShotSpotter. ShotSpotter is helping them identify gunshots that they would otherwise not have known about which helps police to respond and save lives.

Agree
Disagree
Informative
Unhelpful
Funny

For your background, ShotSpotter rigorously trains and tests every individual reviewing gunfire alerts at the incident review center. The detection technicians are only deployed if they are able to detect gunfire with 90% accuracy or higher. Most of ShotSpotter’s employees outperform the 90% accuracy requirement.

what does the Shotspotter 'detection technicians' training certification process look like?

is there a 'lab' somewhere in NoCal they bring recruits to that has stockpiles of bricks and firecrackers to test serious-minded people seeking to tell the difference between two bricks clapped together and a gunshot?

this is flim-flam.

Agree: 1
Disagree
Informative
Unhelpful
Funny

Interesting update around this - super inaccurate if shot spotter is actually altering evidence ... Police Are Telling ShotSpotter to Alter Evidence From Gunshot-Detecting AI

Agree
Disagree
Informative
Unhelpful
Funny

wow.

from that story link:

But after the 11:46 p.m. alert came in, a ShotSpotter analyst manually overrode the algorithms and “reclassified” the sound as a gunshot. Then, months later and after “post-processing,” another ShotSpotter analyst changed the alert’s coordinates to a location on South Stony Island Drive near where Williams’ car was seen on camera.

so now we know what kind of content is taught in their detection technician training certification course - when to alter data when the cops think it's wrong - or doesn't fit the prosecutorial narrative. how nice.

Rather than defend ShotSpotter’s technology and its employees' actions in a Frye hearing, the prosecutors withdrew all ShotSpotter evidence against Williams.

the Emperor does not wish to have any kind of empirical scrutiny of the new clothes.

Agree: 1
Disagree
Informative: 3
Unhelpful
Funny

That sounds like a system I would want used in determining the fate of a fellow human...

Thankfully there is some log file documenting the changes! Hopefully ALL the changes, and not just the ones convenient for the corporate narrative.

Agree
Disagree
Informative
Unhelpful
Funny
Loading Related Reports