Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

The NYPD's newest technology may be recording conversations

NYPD police surveillance
Andrew Burton/Getty Images

The NYPD's latest anti-crime program may be more invasive than it seems. Called ShotSpotter, this technology claims to help police reduce gun violence by monitoring public streets for loud noises. The problem is: ShotSpotter may also be able to record conversations, Fusion reports.

Advertisement

The company has adamantly maintained that it is not a voice surveillance device, but others aren't so sure.

"There is clear evidence that ShotSpotter can record conversations,” Electronic Frontier Foundation activist Nadia Kayyali told Business Insider. 

While ShotSpotter admits that these cases happen, it emphasizes that they are exceptional. It only collects noises that happen at exactly the time of the blast, according to CEO Ralph A. Clark. 

"The system basically truncates the noise; two seconds before, maybe three seconds after," Clark explained to Business Insider. He went on to emphasize that ShotSpotter's technology does not live-stream.

Advertisement

"The technology is not capable of doing any online real-time streaming," he said.

Privacy concerns over ShotSpotter, however, may still start to bubble up in New York, which is piloting a $1.5 million project to test the on-the-street equipment.

“This gunshot detection system is going to do a world of good in terms of going after the bad guys,” New York City Mayor Bill de Blasio said when announcing the initiative. Beginning this week, the NYPD will install 300 ShotSpotter microphones in both Brooklyn and the Bronx.

ShotSpotter, which has been used by public authorities for over a decade, is used in over 90 US cities, including Oakland, Newark, Miami, and Worcester. It works by installing sensors that are activated when loud noises are registered. If ShotSpotter hears what it thinks is a gunshot (a loud noise that is registered by three nearby sensors and then confirmed by an internal team), it notifies both the police and its database of the location. The problem, as Fusion reports, is ShotSpotter's program may stand in the way of the some citizens' Fourth Amendment rights. 

Advertisement

The cases when ShotSpotter data was used in court present a unique privacy conundrum. 37-year-old Oakland resident Tyrone Lyles’s last words were, “Why you done me like that R? R, why you do me like that dude?” Those last moments were recorded and sent to the authorities via ShotSpotter. The recordings were used as evidence to convict Arliton Johnson, Lyles' shooter, of first-degree murder.

Similarly, in New Bedford, Connecticut, a man named Michael Pina yelled “No, Jason! No, Jason!” in 2012 after being shot. Nearby sensors heard his cries, recorded them, and hurled authorities into action. Police then arrested and convicted two men in connection with Pina’s murder.

shotspotter
ShotSpotter detects the sound of a gunfire and sends the location to local authorities. Shotspotter.com

While the merits of determining a gunfire’s location seem obvious, ShotSpotter has led to some unintended consequences. In both of the aforementioned cases, ShotSpotter picked up the dying words of these men, and these recordings were subsequently used as evidence in a court of law. This indicates that the company’s technology, while intended to merely record loud blasts, has the potential to be a privacy nightmare. Oakland, which has had a contract with the company since 2006, has been debating its use for years now. The Northern California city alone has used the sensors’ recordings of human voices as evidence twice. 

ShotSpotter claims that its sensors are “specifically designed to be triggered by loud explosive or ‘impulsive’ sounds only” and that recordings of voices are purely incidental. “The entire system is internally designed not to allow ‘live listening’ of any sort,” the company writes in its privacy policy. And when the devices do transmit live audio the recordings only last a few seconds. This does not constitute surveillance, in its eyes. 

Advertisement

In his conversation with Business Insider, ShotSpotter CEO Clark went on about the specific cases when his company's audio has been used. "Conversations were not recorded," he said. "[There were] people shouting just before or just after a felony." And these cases, he emphasized, have happened fewer than five times out of the millions of impulses that ShotSpotter's sensors have picked up in the last 20 years.

At the same time, the duration of the sensors’ recordings were deemed long enough to help identify suspects. This could create a Fourth Amendment problem. Is it enough that ShotSpotter’s intention isn’t to record voices when it has been used to do just that?

“Of course there are audio sensors that are constantly recording,” Oakland Police Captain Ersie M Joyner III said to the Oakland Public Safety Committee last year. “It takes a loud bang to activate the sensors to start recording.” 

shotspotter
ShotSpotter picks up audio as well as gunshot noises, which could violate 4th amendment rights. Shotspotter.com

ShotSpotter’s critics see a breakdown in what it says it does and what has been done. It’s important that communities installing the program know precisely what it does because of the evidence that it can record conversations, according to Kayyali. 

Advertisement

“The representations [to communities] need to be accurate,” she added. 

Another criticism of ShotSpotter is that it may not even be effective. New York City's Public Advocate Letitia James cites the National Institute of Justice, which claims ShotSpotter “accurately detected 80% of test shots.” This statistic, however, dates back to 1999. Newer data from Newark show it's less accurate. The New Jersey city has been installing sensors since 2007 and false positives have reportedly been rampant. Between 2010 and 2013, “75% of the gunshot alerts have been false alarms,” WNYC reports.

In regards to false positive statistics, Clark said that his company has spent the last five years improving its technology to send out better, more correct reports to authorities.

"We're amazingly good at what we do," he said to Business Insider. "We're not perfect but we strive for perfection."

Advertisement

New York, for its part, is trying to be transparent with how it uses ShotSpotter. Public Advocate James introduced legislation last week that would require the NYPD to provide detailed reports about data collected throughout the year.

“The information that can be gathered from ShotSpotter is critical in helping police respond faster to incidents involving gunshots,” said James in her office’s press release. “We must also ensure that this technology is used transparently.” 

Business Insider reached out to James’s office about the program's privacy concerns. "ShotSpotter has proven to detect gunfire at impressive rates and I support any measure that helps reduce violence in our communities," she emailed to us in statement. "My office has been in contact with ShotSpotter, and they have assured us they are working to address the issues raised regarding privacy concerns."

Still, regulatory oversight remains woefully behind the times as shiny new devices roll out. Kayyali put it plainly saying, “the law has not caught up.”

Advertisement

Clark, though, said he's happy that the conversation about privacy is happening. He maintains that ShotSpotter is anything but a surveillance device, but that new technologies come with certain trade-offs.

"People should be asking [these questions]," he said. 

Privacy
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account