New York City areas have more close-circuit TV cameras with higher rates of “stop-and-frisk” police detection, according to a new report from Amnesty International’s Decode Surveillance NYC project.
Beginning in April 2021, more than 7,000 volunteers began surveying the streets of New York City through Google Street View to document the location of the cameras; Volunteers evaluated 45,000 intersections three times and identified more than 25,500 cameras. The report estimates that about 3,300 of these cameras are publicly owned and used by government and law enforcement. The project used the data to create a map marking the coordinates of all 25,500 cameras, with the help of BetaNYC, a technology-focused civic organization, and contracted data scientists.
Analysis of these data showed that in the Bronx, Brooklyn and Queens, census tracts had more publicly owned cameras with higher concentrations of people of color.
To find out how the camera network correlates with police detection, Amnesty researchers and partner data scientists determined the frequency of events per 1,000 inhabitants in each census route (a geographic section smaller than a zip code) in 2019, based on street address data. From NYPD. “Stop-and-frisk” policies allow authorities to randomly investigate citizens on the basis of “reasonable suspicion”. NYPD data cited in the report show that there have been more than 5 million stop and frisk incidents in New York City since 2002, with most of the searches being on people of color. According to the New York ACLU, most people subject to this finding are innocent.
Each census route was assigned a “surveillance level” according to the number of publicly owned cameras per 1,000 inhabitants within 200 meters of its boundaries. Areas with a higher frequency of stop-and-frisk detection also had higher surveillance levels. For example, there were six such searches in 2019 on a half-mile route in East Flatbush, Brooklyn, and 60% coverage by public cameras.
Experts fear that law enforcement will use face recognition technology on these camera feeds, disproportionately targeting people of color in the process. The New York Police Department used facial recognition, including the controversial Clearview AI system, in at least 22,000 cases between 2016 and 2019, according to documents obtained by the Surveillance Technology Oversight Project (STOP) through public records requests.
“Our analysis shows that the use of facial recognition technology by the NYPD helps to strengthen discriminatory policing against minority communities in New York City,” said Matt Mahmoudi, a researcher at Amnesty International who worked on the report.
The report also details contact details of participants’ face-recognition techniques in last year’s Black Lives Matter protest, overlaying surveillance maps on the March routes. According to Mahmoud, what he received was “almost complete surveillance coverage.” Although it is not clear how facial recognition technology was used during the protest, the NYPD has used it in an investigation into the protest.
On August 7, 2020, dozens of New York City police officers, some in riot gear, knocked on the door of 28-year-old Black Lives Matter activist Derrick Ingram. During one of the marches, Ingram was suspected of attacking a police officer by shouting in the officer’s ear with a bullhorn. Police at the scene were seen examining a document entitled “Facial Identification Section Information Lead Report,” which contained a social media photo of Ingram. NYPD confirmed that he used facial recognition to find her.
Eric Adams, the city’s new mayor, is considering expanding the use of facial recognition technology, despite the fact that many cities in the U.S. have banned it due to concerns about accuracy and bias.
James Spivak, a collaborator at the Georgetown Law Center on Privacy and Technology, says Amnesty’s project “gives us an idea of how widespread surveillance is-especially in the majority of non-white neighborhoods અને and how many public places have been recorded on footage. The police can use facial recognition. “