A new array of digital tools allow law enforcement agencies to conduct surveillance “persistently, universally, at an unimaginable scale” and “with no special permission, no oversight and no advance planning.” Notable examples include:
Ring, the smart home security company owned by Amazon, is reintroducing new versions of features that allow police to request footage directly from Ring users and introducing a new feature that would allow police to request live-stream access to people’s home security devices. This change represents a reversal of course for the company, which made privacy reforms in the last few years by reducing police access to camera footage. Ring footage has been used to violate myriad civil liberties, including surveillance of protestors and warrantless search and seizure. The Electronic Frontier Foundation warns that “it is easy to imagine that law enforcement officials will use their renewed access to Ring information to find people who have had abortions or track down people for immigration enforcement.” Compounding this risk, Ring is making an aggressive AI adoption push, even telling employees that they have to show proof that they use AI in order to get promoted. Per EFF, the most likely reason for Ring’s about-face is the opportunity to cash in on rising authoritarianism, which relies on surveillance technology.
Police surveillance company Flock, which has an enormous nationwide license plate tracking system, is now analyzing civilian driving patterns to determine who is “suspicious,” i.e. an algorithm has decided their movement patterns suggest criminality. The ACLU warns that this feature is a significant expansion of the company’s surveillance infrastructure; using Flock’s camera network not just to investigate on the basis of suspicion, but to generate suspicion itself. Because Flock is a private company, little is known about the algorithm in question, including the data it was trained on or the frequency and nature of its error rates.
Palantir’s Gotham platform enables law enforcement and government analysts to connect disparate datasets and build intelligence profiles for individuals based on characteristics “as granular as a tattoo.” Government agencies are able to use the platform to map an individual’s social networks, track their movements, identify their physical characteristics, and review their criminal history. Palantir now has contracts with the Department of Defense, the CDC, IRS and the NYPD. The company is no longer just a software vendor but a partner in how the federal government conducts investigations, prioritizes targets, and uses algorithms. Because Gotham is proprietary, not even government officials can see how its algorithms assess information.
The rise of new tools raises the specter of what some scholars call “preemptive security,” i.e. using potential future risks to justify present action. In addition to violating numerous civil liberties and eroding due process, this approach signals a creeping authoritarianism that increases the risks of armed conflict, corruption, and economic volatility.
Questions to consider
How are companies developing surveillance tools marketing predictive or pre-emptive security features? How are they assessing risks to human rights? Are they conducting end-user due diligence? What steps are they taking to prevent misuse of their products?


