Up for discussion in this week’s newsletter: Years after shuttering a similarly controversial scheme, the LAPD wants to use tech to profile potential felons
Last modified on Wed 22 Dec 2021 12.11 GMT
In 2011, the Los Angeles police department rolled out a novel approach to policing called Operation Laser. Laser – which stood for Los Angeles Strategic Extraction and Restoration – was the first predictive policing programme of its kind in the US, allowing the LAPD to use historical data to predict with laser precision (hence the name) where future crimes might be committed and who might commit them.
But it was all but precise. The programme used historical crime data like arrests, calls for service, field interview cards – which police filled out with identifying information every time they stopped someone regardless of the reason – and more to map out “problem areas” for officers to focus their efforts on or assign criminal risk scores to individuals. Information collected during these policing efforts was fed into computer software that further helped automate the department’s crime-prediction efforts. The picture of crime that the software presented, activist groups like the Stop LAPD Spying Coalition argue, simply validated existing policing patterns and decisions, inherently criminalising locations and people based on a controversial hypothesis (ie, that where crimes have once occurred they will occur again). The data the LAPD used to predict the future was rife with bias, leading to the over-policing and disproportionate targeting of Black and brown communities – often the same ones they had been targeting for years, experts argue.
About five years into the programme, the LAPD focused on an intersection in a south LA neighbourhood where the late rapper Nipsey Hussle was known to frequent, documents my colleague Sam Levin and I reviewed and first reported on in November revealed. It was the intersection along which he grew up, and later opened a flagship clothing store as an ode to his neighbourhood and a means to move the community forward economically. There, in search of a robbery suspect described only as a Black man between the age of 16 and 18 years old, the LAPD stopped 161 people in a matter of two weeks. Nipsey Hussle had complained of constant police harassment before then, too, saying as early as 2013 that LAPD officers “come hop out, ask you questions, take your name, your address, your cell phone number, your social, when you ain’t done nothing. Just so they know everybody in the hood.” In an interview with Sam Levin, Nipsey’s brother Samiel Asghedom said nobody could go to the store without being stopped by police. The brothers and co-owners of The Marathon Clothing store even considered relocating the store to avoid harassment.
Ultimately, the LAPD was forced to shutter the programme, conceding that the data did not paint a complete picture.
Fast-forward nearly 10 years later: The LAPD is working with a company called Voyager Analytics on a trial basis. Documents the Guardian reviewed and wrote about in November show that Voyager Analytics claimed it could use AI to analyse social media profiles to detect emerging threats based on a person’s friends, groups, posts and more. It was essentially Operation Laser for the digital world. Instead of focusing on physical places or people, Voyager looked at the digital worlds of people of interest to determine whether they were involved in crime rings or planned to commit future crimes, based on who they interacted with, things they’ve posted, and even their friends of friends. “It’s a ‘guilt by association’ system,” said Meredith Broussard, a New York University data journalism professor.
Voyager claims all of this information on individuals, groups and pages allows its software to conduct real-time “sentiment analysis” and find new leads when investigating “ideological solidarity”. “We don’t just connect existing dots,” a Voyager promotional document read. “We create new dots. What seem like random and inconsequential interactions, behaviours or interests, suddenly become clear and comprehensible.”
But systems like Voyager Labs and Operation Laser are only as good as the data they’re based on – and biased data produces biased results.
In a case study showing how Voyager’s software could be used to detect people who “most fully identify with a stance or any given topic,” the company looked at the ways it would have analysed the social media presence of Adam Alsahli, who was killed last year while attempting to attack the Corpus Christi naval base in Texas. Voyager said the software deemed that Alsahli’s profile showed a high proclivity toward fundamentalism. The evidence they pointed to included that 29 of Alsahli’s 31 Facebook posts were pictures with Islamic themes and that one of Alsahli’s Instagram account handles, which was redacted in the documents, reflected “his pride in and identification with his Arab heritage”. The company also pointed out that of the accounts he followed on Instagram “most are in Arabic” and “generally appear” to be accounts posting religious content. On his Twitter account, Voyager wrote, Alsahli mostly tweeted about Islam.
Though the case study was redacted, many aspects of what Voyager viewed as signals of fundamentalism could also qualify as free speech or other protected activity. The case study, at least the parts that we could see, reads like the social media profiles of your average Muslim dad.
While the application may seem different, what the two cases show is the ongoing desire among law enforcement to advance their policing, and the limitations – and in some cases the bias – deeply embedded in the data being used in the systems. Some activists say police employ systems purporting to use artificial intelligence and other advanced technologies to do what it really isn’t capable of doing, that is, to analyse human behaviour to predict future crime. In doing so, they often create a vicious feedback loop.
The main difference is that there’s now an entire sector of tech clamouring to answer law enforcement’s call for more advanced systems. And tech companies that create overt surveillance or policing programmes but also consumer tech companies that the average person interacts with on a daily basis like Amazon are answering the call. Amazon, for its part, specifically worked with the LAPD to give its officers access to its network of Ring cameras. For police the motivation for such partnerships is clear, with such technology giving credence to their policing decisions and potentially making their jobs easier or more effective. For tech companies, the motivation is to tap into revenue streams with growth potential. The lucrative government contract with seemingly endless funding is a hard prospect to resist, especially as many other avenues for growth have started to dry up. It’s why internal employee opposition has not deterred companies like Google, which continues to go after military contracts in spite of years of employee strife.
From the New York Times: “In 2018, thousands of Google employees signed a letter protesting the company’s involvement in Project Maven, a military program that uses artificial intelligence to interpret video images and could be used to refine the targeting of drone strikes. Google management caved and agreed to not renew the contract once it expired.
The outcry led Google to create guidelines for the ethical use of artificial intelligence, which prohibit the use of its technology for weapons or surveillance, and hastened a shake-up of its cloud computing business. Now, as Google positions cloud computing as a key part of its future, the bid for the new Pentagon contract could test the boundaries of those AI principles, which have set it apart from other tech giants that routinely seek military and intelligence work.”
Where does a company like Google, which has expanded its business such that its tentacles are in likely every industry, go to continue to grow its business? Right now, the answer appears to be working with the government.
Readers, I’d love to hear about how you feel about tech companies working with law enforcement to equip them with predictive policing or other surveillance technology.
If you want to read the complete version of the newsletter please subscribe to receive TechScape in your inbox every Wednesday.