12 June 12–18, 2025 dallasobserver.com DALLAS OBSERVER Classified | MusiC | dish | Culture | unfair Park | Contents the offenses that we could use it for reside within the CAPERS division,” Major Brian Lamberson with DPD’s criminal intelli- gence unit said. “[Requests] were kind of slow coming in … but over time I think we’re starting to see them pick up a lot more.” The Public Safety Committee approved the adoption of the facial recognition tech- nology last May, with then-police Chief Ed- die Garcia assuring the council that the system would be a “game changer” for detec- tives. While other North Texas cities such as Arlington and Fort Worth beat Dallas to the facial recognition punch, officials said the de- lay offered DPD the opportunity to build a privacy safety net into the program. The software was developed by a group called Clearview AI and is only used by the department for violent offenses or in the case of imminent public safety threats. A detective has to request that an image be run through the software, and if the request is approved, an FBI-trained analyst is responsible for run- ning the search. A second analyst is charged with combing through the results. According to the Dallas Police Depart- ment, four requests for facial recognition were denied either because the case in- volved an offense that is not at the severity of the AI program’s intended use, or because a supervisor had not approved the request. “I have always had a lot of concerns about privacy, whether it is data or other things. This feels very comfortable for me. This feels like efficiency and just the next step,” Council member Cara Mendelsohn said last spring when the program was approved. Since that initial approval, though, new concerns surrounding Clearview AI’s ethi- cality and privacy protections have emerged. Misidentifications and Political Targeting Last fall, the Netherlands fined Clearview AI for $33.7 million, accusing the company of building an “illegal database” that utilizes images from the internet and social media to create a collection of faces that can be matched to images submitted by law en- forcement. The Netherlands’ Data Protec- tion Agency warned that it is illegal for Dutch companies to utilize the service. This decision, which Clearview AI officials described to the Associ- ated Press as “unlawful, devoid of due process and is unenforceable,” came just weeks after the company settled a law- suit in an Illinois court that consolidated com- plaints from across the U.S. The settlement was estimated to cost as much as $50 million and was made in response to complaints that the so- cial media scraping tech- nology that the facial recognition software uti- lizes amounts to a privacy violation. “The use of dragnet surveillance is not consensual. They’re not anything anyone’s opting into,” Will Owen, communications director of the Surveil- lance Technology Oversight Project, told the Observer. “Companies like Clearview AI are just taking our images and building their da- tabases.” Dragnet surveillance refers to the prac- tice of surveillance by broad and widespread data collection, rather than focusing on a specific suspect. Clearview AI has become increasingly popular across police forces in recent years, Owen said. He finds that concerning from a basic surveillance perspective, but also in light of recent reporting that the founders of Clearview were aware their technology could be used to mark immigrants or politi- cal targets. Government records show that since 2020, U.S. Immigration and Customs En- forcement has paid millions in contracts to Clearview AI, and a recent Mother Jones ar- ticle outlined the ways in which Clearview is helping the Trump administration carry out its crackdown on immigration. That be- ing said, the software also aided in federal investigations following the Jan. 6 Capitol insurrection. Even with all that aside, Owen worries that facial recognition isn’t where it needs to be in terms of accuracy for policing. While the Dallas Police Department says image an- alysts receive training to avoid misidentifi- cation and bias, both are common issues on the technology’s end. Facial recognition sys- tems have routinely been shown to be more inaccurate when identifying Black or brown faces than when looking at white ones. Other nuances, such as whether or not an image is of a cisgender person, can add to the artificial intelligence’s likelihood of making mistakes. “Facial recognition, at large, has expanded greatly across the United States, and it is very unregulated and highly biased in the way it’s deployed. Facial recognition in law enforce- ment drives over policing of immigrant com- munities,” Owen said. “(Clearview AI’s) founder is very explicit in how the technol- ogy can be used to drive the anti-immigrant policies of the Trump administration. So I fear that its use is only going to expand fur- ther in the current political climate.” Unfair Park from p10 Adobe Stock The software scrapes the internet and social media platforms for images that are then compiled into a database that can be cross-referenced with images from law enforcement officials.