Submission + - ICE buying eye-scanning tech to deport and remove people from a foot away (9news.com)
The mobile software from BI2 Technologies can identify individuals from 10 to 15 inches away using a smartphone app, according to the the Massachusetts-based company. It then connects with a second product that includes a database.
ICE posted a Wednesday announcement for a sole source purchase order to BI2 Technologies for licenses to both BI2's Inmate Recognition & Identification System and the Mobile Offender Recognition & Identification System for "enforcement and removal operations."
Steve Beaty, a computer science professor at Metropolitan State University of Denver, explained iris biometric capabilities.
"In general, it's quite accurate," Beaty said. "The iris is the part of your eye that everybody sees — the color has stripes in it and they are unique to an individual."
Beaty said recent technological advances have made iris scanning more accessible and affordable.
"The innovation is now that it's much less expensive that it can be done on less expensive devices such as phones," Beaty said. "In the past it was kind of a big standalone machine that these sorts of things could be used on."
The system compares iris scans to existing databases of photos. Beaty says that can come from a criminal database or even from photos scraped from social media profiles.
"Facial recognition companies have scraped the internet for photos," Beaty said.
But in Colorado, law enforcement agencies couldn't use the technology the same.
Democratic state Rep. Jennifer Bacon co-sponsored a 2022 Colorado law requiring police agencies to disclose their facial recognition plans and prohibiting its use as the sole basis for arrests or investigations.
"The way that we saw facial recognition working was with one to identify and match, versus profiling," Bacon said. "That's two different things."
She expressed concerns about ICE's intended use.
"The notion that ICE is going to use it to do some of those things actually scares me a little bit because that's what we were, in fact, trying to get ahead of," Bacon said.
Bacon outlined specific worries about potential civil rights violations.
"We had a lot of conversations about how law enforcement cannot use it to profile, how law enforcement cannot use it to circumvent due process, how law enforcement cannot use it to circumvent First Amendment rights," she said.
She emphasized the need for safeguards given the high stakes involved.
"When you get it wrong, people's due process are violated," Bacon said. "We're talking about jail time, we're talking about how much one earns. We're talking about if someone can rent an apartment, and so we want to be sure that we can protect our communities from bad decisions."
She questioned underlying assumptions and bias built into artificial intelligence systems too.
"How does one determine what an illegal immigrant looks like or is?" Bacon said. "In America we believe in innocence before proven guilty and so the tools that we have need to also act upon those values as well," she said.
Federal regulation of facial recognition technology differs significantly from state oversight though.
"That's why the states are worried about it," Beaty said.
Beaty noted that this particular software has been used by sheriff's departments elsewhere in the country, primarily as a way to help run jails.
But he raised questions about data handling and privacy protections.
"Let's say my iris is taken and I haven't committed a felony, which I have not," he said. "Where does the data go?Does it stay on the phone? And how long will it be on the phone?"
He highlighted a key concern with biometric data collection.
"Another concern about all biometrics is it's something we cannot change," Beaty said. "Our fingerprints, our faces in general, certainly irises, retinas, we can't change. If it is misreported, then we have a huge problem," he said.