Police Claim Right to Use AI Facial Recognition Despite Restrictions

March 22, 2021 - 8 minutes read

artificial intelligence app development

As a whole, artificial intelligence (AI) applications can be incredibly controversial. AI has been found to be racist, sexist, and biased. If placed in the wrong hands, AI tools like facial recognition could create a situation of life or death. When the Capitol Building was attacked in January, the public worked alongside police departments all over the country to help the FBI identify rioters.

Although facial recognition technology has been shown to be inaccurate and racially-biased, it has been used widely by people in the public and private sector in the past few months. The contentious technology has been banned for use by law enforcement in several major metropolitan areas, but police departments say that there are loopholes to get around these rules.

Finding a Way Through the Loopholes

In Pittsburgh, Alameda (California), Madison (Wisconsin), Boston, Northampton (Massachusetts), and Easthampton (Massachusetts), officials have publicly stated that law enforcement bans of facial recognition have loopholes. These loopholes allow police to use facial recognition technology to access information and take action on it.

artificial intelligence app development

Some experts say these loopholes aren’t bad. The technology helped the public and local police departments track down rioters for the FBI in recent weeks. But other experts say that loopholes allow law enforcement to continue their behavior and actions without facing any consequences. According to Mohammad Tajsar, a senior staff attorney for the American Civil Liberties Union in Southern California, “If you create a carve-out for the cops, they will take it.”

In Pittsburgh, the loophole is a part of the legislation where it says police departments can use software produced or shared by other police departments. Specifically, it says, the law “shall not affect activities related to databases, programs, and technology regulated, operated, maintained, and published by another government entity.” Madison, Boston, and Alameda have very similar language in their loopholes.

artificial intelligence app development

In Madison, police officers can use facial recognition technology that was supplied by a business, even if it’s banned from government usage. In Easthampton, police officers can use the technology as evidence if it was supplied by another police department, but not if it was supplied by a business. In Northampton, law enforcement can use the technology when provided by other police agencies and by businesses.

Ultimately, according to the director of the Technology for Liberty program at the ACLU of Massachusetts, Kade Crockford, a federal ban or restriction on facial recognition would have the best effect on the technology’s usage.

Tracking Usage of the Technology

When so many local and state laws allow facial recognition when used by another police agency, Crockford says that it starts to become really difficult for law enforcement to track if and when facial recognition was used during evidence gathering.

artificial intelligence app development

Quite often, however, police officers use the technology knowing that they’re not allowed to do so against citizens who were not informed about the use of facial recognition. For example, in Miami, police arrested protestors using facial recognition. But even the protestors’ defense attorneys didn’t know that facial recognition, and not “investigative means”, was used to track down their clients. In another incident, Jacksonville police arrested a citizen who sold $50 of cocaine using facial recognition, but this information wasn’t disclosed in the police report.

Besides law enforcement purposefully hiding when facial recognition is used in an investigation, the technology was banned because it is deeply biased against people of color and women. So even when a police officer uses the technology when provided by a business, like in the case of Home Depot, Rite Aid, and Walmart, it’s still highly possible that the technology isn’t working correctly. Jake Laperruque is a senior counsel at the Constitution Project, and he says, “If this is something that’s going to lead to a store calling the police on a person, that to me creates a lot of the same risks if you worry about facial recognition misidentifying someone by the police.”

artificial intelligence app development

Following Portland’s Lead

It seems that one of the only cities with incredible forethought and thoughtfulness about its citizens is Portland, Oregon. The city passed the most exhaustive ban on facial recognition technology to date last September. The law prohibits law enforcement, as well as public places and businesses, from using the technology. This includes restaurants, brick-and-mortar stores, and anywhere the public would visit.

Hector Dominguez is the Open Data Coordinator with Portland’s Smart City PDX. He says that once the department did their due diligence about the issue to develop Portland’s facial recognition ban, the department started “getting a lot of community feedback and recognizing the role that private businesses are having in connecting people’s information.” Even more worrisome were businesses that that appeared from thin air to lobby against these tight regulations.

Amazon, for example, lobbied Portland for the first time ever and spent $12,000. The Oregon Bankers Association asked for an exception for use of the technology when providing law enforcement video of robberies. And the Portland Business Alliance asked for exemptions for retailers, airlines, banks, hotels, concert venues, and amusement parks. But Portland stayed strong and allowed only one exemption: if a business says that they must use the technology to comply with federal, state, or local laws. This includes agencies like the Customs and Border Protection who work at the airport.

artificial intelligence app development

According to an organizer in Portland with Fight for the Future, Lia Holland, police departments may use facial recognition for malicious behavior but private businesses use the technology for similarly malicious reasons. One is more hidden (like a business connecting, monitoring, and tracking their customers’ faces to purchase behaviors or intent) while the other is more in-your-face. In everyday circumstances, says Holland, businesses have more reason to surveil than law enforcement does.

Policing the Public Today

Although facial recognition technology is an example of an advanced machine learning application, it is ingrained with a bias that could negatively impact someone’s life for decades. In that sense, the technology is still in its infancy and needs a lot of fixing, testing, and training in order to be up to par with even Portland’s strict guidelines. Until then, facial recognition is not suitable for use against the public any time soon, especially for businesses that will quietly profit from it and for law enforcement that will act violently upon its findings.

Would you buy stealth clothing that confuses facial recognition algorithms? Let us know in the comments below!

Tags: , , , , , , , , , , , , , ,