Legendary Gymnastics Coach Bela Karolyi Passes Away, Leaving Behind a Remarkable Legacy

Bela Karolyi, who led Nadia Comaneci and Mary Lou Retton to Olympic gymnastics gold, dies Bela Karolyi disappeared from public view after the abuse scandal that rocked USA Gymnastics. Karolyi coached Nadia Comaneci to the first Olympics perfect 10 and Mary Lou Retton to all-around gold. Bela Karolyi, who led Nadia Comaneci and Mary Lou
HomeLocalCivil Rights Commission Raises Concerns Over Law Enforcement's Adoption of AI Technologies

Civil Rights Commission Raises Concerns Over Law Enforcement’s Adoption of AI Technologies

 

 

Concerns Raised About AI Tools Used by Law Enforcement


A recent government report highlights that an artificial intelligence tool, employed by law enforcement, airport security, and public housing monitoring, disproportionately impacts individuals of color and women.

 

Facial recognition technology is under scrutiny from civil rights advocates and some legislators due to concerns about privacy violations and lack of accuracy. The U.S. Commission on Civil Rights has noted that its use is increasing within federal agencies without sufficient oversight.

Commission Chair Rochelle Garza stated, “The unregulated deployment of facial recognition technology threatens civil rights, particularly for marginalized communities who have been affected by discriminatory practices.” Garza emphasized the need for thorough testing of AI systems to ensure fairness and address any disparities found promptly, or halt their use until issues are resolved.

Despite the swift advancement of facial recognition technology, no federal regulations currently oversee its application.

 

According to the Government Accountability Office, at least 18 federal agencies utilize facial recognition technology. Furthermore, the Justice Department has provided $4.2 million to local law enforcement agencies for programs that incorporate facial recognition since 2007, as indicated by public records.

FBI’s Extensive Database and Facial Recognition Software

The recently released 184-page report outlines the discreet application of facial recognition technology by federal agencies and the civil rights implications associated with it. The investigation focused on the Justice Department, Department of Homeland Security, and Department of Housing and Urban Development.

 

“Although there is an ongoing discussion regarding the advantages and disadvantages of federal facial recognition technology (FRT) usage, many agencies have already started employing this tech,” the report notes, stressing that it can lead to severe outcomes such as wrongful arrests, unjust surveillance, and discrimination.

Facial recognition systems employ biometric software to analyze a person’s facial features from an image. These systems attempt to match the analyzed face to a database in order to identify the individual. The accuracy can vary significantly based on factors like image quality and algorithm performance. The Commission’s findings reveal that even the best algorithms tend to produce more false matches for certain demographics, including older adults, women, and people of color.

 

The U.S. Marshals Service has utilized facial recognition technologies for investigations into fugitives, missing children, serious crimes, and protective security operations, as detailed in the report, citing the Justice Department. The Marshals have partnered with facial recognition software provider Clearview AI for several years. Some members of Congress raised concerns over the use of Clearview AI and other facial recognition systems in February 2022, highlighting potential violations of civil rights and privacy issues.

The FBI has employed facial recognition technology since at least 2011. The Justice Department informed the commission that the FBI can analyze a variety of images, including booking photos, driver’s licenses, public social media accounts, images from security footage, and pictures from other law enforcement agencies.

The U.S. Government Accountability Office has been assessing the FBI’s facial recognition technology usage since 2016, concluding in an earlier report that the FBI “should better ensure privacy and accuracy.”

 

The Justice Department, supervising both the FBI and Marshals, introduced a temporary policy in December 2023 stating that facial recognition technology should only be utilized to generate leads in investigations, according to the report. The Commission mentioned that insufficient data exists to verify if this policy is being effectively enforced.

 

When contacted by YSL News, the FBI opted not to provide any comments regarding the report. The Justice Department and the U.S. Marshals Service also did not respond to requests for comments.

Usage of AI in Border Control and Immigration Investigations

The Department of Homeland Security, which manages immigration control and airport security, has rolled out facial recognition technology across various agencies, as discovered by the commission.

The U.S. Immigration and Customs Enforcement has been utilizing facial recognition technology since 2008 through a contract with the biometrics defense firm L-1 Identity Solutions, according to the report.

This agreement allowed ICE access to the facial recognition database of the Rhode Island Division of Motor Vehicles to track down undocumented immigrants involved in criminal activities, as stated in a 2022 study by the Georgetown Law Center on Privacy & Technology.

 

Facial recognition technology is also deployed at airports, seaports, and within pedestrian pathways at the southern and northern border checkpoints for identity verification. The report highlighted that civil rights organizations in 2023 noted that the U.S. Customs and Border Protection mobile app faced difficulties in recognizing Black asylum seekers attempting to schedule appointments. However, this year, CBP claimed an accuracy rate exceeding 99% across various ethnic groups, according to the commission’s report.

 

DHS representative Dana Gallagher communicated to YSL News that the department appreciates the commission’s feedback and emphasized that DHS has proactively tested systems for bias.

The department initiated a 24,000 square-foot lab in 2014 aimed at testing biometric systems, as detailed in the report. Gallagher noted that the Maryland Test Facility, which the commission visited and reviewed, has been recognized as a “model for testing facial recognition systems in real-life scenarios.”

 

“The Department of Homeland Security (DHS) is dedicated to safeguarding the privacy and civil liberties of everyone we engage with as part of our mission to ensure the safety of our homeland and the security of travelers,” Gallagher stated.

 

Facial Recognition Technology in Public Housing

Some surveillance cameras in public housing are equipped with facial recognition capabilities, leading to evictions triggered by minor infractions, the commission reported. Lawmakers have expressed concerns about this phenomenon dating back to 2019.

The U.S. Department of Housing and Urban Development (HUD) has not created this technology itself, according to the report. However, it has provided funding to public housing agencies that have utilized the grants to buy cameras with these capabilities, thereby allowing facial recognition technology to be used without proper regulation or oversight.

Since a significant number of public housing residents are women and people of color, the commission cautioned that the application of this technology could violate Title VI. In April 2023, HUD stated that its Emergency Safety and Security Grants could not be allocated for the purchase of such technology, but the report pointed out that it did not impose restrictions on recipients who had already acquired the tool.

The commission referenced a May 2023 investigation by the Washington Post, which revealed that these cameras had been instrumental in penalizing residents and documenting minor infractions to facilitate evictions, such as smoking in prohibited areas or removing carts from laundry facilities. Lawyers representing evicted tenants noted a rise in cases where surveillance videos were used as evidence leading to evictions, as reported by the Post.

 

The Department of Housing and Urban Development did not respond to a request for comments from YSL News.

Civil Rights Advocates Push for Policy Reform

Tierra Bradford, a senior program manager for justice reform at the Leadership Conference on Civil and Human Rights, expressed her enthusiasm about the report, hoping it would inspire further action.

“This report highlights many concerns we in the justice sector have raised for quite some time,” Bradford remarked.

She further pointed out that the U.S. justice system has a track record of unfairly targeting marginalized groups, and the utilization of facial recognition technology seems to be a continuation of that issue.

 

“There should be a halt on technologies proven to be biased and that disproportionately impact communities.”

 

Ongoing National Debate on Facial Recognition Technology

The findings from the commission come amid ongoing discussions concerning facial recognition tools in both public and private settings.

For instance, in June, the Detroit Police Department announced it would adjust its policies regarding the technology used for crime-solving, following a federal settlement with a Black man wrongfully arrested for theft in 2020 based on facial recognition data.

Last year, the Federal Trade Commission prohibited Rite Aid from employing AI facial recognition technology after it identified that the system subjected customers, particularly people of color and women, to unwarranted searches. The FTC noted that the alerts generated were based on low-quality images, leading to thousands of inaccurate matches, resulting in customers being searched or expelled from stores for offenses they did not commit.

 

In Texas, a man who was wrongfully arrested and imprisoned for nearly two weeks has filed a lawsuit in January, blaming facial recognition software for incorrectly identifying him as the suspect in a store robbery. The lawsuit states that low-quality surveillance footage from a Sunglass Hut in Houston falsely tagged Harvey Murphy Jr. as the suspect, resulting in a warrant for his arrest.

 

On a broader scale, members of the Commission on Civil Rights hope that the report will guide lawmakers in understanding the impact of rapidly changing technology. The agency is advocating for a testing protocol agencies can use to assess the effectiveness, equity, and accuracy of their software. It also calls for Congress to develop a legal mechanism for individuals harmed by facial recognition technology to seek justice.

“I hope this bipartisan report will contribute to public policy that addresses numerous issues regarding artificial intelligence in general and specifically facial recognition technology,” Commissioner Stephen Gilchrist stated. “Our nation has a duty, both morally and legally, to protect the civil rights and civil liberties of all Americans.”