IBM has quit the facial recognition technology business, citing concerns that it can be used for mass surveillance and racial profiling.

The move comes amid ongoing protests following the death of George Floyd on May 25--while in police custody in Minneapolis--that have thrust racial injustice and police monitoring technology into the spotlight.

The tech giant’s CEO Arvind Krishna explained IBM’s decision in a letter sent to U.S. lawmakers Monday.

“IBM no longer offers general purpose IBM facial recognition or analysis software,” he wrote. “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.”

AMAZON'S FACIAL RECOGNITION CAN NOW DETECT 'FEAR' AS ACTIVISTS BLAST USE BY LAW ENFORCEMENT

In this April 26, 2017, file photo, the IBM logo is displayed on the IBM building in Midtown Manhattan, in New York - file photo. (AP Photo/Mary Altaffer, File)

IBM's decision to stop building and selling facial recognition software is unlikely to affect its bottom line, since the tech giant is increasingly focused on cloud computing while an array of lesser-known firms have cornered the market for government facial recognition contracts.

The Armonk, N.Y.-based IBM is one of several big tech firms that had earlier sought to improve the accuracy of their face-scanning software after research found racial and gender disparities. But Krishna, the company’s new CEO, is now questioning whether it should be used by police at all.

“We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” he wrote. “Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe. But vendors and users of AI systems have a shared responsibility to ensure that AI is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.”

IBM had previously tested its facial recognition software with the New York Police Department, although the department has more recently used other vendors. It's not clear if IBM has existing contracts with other government agencies.

AMAZON SLAMMED BY AI RESEARCHERS FOR SELLING ‘BIASED’ FACIAL RECOGNITION TECH TO COPS

Many U.S. law enforcement agencies rely on facial recognition software built by companies less well known to the public, such as Tokyo-based NEC or the European companies Idemia and Cognitec, according to Clare Garvie, a researcher at Georgetown University's Center on Privacy and Technology.

A smaller number have partnered with Amazon, which has attracted the most opposition from privacy advocates since it introduced its Rekognition software in 2016.

IBM’s decision was welcomed by California Democratic Rep. Jimmy Gomez.

“Smart move by @IBM as we KNOW #FacialRecognition tech as a law enforcement tool is NOT ready for primetime. Back in ‘18, @amazon’s software misidentified me & other Members of Congress — mostly people of color — during an @ACLU test,” he tweeted Monday .”This tech needs legislative guardrails NOW.”

AMAZON DROPS SECRET AI RECRUITING TOOL THAT SHOWED BIAS AGAINST WOMEN

The ACLU test found Amazon’s facial recognition software falsely matched 26 California state lawmakers, or more than 1 in 5, to images from a set of 25,000 public arrest photographs. The ACLU said that over half of the false positives were people of color.

In a blog post on Wednesday, Mike Leone, a senior analyst at ESG Market Research, said that IBM’s move will not stop facial recognition being used by law enforcement and government.

“Facial recognition will continue on its gray area trajectory with or without IBM,” he explained. “But what IBM has done, specifically Arvind Krishna, is bring attention to a growing concern that needs far more national and global attention.

CLICK HERE TO GET THE FOX NEWS APP

“The use of facial recognition needs to be scrutinized for bias and privacy concerns. It needs oversight,” he added. “It needs guardrails. Usage, especially from law enforcement and governing entities, needs to be transparent.”

Fox News’ Christopher Carbone and the Associated Press contributed to this article. Follow James Rogers on Twitter @jamesjrogers