Updated

Microsoft's facial-recognition technology is getting smarter at recognizing people with darker skin tones.

On Tuesday, the company touted the progress, though it comes amid growing worries that these technologies will enable surveillance against people of color.

Microsoft's announcement didn't broach the concerns; the company merely addressed how its facial-recognition tech could misidentify both men and women with darker skin tones. Microsoft has recently reduced the system's error rates by up to 20 times.

In February, research from MIT and Stanford University highlighted how facial-recognition technologies can be built with bias. The study found that Microsoft's own system was 99 percent accurate when it came to identifying the gender of lighter-skinned people, but only 87 percent accurate for darker-skinned subjects.

More From PCmag

For women with dark skin, the accuracy rate dropped even further, to 79 percent. The reason? The computer algorithms powering the facial-recognition systems are trained by scanning thousands of different photos and learning to classify them. If the photos mainly feature people with one skin color type over another, the computer algorithms will inadvertently develop some bias.

To address the problem, Microsoft launched a new data collection effort to improve the training data its facial-recognition system was built on. The company also tweaked how the algorithms can classify people.

Microsoft's tech is available as a tool for website and app developers to analyze photos and videos to determine what they contain. But in recent months, civil liberties groups have also been sounding the alarm over how facial-recognitions systems are used by law enforcement.

The technology can potentially help police identify criminal suspects in photos or videos human eyes may have missed. (Microsoft's own AI algorithms are being used by investigators to mine criminal justice data.) However, critics argue that facial-recognition systems are also prone to making errors, and could be abused to discriminate against immigrants and activists.

Some of Microsoft's own employees are even worried. Last week, a group of them began calling for an end to a company contract with US immigration authorities that appeared to involve Microsoft's facial-recognition tech. However, Microsoft says the contract only deals with cloud services such as email, calendar, and messaging.

This article originally appeared on PCMag.com.