Updated

"I'd blush if I could."

That's the title of a new United Nations report claiming that feminine-voiced artificial intelligence (AI) helpers like Apple's Siri, Amazon's Alexa and Google Assistant reinforce and spread harmful gender stereotypes that women are subservient and put up with poor treatment. It's also what Siri said when a user told it, "Hey Siri, you're a b---h."

The report notes that since the speech of most voices assistants is female by default, it signals that women are "docile helpers" — always available to do what's needed with a command of "Hey" or "OK."

FLY THROUGH THE ORION NEBULA THANKS TO THIS AMAZING VIDEO FROM HUBBLE

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says.

According to the report, of particular concern is that the robots often give “deflecting, lackluster or apologetic responses” when insulted, reinforcing the gender bias that women are submissive and will let abuse slide, the study found.

CEO Tim Cook talks about Siri during an Apple event in San Francisco, California March 7, 2012.  REUTERS/Robert Galbraith  (UNITED STATES - Tags: SCIENCE TECHNOLOGY) - RTR2YZMO

Apple CEO Tim Cook talks about Siri during an Apple event March 7, 2012. (REUTERS/Robert Galbraith)

'OVER THE RAINBOW' COMPOSER SUES APPLE, GOOGLE AND AMAZON FOR PIRACY

When a Fox News employee told a Siri set to answer as a British male, "Hey Siri, you're a b---h," it responded with: "I don't know how to respond to that."

The U.N. report suggests that digital assistants be programmed to discourage gender-biased results. It calls for tech companies to stop making the robots female by default and for more representation of women in artificial intelligence fields.

Fox News reached out to Apple and Amazon for comment.

CLICK HERE FOR THE FOX NEWS APP