Updated

Congress demanded answers from tech giants including Facebook and Twitter on Wednesday about whether they’re doing enough to keep terror networks off social media, suggesting that their reports about thwarting such efforts show they’re falling short.

Even Twitter executive Carlos Monje’s report that the so-called microblogging service -- with an estimated 330 million active users --- has suspended more than 1.1 million terrorist-related accounts since mid-2015 wasn’t enough for Sen. Brian Schatz, D-Hawaii.

“Based on your results, you are not where you need to be,” said Schatz. He expressed fear of election-year chaos like what American enemies created in 2016.

In response, Monje said at the Senate commerce committee hearing that Twitter gets better “every day” at detecting terror-related and other dangerous messages. But he also acknowledged that company officials routinely “ask themselves the same question” about how they can improve.

The executives from Facebook, YouTube and Twitter also conceded at the hearing that they have yet to agree on a “shared standard” of what constitutes extremist or terrorist content. That came in response to questioning about the matter by the committee chairman, Sen. John Thune, R-S.D.

“That’s right, Mr. Chairman,” replied Facebook executive Monica Bickert.

However, she also argued that the popular social media network does not allow anybody involved with the radical Islamic group Boko Haram, for example, to have a Facebook page “even if you are simply talking about the lovely weather.”

She also said Facebook does not allow “any praise for such groups or their actions.”

Thune raised concerns about a YouTube video that helped the so-called “Manchester bomber” create his explosive device. He cited a recent report that found the video  has been put up and taken down 11 times on YouTube, yet resurfaced on the site again this month.

“How is that possible?” Thune asked. The homemade bomb was detonated at the Manchester Arena, in the United Kingdom, in May 2017, killing some two dozen people and injuring more than 500 others. Radical Islamists have claimed responsibility for the attack.

YouTube executive Juniper Downs said the company was quickly catching “re-uploads” of the video, then removing them.

She also said the company was sharing such information as part of the coalition formed with Facebook, Twitter and Microsoft to “better identify and remove” offensive content.

Still, Monje, a Twitter public policy director, acknowledged that keeping up with artificial intelligence and other forces that put out dangerous content had elements of a “cat and mouse game.”

He also said that his company, like Facebook, was trying to identify and notify users who might have been subject to Russian internet trolls spreading misinformation during the 2016 election cycle.