Is Google doing enough to protect kids from disturbing YouTube videos?

“Paw Patrol” is all the rage with kids.

They know how to type those two little words into YouTube and find a boatload of free, funny videos. Most of them are age-appropriate, but a few years ago, a video showed fake characters drawn to look similar to those in “Paw Patrol” committing suicide. The video is no longer available, but it proved one thing:

The Internet is a free-for-all, and disturbing YouTube content is not hard to find.

GOOGLE WOES: ANTITRUST CONCERNS, YOUTUBE FINE AND CORPORATE CULTURE CLASHES KEEP TECH GIANT ON THE HOT SEAT

In a cursory search, you can find videos showing people chasing one another with chain saws and horrifying car accidents. In 2015, the Google-owned service released the YouTube Kids app for iPhone and Android, but some experts say “age-gating” and filtering content is not enough.

“It is not a matter of if it is a matter of when a child dies because of something they see on YouTube,” says Brenda Bisner, the senior vice president of content at Kidoodle TV.

“In the worst cases, small children may have accidentally streamed pornography, bestiality, extremely violent and terrifying content, drug use and/or just weird stuff,” adds Leilani Carver, Ph.D.,  a professor at Maryville University who teaches strategic communication. “Most of these things were not initially posted on YouTube Kids but slipped through the filters.”

Google, however, told Fox News that bestiality was never found on YouTube.

GOOGLE HITS BACK AT CRITICS AMID ANTITRUST INVESTIGATIONS

Dr. Carver told Fox News that the landing page for YouTube Kids includes an explanation about content filtering -- it’s partly automated with bots, partly controlled by humans, and partly depends on users flagging inappropriate content.

The question is whether that’s really enough.

“Even if I watch with my child, I do not know the video is disturbing until we have watched it together (unless I screen everything that my child watches, which is often not practical),” says Carver. “Once something horrible has been seen by a child, it cannot be unseen.”

YOUTUBE TO PAY MASSIVE $170M FINE AS IT SETTLES CLAIMS IT VIOLATED CHILDREN'S PRIVACY LAWS

Bisner notes how the terms of service for YouTube clearly state the obvious -- the videos are not intended for anyone under 13. Whether it’s YouTube or YouTube Kids, the policy of flagging or blocking content is (mostly) fine for the free-for-all on Twitter or Reddit, but all of the experts say the issue is that kids tend to watch more YouTube than any other video service.

And, Carver says about 500 hours of video are uploaded to YouTube every minute. No one can possibly police that many videos, so the call to action is for Google and YouTube to monitor videos much more closely and ensure child safety, especially on the YouTube Kids app.

Susan Merrill, the Digital Media Director of Family First, says another solution is for parents to become much more aware of which content is available and to monitor apps and services more closely. Some parents are not fluent in technology, she says, and they even rely on their kids to teach them how a new app works, but that’s asking for trouble.

YOUTUBE REMOVED 17,000 CHANNELS, 500 MILLION COMMENTS UNDER NEW HATEFUL CONDUCT POLICY

“The average parent of a 10-year-old today did not have YouTube as a child,” says Merrill, speaking to Fox News. “The parents underestimate children (and trust the Internet) and think they aren't able to access content that they can easily.”

She says “full immersion” into the lives of our kids will help. Knowing what they are doing, coming alongside them and watching content together -- that’s the key.

“Most parents hesitate to use parental controls because their kids get angry with them,” she says. “If we love our children with the future in mind we will protect them fearlessly.”

CLICK HERE TO GET THE FOX NEWS APP

“We were unable to find the videos described on YouTube Kids, our standalone app for kids to more safely explore their interests," said YouTube spokesperson Ivy Choi, in a statement emailed to Fox News. "Sometimes, content on YouTube’s main experience that don’t violate our policies may not be appropriate for all audiences. We have built in policies and protections to age-restrict or remove mature content that explicitly targets minors and families. As a reminder, YouTube has never been for people under 13 and we terminate thousands of these accounts per week.”