Google-owned YouTube is dealing with an exodus of some of its major advertisers in the wake of disturbing claims that pedophiles have been infiltrating the comments section of videos featuring children, seemingly without consequence, until now. And YouTube's own technology may be perpetuating this disturbing trend.
On Feb. 17, a YouTube user by the name of Matt Watson published a video titled "Youtube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized (2019)." The video itself, which has racked up nearly 2.5 million views as of this writing, is difficult to watch, and what Watson found is even more chilling. Watch at your own risk.
Watson discovered a myriad of YouTube videos featuring children engaged in otherwise innocuous activities, like doing gymnastics or dancing with friends, that were filled with sickening comments from apparent pedophiles. The users were interacting with each other, sexually objectifying minors and even reportedly trading links to actual child pornography.
Watson eventually found that some of these videos were accompanied by advertisements from companies large and small, meaning the videos where these disturbing exchanges were unfolding had been "monetized."
“Any content - including comments - that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There's more to be done, and we continue to work to improve and catch abuse more quickly.”
"You're in the car and you've got your kids, and you're busy, and you hand them a tablet or they've got a phone and they're watching YouTube videos - kids videos - and nothing wrong until you go down in the comments section," explained Kurt "The Cyberguy" Knutson.
"As it turns out nobody but nobody is stopping these disgusting human beings from doing perverted things online," Knutson added in an interview with Fox & Friends.
Responding to a request for comment for this story, YouTube said: “Any content - including comments - that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube." The video streaming giant added it "took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors."
YouTube acknowledged "[t]here's more to be done," and explained to Fox News that it has taken a more aggressive response in this particular situation than normal. That apparently includes removing more than 400 channels, disabling comments on tens of millions of videos, removing thousands more and even reporting some comments to the National Center for Missing & Exploited Children over the course of 48 hours.
YouTube has faced an exodus of major advertisers in the wake of Watson's report. In statements to Fox News, AT&T, Nestle, GNC, Canada Goose, RB (Lysol) and Epic Games, the company behind the hugely popular video game "Fortnite," all confirmed that they were pausing their advertising as a result of the issue.
AT&T had only resumed advertising on YouTube earlier this year after taking a two-year hiatus in 2017. At the time, AT&T and other major brands were walking away after a similar controversy involving advertisements appearing alongside videos featuring the promotion of terrorism, hate speech and other unsavory content. It took several years for YouTube to convince companies to return to its platform.
"Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube,” an AT&T spokesperson told Fox News.
Unfortunately for YouTube, the disturbing comments that were being traded between apparent pedophiles is just a part of the problem. And Fox News can confirm that another one of the issues that Watson exposed in his investigation is still plaguing the site to this day.
"What makes what I've discovered worse is the fact that I can find this wormhole from a fresh, never-before-used YouTube account via innocuous videos within about five clicks."
What Watson is referring to is the fact that YouTube “recommends” videos to its users using an algorithm that makes determinations based on your previous search history. These "recommended for you" videos typically appear alongside whatever video one is watching on the platform.
What Watson found (and Fox News was able to confirm) is that YouTube's algorithm is "recommending" videos that show children in these innocuous but potentially compromising situations to people who are searching for videos that are clearly more adult in nature. It took just 3 clicks to get from a "bikini" video showing an adult woman trying on new clothes, to a "wormhole," as Watson puts it, where the "recommended for you" section becomes almost entirely populated with videos of minors.
"It doesn't matter that they flag videos and turn off the comments," Watson claimed. "These videos are still being monetized, and more importantly they are still available for users to watch," he says.
Fox Business producer Daniel Hillsdon contributed to this report.