A 20-year-old man streamed the murder of his 11-month-old daughter on Facebook Live Monday, before committing suicide. Wuttisan Wongtalay filmed the gruesome act on the rooftop of a deserted hotel in Thailand.
This is the latest in a string of violent incidents that have been posted on the social media site just this year.
January: Four people were arrested in Chicago on allegations they kidnapped and assaulted a disabled man., which was broadcast on Facebook Live.
March: A 15-year-old girl, also in Chicago, was allegedly sexually assaulted by 5 to 6 males while up to 40 people watched the attack on Facebook.
April: The murder of Robert Godwin, a 74-year-old Cleveland man, was posted by his killer, Steve Stephens, on Facebook. This led to a highly publicized nationwide manhunt for the shooter.
Reuters reports that, in this latest incident, two videos of the child’s killing were posted on the father’s Facebook page and removed about 24 hours later. They say the first video had about 112,000 views by mid-afternoon local time Tuesday, and the second video had 258,000 views.
A spokesperson for Facebook released this statement to Fox News: “This is an appalling incident and our hearts go out to the family of the victim. There is absolutely no place for acts of this kind on Facebook and the footage has now been removed.”
But why, especially after this growing history of streaming violence, did it take so long and have so many views before it was taken down?
“Problem is in the online setting, users will tell friends, share with others and repost. The opposite of what you would expect in the real world, where someone would immediately call 911.” Hemu Nigam, CEO of SSP Blue, an online safety consultancy and former Chief Security Officer of MySpace when it was a News Corporation property told Fox News in a phone interview. “That leads to a longer time of it being reported to an organization like Facebook.”
And, it seems that if these videos are not reported, there is still no good way for companies to locate them and remove them quickly. Wongtalay’s video was also posted to YouTube, but was reportedly taken down within 15 minutes of the BBC alerting YouTube to its presence.
“Technology is amazing at identifying people’s likes, habits, dislikes, preferences. They have a platform to analyze that behavior.” Says Nigam, “Using that data to protect citizens is quite possible. There is not an engineer out there who would say that is not true.”
He added “Companies are in a place where they have the ability to solve these real challenges created by the way people use technology. They just have to put their mind to it.”
Facebook says that is just what they are doing.
“We are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible” Facebook’s VP of Global Operations, Justin Osofsky, wrote last week.
He says they have thousands of people around the world, working 24 hours a day 7 days a week and in more than 40 languages, reviewing millions of items that are reported every week.
“We are constantly exploring ways that new technologies can help us make sure Facebook is a safe environment. Artificial intelligence, for example, plays an important part in this work, helping us prevent the videos from being re-shared in their entirety.”
As all social media sites figure out the best way to manage this problem, it seems videos like the heinous act that took place in Thailand Monday will slip through the cracks and remain online for the world to see, for a long period of time.
Simply put, Nigam told us “That video should not have been seen that many times.”
If you see a disturbing video on Facebook, you can report it on their community standards page found here.