SAN FRANCISCO — A distraught father in Thailand killed his 11-month-old child on Facebook Live, the latest in a string of disturbing incidents including suicide, torture and sexual assault that have reached millions on the live-streaming service, raising questions about Facebook’s ability to monitor violence on its platform.
Before committing suicide Tuesday, Wuttisan Wongtalay, 20, filmed the murder of his daughter on the rooftop of a hotel in two videos streamed on Facebook. The videos remained accessible on Facebook for 24 hours. The videos were also uploaded to Google’s YouTube by other viewers, which took them down.
Wuttisan’s suicide was not broadcast on Facebook. His body was found next to his daughter, Jullaus Suvannin, the Thai police officer in charge of the case, told the Guardian newspaper.
“This is an appalling incident and our hearts go out to the family of the victim. There is absolutely no place for acts of this kind on Facebook and the footage has now been removed,” Facebook said in a statement.
The videotaped execution of 74-year-old grandfather Robert Godwin Sr. in Cleveland earlier this month prompted CEO Mark Zuckerberg to address growing tensions over violence on Facebook at the company’s annual conference for software developers. The suspect in the murder, Steve Stephens, killed himself that morning as he was being hunted for Godwin’s murder.
“We have a lot of work and we will keep doing all we can to prevent tragedies like this from happening,” Zuckerberg said from the keynote stage.
Thai police said Wuttisan’s broadcast may have been influenced by the Cleveland Facebook killing, according to The Guardian.
Facebook has pledged to improve how users can flag violent content. Facebook prohibits content that glorifies promotes violence, only permitting violent content that is considered to be in the public interest.
Facebook eventually removed the video of Godwin’s murder that Stephens uploaded to Facebook, but not before it had been viewed millions of times and posted elsewhere. The victim’s grandson, Ryan Godwin, begged people to stop sharing the footage, writing on Twitter: “That is my grandfather, show some respect.”
Stephens’ ex-girlfriend, a 42-year-old social worker, told NBC News that Stephens was solely responsible for his actions, but that the publicity he gained from the viral Facebook videos escalated the crisis.
Facebook’s video features were “something that was meant to be good [but] was perverted bad,” she said.
“I’m a clinician,” she said. “I’m in the mental health field and I teach, and we have a session in my class and we talk about media and how the use of media has been has almost been perverted.
“What it’s being used for is not what it was intended to be used for,” she said.
Last week Rev. Jesse Jackson and officials in the Chicago area called on Facebook to drop the Live service for 30 days in the wake of Godwin’s murder.
In an interview earlier this month at Facebook’s Silicon Valley headquarters, Zuckerberg told USA TODAY his company has a responsibility “to continue to get better at making sure we are not a tool for spreading” video of violent acts.
The Facebook CEO was referring to incidents in which people commit violent crimes and post them on the giant social network and, with growing frequency, who stream them live on Facebook Live.
“If it happens in Live or if it happens in comments, it’s the same,” Zuckerberg told USA TODAY. “If someone’s getting hurt, you want to be able to identify what’s going to happen and help the right people intervene sooner, and I view that as our responsibility.”
Live lets Facebook users share their lives publicly in real time. It’s often used to celebrate joyful occasions, such as a child’s first steps or the birth of a giraffe. But it’s also been used to capture traumatic, sometimes graphic, events as they unfold, from the police shooting of motorist Philando Castile last summer to the the torture of a mentally challenged teenager in Chicago in January.
The social media giant deploys teams of content moderators who are trained to remove content that violates the company’s policies. It has also begun to use artificial intelligence software to detect prohibited content.
Facebook is within a few years of being able to reduce the amount of violent content on Facebook with the help of artificial intelligence that can detect what’s happening in a video, Zuckerberg told USA TODAY.
“In the near term, the system we have is based on people reporting it and us going through and reviewing the reports. There are some things that I think we can speed up there. But the long-term solution is going to be having better artificial intelligence tools to give context of what’s going on,” Zuckerberg said. “That won’t be this year, but I also don’t think that’s 10 years from now. I do think over a few-year period, this will get better.”
Read or Share this story: http://usat.ly/2q1x0tN