Will Steve Stephens' Facebook Killing Video Have A Copycat Effect On Viewers Prone To Violence?

People are silhouetted as they posed with mobile devices in front of a screen projected with a Facebook logo, in this picture illustration taken in Zenica October 29, 2014. Reuters

There was a reason Steve Stephens recorded himself killing a man and uploaded it to Facebook for the world to see, instead of sending the video in a text message to ex-girlfriend Joy Lane.

In Stephens' mind, the horrifying moment when he killed Robert Godwin Sr., a random passerby, would have the most impact if it was shared with his friends on the social media platform—some of whom had been following along as he claimed to embark on a days-long killing spree. Experts say this is a theme all too often recurring in the digital age: users are treating sites as tools during acts of senseless violence to gain attention and spread whatever their misguided cause may be.

Related: Cleveland Killing, Steve Stephens Manhunt Leads Facebook to Review Its Video Policy

Stephens knew publishing the moment he took a man's life to Facebook could elicit a certain level of response. What he may not have known, however, was the potential contagion effect he could have caused for the countless viewers who have since seen the video; some of whom are also prone to committing horrific and violent crimes.

The accused killer died of a self-inflicted gunshot wound Tuesday after police searched for days to track him down. His video, which was prerecorded and then posted online, lasted three hours on Facebook before it was reported and taken down, but former New Jersey Attorney General Anne Milgram tells Newsweek that window of time could easily have been enough to incite the same acts in another vengeful person. Not to mention, virtually any video with such global significance lives on forever across the web, with Stephens' video already having been viewed hundreds of thousands of times on numerous sites.

"It seems likely that someone more inclined to commit violence already might see the murderer getting notoriety and attention in something like this public video and potentially commit a copycat act," Milgram said. "When looking at studies on copycat suicides, it's clear excessive reporting and a great deal of attention, along with descriptions, imagery and specific details are correlated with suicide contagion... Research also shows significant correlation between mass killings and school shooting contagion with traditional media coverage. I would imagine social media would have an equal or even greater effect."

Facebook defended itself in January after a video showed a group of teenagers kidnapping a mentally handicapped man and repeatedly beating him, saying in a statement to the Associated Press, "We do not allow people to celebrate or glorify crimes on Facebook and have removed the original video for this reason. In many instances, though, when people share this type of content, they are doing so to condemn violence or raise awareness about it. In that case, the video would be allowed."

But for Milgram, who implemented the use of "smart data" to prevent crime across the state of New Jersey, the question isn't about whether to maintain the right for users to denounce violence by sharing depictions of crimes recorded for the web.

"The question is, can Facebook and other platforms create an algorithm—or does a technological method already exist—capable of identifying violent crimes and preventing them from being posted to the web, while continuing to provide its billions of users the same ease of access they've become accustomed to," Milgram said. "I think after this tragedy, that's their responsibility."

Facebook launched an internal review Monday on its removal procedures surrounding violent videos, though Justin Osofsky, vice president for global operations and media partnerships, said the platform would continue relying on users to report objectionable material.

"We prioritize reports with serious safety implications for our community, and are working on making that review process go even faster," Osofsky said. "We disabled the suspect's account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better."