Why Facebook Can't Fix Fake News

Facebook is caught in a catch-22; the company drives revenue and traffic by giving users posts they're more likely to "like" and thus remain with in Facebook to continue reading, but it's also in danger of driving an exodus as fake news and bitterness of politics creates a toxic environment. Jeff Chiu/AP

Imagine going on Facebook and finding no political posts—just your friends and their updates. It would feel like pulling on clean underwear after wearing a single pair for a week on a desert hike.

Fake news? That's only the start of the tempest about to roll through Facebook headquarters. The site is turning into a septic tank of polluted politics. It's becoming a party you want to leave because everybody got drunk and obnoxious. No new social network is going to beat Facebook by copying Facebook, but we might get fed up with all the politics and fake crap on Facebook and turn to something refreshingly different. Right there you can see the leak in Facebook's tire: the left glove that drops and leaves an opening for a rival to punch it in the face.

"There's a real risk this is doing great harm to the brand," I was told by a Facebook insider who has been part of recent conversations at the top of that behemoth but asked not to be identified because the person didn't want to alienate the company. This source said the election aftermath might be Facebook's "Tylenol moment"—a reference to the 1982 poisoning deaths of people who ingested Tylenol capsules laced with cyanide. That crisis nearly crippled its maker, medical giant Johnson & Johnson.

Think back just a couple of years, before the 2016 election cycle and before Facebook set itself up as the world's newswire. Facebook grew to a billion users by being a social network. It's where you found old friends and kept up with family. I just looked back at my 2014 Facebook timeline. Almost zero politics! And that's how most people liked it. Many users back then even beseeched friends to avoid political posts, or muted the violators if they persisted. In real life, most of us don't want to argue politics with our friends and family, so why would we want to do it online?

Related: Facebook staff form 'secret taskforce' to deal with fake news controversy

Then, over the past two years, Facebook aggressively morphed into a media site. It set up deals with publishers to populate all our timelines with stories. It subtly encouraged users to post stories and to "like" and comment on them. Facebook, of course, did this with its own goals in mind. To maximize profit, Facebook needs to keep users engaged and on the site as long as possible, and to get those users to create or interact with all the content in their feeds. That thrum of activity helps Facebook's algorithms more deftly target ads to more people, which makes Facebook even more attractive to advertisers.

Since politics is traditionally news, of course that topic started to slip into our feeds, and Facebook's setup encouraged sinister practices. As users zip through their news feeds, scanning only the headlines, they are more likely to click on and share stories that are outrageous or stir emotions. In other words, Facebook—unwittingly, from what I hear—incentivized clickbait "news" over more serious news, and the success of clickbait opened the way for fake news. "We're more likely to share inflammatory posts than non-inflammatory ones, which means that each Facebook session is a process by which we double down on the most radical beliefs in our feed," writes Mike Caulfield, an expert in learning environments. "Marketers figured this out and realized that to get you to click, they had to up the ante. So they produced conspiracy sites that have carefully designed, fictional stories that are inflammatory enough that you will click."

It's hard to say whether Facebook is the chicken or the egg in this wave of political propaganda—whether it helped create the acidic and divided politics around the world or if the ugly political environment merely found an accommodating home on Facebook. No doubt it was some of both, and the result is that our feeds are now overwhelmed with wingnut political content that gets amplified even if it's crazy. During the election, a lot of Facebook users just didn't care if something was true, says Paul Mihailidis, a media literacy professor at Emerson College. "They saw it as a way to advocate," he says. "They see a catchy headline, and the default is to share." If you look globally—the U.S., the U.K., France, Colombia, the Philippines—politics are getting more caustic, not less. In this kind of environment, all the media outlets that now rely on Facebook's audience are driven to flood us with click-worthy headlines that play to our fears and anger. Every trend line points to more of what we're growing to hate on Facebook.

Parade goers pass by cutouts of Trump and the Pope during an Easter Sunday Parade on March 27, in New York City. During the last days of the US presidential election, a popular fake news article on Facebook reported that the Pope had endorsed Trump, he didn't, but that didn't stop people for posting, sharing and liking the fake article. Dennis Van Tine/STAR MAX/IPx

And what's Facebook going to do about it? It can't ban political posts; it would lose its position as a media outlet and blow out its bottom line. This conundrum is Facebook's Innovator's Dilemma, as described in the book by the same name. The company is making too much money with this product to make changes that would bring in less money, even if the company knows that's what its users really want. The more money the company makes as a news outlet, the harder—as a public company with demanding shareholders—it will be for Facebook to change course. The news industry has spent the past couple of years lamenting Facebook's power, yet news might turn out to be Facebook's OxyContin—it made it feel great for a while, but ingest too much and it could wind up in a ditch with a tattoo on its face.

Despite recent statements by Facebook CEO Mark Zuckerberg about his efforts to rein in fake news, he won't be able to do that easily. Zuckerberg hit on the reason when he said it would be problematic to set up Facebook editors or algorithms as "arbiters of truth." Because—what's truth? Centuries ago, it was true that the world was flat. When I was a kid, a mom would sit in a car's front seat and put her baby on her lap and not wear a seat belt. If someone said that was insanely unsafe, you probably would've blinked quizzically and said, "That's not true."

Facebook apparently is working on software that would flag or block fake news. Last year, Google published research on a knowledge-based trust algorithm that would sort for truth. Some college kids recently got attention for creating a Google Chrome extension they called FiB that automatically labels allegedly iffy sources. British technologist Peter Cochrane recently talked to me about developing software he called a truth engine. These might succeed in banning certain sites or identifying stories likely to be fake because they come from a single source, and yet software solutions can probably never overcome the problem that truth to me might not be truth to you, and truth today won't necessarily be truth tomorrow.

If all its political garbage can't be filtered or eliminated, Facebook will become vulnerable to a challenger. Let's say a new social network defines the category in a new way—make it more about our connections and our lives while banning media. Maybe it would have some new twists built around artificial intelligence or virtual reality. Such a competitor could disrupt Facebook the way the personal computer chewed at the market for IBM's expensive mainframes, or the way Airbnb has cut into Marriott's business. If users devote even some of their limited attention to a new social network, Facebook's momentum will stall.

Competitors may already be kicking at that door. Snapchat's parent will go public at a valuation of more than $25 billion, in part because it's starting to eat up some of the time people used to spend on Facebook. At least Facebook has been smart enough to buy properties that so far remain free of news pollution. Its Instagram and WhatsApp are more purely for social sharing and messaging. It bought virtual reality company Oculus VR, which could usher in a new way to socialize in a parallel cyberworld. Still, none of Facebook's other properties likely come close to pumping out the profits of the mother site. (Facebook doesn't break out results for its different properties.)

One constant about the technology industry is that every seemingly bulletproof superpower at some point has a Waterloo. It happened to IBM, AOL, Microsoft, Intel; and it will happen to Apple, Amazon and Google. You might be witnessing Facebook's moment of truth, in a very literal sense. If Facebook turns into a bottomless cesspool of competing political "truths," a lot of us are going to soil ourselves and escape to something else.

Read more from newsweek.com

- Students can't tell the difference between real and fake news
- H0w fake news created the myth of Fidel Castro as Latin Robin Hood
- Fake news site claims full responsibility for Donald Trump victory