Facebook's Arrogance Crisis: Can Mark Zuckerberg Claw Back Control? | Opinion

Facebook’s catastrophic privacy fail put its stock in free fall, with shares falling more than 10 percent over the past three weeks, destroying tens of billions of dollars of market capitalization. CEO Mark Zuckerberg recently announced that pretty much all users should “assume” their profiles have been scraped.

That’s more than two billion people.

A memo leaked from a top Facebook exec extolling “growth at any cost.” Employees were reportedly not upset because of the corrosion of values this revealed, but at the idea that someone “leaked” their cultural norms to the world. Facebook’s troubles can be traced all the way up to the board room, and the institutional arrogance that has taken hold.

Self-assured in the brightly blind belief that they are automatically a force for good because they are a technology company, executive leadership has failed to set appropriate parameters around societal and governance risks. A culture of loyalty has overtaken a culture of responsibility; the Chief Information Security Officer (CISO) appears to have been constructively terminated because, it is suggested, he advocated for more disclosure, thus questioning the party line.

At a minimum Facebook should have proactively assessed the societal risks of its activities. Numerous Fortune 500 companies have instituted programs to examine and report on their risks along environmental, social, and governance dimensions. Twitter, a similarly situated company, issues a twice-per-year transparency report, that includes how many troll accounts it deactivates. Tech and data companies like Bloomberg LP and Symantec actively use SASB standards to report on governance and similar factors.

Mark Zuckerberg Facebook CEO Mark Zuckerberg testifies before two U.S. Senate committees on April 10. Alex Wong/Getty Images

One could even make the argument for negligence on the part of Mark Zuckerberg and the board for failing to proactively engage in positive societal behaviors, and to quantify them. This is hardly a novel concept: that companies should have an appreciation for their environmental, social, and governance risks in addition to other kinds of operational and financial risks. Beyond social responsibility, the more progressive organizations are “leaning in” to generate long-term return through active societal engagement for positive change.

Facebook can’t simply assume, a priori, that it is a white hat actor incapable of ill. It must monitor the effects of its actions, engage with the ecosystem around it to promote and leverage the good, minimize the bad.

Interestingly, a number of the AI techniques that Facebook has used to optimize its business are also useful in assessing societal benefit or harm. Refined by MIT as “social physics”, this computational social science is being tested by organizations like the United Nations and the European Union to manage social change for good. 

By failing to measure social and governance risk in an appropriate fashion, Facebook has endangered not only its market capitalization and shareholder return, but also its future cash flows (the #deletefacebook movement for example) and, it is suggested, is even being considered for regulation by government.

Ethical guidelines could also have helped moderate Facebook’s risk profile. Mark Zuckerberg’s fateful words—Facebook users who share data with the platform are “dumb fucks”—will haunt him for years to come. Facebook cofounders and executives, such as Sean Parker and Chamath Palihapitiya, have openly lamented the force they have unleashed on the world, saying it is “ripping apart society.”

Mark Zuckerberg Facebook founder and CEO Mark Zuckerberg arrives to testify following a break during a Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee joint hearing about Facebook on Capitol Hill, on April 10. SAUL LOEB/AFP/Getty Images

Mark Zuckerberg’s testimony to the U.S. Senate on Tuesday (March 10) left more questions than answers, as 44 Senators lined up to grill him. The most interesting part was the near-universal acceptance of the principle that people should “own their own data,” although Zuckerberg dodged the matter of whether or not GDPR (the European privacy regulations) should be implemented in the U.S.

It was also interesting that he reversed a longstanding stance and finally acknowledged that platforms, including Facebook, are responsible for the content on them. He was also was positive about the possibility of “information fiduciaries” to help people control their data in the best possible manner.

Wall Street largely applauded, sending Facebook stock up more than $6 a share, apparently feeling that the CEO’s performance was adequate enough to avoid some of the more drastic corrective actions that could be imposed on it. However, many Senators were deeply dissatisfied with the pro forma apologies. Senator Blumenthal said to Zuckerburg: “We’ve seen the apology tours before. You have refused to acknowledge even an ethical violation.” 

Given Zuckerberg’s vice-like grip over Facebook, we don’t anticipate serious change in the long term unless it is closely monitored. This would not necessarily have to be done by creating a new government bureaucracy. What if Facebook were required to turn its powerful tools of social analysis on itself, to measure and report its effect on social outcomes? Could Facebook be required to regularly update its policies in order to genuinely improve society as a whole?

Alex Pentland is a professor at MIT. David Shrier is CEO of Distilled Analytics. This column reflects their individual opinions and not necessarily those of MIT or Distilled Analytics.