This story originally appeared in The Washington Post Oct. 25, 2021
https://www.washingtonpost.com/technology/2021/10/25/mark-zuckerberg-facebook-whistleblower/

The case against Mark Zuckerberg: Insiders say Facebook’s CEO chose growth over safety

The SEC has been asked to probe whether his iron-fisted management style, described in newly released documents and by insiders, led to disastrous outcomes.


(Washington Post illustration; Erin Scott/Reuters; Facebook screenshots; iStock)

By Elizabeth Dwoskin, Tory Newmyer, and Shibani Mahtani
October 25, 2021 Updated October 25, 2021 at 3:34 p.m. EDT

CORRECTION
A previous version of this article incorrectly described the content of a blog post by Guy Rosen, Facebook’s vice president for integrity, and of congressional testimony by the firm's CEO, Mark Zuckerberg. Rosen wrote in the blog post that the White House had missed its vaccination goals, not that Facebook had missed its own goals. And Zuckerberg testified that the company removes 94 percent of the hate speech it finds before a human reports it, not just that it removes 94 percent of the hate speech it finds. The article has been corrected.

Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets.

In America, the tech CEO is a champion of free speech, reluctant to remove even malicious and misleading content from the platform. But in Vietnam, upholding the free-speech rights of people who question government leaders could have come with a significant cost in a country where the social network earns more than $1 billion in annual revenue, according to a 2018 estimate by Amnesty International.


The Facebook Papers show what its employees knew about how the website fostered polarization and how it contrasted with CEO Mark Zuckerberg's public comments. (JM Rieger/The Washington Post)

So Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on the condition of anonymity to describe internal company discussions. Ahead of Vietnam’s party congress in January, Facebook significantly increased censorship of “anti-state” posts, giving the government near-total control over the platform, according to local activists and free-speech advocates.

Zuckerberg’s role in the Vietnam decision, which has not been previously reported, exemplifies his relentless determination to ensure Facebook’s dominance, sometimes at the expense of his stated values, according to interviews with more than a dozen former employees. That ethos has come under fire in a series of whistleblower complaints filed with the U.S. Securities and Exchange Commission by former Facebook product manager Frances Haugen.

While it’s unclear whether the SEC will take the case or pursue action against the CEO personally, the allegations made by the whistleblower represent arguably the most profound challenge to Zuckerberg’s leadership of the most powerful social media company on Earth. Experts said the SEC — which has the power to seek depositions, fine him and even remove him as chairman — is likely to dig more deeply into what he knew and when. Though his direct perspective is rarely reflected in the documents, the people who worked with him say his fingerprints are everywhere in them.

Apologies were once staples after Facebook scandals. Now the company offers defiance.

In particular, Zuckerberg made countless decisions and remarks that demonstrated a hard-line devotion to free speech. Even in Vietnam, the company says that the choice to censor is justified “to ensure our services remain available for millions of people who rely on them every day,” according to a statement provided to The Post.


Nguyen Quoc Duc Vuong, center, was sentenced by a Vietnamese court on July 7, 2020, to eight years in prison for live-streaming videos “humiliating" the country's leaders on social media. (Vietnam News Agency/AFP/Getty Images)

Haugen references Zuckerberg’s public statements at least 20 times in her SEC complaints, asserting that the CEO’s singular power and unique level of control over Facebook mean he bears ultimate responsibility for a litany of societal harms. Her documents appear to contradict the CEO on a host of issues, including the platform’s impact on children’s mental health, whether its algorithms contribute to polarization and how much hate speech it detects around the world.


Facebook under fire

A whistleblower’s power: Key takeaways from the Facebook Papers
Inside Facebook, Jan. 6 violence fueled anger, regret over missed warning signs
How Facebook shapes your feed
The case against Mark Zuckerberg: Insiders say Facebook’s CEO chose growth over safety
Five points for anger, one for a ‘like’: How Facebook’s formula fostered rage and misinformation
What happens next? We answer your questions about the Facebook Papers

For example, Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds before a human reports it — but internal documents show that its researchers estimated that the company was removing less than 5 percent of hate speech on Facebook. In March, Zuckerberg told Congress that it was “not at all clear” that social networks polarize people, when Facebook’s own researchers had repeatedly found that they do.

The documents — disclosures made to the SEC and provided to Congress in redacted form by Haugen’s legal counsel — were obtained and reviewed by a consortium of news organizations, including The Washington Post.

In her congressional testimony, Haugen repeatedly accused Zuckerberg of choosing growth over the public good, an allegation echoed in interviews with the former employees.


The former Facebook employee alleged on that the company has misled the public about internal research showing that some of its products have harmful effects. (Reuters)

“The specter of Zuckerberg looms in everything the company does,” said Brian Boland, a former vice president of partnerships and marketing who left in 2020 after coming to believe that the platform was polarizing society. “It is entirely driven by him.”

Facebook is drawing a bipartisan backlash from Congress, but the SEC could deliver a tougher blow

A Facebook spokeswoman, Dani Lever, denied that decisions made by Zuckerberg “cause harm,” saying the claim was based on “selected documents that are mischaracterized and devoid of any context.”

“We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible,” she said. “Like every platform, we are constantly making difficult decisions between free expressions and harmful speech, security and other issues, and we don’t make these decisions inside a vacuum — we rely on the input of our teams, as well as external subject matter experts to navigate them. But drawing these societal lines is always better left to elected leaders which is why we’ve spent many years advocating for Congress to pass updated Internet regulations.”

Facebook has previously fought efforts to hold Zuckerberg personally accountable. In 2019, as the company was facing a record-breaking $5 billion fine from the Federal Trade Commission for privacy violations related to Cambridge Analytica, a political consultancy that abused profile data from tens of millions of Facebook users, Facebook negotiated to protect Zuckerberg from direct liability. Internal Facebook briefing materials revealed the tech giant was willing to abandon settlement talks and duke it out in court if the agency insisted on pursuing the CEO.

The current chair of the SEC, Gary Gensler, has said he wants to go much harder on white-collar crime. Experts said Gensler is potentially likely to weigh the Haugen complaint as he looks toward a new era of corporate accountability.

Zuckerberg “has to be the driver of these decisions,” said Sean McKessy, the first chief of the SEC’s whistleblower office, now representing whistleblowers in private practice at Phillips & Cohen. “This is not a typical public company with checks and balances. This is not a democracy, it’s an authoritarian state. … And although the SEC doesn’t have the strongest track record of holding individuals accountable, I certainly could see this case as being a poster child for doing so.”


Rep. Frank Pallone Jr. (D-N.J.) pressed Mark Zuckerberg about reports that Facebook executives reduced staff efforts to make its platform less divisive. (The Washington Post)

Zuckerberg, who is 37, founded Facebook 17 years ago in his college dorm room, envisioning a new way for classmates to connect with one another. Today, Facebook has become a conglomerate encompassing WhatsApp, Instagram and a hardware business. Zuckerberg is chairman of the board and controls 58 percent of the company’s voting shares, rendering his power virtually unchecked internally at the company and by the board.

An ownership structure that gives a single leader a lock on the board’s decision-making is “unprecedented at a company of this scale,” said Marc Goldstein, head of U.S. research for the proxy adviser Institutional Shareholder Services. “Facebook at this point is by far the largest company to have all this power concentrated in one person’s hands.”

Zuckerberg has long been obsessed with metrics, growth and neutralizing competitive threats, according to numerous people who have worked with him. The company’s use of “growth-hacking” tactics, such as tagging people in photos and buying lists of email addresses, was key to achieving its remarkable size — 3.51 billion monthly users, nearly half the planet. In Facebook’s early years, Zuckerberg set annual targets for the number of users the company wanted to gain. In 2014, he ordered teams at Facebook to grow “time spent,” or each user’s minutes spent on the service, by 10 percent a year, according to the documents and interviews.

Government’s antitrust case against Facebook seeks a villain in Mark Zuckerberg

In 2018,Zuckerberg defined a new metric that became his “north star,” according to a former executive. That metric was MSI — “meaningful social interactions” — named because the company wanted to emphasize the idea that engagement was more valuable than time spent passively scrolling through videos or other content. For example, the company’s algorithm would now weight posts that got a large number of comments as more “meaningful” than likes, and would use that information to inject the comment-filled posts into the news feeds of many more people who were not friends with the original poster, the documents said.

Even as the company has grown into a large conglomerate, Zuckerberg has maintained a reputation as a hands-on manager who goes deep on product and policy decisions, particularly when they involve critical trade-offs between preserving speech and protecting users from harm — or between safety and growth.


When Sen. Lindsey Graham (R-SC) asked Facebook CEO Mark Zuckerberg whether he felt Facebook is a monopoly in the social media industry, Mark Zuckerberg said no. (The Washington Post)

Politically, he has developed hard-line positions on free speech, announcing that he would allow politicians to lie in ads and at one time defending the rights of Holocaust denialists. He has publicly stated that he made the final call in the company’s most sensitive content decisions to date, including allowing President Donald Trump’s violence-inciting post during the George Floyd protests to stay up, despite objections from thousands of employees.

And his capacity for micromanagement is vast: He personally chose the colors and layout of the company’s “I got vaccinated” frames for user profile pictures, according to two of the people.

But the former employees who spoke with The Post said his influence goes far beyond what he has stated publicly, and is most felt in countless lesser-known decisions that shaped Facebook’s products to match Zuckerberg’s values — sometimes, critics say, at the expense of the personal safety of billions of users.

Ahead of the 2020 U.S. election, Facebook built a “voting information center” that promoted factual information about how to register to vote or sign up to be a poll worker. Teams at WhatsApp wanted to create a version of it in Spanish, pushing the information proactively through a chat bot or embedded link to millions of marginalized voters who communicate regularly through WhatsApp. But Zuckerberg raised objections to the idea, saying it was not “politically neutral,” or could make the company appear partisan, according to a person familiar with the project who spoke on the condition of anonymity to discuss internal matters, as well as documents reviewed by The Post.

Silicon Valley braces for tougher regulation in Biden’s new Washington

Ultimately, the company implemented a whittled-down version: a partnership with outside groups that allowed WhatsApp users to text a chat bot if they saw potential misinformation or to text a bot built by the organization Vote.org to get voting info.

“WhatsApp did not propose pushing information to all users, which is not how WhatsApp works,” said spokeswoman Christina LoNigro.

When considering whether to permit increased censorship in Vietnam, one former employee said, Zuckerberg’s line in the sand regarding free speech seemed to be constantly shifting. Warned that catering to a repressive regime could harm Facebook’s global reputation, according to one of the people, Zuckerberg argued that going offline entirely in Vietnam would cause even greater harm to free speech in the country.


Facebook co-founder and CEO Mark Zuckerberg in a Senate hearing on April 10, 2018. Zuckerberg was called to testify after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (The Asahi Shimbun/Getty Images)

After Zuckerberg agreed to increase censorship of anti-government posts, Facebook’s transparency report shows that more than 2,200 posts by Vietnamese users were blocked between July and December 2020, compared with 834 in the previous six months. Pro-democracy and environmental groups, meanwhile, have become a target of government-led mass reporting campaigns, the documents and interviews show, landing people in jail for even mildly critical posts.

In April 2020, Zuckerberg appeared to shoot down or express reservations about researchers’ proposals to cut down on hate speech, nudity, graphic violence and misinformation, according to one of the documents. The pandemic was in its early days and coronavirus-related misinformation was spreading. The researchers proposed a limit to boosting content the news-feed algorithm predicts will be reshared, because serial “reshares” tended to correlate with misinformation. Early tests showed limiting this could reduce coronavirus-related misinformation by up to 38 percent, according to the document.

“Mark doesn’t think we could go broad,” said Anna Stepanov, the director giving the readout from the Zuckerberg meeting, about the CEO’s response to the proposal to change the algorithm. “We wouldn’t launch if there was a material trade-off with MSI.”


"Mark doesn't think we could go broad" (Facebook Papers)

Zuckerberg was a bit more open to a proposal to allow algorithms to be slightly less precise in what the software deemed to be hate speech, nudity and other banned categories — enabling it to delete a broader array of “probable violating content” and potentially reducing such harmful material by as much as 17 percent. But he only supported it as a “break the glass” measure, to be used in emergency situations such as the Jan. 6 insurrection, the documents said. Account demotions — which would have preemptively limited accounts that algorithms predicted were most likely to promote misinformation or hate — were off the table.

Facebook’s Lever says “probable violating” proposals were not break the glass measures and the company did implement them across categories such as graphic violence, nudity and porn, and hostile speech. Later, it also implemented the algorithm change fully for political and health categories that are in place today.

The Wall Street Journal first reported on the document’s existence.

[Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show]

The document that finally reached Zuckerberg was carefully tailored to address objections that researchers anticipated he would raise. For each of the nine suggestions that made their way up the chain, the data scientists added one row to list how the proposals would affect three areas he was known to care about: free speech, how Facebook is viewed publicly and how the algorithm change might affect MSI.

One former employee involved in that proposal process said those who worked on it were deflated by Zuckerberg’s response. The researchers had gone back and forth with leadership for months on it, changing it many times to address concerns about clamping down on free speech.


"In sum, the political whitelist violates multiple core company principles by treating politicians' speech as more valuable than ordinary citizens, and knowingly exposing users to harms in the form of misinformation." (Facebook Papers)

Zuckerberg, said a former executive, “is extremely inquisitive about anything that impacts how content gets ranked in the feed — because that’s the secret sauce, that’s the way this whole thing keeps spinning and working and making profits.”

“People felt, it was Mark’s thing, so he needs it to be successful. It needs to work,” the person added.

[Frances Haugen wasn't the first whistleblower at Facebook]

In 2019, those in the company’s civic integrity division, a roughly 200-person team that focused on how to mitigate harms caused by the platform, began to hear that Zuckerberg himself was becoming very worried about “false positives” — or legitimate speech being taken down by mistake. They were soon asked to justify their work by providing estimates of how many “false positives” any integrity-related project was producing, according to one of the people.

“Our very existence is fundamentally opposed to the goals of the company, the goals of Mark Zuckerberg,” said another person who quit. “And it made it so we had to justify our existence when other teams didn’t.”


Zuckerberg, left, and Joel Kaplan, Facebook's vice president of global public policy, chat after leaving a meeting on Capitol Hill in September 2019. (Samuel Corum/Getty Images)

“Founder-CEOs have superpowers that allow them to do courageous things. Mark has done that time and again,” Samidh Chakrabarti, the former head of the company’s civic integrity unit, who quit recently, tweeted this month. “But the trust deficit is real and the FB family may now better prosper under distributed leadership.”

Even as Facebook is facing perhaps its most existential crisis to date over the whistleblower documents, lately Zuckerberg’s attention has been elsewhere, focused on a push toward virtual-reality hardware in what former executives said was an attempt to distance himself from the problems of the core Facebook platform, known internally as the Big Blue app. The company is reportedly even considering changing its name to align better with his vision of a virtual-reality-driven “metaverse.” Facebook has said it doesn’t comment on rumors or speculation.

[How Facebook’s ‘metaverse’ became a political strategy in Washington]

The former employees said it was also not surprising that the document trove contains so few references to Zuckerberg’s thoughts. He has become more isolated in recent years, in the face of mounting scandals and leaks (Facebook disputes his isolation). He primarily communicates decisions through a small inner circle, known as the Small Team, and a slightly bigger group of company leaders known as M-Team, or Mark’s team. Information that gets to him is also tightly controlled, as well as information about him.

Even criticizing Zuckerberg personally can come with costs. An engineer who spoke with The Post, and whose story was reflected in the documents, says he was fired in 2020 after penning an open letter to Zuckerberg on the company’s chat system, accusing the CEO of responsibility for protecting conservatives whose accounts had been escalated for misinformation.

One document, a 2020 proposal that indicates it was sent to Zuckerberg for review — over whether to hide like counts on Instagram and Facebook — strongly suggests that Zuckerberg was directly aware of some of the research into harmful effects of the service. It included internal research from 2018 that found that 37 percent of teenagers said one reason that they stopped posting content was because wanting to get enough like counts caused them “stress or anxiety.”

(The like-hiding study, named Project Daisy, was also reported by the Journal. In 2021, the company ultimately did offer an option to hide likes on Instagram, but not on Facebook. Facebook says it didn’t implement Project Daisy because a test showed mixed results for people’s well-being and that the 2018 study used in the presentation “cannot be used to show that Instagram causes harm because the survey wasn’t designed to test that, nor does the data show it.”)

Over the summer, executives in Facebook’s Washington office heard that Zuckerberg was angry about President Biden’s charge that coronavirus misinformation on Facebook was “killing people.” Zuckerberg felt Biden had unfairly targeted the company and wanted to fight back, according to people who heard a key Zuckerberg adviser, Facebook Vice President for Global Affairs Nick Clegg, express the CEO’s viewpoint.

Zuckerberg is married to a physician, runs a foundation focused on health issues and had hoped that Facebook’s ability to help people during the pandemic would be legacy-making. Instead, the plan was going south.

In July, Guy Rosen, Facebook’s vice president for integrity, wrote a blog post noting that the White House had missed its own vaccine goals, and asserting that Facebook wasn’t to blame for the large number of Americans who refused to get vaccinated.

Though Biden later backed off his comment, some former executives saw Facebook’s attack on the White House as unnecessary self-sabotage, an example of the company exercising poor judgment in an effort to please Zuckerberg.

But complaints about the brash action were met with a familiar response, three people said: It was meant to please the “audience of one.”

More coverage: Facebook under fire

The Facebook Papers are a set of internal documents that were provided to Congress in redacted form by Frances Haugen’s legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post.

The trove of documents show how Facebook CEO Mark Zuckerberg has, at times, contradicted, downplayed or failed to disclose company findings on the impact of its products and platforms.

The documents also provided new details of the social media platform’s role in fomenting the storming of the U.S. Capitol.

Facebook engineers gave extra value to emoji reactions, including ‘angry,’ pushing more emotional and provocative content into users’ news feeds.

Read more from The Post’s investigation:

Key takeaways from the Facebook Papers

How Facebook neglected the rest of the world, fueling hate speech and violence in India

How Facebook shapes your feed


By Elizabeth Dwoskin Lizza joined The Washington Post as Silicon Valley correspondent in 2016, becoming the paper's eyes and ears in the region. She focuses on social media and the power of the tech industry in a democratic society. Before that, she was the Wall Street Journal's first full-time beat reporter covering AI and the impact of algorithms on people's lives.

By Tory Newmyer Tory Newmyer covers economic policy and the intersection of Wall Street and Washington as the anchor of PowerPost's daily tipsheet The Finance 202. He previously worked at Fortune, where he spent seven years as the magazine's Washington correspondent.

By Shibani Mahtani Shibani Mahtani is the Southeast Asia bureau chief for The Washington Post, covering countries that include the Philippines, Myanmar, Thailand and Indonesia. She joined The Post's foreign desk in 2018 after seven years as a correspondent for the Wall Street Journal in Southeast Asia and later in Chicago, where she covered the Midwest.