Mark Zuckerberg has bent Facebook’s guiding principles in order to keep Donald Trump’s posts on the site, according to documents and interviews obtained by a newspaper.
The Facebook CEO is confronting a company in crisis, as employees publicly protest against his failure to challenge Trump and advertisers boycott the firm in response.
On Sunday Starbucks became the latest to announce it was no longer advertising with the tech giant, following a lead set by Verizon, North Face, Ben & Jerry’s and Patagonia.
‘We will pause advertising on all social media platforms while we continue discussions internally, with our media partners and with civil rights organizations in the effort to stop the spread of hate speech,’ the company said.
The boycott has cost the company $7.2 billion, Bloomberg reported.
Zuckerberg’s woes increased further with the publication of a damning report by The Washington Post, which detailed how since 2015 he had changed Facebook’s policies to accommodate Trump’s controversial statements.
As a candidate, Trump posted a video in December 2015 calling for a ban of Muslims entering the United States.
Anger among employees at the video led to a companywide town hall, in which staff members told executives that they viewed the video as hate speech – a violation of the company’s policies.
In meetings about the issue, senior leaders and policy experts overwhelmingly said they felt that the video was hate speech, three former employees told The Post.
Zuckerberg himself said he was personally disgusted by it and wanted it removed, the people said.
At one of the meetings, Monika Bickert, Facebook’s vice president for policy, drafted a document to address the video and shared it with leaders including Zuckerberg’s top deputy, COO Sheryl Sandberg, and Joel Kaplan, the vice president of global policy and the company’s most prominent Republican.
The document, obtained by The Post, weighed four options.
They included removing the post for hate speech violations, making a one-time exception for it, creating a broad exemption for political discourse, and weakening the company’s community guidelines for everyone, allowing comments such as ‘No blacks allowed’ and ‘Get the gays out of San Francisco.’
The document also listed possible ‘PR Risks’ for each.
For example, lowering the standards overall would raise questions such as: ‘Would Facebook have provided a platform for Hitler?’ Bickert wrote.
She said that giving total free rein, on the other hand, risked opening the floodgates for even more hateful ‘copycat’ comments.
In the end Zuckerberg was talked out of his desire to remove the post in part by Kaplan, according to the people.
Instead, the executives created an allowance that newsworthy political discourse would be taken into account when making decisions about whether posts violated community guidelines.
Bickert said the company ultimately made a call to maintain Trump’s Muslim ban video because executives interpreted Trump’s comment to mean that the then-candidate was not speaking about all Muslims, but rather advocating for a policy position on immigration as part of a newsworthy political debate.
A formal newsworthiness policy was announced in October 2016.
A similar internal debate played out in the spring of 2016, when Trump began campaigning hard on his policy of building a U.S.-Mexico border wall.
Zuckerberg wanted to write a blog post condemning it, but was talked out of it by advisers in Washington DC, who said it would look like he was taking sides ahead of the November election.
Critics of Facebook say that their early assessment of Trump, and whether to label his proclamations as hate speech, have had a lasting impact – allowing figures such as Brazil’s Jair Bolsonaro and Rodrigo Duterte in the Philippines to speak with impunity.
Another way in which Facebook’s policies were slanted to help the president regarded news reports.
In the aftermath of the November 2016 election, rooting out fake news promulgated by Russian trolls became a priority for liberals.
Facebook’s security engineers in December 2016 presented findings from a broad internal investigation, known as Project P, to senior leadership on how false and misleading news reports spread so virally during the election.
When Facebook’s security team highlighted dozens of pages that had peddled false news reports, according to The Post, senior leaders in Washington, including Kaplan, opposed shutting them down immediately, arguing that doing so would disproportionately impact conservatives.
A year later, when Facebook considered changing its news feed algorithm to focus more on posts by friends and family versus publishers, Kaplan asked whether the change would hurt conservative news outlets, the paper reported.
When the data showed it would — conservative leaning outlets were pushing more content that violated its policies, the company had found — Kaplan successfully pushed for changes to make the new algorithm to be what he considered more evenhanded in its impact, the people said.
In September 2019 Nick Clegg, Facebook’s new head of global affairs and communications and a former British deputy prime minister, announced that Facebook would allow politicians to express themselves virtually unchecked on social media, unless they urged violence.
Facebook’s network of independent fact-checkers, which had been established as a key part of the company’s response to disinformation, would not evaluate their claims and the community guidelines would largely not apply to politicians.
He echoed Zuckerberg’s long-held view that Facebook should not want to be an arbiter of truth in political debate.
The speech angered some employees, triggering more than 250 of them to sign a petition disagreeing with the decision because they thought it gave politicians a pass.
In May Twitter began to distance itself from Facebook by marking some of Trump’s tweets as inciting violence or factually wrong.
Zuckerberg again resisted.
On Friday, however, Zuckerberg told employees in a live-streamed town hall that he was changing the company’s policy to be more in line with Twitter’s.
Facebook, he said, would label problematic newsworthy content that violated the company’s policies as Twitter does.
‘There are no exceptions for politicians in any of the policies that I’m announcing today,’ Zuckerberg said.