Zuckerberg’s about-faces

Damian Collins enjoys an expert examination of Facebook’s murky relationship with free speech

roman-martyniuk-QQhAQHWvTYk-unsplash

The “ugly truth” about Facebook was set out in a now infamous internal company memo titled “The Ugly” by senior company executive Andrew “Boz” Bosworth in June 2016. “So we connect people,” Boz wrote. “That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And yet still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.”

After the memo leaked to the media in 2018, Facebook founder Mark Zuckerberg and Bosworth both denied that they agreed with its contents or that it reflected the company’s values. Well, they would, wouldn’t they? In An Ugly Truth, however, Sheera Frenkel and Cecilia Kang, both leading reporters on technology and cyber security for the New York Times, challenge that denial and examine the choices Facebook makes between removing harmful content on its platform and driving engagement from users.

Facebook is an advertising business where the greater the level of engagement from users, the more money the company makes. The algorithms that drive this business model act on the basis that all engagement is good. Chris Cox, the chief product officer at Facebook, launched the “news feed” tool so that the platform could promote content to users it thought they would most enjoy. Cox’s decision in 2016 to give a higher ranking to content from users’ friends and family than to known and trusted sources of information was one of the changes that helped fuel an explosion of fake news on the platform. 

In this new book, Frenkel and Kang use the testimonies of current and former Facebook employees to demonstrate how the company has often looked in the other direction when presented with evidence that its systems have been abused. Alex Stamos, the former head of Facebook’s security team, had identified Russian networks spreading disinformation that targeted American voters ahead of the 2016 presidential election in March that year, but it would be another 18 months before there was any public acknowledgement from the company about this fact. Repeated warnings were given to senior executives at Facebook about how the platform was driving engagement with extremist content, including the promotion of holocaust denial, hate speech, antisemitism and vaccine hesitancy. 

Mark Zuckerberg has consistently stated his belief in freedom of speech, but, as Renee DiResta from the Stanford Internet Observatory has pointed out, “there is no right to algorithmic amplification. In fact, that’s the very problem that needs fixing”. Criticism over Facebook’s policy not to fact-check political advertising has also been directed at Zuckerberg from hundreds of his own employees. In an open letter published in October 2019, they stated: “Free speech and paid speech are not the same thing. Misinformation affects us all. Our current policies on fact-checking people in political office, or those running for office, are a threat to what Facebook stands for.”

One of the most striking examples in An Ugly Truth relates to the night of the US presidential election in November 2020. Concerned that Facebook could be used to spread disinformation that there had been fraud at the polls and thus deny victory to Donald Trump, the company changed the news feed algorithm to put more emphasis on “news ecosystem quality”. 

This was a secret internal quality ranking that Facebook assigned to news publishers, and the changes meant people were more likely to see reporting about the election from trusted sources. This had an immediate calming effect and one member of the Facebook elections team said that people started to call it the “nicer news feed, it was this brief glimpse of what Facebook could be”. 

However, by the end of the month the company had transitioned back to the old news feed, and there was evidence that its “nicer” equivalent had led to a significant decline in engagement from Facebook users. In summing up the situation, one data scientist at the company stated: “The bottom line was that we couldn’t hurt our bottom line. Mark still wanted people using Facebook as much as possible, as often as possible.”

Damian Collins is MP for Folkestone and Hythe and chaired the DCMS select committee on disinformation and fake news

15th November 2021