Hurting people at scale

Selected passages and quotes from Ryan Mac and Craig Silverman’s outstanding piece in Buzzfeed News, Hurting People  At Scale: Facebook’s Employees Reckon With The Social Network They’ve Built

On July 1, Max Wang, a Boston-based software engineer who was leaving Facebook after more than seven years, shared a video on the company’s internal discussion board that was meant to serve as a warning.

“I think Facebook is hurting people at scale,” he wrote in a note accompanying the video. “If you think so too, maybe give this a watch.”

Most employees on their way out of the “Mark Zuckerberg production” typically post photos of their company badges along with farewell notes thanking their colleagues. Wang opted for a clip of himself speaking directly to the camera. What followed was a 24-minute clear-eyed hammering of Facebook’s leadership and decision-making over the previous year.

What the departing engineer said echoed what civil rights groups such as Color of Change have been saying since at least 2015: Facebook is more concerned with appearing unbiased than making internal adjustments or correcting policies that permit or enable real-world harm.

Yaël Eisenstat, Facebook's former election ads integrity lead, said the employees’ concerns reflect her experience at the company, which she believes is on a dangerous path heading into the election.

“All of these steps are leading up to a situation where, come November, a portion of Facebook users will not trust the outcome of the election because they have been bombarded with messages on Facebook preparing them to not trust it,” she told BuzzFeed News.

She said the company’s policy team in Washington, DC, led by Joel Kaplan, sought to unduly influence decisions made by her team, and the company’s recent failure to take appropriate action on posts from President Trump shows employees are right to be upset and concerned.

“These were very clear examples that didn't just upset me, they upset Facebook’s employees, they upset the entire civil rights community, they upset Facebook’s advertisers. If you still refuse to listen to all those voices, then you're proving that your decision-making is being guided by some other voice,” she said.

“[Zuckerberg] uses ‘diverse perspective’ as essentially a cover for right-wing thinking when the real problem is dangerous ideologies,” Brandi Collins-Dexter, a senior campaign director at Color of Change, told BuzzFeed News after reading excerpts of Zuckerberg’s comments. “If you are conflating conservatives with white nationalists, that seems like a far deeper problem because that’s what we’re talking about. We’re talking about hate groups and really specific dangerous ideologies and behavior.”
“Facebook is getting trapped by the ideology of free expression. It causes us to lose sight of other important premises, like how free expression is supposed to serve human needs.” — Max Wang

Replying to Wang’s video and comments, Facebook’s head of artificial intelligence Yann LeCun wrote,

“American Democracy is threatened and closer to collapse than most people realize. I would submit that a better underlying principle to content policy is the promotion and defense of liberal democracy.”

Other employees, like [engineer Dan Abramov], the engineer, have seized the moment to argue that Facebook has never been neutral, despite leadership’s repeated attempts to convince employees otherwise, and as such needed to make decisions to limit harm. Facebook has proactively taken down nudity, hate speech, and extremist content, while also encouraging people to participate in elections — an act that favors democracy, he wrote.

“As employees, we can’t entertain this illusion,” he said in his June 26 memo titled “Facebook Is Not Neutral.” “There is nothing neutral about connecting people together. It’s literally the opposite of the status quo.”

Zuckerberg seems to disagree. On June 5, he wrote that Facebook errs on the “side of free expression” and made a series of promises that his company would push for racial justice and fight for voter engagement.

The sentiment, while encouraging, arrived unaccompanied by any concrete plans. On Facebook’s internal discussion board, the replies rolled in.

As Zuck prattles on

“As Zuck prattles on in revisionist blog posts about how he intended [Facebook] to ‘Give people a voice’, he consistently misses this point: harassment of this sort *silences* voices. It deters counterspeech to terrible ideas by making the reputational, time, and sanity cost too high…”
From a thread by Renee DiResta (@noUpside), of the Stanford Internet Observatory, regarding anti-vaxers harassing and threatening physicians’ over vaccination-related content.

A lot of control

Facebook’s decisions can fundamentally alter the speech ecosystem in a nation. The company does not only end up governing individuals; it ends up governing governments, too. The norms Facebook or its court choose for their pseudo-constitution will apply everywhere, and though the company will strive for sensitivity to local context and concerns, those norms will affect how the whole world talks to one another.

That’s a lot of control. […] Democracy, at least in theory, allows us to change things we do not like. We can vote out legislators who pass policy we disagree with, or who fail to pass policy at all. We cannot vote out Facebook.

Facebook has declared sovereignty, by Molly Roberts, Washington Post, 31 January 2019

Different ground

The dinners demonstrated a commitment from Zuckerberg to solve the hard problems that Facebook has created for itself through its relentless quest for growth. But several people who attended the dinners said they believe that they were starting the conversation on fundamentally different ground: Zuckerberg believes that Facebook’s problems can be solved. Many experts do not.
The Impossible Job: Inside Facebook’s Struggle to Moderate Two Billion People by Jason Koebler and Joseph Cox, Motherboard, 23 August 2019

Facebook

The East India company was the spearhead of the British empire. They controlled the two most important commodities you needed for economic growth. One was labor and the other was actual commodities. In the post internet age attention and connectivity are the two limiting factors to economic growth. By controlling those points what you’re looking at is Facebook is trying to control what will happen in the future.
Om Malik re: Facebook’s Internet.org effort, on This Week in Tech #546 (starts at 13:45)

Om continues:

I think people look at it and just say “no, this is just about the Internet.” But if the Facebook logarithm can decide who wins the elections in Egypt or India or wherever that is not a good situation to have. They have never addressed those issues directly. They don’t address the insidious nature of internet.org and its impact on local governments and local populations. So, I understand that they are trying to do good by giving access to people, but I don’t think it’s as simple as that. This is all about advertising, making money, and decisions which will be made based on how Facebook makes more money in the future. […] I Draw the parallel between what was the East India company and I think Facebook is trying to do somewhat the same thing.