Breaking the world
“I’ll be honest with you: I’m terrified… There’s a good chance the internet will help break the world this year, and I’m not confident we have the tools to stop it.”
“I’ll be honest with you: I’m terrified… There’s a good chance the internet will help break the world this year, and I’m not confident we have the tools to stop it.”
In a Facebook experiment published in Nature that was conducted on a whopping 61 million people, some randomly selected portion of this group received a neutral message to “go vote,” while others, also randomly selected, saw slightly more social version of the encouragement: small thumbnail pictures of a few of their friends who reported having voted were shown within the “go vote” pop-up.
The researchers measured that this slight tweak — completely within Facebook's control and conducted without the consent or notification of any of the millions of Facebook users — caused about 340,000 additional people to turn out to vote in the 2010 U.S. congressional elections.
(The true number may even be higher since the method of matching voter files to Facebook names only works for exact matches.)
That significant effect—from a one-time, single tweak—is more than four times the number of votes that determined that Donald Trump would be the winner of the 2016 election for presidency in the United States.
On the evening of August 13 [2014], the police appeared on the streets of Ferguson in armored vehicles and wearing military gear, with snipers poised in position and pointing guns at the protesters. That is when I first noticed the news of Ferguson on Twitter—and was startled at such a massive overuse of police force in a suburban area in the United States.
On Twitter, among about a thousand people around the world that I follow, and which was still sorted chronologically at the time, the topic became dominant.
On Facebook's algorithmically controlled news feed, however, it was as if nothing had happened.
As I inquired more broadly, it appeared that Facebook’s algorithm may have decided that the Ferguson stories were lower priority to show to many users than other, more algorithm-friendly ones.
Instead of news of the Ferguson protests, my own Facebook's news feed was dominated by the “ice-bucket challenge,” a worthy cause in which people poured buckets of cold water over their heads and, in some cases, donated to an amyotrophic lateral sclerosis (ALS) charity. Many other people were reporting a similar phenomenon.
Facebook's algorithm was not prioritizing posts about the “Ice Bucket Challenge” rather than Ferguson posts because of a nefarious plot by Facebook's programmers or marketing department to bury the nascent social movement. The algorithm they designed and whose priorities they set, combined with the signals they allowed users on the platform to send, created that result.
“There aren’t many comparisons in American history for Thursday’s press conference in which Donald Trump suggested that the coronavirus might be defeated by shining lights inside human beings or injecting people with disinfectant. But there is the song ‘Miracles’ by Insane Clown Posse.”
Selected passages and quotes from Ryan Mac and Craig Silverman’s outstanding piece in Buzzfeed News, Hurting People At Scale: Facebook’s Employees Reckon With The Social Network They’ve Built
On July 1, Max Wang, a Boston-based software engineer who was leaving Facebook after more than seven years, shared a video on the company’s internal discussion board that was meant to serve as a warning.
“I think Facebook is hurting people at scale,” he wrote in a note accompanying the video. “If you think so too, maybe give this a watch.”
Most employees on their way out of the “Mark Zuckerberg production” typically post photos of their company badges along with farewell notes thanking their colleagues. Wang opted for a clip of himself speaking directly to the camera. What followed was a 24-minute clear-eyed hammering of Facebook’s leadership and decision-making over the previous year.
Yaël Eisenstat, Facebook's former election ads integrity lead, said the employees’ concerns reflect her experience at the company, which she believes is on a dangerous path heading into the election.
“All of these steps are leading up to a situation where, come November, a portion of Facebook users will not trust the outcome of the election because they have been bombarded with messages on Facebook preparing them to not trust it,” she told BuzzFeed News.
She said the company’s policy team in Washington, DC, led by Joel Kaplan, sought to unduly influence decisions made by her team, and the company’s recent failure to take appropriate action on posts from President Trump shows employees are right to be upset and concerned.
“These were very clear examples that didn't just upset me, they upset Facebook’s employees, they upset the entire civil rights community, they upset Facebook’s advertisers. If you still refuse to listen to all those voices, then you're proving that your decision-making is being guided by some other voice,” she said.
Replying to Wang’s video and comments, Facebook’s head of artificial intelligence Yann LeCun wrote,
Other employees, like [engineer Dan Abramov], the engineer, have seized the moment to argue that Facebook has never been neutral, despite leadership’s repeated attempts to convince employees otherwise, and as such needed to make decisions to limit harm. Facebook has proactively taken down nudity, hate speech, and extremist content, while also encouraging people to participate in elections — an act that favors democracy, he wrote.
“As employees, we can’t entertain this illusion,” he said in his June 26 memo titled “Facebook Is Not Neutral.” “There is nothing neutral about connecting people together. It’s literally the opposite of the status quo.”
Zuckerberg seems to disagree. On June 5, he wrote that Facebook errs on the “side of free expression” and made a series of promises that his company would push for racial justice and fight for voter engagement.
The sentiment, while encouraging, arrived unaccompanied by any concrete plans. On Facebook’s internal discussion board, the replies rolled in.
Stelter was reacting to dismissive statements on Fox & Friends by William Bennet, former Secretary of Education in the Reagan administration, about the severity of the Coronavirus.
Bennett smugly stated,
At the time William Bennett made those statements — April 13, 2020 — 22,000 Americans had already died of COVID-19.
“Social media is a nuance destruction machine…”
The full quote, in response to a question about so-called “cancel culture”, was, “What I find a little discouraging is that it appears to me that social media is a nuance destruction machine, and I don’t think that’s helpful for a democracy.”
India:
“I’m old enough to remember when the Internet wasn’t a group of five websites, each consisting of screenshots of text from the other four.”
1980:
The essay ends with, “In a world where everything and everyone is treated as an object to be bought and sold, the new technologies — and most of the old ones for that matter — will inevitably create hardship and human misery. […] The ease with which computers are used as instruments of social control cannot be allowed to obscure their liberatory potential.”
“We are living in a global public health crisis moving at a speed and scale never witnessed by living generations. The cracks in our medical and financial systems are being splayed open like a gashing wound.”
“Welcome to 2020, time travelers, where white grandads fighting for racial equity mid pandemic are equipped with n95s and super charged leaf blowers to ‘blow the tear gas away.’”
This quote was a little hard to track down, but I found this in Kathryn Kish Sklar’s essay in Revisiting the Origins of Human Rights: "Roosevelt's remarks were extemporaneous and no document of them survives… [She] was speaking at the UN on the occasion of presenting a pamphlet co-authored with Ethel Philips, In Your Hands: a Guide for Community Action (New York: Church Peace Union, 1958).”