A referendum on reality itself
There is perhaps no better place to witness what the culture of disinformation has already wrought in America than a Trump campaign rally.
Tony Willnow, a 34-year-old maintenance worker who had an American flag wrapped around his head, observed that Trump had won because he said things no other politician would say. When I asked him if it mattered whether those things were true, he thought for a moment before answering. “He tells you what you want to hear,” Willnow said. “And I don’t know if it’s true or not — but it sounds good, so fuck it.”
The political theorist Hannah Arendt once wrote that the most successful totalitarian leaders of the 20th century instilled in their followers “a mixture of gullibility and cynicism.” When they were lied to, they chose to believe it. When a lie was debunked, they claimed they’d known all along — and would then “admire the leaders for their superior tactical cleverness.” Over time, Arendt wrote, the onslaught of propaganda conditioned people to “believe everything and nothing, think that everything was possible and that nothing was true.”
Leaving the rally, I thought about Arendt, and the swaths of the country that are already gripped by the ethos she described. Should it prevail in 2020, the election’s legacy will be clear — not a choice between parties or candidates or policy platforms, but a referendum on reality itself.
Facebook’s Frankenstein Moment
Four times the number of votes
In a Facebook experiment published in Nature that was conducted on a whopping 61 million people, some randomly selected portion of this group received a neutral message to “go vote,” while others, also randomly selected, saw slightly more social version of the encouragement: small thumbnail pictures of a few of their friends who reported having voted were shown within the “go vote” pop-up.
The researchers measured that this slight tweak — completely within Facebook's control and conducted without the consent or notification of any of the millions of Facebook users — caused about 340,000 additional people to turn out to vote in the 2010 U.S. congressional elections.
(The true number may even be higher since the method of matching voter files to Facebook names only works for exact matches.)
That significant effect—from a one-time, single tweak—is more than four times the number of votes that determined that Donald Trump would be the winner of the 2016 election for presidency in the United States.
Facebook, Ferguson, and the Ice Bucket Challenge
On the evening of August 13 [2014], the police appeared on the streets of Ferguson in armored vehicles and wearing military gear, with snipers poised in position and pointing guns at the protesters. That is when I first noticed the news of Ferguson on Twitter—and was startled at such a massive overuse of police force in a suburban area in the United States.
On Twitter, among about a thousand people around the world that I follow, and which was still sorted chronologically at the time, the topic became dominant.
On Facebook's algorithmically controlled news feed, however, it was as if nothing had happened.
As I inquired more broadly, it appeared that Facebook’s algorithm may have decided that the Ferguson stories were lower priority to show to many users than other, more algorithm-friendly ones.
Instead of news of the Ferguson protests, my own Facebook's news feed was dominated by the “ice-bucket challenge,” a worthy cause in which people poured buckets of cold water over their heads and, in some cases, donated to an amyotrophic lateral sclerosis (ALS) charity. Many other people were reporting a similar phenomenon.
Facebook's algorithm was not prioritizing posts about the “Ice Bucket Challenge” rather than Ferguson posts because of a nefarious plot by Facebook's programmers or marketing department to bury the nascent social movement. The algorithm they designed and whose priorities they set, combined with the signals they allowed users on the platform to send, created that result.
Hurting people at scale
Selected passages and quotes from Ryan Mac and Craig Silverman’s outstanding piece in Buzzfeed News, Hurting People At Scale: Facebook’s Employees Reckon With The Social Network They’ve Built
On July 1, Max Wang, a Boston-based software engineer who was leaving Facebook after more than seven years, shared a video on the company’s internal discussion board that was meant to serve as a warning.
“I think Facebook is hurting people at scale,” he wrote in a note accompanying the video. “If you think so too, maybe give this a watch.”
Most employees on their way out of the “Mark Zuckerberg production” typically post photos of their company badges along with farewell notes thanking their colleagues. Wang opted for a clip of himself speaking directly to the camera. What followed was a 24-minute clear-eyed hammering of Facebook’s leadership and decision-making over the previous year.
Yaël Eisenstat, Facebook's former election ads integrity lead, said the employees’ concerns reflect her experience at the company, which she believes is on a dangerous path heading into the election.
“All of these steps are leading up to a situation where, come November, a portion of Facebook users will not trust the outcome of the election because they have been bombarded with messages on Facebook preparing them to not trust it,” she told BuzzFeed News.
She said the company’s policy team in Washington, DC, led by Joel Kaplan, sought to unduly influence decisions made by her team, and the company’s recent failure to take appropriate action on posts from President Trump shows employees are right to be upset and concerned.
“These were very clear examples that didn't just upset me, they upset Facebook’s employees, they upset the entire civil rights community, they upset Facebook’s advertisers. If you still refuse to listen to all those voices, then you're proving that your decision-making is being guided by some other voice,” she said.
Replying to Wang’s video and comments, Facebook’s head of artificial intelligence Yann LeCun wrote,
Other employees, like [engineer Dan Abramov], the engineer, have seized the moment to argue that Facebook has never been neutral, despite leadership’s repeated attempts to convince employees otherwise, and as such needed to make decisions to limit harm. Facebook has proactively taken down nudity, hate speech, and extremist content, while also encouraging people to participate in elections — an act that favors democracy, he wrote.
“As employees, we can’t entertain this illusion,” he said in his June 26 memo titled “Facebook Is Not Neutral.” “There is nothing neutral about connecting people together. It’s literally the opposite of the status quo.”
Zuckerberg seems to disagree. On June 5, he wrote that Facebook errs on the “side of free expression” and made a series of promises that his company would push for racial justice and fight for voter engagement.
The sentiment, while encouraging, arrived unaccompanied by any concrete plans. On Facebook’s internal discussion board, the replies rolled in.
“Social media is a nuance destruction machine…”
The full quote, in response to a question about so-called “cancel culture”, was, “What I find a little discouraging is that it appears to me that social media is a nuance destruction machine, and I don’t think that’s helpful for a democracy.”
“I’m old enough to remember when the Internet wasn’t a group of five websites, each consisting of screenshots of text from the other four.”
Brave New Workplace
1980:
The essay ends with, “In a world where everything and everyone is treated as an object to be bought and sold, the new technologies — and most of the old ones for that matter — will inevitably create hardship and human misery. […] The ease with which computers are used as instruments of social control cannot be allowed to obscure their liberatory potential.”
A distribution channel for speech acts
On Twitter, people almost never do the thing that makes for real conversation: ask specific questions of other people about their lives.”
Trooly unctuous
Trooly (a play on ‘truly’, ugh), crawls social media, news sites, police and court registries, credit bureaus and similar sites and uses AI to determine whether, say, an AirBnB renter, is likely to be trustworthy, in their opinion.
It does this on-demand in about 30 seconds, for a cost of about $1.
The quote in full context, below.
Trooly — [now used by] Airbnb — is combining social credit scores with predictive policing. Tools like PredPol use AI that combines data points and historical events, factors like race and location, digital footprints and crime statistics, to predict likelihood of when and where crimes will occur (as well as victims and perpetrators). It’s no secret that predictive policing replicates and perpetuates discrimination.
Combine this with companies like Instagram, Facebook, YouTube, and yes, Airbnb deciding what legal behaviors are acceptable for service, and now we’re looking at groups of historically marginalized people being denied involvement in mainstream economic, political, cultural and social activities — at scale.
“Our ads are always accurate so it’s good that Facebook won’t limit political messages because it encourages more Americans to be involved in the process. This is much better than the approaches from Twitter and Google, which will lead to voter suppression.”
Election 2020
Twitter was made for trouble
On Twitter…teens saw the street code in the workings of the site. “Whoever made Twitter,” said Tiana, in September 2010, “designed Twitter for trouble.”
She explained that she could see her friends’ confrontations with people she didn’t follow. Tiana was prepared to “jump into” these conflicts and expected her friends to do the same. In the context of the [street] code, Twitter seemed provocative. It placed users before a stream of other people’s conversations, with the prompt “What’s happening?”
The Sony hack, 2014
Et tu, Instagram?
"Society is more than a bazaar"
Some articles and references I’ve collected regarding the dark side of “social media.”
1. “Privacy findings.” Pew Research Center. Nov 2019. https://www.pewresearch.org/fact-tank/2019/11/15/key-takeaways-on-americans-views-about-privacy-surveillance-and-data-sharing/
Americans are concerned about how much data is being collected about them, and many feel their information is less secure than it used to be.
Very few Americans believe they understand what is being done with the data collected about them.
Most Americans see more risks than benefits from personal data collection.
Americans say they have very little understanding of current data protection laws, and most are in favor of more government regulation.
2. “Women are harassed every 30 seconds on Twitter, major study finds.” Amnesty International, Mashable.
By Rachel Thompson. 18 December 2019.
https://mashable.com/article/amnesty-study-twitter-abuse-women/
3. “TikTok's local moderation guidelines ban pro-LGBT content.” The Guardian. By Alex Hern. 26 September 2019
https://www.theguardian.com/technology/2019/sep/26/tiktoks-local-moderation-guidelines-ban-pro-lgbt-content
4. “This is how we radicalized the world.” Buzzfeed. By Ryan Roderick. October 28-29, 2018
https://www.buzzfeednews.com/article/ryanhatesthis/brazil-jair-bolsonaro-facebook-elections
(On the Bolsonaro election in Brazil.)
5. “How YouTube Radicalized Brazil.” New York Times. By Max Fisher and Amanda Taub. 11 August 2019.
https://www.nytimes.com/2019/08/11/world/americas/youtube-brazil.html
6. “Living in a sea of false signals: Are we being pushed from “trust, but verify” to “verify, then trust”?“ Nieman Lab. March 2018.
https://www.niemanlab.org/2018/03/living-in-a-sea-of-false-signals-are-we-being-pushed-from-trust-but-verify-to-verify-then-trust/
(Recap of a speech by Craig Silverman.)
7. Sarah Thompson’s work, pointed to by Craig Silverman (above), investigating the harm done to First Nations peoples by fake Facebook accounts. Thompson argues that fake accounts do harm precisely because they interfere with the real good being done by First Nations’ people on Facebook. April 2019. https://exploitingtheniche.wordpress.com/2019/04/09/implants-and-extractions-part-ii/
8. “Rebalancing Regulation of Speech: Hyper-Local Content on Global Web-Based Platforms.” By Chinmayi Arun. Berkman Center. 28 March 2018.
https://medium.com/berkman-klein-center/rebalancing-regulation-of-speech-hyper-local-content-on-global-web-based-platforms-1-386d65d86e32
9. “Internal Documents Show Facebook Has Never Deserved Our Trust or Our Data.” By Jason Koebler and Joseph Cox. Motherboard/Vice News. 5 December 2019.
https://www.vice.com/en_us/article/7xyenz/internal-documents-show-facebook-has-never-deserved-our-trust-or-our-data
10. “The Art of Eyeball Harvesting.” By Shengwu Li. Logic Magazine, vol 6. January 2019.
https://logicmag.io/play/shengwu-li-on-online-advertising/
11. “Temporal Limits of Privacy in Human Behavior." Sekara et al, 2018. Arxiv.org (https://arxiv.org/pdf/1806.03615.pdf) via https://twitter.com/SarahJamieLewis/status/1041646238280679424, @SarahJamieLewis. 17 September 2018.
(This speaks to how much “risk” to privacy can arise from very little information)
12. “Freedom on the Net 2018: The Rise of Digital Authoritarianism.” Freedom House. 2018. https://freedomhouse.org/report/freedom-net/freedom-net-2018/rise-digital-authoritarianism [New link as of April 2020: https://freedomhouse.org/report/freedom-net/2018/rise-digital-authoritarianism ]
[Note: A reader from Comparitech suggested their map of Internet censorship in 181 countries might be useful in this context, https://www.comparitech.com/blog/vpn-privacy/internet-censorship-map/ —MPE April 27, 2020]
(From the introduction by Adrian Shahbaz.)
13. “Freedom on the Net 2019, The Crisis of Social Media.” By Adrian Shahbaz and Allie Funk. Freedom House. November 2019. https://www.freedomonthenet.org/sites/default/files/2019-11/11042019_Report_FH_FOTN_2019_final_Public_Download.pdf
14. “Goodbye, Chrome: Google’s Web browser has become spy software.” By Geoffrey Fowler. Washington Post. 21 June 2018.
https://www.washingtonpost.com/technology/2019/06/21/google-chrome-has-become-surveillance-software-its-time-switch/
15. Carole Cadwalladr and Emma Graham-Harrison’s reporting on Cambridge Analytica for the Guardian, such as https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election (17 March 2019)
16. “Content or Context Moderation? Artisanal, Community-Reliant, and Industrial Approaches.” Robyn Caplyn. Data In Society. 14 November 2018. https://datasociety.net/output/content-or-context-moderation/
17. “6 ways social media has become a direct threat to democracy.” By Pierre Omidyar. Washington Post. 9 October 2017.
https://www.washingtonpost.com/news/theworldpost/wp/2017/10/09/pierre-omidyar-6-ways-social-media-has-become-a-direct-threat-to-democracy/
18. NY Times: Delay, “Deny and Deflect: How Facebook’s Leaders Fought Through Crisis.” By Sheera Frenkel, Nicholas Confessore, Cecilia Kang, Matthew Rosenberg and Jack Nicas. 14 November 2018.
https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-election-racism.html
19. “Two years after #Pizzagate showed the dangers of hateful conspiracies, they’re still rampant on YouTube.” By Craig Timberg, Elizabeth Dwoskin, and Andrew Ba Tran. Washington Post. 14 November 2018.
https://pb-impact.washingtonpost.com/business/technology/hateful-conspiracies-thrive-on-youtube-despite-pledge-to-clean-up-problematic-videos/2018/12/10/625730a8-f3f8-11e8-9240-e8028a62c722_story.html
20. “On YouTube’s Digital Playground, an Open Gate for Pedophiles.” New York Times. By Max Fisher and Amanda Taub. 3 June 2019.
https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html
21. Letter to Jack Dorsey, CEO of Twitter. David Kaye, United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression. 10 December 2018 (posted to Twitter 21 December 2018).
https://twitter.com/davidakaye/status/1076018548378497024?lang=en
22. “Russian Meddling Is a Symptom, Not the Disease.” By Zeynep Tufekci. New York Times. 3 October 2018. https://www.nytimes.com/2018/10/03/opinion/midterms-facebook-foreign-meddling.html
23. “The Expensive Education of Mark Zuckerberg and Silicon Valley.” By Kara Swisher. New York Times. 2 August 2018.
https://www.nytimes.com/2018/08/02/opinion/the-expensive-education-of-mark-zuckerberg-and-silicon-valley.html
24. “Robotrolling 3.” NATO Strategic Communications Centre for Excellence. 2018. https://www.stratcomcoe.org/robotrolling-20183
25. “Where countries are tinder boxes and Facebook is a match”. By Amanda Taub and Max Fisher. New York Times. 21 April 2018
https://www.nytimes.com/2018/04/21/world/asia/facebook-sri-lanka-riots.html
26. “A Genocide Incited on Facebook, With Posts From Myanmar’s Military.” By Paul Mozur. New York Times. 15 October 2018.
https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html
27. “How Facebook and YouTube help spread anti-vaxxer propaganda.” By Julia Carrie Wong. The Guardian. 1 February 2019
https://www.theguardian.com/media/2019/feb/01/facebook-youtube-anti-vaccination-misinformation-social-media
28. “Tim Berners-Lee unveils global plan to save the web.” By Ian Sample. The Guardian. 24 November 2019.
https://www.theguardian.com/technology/2019/nov/24/tim-berners-lee-unveils-global-plan-to-save-the-internet?CMP=share_btn_tw
29. “YouTube’s Privacy Problem.” By Natasha Singer, in the New York Times “Kids” section (print only). 12 November 2019.
30. “Why Tech is Starting to Make Me Uneasy.” By Farhad Manjoo. New York Times. 11 October 2017.
https://www.nytimes.com/2017/10/11/insider/tech-column-dread.html
And finally (for now), Ethan Zuckerman, https://twitter.com/EthanZ/status/1009838622449766400 (Commenting on NYT “Want to understand what ails the modern Internet? Look at eBay.” https://www.nytimes.com/2018/06/20/magazine/want-to-understand-what-ails-the-modern-internet-look-at-ebay.html )
42 billion times
The Web We Want
I was supposed to be in San Diego this week for the Museum Computer Network conference, but business called me away. Here’s my Ignite talk, The Web We Want, composed with both the news of the day (fake news, propaganda on Facebook) and the rhythm of N.W.A.’s Express Yourself stuck in my head. I produced this video facsimile as a self-contained all-in-one production to try to give a sense of the moment — of being on stage with and for my people.
It’s a bit of a sequel to the last MCN Ignite talk, Jack the Museum, given in 2012.
Jack the Museum asked us to reach outside the constrictive idioms of traditional museum practice to seek greater impact in the world. Now, seven years later, with the humanistic vision of the Internet and the Web under threat, the Web We Want asks us to fight to reclaim the positive values of a digitally connected world.
Good luck tonight Nathan, Effie, Alison, Andrew, Beth, Koven and Nik — I’ll be with you in spirit, sending you all good vibes from somewhere over the Atlantic! Cheers!! https://conference.mcn.edu/2019/Ignite.cfm
Here’s the script (and an annotated version is here as a .pdf).
Hey I’m Mike; Cheers! Tonight I’m in absentia.
Talking to you across a digital connection.
Broadcasting from my trusty cyber station,
My code, copper, glass, and silicon creation.
Yeah I love the web — and it’s our baby.
And many-of-us are insider spiders that can ride her, maybe.
Or are we flies that come around … get stuck and eaten?
That web’s a sticky place now, and that smilin’ spider we be greetin.
What am I talkinabout? Well you may know me, Willis.
May know the things I care about I think will kill us.
May know the scope, scale, and speed-lovin man I am.
May know the green-eggs-and-ham lovin Sam-I-am.
I look out my windows and I see a shit-show.
If you log-in, or blogging with your noggin there’s a quid-pro-quo.
So let me take you through the categoric history.
To sweep the evils of these platforms to the dustbin of our history.
Facebook, Insta, YouTube, and Twitter.
I loved them, but these days I’d use them for kitty litter.
Flush them right down the pot — yeah, I’m pretty bitter.
They stealin’ from us like a Ben Franklin counterfeiter.
Facebook? It aided a genocide. A genocide. A genocide.
Facebook? It aided a genocide.
Yeah, they did that, and people died.
‘But we’re too big to mod-er-ate on-line activity.’
That’s a laugh! Facebook you could have the proclivity
To care about people and values and civility,
And redirect some of your vaunted corporate creativity
To make common sense solutions that work for all humanity.
Hate is not a fair choice ‘tween free speech, profit, and inanity.
Grow a spine and learn from those who’ve learned to love their community.
Weave a web of love and trust like RPG and Ravelry.
YouTube, oh, don’t get me started.
I love this platform but then they departed
The land of common, objective civic decency,
When they give aid and comfort those who spread conspiracy .
Twitter, hell, it’s a travesty,
How harassment and abuse is right there for all to see.
It’s not convenient to care when your mistress is a business model,
That makes you into every troll and dictator’s mollycoddle.
Google, aw, where to start?
They’ve turned exploitation of privacy into an art.
Micro-targeting, tracking, and ruthless data aggregation,
Reduce life-changing choices to an algorithmic calculation,
Blurring our lives into a smear of ruthless averages.
When they work they work but when they don’t who pays the damages?
Not people like me, white, straight, schooled, and privileged.
The grievous harm they cause to the powerless and poor can be unlimited.
So don’t tell me it don’t affect you.
Don’t affect those you met and those who beget you.
“Come into my parlor” say we spiders to the flies outside.
Hey, everybody goes there, why not? Don’t worry ‘bout the sticky side.
I think it’s a matter of owning up to consequence.
We all ask our global family to play here, and at great expense,
We burnish the street cred’ of dot coms with our edifice,
And risk harm to our community while we’re being generous.
Microsoft, Apple, and the Amazon crew.
ISP’s and the mobile’s are part of this too.
They claim public good, civic virtue, in their soundbites,
But when push comes to shove will they shove the Benjamins or human rights?
Oh bruh and sis, I almost forgot.
Elections and fake news are what we begot.
Remember that thing with Cambridge Analytica?
Well how’ you feelin’ about the current situation politica’?
Not so good? Huh — well me neither.
Catastrophic atmospheric carbon’s rising in the ether,
And just when we all must be connected, fast, and democratic,
The web we need is rotting, unacceptable-ly problematic.
Are we going to let 7 billion people live and love on a Web that’s autocratic?
Where the values of decency and common good make the dot-com’s panic?
Where the captains of Silicon Valley are running manic?
Piloting our commons to an iceberg like their own Titanic?
To transcend greed. Avarice. The fecklessness of feckless pricks,
We’re going to have to work as one, renegotiate some politics.
Boycott, cajole, write those letters band together,
Take a stand, take a risk, take the streets hell bent for leather.
This heart, these beats from this spider-web practitioner.
This time, these rhymes bustin’ from this long-distance exhibitioner
This urgency this planet this community can get get it done.
The web we want’s the dream we got if we spin our silk together, connected and strong.
[Updated 29 November 2019 to include link to annotated notes and link to official MCN version of the video.]
“It really doesn’t matter what country you’re in. The dance is the same everywhere you go.”
Chances are, by now, your country has some, if not all, of the following.
First off, you probably have some kind of local internet troll problem, like the MAGAsphere in the US, the Netto-uyoku in Japan, Fujitrolls in Peru, or AK-trolls in Turkey.
Your trolls will probably have been radicalized online via some kind of community for young men like Gamergate, Jeuxvideo.com ("videogames.com") in France, ForoCoches ("Cars Forum") in Spain, Ilbe Storehouse in South Korea, 2chan in Japan, or banter Facebook pages in the UK.
…Far-right influencers start appearing, aided by algorithms recommending content that increases user watch time. They will use Facebook, Twitter, and YouTube to transmit and amplify content and organize harassment and intimidation campaigns.
If these influencers become sophisticated enough, they will try to organize protests or rallies. The mini fascist comic cons they organize will be livestreamed and operate as an augmented reality game for the people watching at home. Violence and doxxing will follow them.
Some of these trolls and influencers will create more sophisticated far-right groups within the larger movement, like the Proud Boys, Generation Identity, or Movimento Brasil Livre. Or some will reinvigorate older, more established far-right or nationalist institutions like the Nordic Resistance Movement, the Football Lads Alliance, United Patriots Front, or PEGIDA.
While a far-right community is building in your country, a fake news blitz is usually raging online. It could be a rumor-based culture of misinformation, like the localized hoaxes that circulate in countries like India, Myanmar, or Brazil. Or it could be the more traditional “fake news” or hyperpartisan propaganda we see in predominantly English-speaking countries like the US, Australia, or the UK.
Typically, large right-wing news channels or conservative tabloids will then take these stories going viral on Facebook and repackage them for older, mainstream audiences. Depending on your country’s media landscape, the far-right trolls and influencers may try to hijack this social-media-to-newspaper-to-television pipeline. Which then creates more content to screenshot, meme, and share. It’s a feedback loop.
Populist leaders and the legions of influencers riding their wave […]create filter bubbles inside of platforms like Facebook or YouTube that promise a safer time, one that never existed in the first place, before the protests, the violence, the cascading crises, and endless news cycles. Donald Trump wants to Make American Great Again; Bolsonaro wants to bring back Brazil’s military dictatorship; Shinzo Abe wants to recapture Japan’s imperial past; Germany’s AFD performed the best with older East German voters longing for the days of authoritarianism. All of these leaders promise to close borders, to make things safe. Which will, of course, usually exacerbate the problems they’re promising to disappear. Another feedback loop.
…It really doesn’t matter what country you’re in. The dance is the same everywhere you go.