It just requires that Twitter would care

Imagine if signing up to read Twitter was free, but posing required you to spend a week doing moderation first.

Everyone who came into the community would have to learn the rules before they violated them.

Then, when you’re tempted to break the rules, you’d remember that there were people who would read what you wrote, just like you did for others, and you’d lose your account and have to do another week of moderation before getting to post again.

This is not too hard to implement. It’s certainly easier than inventing a magic AI that will solve all your problems. It just requires that Twitter care enough about their community to do it.

The Hottest Chat App for Teens

"When everyone logs on to do homework at night, Google Docs chats come alive. Groups of kids will all collaborate on a document, while their parents believe they’re working on a school project. As a Reddit thread revealed in February, chatting via Google Docs is also a great way to circumvent a parental social-media ban."
The Hottest Chat App for Teens Is … Google Docs, by Taylor Lorenz, The Atlantic, 14 March 2019

To which Christina Xu (@xuhulk) replied: “I wrote my undergrad thesis on the history of instant messengers and learned that teenagers misusing productivity tools to flirt is truly one of the driving forces of the internet.“ (14 March 2019)

We want voters to be aware of who is trying to influence them. That’s the reason we have disclosure requirements on our campaign ads. We’ve known, at least since Aristotle in Western culture, that the source is judged as part of the message.
— Kathleen Jameison Hall, author of Cyberwar: How Russian Hackers and Trolls Helped Elect a President — What We Don’t, Can’t, and Do Know (Oxford University Press, 2018), as quoted in How Russia Helped Swing the Election for Trump, by Jane Mayer, New Yorker, 24 September 2018

Discourse Saboteurs

[Kathleen Hall Jamieson's] case is based on a growing body of knowledge about the electronic warfare waged by Russian trolls and hackers — whom she terms “discourse saboteurs” — and on five decades’ worth of academic studies about what kinds of persuasion can influence voters, and under what circumstances. Democracies around the world, she told me, have begun to realize that subverting an election doesn’t require tampering with voting machines. Extensive studies of past campaigns, Jamieson said, have demonstrated that “you can affect people, who then change their decision, and that alters the outcome.” She continued, “I’m not arguing that Russians pulled the voting levers. I’m arguing that they persuaded enough people to either vote a certain way or not vote at all.”
How Russia Helped Swing the Election for Trump, by Jane Mayer, New Yorker, 24 September 2018. The article is a profile of Kathleen Hall Jamieson forensic's analysis of the 2016 election: Cyberwar: How Russian Hackers and Trolls Helped Elect a President — What We Don’t, Can’t, and Do Know (Oxford University Press, 2018)

A pre-Newtonian moment

“Social media is in a pre-Newtonian moment, where we all understand that it works, but not how it works,” Mr. Systrom told me, comparing this moment in the tech world to the time before man could explain gravity. “There are certain rules that govern it and we have to make it our priority to understand the rules, or we cannot control it.”

Leo Laporte: Maybe what he's thinking is Mark Zuckerberg created Facebook to connect and everything like that. It was used against us in our elections by the Russians particularly to convince people not to vote or to stay at home mostly or to vote for somebody in particular. To me that was the come-to-Jesus moment where somebody figured out how to use social media in a very powerful way and they understood it but Zuckerberg did not and it took Facebook off guard, and at first they denied it even happened. Finally of late they've admitted yeah that's what happened.

Larry Magid: I think part of the problem for consumers is that most of us don't know how it works. We know that there are algorithms…

Leo Laporte: But do you think Zuck [Mark Zuckerberg] does is the question?

Larry Magid: That's what I'm saying, I assume that Zuck does, but maybe he doesn't fully understand it.

This Week in Tech (TWIT) 606, 19 March 2019, [at 43:51]


Obscurity makes meaningful and intimate relationships possible, ones that offer solidarity, loyalty and love. It allows us to choose with whom we want to share different kinds of information. It protects us from having everyone know the different roles we play in the different parts of our lives. We need to be able to play one role with our co-workers while revealing other parts of ourselves with friends and family. Indeed, obscurity is one reason we feel safe bonding with others over our shared vulnerabilities, our mutual hopes, dreams and fears.
Why You Can No Longer Get Lost in the Crowd, by Woodrow Hartzog and Evan Selinger, New York Times, 17 April 2019. This article by Dr. Hartzog, a professor of law and computer science, and Dr. Selinger, a professor of philosophy, is part of the New York Times’ Privacy Project

WhatsApp — half of Zimbabwe

The crackdown on social media [in Zimbabwe], in part, is a demonstration of how the WhatsApp corner of the internet has become a powerful space for Zimbabweans. WhatsApp facilitated the spread of misinformation during elections in Brazil and has contributed to caste-based violence and mob killings in India. But it can also serve as a platform for democratized distribution of news in a country with a storied history of oppressing government critics.

Independent media in Zimbabwe are turning to WhatsApp as a primary distributor of news in the midst of an information landscape that is shifting to social platforms. State broadcasters and newspapers have long dominated the media, but alternative platforms began to exist and gain increasing traction in the last years of the Mugabe era.

Zimbabwe, with a population of 16.7 million, has a mobile penetration rate of close to 100 percent, and an internet penetration rate of about 50 percent. WhatsApp connections comprise almost half of all Internet usage in the country. Mobile network operators such as the country’s largest, Econet Wireless, provide data bundles specific to WhatsApp and Twitter, or Facebook and Instagram, for as low as $0.50 or $1 per week, making it more affordable for people to freely communicate in an economy where a significant majority are unemployed.

“When someone talks about internet access in Zimbabwe, they basically are talking about WhatsApp access,” said Thulani Thabango, a Ph.D candidate in media at Stellenbosch University in South Africa.

It was so easy

“On November 5, 2016, Jestin Coler, founder of the fake newspaper Denver Guardian, posted a ‘news story’ saying an FBI agent involved in leaking Hillary Clinton’s emails was found dead in an ‘apparent murder-suicide.’”
“Everything about it was fictional: the town, the people, the sheriff, the FBI guy,” Coler told NPR. “Our social media guys kind of go out and do a little dropping it throughout Trump groups and Trump forums, and boy, it spread like wildfire.” The made-up tale went viral on Facebook before the 2016 election—and was probably seen by tens of millions. “It was so easy,” Coler told me once.
Zeynep Tufekci, The Imperfect Truth About Finding Facts In A World Of Fakes, Wired, 18 February 2019

A great challenge to society

Netflix does seem to be pushing cultural boundaries and sparking new conversations all over the world. After it plastered Bangkok with billboards advertising “Sex Education” last month, a conservative Thai political party filed a complaint against the company for airing the racy British comedy, which the party called “a great challenge to Thai society.” The young, progressive Thai internet responded in fury, and in the outrage, people started talking about actual problems in Thai society, like the lack of sex education and the high rates of teenage pregnancy.
Netflix Is the Most Intoxicating Portal to Planet Earth by Farhad Manjoo, New York Times, 22 February 2019

Happiness, anger, sadness, disgust, surprise, and fear

But there’s a problem. While the technology is cutting-edge, it’s using an outdated scientific concept stating that all humans, everywhere, experience six basic emotions, and that we each express those emotions in the same way. By building a world filled with gadgets and surveillance systems that take this model as gospel, this obsolete view of emotion could end up becoming a self-fulfilling prophecy, as a vast range of human expressions around the world is forced into a narrow set of definable, machine-readable boxes.
Silicon Valley thinks everyone feels the same six emotions by Dr. Rich Firth-Godbehere, Quartz, 17 September 2018. Dr. Firth-Godbehere says that those emotions are happiness, anger, sadness, disgust, surprise, and fear.

We were looking in the wrong place

I love this article, The ‘Future Book’ Is Here, But It's Not What We Expected by Craig Mod (Wired, 20 December 2018), for how it opens up a new way of thinking about, and looking for, change.

Mod looks at the case of the venerable printed book and argues that while we’ve all been waiting for the physical platform of the book to change — and wondering why it hasn’t — everything else in the stack around, under, and on top of funding, writing, printing, distributing, and promoting books has changed dramatically.

We were looking for the Future Book in the wrong place. It’s not the form, necessarily, that needed to evolve […] Instead, technology changed everything that enables a book, fomenting a quiet revolution. Funding, printing, fulfillment, community-building — everything leading up to and supporting a book has shifted meaningfully, even if the containers haven’t.
Our Future Book is composed of email, tweets, YouTube videos, mailing lists, crowdfunding campaigns, PDF to .mobi converters, Amazon warehouses, and a surge of hyper-affordable offset printers in places like Hong Kong. For a “book” is just the endpoint of a latticework of complex infrastructure, made increasingly accessible. Even if the endpoint stays stubbornly the same—either as an unchanging Kindle edition or simple paperback—the universe that produces, breathes life into, and supports books is changing in positive, inclusive ways, year by year.

Mod’s observations seem to me to be a kind of ninja move for understanding the ways in which the most obvious and highly scrutinized components an ecosystem or piece of infrastructure can seem to remain stubbornly stagnant while in fact all of the unconsidered enabling elements around them are being transformed.

We tend to look at the surface of things, the indicator species, show stoppers, and divas, at the expense of the rest of the ecosystem — and those ecosystems can be fascinating.

Disruption for Thee, But More for Me

That’s where the trouble starts. Tech law is a minefield of overly broad, superannuated rules that have been systematically distorted by companies that used ‘disruption’ to batter their way into old industries, but now use these laws to shield themselves from any pressure from upstarts to seek to disrupt them.
Disruption for Thee, But More for Me , Cory Doctorow's takedown of the Digital Millennium Copyright Act and the Computer Fraud and Abuse Act, Locus Magazine, 7 January 2019

The article continues,

[The DMCA] is used for “business model enforcement,” to ensure that disruptive, but legal, ways of using a product or service are made illegal – from refilling your printer’s ink cartridge to getting your car or phone serviced by an independent neighborhood repair shop.

Together, the CFAA and DMCA have given digital businesses access to a shadowy legal doctrine that was never written by Congress but is nevertheless routinely enforced by the courts: Felony Contempt of Business-Model.


How do you keep bigots off social media?

FACEBOOK: Wow, it's tough but we're trying! ❤️👍
TWITTER: What is "bigotry", really?
RAVELRY: Our grandmothers wrote code into scarves that sunk Nazi submarines. We carry razor sharp needles everywhere and we will fuck. you. up.

@bentev28, 30 June, 2019

In the midst of a withering year or two of big platforms abdicating (or outright denying) their civic responsibilities, fiber arts website/community Ravelry announced a new content policy that bans support for President Donald Trump on its platform.

New Policy: Do Not Post In Support of Trump or his Administration

Sunday, June 23rd 2019

We are banning support of Donald Trump and his administration on Ravelry.

This includes support in the form of forum posts, projects, patterns, profiles, and all other content. Note that your project data will never be deleted. We will never delete your Ravelry project data for any reason and if a project needs to be removed from the site, we will make sure that you have access to your data. Even if you are permanently banned from Ravelry, you will still be able to access any patterns that you purchased. Also, we will make sure that you receive a copy of your data.

We cannot provide a space that is inclusive of all and also allow support for open white supremacy. Support of the Trump administration is undeniably support for white supremacy.

The Community Guidelines have been updated with the following language: “Note that support of President Trump, his administration, or individual policies that harm marginalized groups, all constitute hate speech.”

“Everyone uses Ravelry”: why a popular knitting website’s anti-Trump stance is so significant by By Aja Romano (@ajaromano) is an excellent, long, detailed article in Vox about the new Ravelry policy.

I was curious about the reference to grandmothers who “wrote code into scarves that sunk Nazi submarines” in @bentev28’s tweet. Vox/Romano linked to this fascinating Atlas Obscura article by Natalie Zarrelli, 1 June 2017, The Wartime Spies Who Used Knitting as an Espionage Tool.

DURING WORLD WAR I, A grandmother in Belgium knitted at her window, watching the passing trains. As one train chugged by, she made a bumpy stitch in the fabric with her two needles. Another passed, and she dropped a stitch from the fabric, making an intentional hole. Later, she would risk her life by handing the fabric to a soldier—a fellow spy in the Belgian resistance, working to defeat the occupying German force.

Also in the Vox article I learned that the gaming site banned Trump support in October 2018.

The following policy announcement is the result of over a year of serious debate by the moderation team. The decision is as close to unanimous as we ever get. It will not be the subject of further debate. We have fully considered the downsides and ultimately decided we have to stay true to our values. We will not pretend that evil isn’t evil, or that it becomes a legitimate difference of political opinion if you put a suit and tie on it.

We are banning support of Donald Trump or his administration on the RPGnet forums. This is because his public comments, policies, and the makeup of his administration are so wholly incompatible with our values that formal political neutrality is not tenable. We can be welcoming to (for example) persons of every ethnicity who want to talk about games, or we can allow support for open white supremacy. Not both. Below will be an outline of the policy and a very incomplete set of citations.

We have a community here that we’ve built carefully over time, and support for elected hate groups aren’t welcome here. We can't save the world, but we can protect and care for the small patch that is this board.

But there are so many of them

Former YouTube engineer Guillaume Chaslot, an artificial intelligence expert who once worked to develop the platform’s recommendation algorithm, says he discovered the severity of the problem, which he believes he helped create, on a long bus ride through his native France in 2014, the year after he left the company. A man sitting on the seat next to him was watching a succession of videos claiming that the government had a secret plan to kill one-quarter of the population. Right after one video finished, another started automatically, making roughly the same claim.

Chaslot tried to explain to the man that the conspiracy was obviously untrue and that YouTube’s recommendation engine was simply serving up more of what it thought he wanted. The man at first appeared to understand, Chaslot said, but then concluded: “But there are so many of them.”

Two years after #Pizzagate showed the dangers of hateful conspiracies, they’re still rampant on YouTube, by Craig Timberg, Elizabeth Dwoskin, Tony Romm and Andrew Ba Tran, Washington Post, 10 December 2018

Regardless of who you are

Today, the risks exist regardless of who you are, what platform you write or speak for, what language you choose to use. The bounds of right and wrong now extend beyond the parameters of a political system, to what is deemed to be moral for the culture and conscientious for the nation. This is a moment of crisis, when new forms of expression and resistance must emerge.