Imagine a world in which

We don’t know exactly what this new future looks like, of course. But one can imagine a world in which, to get on a flight, perhaps you’ll have to be signed up to a service that tracks your movements via your phone. The airline wouldn’t be able to see where you’d gone, but it would get an alert if you’d been close to known infected people or disease hot spots. There’d be similar requirements at the entrance to large venues, government buildings, or public transport hubs. There would be temperature scanners everywhere, and your workplace might demand you wear a monitor that tracks your temperature or other vital signs. Where nightclubs ask for proof of age, in future they might ask for proof of immunity—an identity card or some kind of digital verification via your phone, showing you’ve already recovered from or been vaccinated against the latest virus strains.

All of us will have to adapt to a new way of living, working, and forging relationships. But as with all change, there will be some who lose more than most, and they will be the ones who have lost far too much already. The best we can hope for is that the depth of this crisis will finally force countries—the US, in particular—to fix the yawning social inequities that make large swaths of their populations so intensely vulnerable.
We’re not going back to normal, by Gideon Lichfield, Technology Review, 17 March 2020. With light edits.

20-cents off a can of corn

Most large companies doing business in California are required by the state’s new privacy law to disclose what they know about customers and how that information is used.

This resulted in fairly straightforward announcements by many businesses.

Then there’s Ralphs, the supermarket chain owned by Kroger.

…As part of signing up for a rewards card, Ralphs “may collect” information such as “your level of education, type of employment, information about your health and information about insurance coverage you might carry.”

It says Ralphs may pry into “financial and payment information like your bank account, credit and debit card numbers, and your credit history.” […]

Ralphs says it’s gathering “behavioral information” such as “your purchase and transaction histories” and “geolocation data,” which could mean the specific Ralphs aisles you browse or could mean the places you go when not shopping for groceries, thanks to the tracking capability of your smartphone.

Ralphs also reserves the right to go after “information about what you do online” and says it will make “inferences” about your interests “based on analysis of other information we have collected.”

Other information? This can include files from “consumer research firms” — read: professional data brokers — and “public databases,” such as property records and bankruptcy filings.

[The article also notes that Ralphs' parent company Kroger also owns a company 'devoted solely to using customer data as a business resource' by aggregating data about its customers and selling it on the open market.]

“This level of intrusiveness seems like a very unfair bargain in return for, say, 20 cents off a can of corn,” Fordham’s Reidenberg said.

Is a supermarket discount coupon worth giving away your privacy?, by David Lazarus, Los Angeles Times, 21 January 2020

Trooly unctuous

“…and now we’re looking at groups of historically marginalized people being denied involvement in mainstream economic, political, cultural and social activities — at scale.”

Trooly (a play on ‘truly’, ugh), crawls social media, news sites, police and court registries, credit bureaus and similar sites and uses AI to determine whether, say, an AirBnB renter, is likely to be trustworthy, in their opinion.

It does this on-demand in about 30 seconds, for a cost of about $1.

The quote in full context, below.

Trooly — [now used by] Airbnb — is combining social credit scores with predictive policing. Tools like PredPol use AI that combines data points and historical events, factors like race and location, digital footprints and crime statistics, to predict likelihood of when and where crimes will occur (as well as victims and perpetrators). It’s no secret that predictive policing replicates and perpetuates discrimination.

Combine this with companies like Instagram, Facebook, YouTube, and yes, Airbnb deciding what legal behaviors are acceptable for service, and now we’re looking at groups of historically marginalized people being denied involvement in mainstream economic, political, cultural and social activities — at scale.

Micro-targeted

One difference that springs to mind is the sheer individualization of it. There are some auctions where you can even bid for an individual human impression. For example, there’s a startup that will let you target a particular person with an ad campaign…

Maybe you want your partner to stop smoking. This startup will generate a special link for you that looks like it’s an e-commerce site. You send it to your partner and when they click it, they get a cookie secretly loaded into their browser. This cookie enables the company to track your partner across the web. You write up an anti-smoking ad, and the company will ensure that your partner sees that ad everywhere. Now your partner’s entire internet experience is permeated with pressures to stop smoking.

You can design a similar campaign for a coworker you don't like. You can show them ads for job-hunting websites, to encourage them to get another job.

From The Art of Eyeball Harvesting, by Shengwu Li, Assistant Professor of Economics, Harvard University. Logic Magazine, Issue #6, Play

Impossible until it's not

Political power is a malleable thing, Mactaggart had learned, an elaborate calculation of artifice and argument, votes and money. People and institutions — in politics, in Silicon Valley — can seem all-powerful right up to the moment they are not. And sometimes, Mactaggart discovered, a thing that can’t possibly happen suddenly becomes a thing that cannot be stopped.
The Unlikely Activists Who Took On Silicon Valley — and Won, by Nicholas Confessore, New York Times Magazine, 14 August 2018. The article details Alastair Mactaggart's work to develop a California ballot initiative to protect consumers' online privacy.

Privacy is hard

“Privacy is Hard, Volume 18283833.

Knowing of 5 downloaded, non-vendor specific apps installed on a device is enough to uniquely identify over 90% of individuals in a sample of a million+ people.”
Privacy researcher and Director of Open Privacy Sarah Jamie Lewis (@SarahJamieLewis), 17 September 2018. Jamie Lewis cites Temporal Limits of Privacy in Human Behavior by Sekara et al, 2018

Jamie Lewis continues, “This result is not surprising but it’s a nice illustration of the how even seemingly ‘useless’/‘unimportant’ information like ‘what apps do you have installed’ can impact total privacy.”

A time-shifted risk

“People always ask me: where's the harm?

Privacy harms are a time-shifted risk.

The moment you are protesting against your government, a seamless cashless public transport system can turn into a data trove for surveillance crowd control.”
Reporter Mary Hui (@maryhui), 13 June 2019, on witnessing Hong Kong protesters using cash transactions to board the subway instead of using their trackable/traceable metro cards. Also see Hui's CASH IS KING: Why Hong Kong’s protesters were afraid to use their metro cards, Quartz, 13 June 2019

Obscurity

Obscurity makes meaningful and intimate relationships possible, ones that offer solidarity, loyalty and love. It allows us to choose with whom we want to share different kinds of information. It protects us from having everyone know the different roles we play in the different parts of our lives. We need to be able to play one role with our co-workers while revealing other parts of ourselves with friends and family. Indeed, obscurity is one reason we feel safe bonding with others over our shared vulnerabilities, our mutual hopes, dreams and fears.
Why You Can No Longer Get Lost in the Crowd, by Woodrow Hartzog and Evan Selinger, New York Times, 17 April 2019. This article by Dr. Hartzog, a professor of law and computer science, and Dr. Selinger, a professor of philosophy, is part of the New York Times’ Privacy Project