The Living Thing / Notebooks :

The surveillance society

All-seeing like a state

Usefulness: 🔧
Novelty: 💡
Uncertainty: 🤪 🤪 🤪
Incompleteness: 🚧 🚧 🚧

Corporate surveillance

Microsoft knowing you
Microsoft knowing you

Popular mechanics details Microsoft’s lense upon the petri dish you live upon.

The Quantified Other or How I help Facebook learn enough to replace me with Danbots

Big data, pre-existing conditions, the pan*icon, and the messy politics of monetising the confidential information of the masses for the benefit of the powerful. This is mostly opinion pieces; for practical info see Confidentiality, a guide to having it.

Corporate Surveillance in Everyday Life:

Report: How thousands of companies monitor, analyze, and influence the lives of billions. Who are the main players in today’s digital tracking? What can they infer from our purchases, phone calls, web searches, and Facebook likes? How do online platforms, tech companies, and data brokers collect, trade, and make use of personal data?

Chris Tucchio, Why you can’t have privacy on the internet:

A special case of fraud which also relates to the problem of paying for services with advertising is display network fraud. Here’s how it works. I run “My Cool Awesome Website About Celebrities”, and engage in all the trappings of a legitimate website – creating content, hiring editors, etc. Then I pay some kids in Ukraine to build bots that browse the site and click the ads. Instant money, at the expense of the advertisers. To prevent this, the ad network demands the ability to spy on users in order to distinguish between bots and humans.

Facebook will engineer your social life.

Can you even get off facebook without getting all your friends off it?

Connections like these seem inexplicable if you assume Facebook only knows what you’ve told it about yourself. They’re less mysterious if you know about the other file Facebook keeps on you–one that you can’t see or control.

Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections.

Vicki Boykis, Facebook is collecting this:

Facebook data collection potentially begins before you press “POST”. As you are crafting your message, Facebook collects your keystrokes.

Facebook has previously used to use this data to study self-censorship […]

Meaning, that if you posted something like, “I just HATE my boss. He drives me NUTS,” and at the last minute demurred and wrote something like, “Man, work is crazy right now,” Facebook still knows what you typed before you hit delete.

For a newer and poetic poetic backgrounder on this kind of stuff, read Amanda K. Green on Data Sweat, a.k.a. Digital exhaust.

This diffuse, somewhat enigmatic subset of the digital footprint is composed primarily of metadata about seemingly minor and passive online interactions. As Viktor Mayer-Schönberger and Kenneth Cukier write, it includes “where [users] click, how long they look at a page, where the mouse-cursor hovers, what they type, and more.” While at first glance this type of data may appear divorced from our interior, personal lives, it is actually profoundly embodied in even deeper ways than many of the things we intentionally publish – it is inadvertently “shed as a byproduct of peoples’ actions and movements in the world,” as opposed to being intentionally broadcast.

In other words, digital exhaust is shaped by unconscious, embodied affects – the lethargy of depression seeping into slow cursor movements, frustration in rapid swipes past repeated advertisements, or a brief moment of pleasure spent lingering over a striking image.

Or, more concretely, Camilla Cannon on the vocal surveillance in the call centre industry:

But vocal-tone-analysis systems claim to move beyond words to the emotional quality of the customer-agent interaction itself.

It has long been a management goal to rationalize and exploit this relationship, and extract a kind of affective surplus. In The Managed Heart, sociologist Arlie Hochschild described this quest as “the commercialization of human feeling,” documenting how “emotional labor” is demanded from employees and the toll that demand takes on them. Sometimes also referred to as the “smile economy,” this quintessentially American demand for unwavering friendliness, deference, and self-imposed tone regulation from workers is rooted in an almost religious conviction that positive experiences are a customer’s inalienable right.

The Trust Engineers is a chin-stroking public radio show about how Facebook researches people. If you project it forward 10 years, this should evoke pants-shitting grade dystopia, when epistemic communities are manufactured to order by an unaccountable corporation in the interests of whomever.

Anand Giridharadas, Deleting Facebook Won’t Fix the Problem:

When we tell people to get off the platform, we recast a political issue as a willpower issue.

State surveillance

Do You Want to Know a Secret by Dorothy Gambrell

Somewhere between lawlessness and a Stasi reign of terror there might exist a sustainable respect of citizen privacy by the state. Are we having the discourse we need at the moment to discover that?

For practical info see Confidentiality, a guide to having it.

Steven Feldstein’s report The Global Expansion of AI Surveillance:

presents an AI Global Surveillance (AIGS) Index—representing one of the first research efforts of its kind. The index compiles empirical data on AI surveillance use for 176 countries around the world.

Such tidbits as

Alexa O’Brien summarises: Retired NSA Technical Director Explains Snowden Docs.

Peer surveillance

Keli Gabinelli , Digital alarm systems increase homeowners’ sense of security by fomenting fear argues that our peer surveillance systems may not police the norms we would like them to:

Fear — mediated and regulated through moral norms — acts as a “cultural metaphor to express claims, concerns, values, moral outrage, and condemnation,” argues sociologist and contemporary fear theorist Frank Furedi. We process fear through a prevailing system of meaning that mediates and informs people about what is expected of them when confronted with a threat. It also poses a threat in and of itself, where simply feeling fear, or feeling insecure, equates to threat and insecurity. Ring, then, is a performative strategy to manage fears related to this sense of vulnerability. Despite posing itself as a tool to “bring communities together to create safer neighborhoods,” it offers hardly any concrete opportunities to connect and bridge those divisions through anything but increased fear. Anonymous neighbors sit behind a screen with their own personal sets of anxieties, and map those beliefs onto common enemies defined by structural prejudices, siloing themselves into a worldview that conveniently attends to their own biases.

Robin Hanson’s hypocralypse is a characteristically nice rebranding of affective computing. This considers surveillance-as-transparency transparency might be problem between peers also, as opposed to classic asymmetrical problem of the elites monitoring the proles.

within a few decades, we may see something of a “hypocrisy apocalypse”, or “hypocralypse”, wherein familiar ways to manage hypocrisy become no longer feasible, and collide with common norms, rules, and laws.[…]

Masked feelings also helps us avoid conflict with rivals at work and in other social circles. […] Tech that unmasks feelings threatens to weaken the protections that masked feelings provide. That big guy in a rowdy bar may use new tech to see that everyone else there can see that you despise him, and take offense. You bosses might see your disrespect for them, or your skepticism regarding their new initiatives. Your church could see that you aren’t feeling very religious at church service. Your school and nation might see that your pledge of allegiance was not heart-felt.

Awaiting filing

Why we live in a dystopia even Orwell couldn’t have envisioned.


Lanchester, John. 2019. “Document Number Nine.” London Review of Books, October 10, 2019.

Logan, Sarah. 2017. “The Needle and the Damage Done: Of Haystacks and Anxious Panopticons.” Big Data & Society 4 (2): 2053951717734574.

Lorenz, Chris. 2012. “If You’re so Smart, Why Are You Under Surveillance? Universities, Neoliberalism, and New Public Management.” Critical Inquiry 38 (3): 599–629.

———. 2014. “Fixing the Facts: The Rise of New Public Management, the Metrification of ‘Quality’ and the Fall of the Academic Professions.” Moving the Social 52: 5–26.

Solove, Daniel J. 2007. “’I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy.” SSRN Scholarly Paper ID 998565. Rochester, NY: Social Science Research Network.