News media and public shared reality. Fake news, incomplete news, alternative facts, strategic inference, kompromat, agnotology. Basic media literacy and whether it helps. As seen in elections.
…newspapers such as the Scarfolk Mail realised that they no longer needed to provide actual content: Readers only saw what they wanted to see and comprehended what they wanted to comprehend.
“Data journalism” has interesting tools. But do people care about data? Are facts persuasive? As Gilad Lotan anecdotally illustrates, merely selecting facts can get you your own little reality, without even bothering to lie. As Gwern on points out in Littlewood’s Law of Media, the anecdotes we can produce grow increasingly… odd.
[This] illustrates a version of Littlewood’s Law of Miracles: in a world with ~8 billion people, one which is increasingly networked and mobile and wealthy at that, a one-in-billion event will happen 8 times a month.
Human extremes are not only weirder than we suppose, they are weirder than we can suppose.
But let’s, for a moment, assume that people actually have intent to come to a shared understanding of the facts reality, writ large and systemic. Do they even have the skills? I don’t know, but it is hard to work out when you are being fed bullshit and we don’t do well at teaching that. Here is a course on identifying the lazier type of bullshit. Will all the billions of humans on earth take such a course?
And, given that society is complex and counter intuitive even if we are doing simple analysis of correlation, how about more complex causation, such as feedback loops? Nicky Case has a diagrammatic account of how “systems journalism” might work.
Let’s get real here; for the moment, reasoned engagement with a shared rational enlightenment doesn’t dominate the media. Bread and circuses and kompromat and gut-instinct do.
Renee DiResta: Social Network Algorithms Are Distorting Reality By Boosting Conspiracy Theories :
Seeking news from traditional sources—newspapers and magazines—has been replaced with a new model: getting all of one’s news from trending stories on social networks. The people that we know best are most likely to influence us because we trust them. Their ideas and beliefs shape ours. And the tech behind social networks is built to enhance this[…]
Once a user joins a single group on Facebook, the social network will suggest dozens of others on that topic, as well as groups focused on tangential topics that people with similar profiles also joined. That is smart business. However, with unchecked content, it means that once people join a single conspiracy-minded group, they are algorithmically routed to a plethora of others. Join an anti-vaccine group, and your suggestions will include anti-GMO, chemtrail watch, flat Earther (yes, really), and “curing cancer naturally” groups. Rather than pulling a user out of the rabbit hole, the recommendation engine pushes them further in. We are long past merely partisan filter bubbles and well into the realm of siloed communities that experience their own reality and operate with their own facts.
Nick Cohen, Trump’s lies are not the problem. It’s the millions who swallow them who really matter:
Compulsive believers are not just rednecks. They include figures as elevated as the British prime minister and her cabinet. […]
Mainstream journalists are almost as credulous. After decades of imitating Jeremy Paxman and seizing on the trivial gaffes and small lies of largely harmless politicians, they are unable to cope with the fantastic lies of the new authoritarian movements. When confronted with men who lie so instinctively they believe their lies as they tell them, they can only insist on a fair hearing for the sake of “balance”. Their acceptance signals to the audience the unbelievable is worthy of belief.
This is a shallow causal analysis to my mind; after all, we create the systems that make it easier to never “get it”. However, thinking about the credulity of people in power is interesting.
- fullfact is a full-time fact checking agency in the UK, Who do fact checking and reports such as Tackinling misinformation in an Open Society
- The previous organisation I found via Data Skeptic Podcast’s Fake News Series
- an amusing portrait of snopes
- How much of the internet is fake?
Unfiltered news doesn’t share well, not at all:
It can be emotional, but in the worse sense; no one is willing to spread a gruesome account from Mosul among his/er peers.
Most likely, unfiltered news will convey a negative aspect of society. Again, another revelation from The Intercept or ProPublica won’t get many clicks.
Unfiltered news can upset users’ views, beliefs, or opinions.
Tim Harford, The Problem With Facts:
[…]will this sudden focus on facts actually lead to a more informed electorate, better decisions, a renewed respect for the truth? The history of tobacco suggests not. The link between cigarettes and cancer was supported by the world’s leading medical scientists and, in 1964, the US surgeon general himself. The story was covered by well-trained journalists committed to the values of objectivity. Yet the tobacco lobbyists ran rings round them.
In the 1950s and 1960s, journalists had an excuse for their stumbles: the tobacco industry’s tactics were clever, complex and new. First, the industry appeared to engage, promising high-quality research into the issue. The public were assured that the best people were on the case. The second stage was to complicate the question and sow doubt: lung cancer might have any number of causes, after all. And wasn’t lung cancer, not cigarettes, what really mattered? Stage three was to undermine serious research and expertise. Autopsy reports would be dismissed as anecdotal, epidemiological work as merely statistical, and animal studies as irrelevant. Finally came normalisation: the industry would point out that the tobacco-cancer story was stale news. Couldn’t journalists find something new and interesting to say?
[…] In 1995, Robert Proctor, a historian at Stanford University who has studied the tobacco case closely, coined the word “agnotology”. This is the study of how ignorance is deliberately produced; the entire field was started by Proctor’s observation of the tobacco industry. The facts about smoking — indisputable facts, from unquestionable sources — did not carry the day. The indisputable facts were disputed. The unquestionable sources were questioned. Facts, it turns out, are important, but facts are not enough to win this kind of argument.
Dead cat strategy.
A few years ago, a cynical commentator described the “dead cat” strategy, to be deployed when losing an argument at a dinner party: throw a dead cat on the table. The awkward argument will instantly cease, and everyone will start losing their minds about the cat. The cynic’s name was Boris Johnson.
The tactic worked perfectly in the Brexit referendum campaign. Instead of a discussion of the merits and disadvantages of EU membership, we had a frenzied dead-cat debate over the true scale of EU membership fees.
For more hot tips like that, try The Alt-Right Playbook.
Alexis Madrigal, What Facebook Did to American Democracy Buzzfeed, Inside the partisan fight for your newsfeed:
The most comprehensive study to date of the growing universe of partisan websites and Facebook pages about US politics reveals that in 2016 alone at least 187 new websites launched, and that the candidacy and election of Donald Trump has unleashed a golden age of aggressive, divisive political content that reaches a massive amount of people on Facebook.
Thanks to a trinity of the internet, Facebook, and online advertising, partisan news websites and their associated Facebook pages are almost certainly making more money for more people and reaching more Americans than at any time in history. In some cases, publishers are generating hundreds of thousands of dollars a month in revenue, with small operations easily earning five figures thanks to one website and at least one associated Facebook page.
At its root, the analysis of 667 websites and 452 associated Facebook pages reveals the extent to which American online political discourse is powered by a mix of money and outrage.
- ShCh17: (2017) Association of Facebook Use With Compromised Well-Being: A Longitudinal Study. American Journal of Epidemiology, 185(3). DOI
- HoKo16: (2016) Bots, #StrongerIn, and #Brexit: computational propaganda during the UK-EU Referendum. Browser Download This Paper.
- WeBe11: (2011) Can Ignorance Promote Democracy? Science, 334(6062), 1503–1504. DOI
- OlWo14: (2014) Conspiracy Theories and the Paranoid Style(s) of Mass Opinion. American Journal of Political Science, 58(4), 952–966. DOI
- OlWo14: (2014) Larger than life. New Scientist, 224(3000), 36–37. DOI
- MaLe17: (2017) Media Manipulation and Disinformation Online. Data & Society Research Institute
- DiSe15: (2015) Media, Markets, and Radical Ideas: Evidence from the Protestant Reformation. Centre for Economic Performance Working Paper.
- SoBe17: (2017) Sensory Metrics of Neuromechanical Trust. Neural Computation, 29(9), 2293–2351. DOI
- JoSe94: (1994) Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences. Learning, Memory, 20(6), 1420–1436.
- ReCE10: (2010) The Affective Tipping Point: Do Motivated Reasoners Ever “Get It”? Political Psychology, 31(4), 563–593. DOI
- Evan17: (2017) The Economics of Attention Markets (SSRN Scholarly Paper No. ID 3044858). Rochester, NY: Social Science Research Network
- Cadw17: (2017, May 7) The great British Brexit robbery: how our democracy was hijacked. The Guardian.
- VoRA18: (2018) The spread of true and false news online. Science, 359(6380), 1146–1151. DOI