The Living Thing / Notebooks :

Risk perception

survival by arrows

Heuristics and biases. Micromorts.

Risk, perception of. How humans make judgments under uncertainty, as actors able to form beliefs about the future.

Trying to work out how people make decisions and whether it is

  1. probabilistic
  2. rational
  3. optimal
  4. manipulable

and what definitions of each of those words would be required to make this work.

People really want this to be simple, in the sense of mathamtically simple, and not simple in the evolutionarily plausible simplcity of the idea that it is just messy, and our messy lumpy Pleistocene brains worry about great white sharks more than loan sharks because the former have teeth.

Assigned reading 1: The Beast Upon Your Shoulder, The Price Upon Your Head.

Assigned reading 2: Visualising risk in the NHS Breat cancer screening leaflet

Health information leaflets are designed for those with a reading age of 11, and similar levels of numeracy. But there is some reasonable evidence that people of low reading age and numeracy are less likely to take advantage of options for informed choice and shared decision making. So we are left with the conclusion -

Health information leaflets are designed for people who do not want to read the leaflets.

David Spiegelhalter discusses microlives:

many risks we take don’t kill you straight away: think of all the lifestyle frailties we get warned about, such as smoking, drinking, eating badly, not exercising and so on. The microlife aims to make all these chronic risks comparable by showing how much life we lose on average when we’re exposed to them:

a microlife is 30 minutes of your life expectancy

The Ellsberg paradox

If you’re like most people, you don’t have a preference between B1 and W1, nor between B2 and W2. But most people prefer B1 to B2 and W1 to W2. That is, they prefer “the devil they know”: They’d rather choose the urn with the measurable risk than the one with unmeasurable risk.

This is surprising. The expected payoff from Urn 1 is $50. The fact that most people favor B1 to B2 implies that they believe that Urn 2 contains fewer black balls than Urn 1. But these people most often also favor W1 to W2, implying that they believe that Urn 2 also contains fewer white balls, a contradiction.

Ellsberg offered this as evidence of “ambiguity aversion,” a preference in general for known risks over unknown risks. Why people exhibit this preference isn’t clear. Perhaps they associate ambiguity with ignorance, incompetence, or deceit, or possibly they judge that Urn 1 would serve them better over a series of repeated draws.

The principle was popularized by RAND Corporation economist Daniel Ellsberg, of Pentagon Papers fame.


Using expected utility to explain anything more than economically negligible risk aversion over moderate stakes such as $10, $100, and even $1,000 requires a utility-of-wealth function that predicts absurdly severe risk aversion over very large stakes. Conventional expected utility theory is simply not a plausible explanation for many instances of risk aversion that economists study.


Chase, V. M., Hertwig, R., & Gigerenzer, G. (1998) Visions of rationality. Trends in Cognitive Sciences, 2, 206–214. DOI.
Gigerenzer, G. (2000) Adaptive Thinking: Rationality in the Real World (Evolution and Cognition Series). . Oxford University Press, USA
Gigerenzer, G. (2003) Where do New Ideas Come From? a Heuristics of Discovery in the Cognitive Sciences. In M. C. Galavotti (Ed.), Observation and Experiment in the Natural and Social Sciences (pp. 99–139). Springer Netherlands
Gigerenzer, G., & Goldstein, D. G.(1996) Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review, 103(4), 650–669. DOI.
Gigerenzer, G., & Hoffrage, U. (1995) How to improve Bayesian reasoning without instruction: Frequency formats. Psychological Review, 102(4), 684–704. DOI.
Gigerenzer, G., & Selten, R. (2002) Bounded Rationality: The Adaptive Toolbox. . The MIT Press
Gigerenzer, G., Todd, P. M., & ABC Research Group. (1999) Simple heuristics that make us smart. . Oxford University Press, USA
Golman, R., & Loewenstein, G. (2015) Curiosity, Information Gaps, and the Utility of Knowledge (SSRN Scholarly Paper No. ID 2149362). . Rochester, NY: Social Science Research Network
Howell, J. L., & Shepperd, J. A.(2012) Behavioral Obligation and Information Avoidance. Annals of Behavioral Medicine, 45(2), 258–263. DOI.
Hoy, M., Peter, R., & Richter, A. (2014) Take-up for genetic tests and ambiguity. Journal of Risk and Uncertainty, 48(2), 111–133. DOI.
Kahneman, D., & Tversky, A. (1979) Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–292.
Kahneman, D., & Tversky, A. (1984) Choices, values, and frames. American Psychologist, 39(4), 341–350. DOI.
Köszegi, B., & Rabin, M. (2006) A Model of Reference-Dependent Preferences. Quarterly Journal of Economics, 121(4), 1133–1165. DOI.
Rabin, M. (2000a) Diminishing marginal utility of wealth cannot explain risk aversion.
Rabin, M. (2000b) Risk Aversion and Expected-utility Theory: A Calibration Theorem. Econometrica: Journal of the Econometric Society, 68(5), 1281–1292.
Rabin, M., & Thaler, R. H.(2001) Anomalies: Risk Aversion. The Journal of Economic Perspectives, 15(1), 219–232. DOI.
Sedlmeier, & Gigerenzer, G. (2001) Teaching Bayesian reasoning in less than two hours. J Exp Psychol Gen, 130, 380–400.
Sharpe, K. (2015) On the Ellsberg Paradox and its Extension by Machina (SSRN Scholarly Paper No. ID 2630471). . Rochester, NY: Social Science Research Network
Simon, H. (1956) Rational choice and the structure of the environment. Psychological Review, 63(2), 129–138. DOI.
Simon, H. A.(1958) “The Decision-Making Schema”: A Reply. Public Administration Review, 18(1), 60–63. DOI.
Simon, H. A.(1975) Style in design. Spatial Synthesis in Computer-Aided Building Design, 9, 287–309.
Tversky, A. (n.d.) Features of Similarity.
Tversky, A., & Gati, I. (1982) Similarity, separability, and the triangle inequality. Psychological Review, 89(2), 123–154. DOI.
Tversky, A., & Kahneman, D. (1973) Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. DOI.
Tversky, A., & Kahneman, D. (1974) Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131. DOI.
Tversky, A., & Kahneman, D. (1981) The framing of decisions and the psychology of choice. Science, 211(4481), 453–458. DOI.
Wolpert, D. H.(2010) Why income comparison is rational. Ecological Economics, 69(2), 458–474. DOI.
Wolpert, D. M., Miall, R. C., & Kawato, M. (1998) Internal models in the cerebellum. Trends in Cognitive Sciences, 2, 338–347. DOI.