I have been thinking about the value of powerful anecdotes. An easy argument for why we should be very cautious of personal experience and anecdotal evidence is that it has the potential to cause us to over-update. E.g. somebody that hears a few harrowing stories from friends about gun violence in Chicago is more likely to have an overly high estimation of how dangerous Chicago is.
Maybe the best way to formulate our worldview is in a cold and impersonal manner, disregarding most anecdotes in favor of hard data. This is the type of thing I might have once said, but I now think that this approach is likely just as flawed.
First of all, I think it’s an unrealistic demand on most people’s cognitive systems that they toss out the personal and compelling in their worldview.
And second of all, just like personal experience and anecdotal evidence have the potential to cause over-updating, statistical data and dry studies have the potential to cause under-updating.
Reading some psychological studies about the seriousness of the psychological harms of extended periods of solitary confinement is no match for witnessing or personally experiencing the effects of being locked in a tiny room alone for years. There’s a real and important difference between abstractly comprehending a fact and really understanding the fact. Other terms for this second thing include internalizing the fact, embodying it, and making it salient to yourself.
This difference is not easy to capture on a one-dimensional model of epistemology where beliefs are represented as simple real numbers. I’m not even sure if there’d be any good reason to build this distinction into artificial intelligences we might eventually construct. But it is there in us, and has a powerful influence.
How do we know whether somebody has really internalized a belief or not? I’m not sure, but here’s a gesture in what I think is the right direction.
We can conceive of somebody’s view of the world as a massive web of beliefs, where the connections between beliefs indicate dependencies and logical relations. To have a fully internalized a belief is to have a web that is fully consistent with the truth of this belief. On the other hand, if you notice that somebody verbally reports that they believe A, but then also seem to believe B, C, and D, where all of these are inconsistent with A, then they have not really internalized A.
The worry is that a cold and impersonal approach to forming your worldview is the type of thing that would result in this type of inconsistency and disconnectedness in your web of beliefs, through the failure to internalize important facts.
Such failures become most obvious when you have a good sense of somebody’s values, and can simply observe their behavior to see what it reveals about their beliefs. If somebody is a pure act utilitarian (I know that nobody actually has a value system as simple as this, but just play along for a moment), then they should be sending their money wherever it would be better spent maximizing utility. If they are not doing so, then this reveals an implicit belief that there is no better way to be maximizing utility than by keeping their own money.
This is sort of an attempt to uncover somebody’s revealed beliefs, to steal the concept of revealed preferences from economics. Conflicts between revealed beliefs and stated beliefs indicate a lack of internalization.
“First of all, I think it’s an unrealistic demand on most people’s cognitive systems that they toss out the personal and compelling in their worldview.
And second of all, just like personal experience and anecdotal evidence have the potential to cause over-updating, statistical data and dry studies have the potential to cause under-updating.”
These are both so true, and primarily I’m going to heavily emphasizing the second one (to myself and others). Ironically, it’s a strain on my cognitive system (and that of many others, I think) to continually take the outside view about my own need to be emotionally charmed, and to go about my information-gathering in a corresponding way. It’s much easier to feel that my brain is, in fact, a simpler data-processing machine, and then just to think about where and how I’m over-updating/under-updating.