I find urns to be a fruitful source for metaphors regarding rationality. For example, here’s a question that I’ve recently been thinking about: What does it mean for somebody to be biased?
Imagine that there is an urn containing black and white balls that you don’t have direct access to. You want to know the ratio of white to black balls in the urn, and you know somebody that does have direct access to it. This person will remove some number of balls from the urn and show them to you, thus giving you some evidence as to the contents of the urn.
So, for instance, if this person shows 100 black balls in a row, then this is strong evidence that there are many more black balls in the urn than white balls. Or is it?
In fact, this is only strong evidence if you have good reason to think that the person presenting you with the evidence is unbiased. We can exactly formulate what unbiased means in this example. The procedure your acquaintance is running has two steps: first they remove some balls from the urn, and second they show you some of the balls they removed. Thus there are two sources of bias. I’ll call the first type of bias knowledge bias and the second presentation bias.
Knowledge bias is what happens if the person is not randomly sampling balls from the urn. Maybe they are fishing through the urn until they find a black ball and then removing it. Or maybe for some complicated reason that they are unaware of, their sampling is unrepresentative of the true ratio in the urn. The first of these corresponds to things like motivated reasoning and confirmation bias. The second is more subtle; it corresponds to a bias in terms of the information that they are exposed to. This could come as a result of living in a culture in which certain views are taken for granted and never questioned, or as a result of the information that reaches them being subject to selection pressures that distort the ratio of information on one side to the other. Scott Alexander’s toxoplasma of rage seems like a good example of this.
In short, knowledge bias refers to a state of knowledge where the information that you have is not representative of the information you would get from a random sampling procedure.
Presentation bias is what happens when the balls you are being shown are not a representative sample of the balls that were removed. For example, somebody could have a totally random sampling procedure, and end up removing 10 black balls and 100 white balls, but then only show you the 10 black balls. On the more explicit side, this corresponds to explicitly omitting information or arguments that you know. On the less explicit side, this could correspond to doing a better job at presenting arguments with favorable conclusions than those with unfavorable conclusions. This is pretty hard to avoid in general; it is not easy to do just as good of a job at presenting arguments you dislike as it is for arguments you like.
In short, presentation bias is where the information that is being presented is unrepresentative of a random sampling of the information that the presenter has.
What if all of the good arguments for one side are really complicated and all of the arguments on the other side are dead simple? If you’re talking to a dumb person, you’ll have a hard time conveying the relative strengths of the arguments on either side. In this case, the bias is arising not through the information being presented, but the information that is being received. This is not a presentation bias, but a knowledge bias on the part of the person listening. In this case, a good educator has the choice to either not present the complicated information that their student won’t understand anyway (a presentation bias), or present it and watch it not be understood (a knowledge bias).
Notice that intention is not emphasized in this way of thinking about bias. While intending to present biased information certainly makes it easier to be biased, it is not necessary. Somebody might be biased as a result of not being smart enough, or being surrounded by a biased culture, or being better at making the case for their side than the other.
Bias can get complicated really quickly. Person A, who gets all of their political information from Fox News, probably has a significant knowledge bias. This knowledge bias arises from a presentation bias on the part of Fox News. If Person A presents some arguments they heard on Fox to a friend of theirs, and this friend accepts and updates on those arguments, then they will have unwittingly attained a knowledge bias. This is the case even if there is no presentation bias on the part of Person A!
Basically, bias is contagious. Enter one Super Persuader, somebody who is a master presenter of biased arguments, and bias can propagate like mad throughout a society to the point that it is unclear who and what can be trusted. I’m not sure to what degree it makes sense to say that this is the state of our society today, but it certainly gives reason to be very careful about the way that information is attained and dispersed.