(Title a reference to this post by Scott Alexander, though actual content of this post not so much)
I’ve been talking with Pozorvlak and Paul Crowley about this on Twitter. I wanted somewhere slightly longer than 140 characters to express my argument, so here you go.
My thesis is this: Given someone has known extremely false beliefs (homeopaths, young earth creationists, global warming deniers, people against social healthcare, etc), that they believe something should in many cases be taken as weak evidence against it.
Certainly not strong evidence against it, and certainly not enough to override other more compelling evidence (if they tell me the sky is blue I will nevertheless continue to believe that the sky is blue despite their having provided me with weak evidence against it), but evidence nevertheless.
Two reasons: First, they have demonstrated that their judgment is suspect. Therefore their beliefs should at most be taken as extremely weak positive evidence for something depending on how much judgment is required (I am more likely to believe “it was quite warm yesterday” than “the current pattern of warming is part of a natural cycle” for example).
Secondly: They have a process for producing false beliefs and eliminating true ones. Thus one false belief tends to seed many.
What is this process? Well, there are two actually, which both feed on each other.
The first is simple: People want to have consistent belief systems and want to be able to argue for their beliefs. Therefore if they believe one false thing, they will look for facts to justify it. Through a process of motivated reasoning and cherry picking of data they will happily find all sorts of “facts” that support their opinions. So a belief held by such a person may well have been selected by this process in order to support more overtly false beliefs. Similarly, true beliefs are likely to be rejected where they disagree with false beliefs, so as well as being more likely to believe false things they are less likely to believe true ones.
Secondly: People with false beliefs tend to congregate. Sometimes around their specific area of their false belief, sometimes around related areas, sometimes just because they’re “anti-establishment” and want to show some solidarity. Unfortunately this means that our people with false beliefs who we have previously mentioned have poor judgment in discerning truth-hood are awash in a lovely warm bath of false memes. Some of these will catch and they’ll get a whole new set of false beliefs to hold, which will then feed back into their process of building a self-consistent belief system and generate new ones. Wash, rinse, repeat.
This process tends to focus on areas related to their existing false beliefs, so you should definitely treat proximity as a signal for how heavily to treat another belief as evidence, but the social aspect definitely means that it’s not confined to that exclusively.
In general I would probably not apply this heuristic too aggressively – it’s easier to just not believe what they say one way or the other – but I find it sometimes useful to bear in mind.
It’s also important not to apply this heuristic to people you merely disagree with, or just think are a bit dim. It’s entirely possible for reasonable people to disagree and it is unproductive to react to this by disagreeing with everything your opponent is saying. This heuristic is reserved for people whose beliefs are so out there that you cannot see a reasonable way to believe them.