Let’s start with apology: This blog post will not contain any concrete examples of what I want to talk about. Please don’t ask me to give examples. I will also moderate out any concrete examples in the comments. Sorry.
Hopefully the reasons for this will become clear and you can fill in the blanks with examples from your own experience.
There’s a pattern I’ve been noticing for a while, but it happens that three separate examples of it came up recently (only one of which involved me directly).
Suppose there are two groups. Let’s call them the Eagles and the Rattlers. Suppose further that the two groups are roughly evenly split.
Now suppose there is some action, or fact, on which people disagree. Let’s call them blue and orange.
One thing is clear: If you are a Rattler, you prefer orange.
If you are an Eagle however, opinions are somewhat divided. Maybe due to differing values, or different experiences, or differing levels of having thought about the problem. It doesn’t matter. All that matters is that there is a split of opinions, and it doesn’t skew too heavily orange. Let’s say it’s 50/50 to start off with.
Now, suppose you encounter someone you don’t know and they are advocating for orange. What do you assume?
Well, it’s pretty likely that they’re a Rattler, right? 100% of Rattlers like orange, and 50% of Eagles do, so there’s a two thirds chance that a randomly picked orange advocate will be Rattler. Bayes’ theorem in action, but most people are capable of doing this one intuitively.
And thus if you happen to be an Eagle who likes orange, you have to put in extra effort every time the subject comes up to demonstrate that. It’ll usually work – the evidence against you isn’t that strong – but sometimes you’ll run into someone who feels really strongly about the blue/orange divide and be unable to convince them that you want orange for purely virtuous reasons. Even when it’s not that bad it adds extra friction to the interaction.
And that means that if you don’t care that much about the blue/orange split you’ll just… stop talking about it. It’s not worth the extra effort, so when the subject comes up you’ll just smile and nod or change it.
Which, of course, brings down the percentage of Eagles you hear advocating for orange.
So now if you encounter an orange advocate they’re more likely to be Rattler. Say 70% chance.
Which in turn raises the amount of effort required to demonstrate that you, the virtuous orange advocate, are not in fact Rattler. Which raises the threshold of how much you have to care about the issue, which reduces the fraction of Eagles who talk in favour of orange, which raises the chance that an orange advocate is actually Rattler, etc.
The result is that when the other side is united on an issue and your side is divided, you effectively mostly cede an option to the other side: Eventually the evidence that someone advocating for that option is a Rattler is so overwhelming that only weird niche people who have some particularly strong reason for advocating for orange despite being an Eagle will continue to argue the cause.
And they’re weird and niche, so we don’t mind ostracising them and considering them honorary Rattlers (the real Rattlers hate them too of course, because they still look like Eagles by some other criteria).
As you can probably infer from the fact that I’m writing this post, I think this scenario is bad.
It’s bad for a number of reasons, but one very simple reason dominates for me: Sometimes Rattlers are right (usually, but not always, for the wrong reasons).
I think this most often happens when the groups are divided on some value where Eagles care strongly about it, but Rattlers don’t care about that value either way, and vice versa. Thus the disagreement between Rattler and Eagles is of a fundamentally different character: Blue is obviously detrimental to the Rattlers’ values, so they’re in favour of orange. Meanwhile the Eagles have a legitimate disagreement not over whether those values are good, but over the empirical claim of whether blue or orange will be better according to those values.
Reality is complicated, and complex systems behave in non-obvious ways. Often the obviously virtuous action has unfortunate perverse side effects that you didn’t anticipate. If you have ceded the ground to your opponent before you discover those side effects, you have now bound your hands and are unable to take what now turns out to be the correct path because only a Rattler would suggest that.
I do not have a good suggestion for how to solve this problem, except maybe to spend less time having conversations about controversial subjects with people whose virtue you are unsure of and to treat those you do have with more charity. A secondary implication of this suggestion is to spend less time on Twitter.
But I do think a good start is to be aware of this problem, notice when it is starting to happen, and explicitly call it out and point out that this is an issue that Eagles can disagree on. It won’t solve the problem of ground already ceded, but it will at least help to stop things getting worse.
Like my writing? Why not support me on Patreon! Especially if you want more posts like this one, because I mostly don’t write this sort of thing any more if I can possibly help it, but might start doing so again given a nudge from patrons.