This is a technique that is by now so ingrained in how I think about things that it’s sometimes hard for me to remember that not only do normal people not constantly do this, it took me most of 30 years to figure out how to do it.
Do you frequently catch yourself treating your political enemies as if they are basically bogeymen who want to eat kittens? Do you frequently find yourself saying “I literally can’t imagine how anyone could think this way?”
You may be suffering from demonization: The tendency to believe that people who disagree with you are inhuman and fundamentally incomprehensible monsters.
You can keep doing this if you want, but personally I don’t recommend it. It’s both not a useful model of the world and given what a large proportion of the world probably disagrees with you, I imagine it’s quite stressful going around thinking they’re all fundamentally stupid and/or evil and are barely being restrained by society from chowing down to a nice bowl of kitten pops.
So if you find yourself unable to comprehend how someone could possibly hold a position, consider the following technique:
Take a set of things you care about. These can either be things you value, things you fear, or any mixture of the two. Now exaggerate some of them and downplay others.
I find moral foundations theory to be often useful here (I don’t know enough experimental moral philosophy to comment on its truth, but that’s not actually a required feature for this). For example, in order to understand conservative thinking I dial down care/harm a bit and dial up the other five axes.
Moral qualities are not the only dials you have to twiddle with. Trust of a particular group is often a good one too. e.g. anti-vaxxers become much more comprehensible when you consider that there are more than a few incidences in the 20th century of “We’re totes vaccinating you honest” medical experiments, and vaccination programs have not proven without ulterior motives in the 21st century either. It’s not hard to imagine distrusting the people who tell you that vaccines are OK, and dialling up the fear of harming your kids (Sanctity/degradation helps here too).
Generally speaking I rarely find a position so alien that I could not imagine myself holding it if my priorities were very different. Sometimes I have to really completely distort my priorities (I can just about stretch to understanding people who are against late term abortion, but people who are against early term abortion I basically have to start saying “Well if I believed this entirely wrong thing…”), but even then most positions are usually reachable.
It’s worth noting that the purpose of these mental gymnastics is neither to provide an accurate model of peoples’ beliefs, nor to come up with a reason that they’re OK. The fact that I have a somewhat better understanding of conservative morality than I used to does not make me significantly more inclined to be conservative, and the fact that I can somewhat understand the position of anti-vaxxers does not make any less inclined to think that they’re child murdering scum who should sent to jail (it turns out that even once you’ve thought of someone as entirely human with a coherent set of motivations you can still passionately hate them).
The purpose is to give you a mental picture you can work with, and start treating people as individuals to be engaged with, and if necessary dealt with, without treating them as caricatures. It’s a useful working principle for getting things done, not for acquiring a perfect understanding of someone’s motivations. Once you have engaged with them, you will probably find you acquire a more nuanced view of their actual motivations.
It’s also helpful for making me feel better about the world. I don’t know about you, but I find it nice to know that it’s not actually full of moustache twirling villains who are basically out to cause harm, but instead people with coherent sets of motivations that are different from your own.
Obviously you should feel under no compulsion to actually do this. This is intended as a useful technique, not a moral obligation. If you don’t feel comfortable or able to do this, I’m pretty sure I can understand your position.
I wish everyone thought like this. The “some people are just bad” worldview causes so much trouble.
To be clear: This does not necessarily stop you from thinking that some people are just bad. e.g. I would find it very hard to consider someone who put all their morality points into authority/subversion and sanctity/degradation and treated care/harm as a dump stat to be anything other than a truly awful human being.
What it does is it get rid of the idea of someone being incomprehensibly awful. Sometimes this is useful because it helps you interact with them productively and feel less bad about the world. Sometimes this is useful because it gives you helpful tools with which to destroy them and everything they hold dear.
Somewhat related: http://www.snell-pym.org.uk/archives/2012/08/02/differentophobia/
To follow up on the important point that understanding people doesn’t mean agreeing with them or being oneself a “good” person: sociopaths are particularly good at understanding what makes people tick. So there is a bit of paradox in human nature: the ability to step back and strategically assess how someone else thinks might be less common than we think we would like, if somehow for whatever evolutionary reason this ability might be correlated with sociopathy.
Speculating wildly: I think a certain amount of this is about emotional distance. It’s much harder to perform abstract reasoning about someone’s motivations when you’re very angry at them for example. It’s much easier if you don’t actually care about them one way or the other, so sociopaths have something of a head start.
One way I personally experience this is that I find it much easier to do this sort of mental juggling act for groups rather than individuals. I’ve enough distance from antivaxxers as a group that I can sit down and calmly think about where they might be coming from, but this is much harder with someone I’m currently in a heated argument with and I’m more likely to just conclude they’re a clueless asshole.