I coined a term on Twitter earlier and almost immediately found it useful, so I thought I’d register it somewhere a bit more permanent.
The “Clarke Gap” of a technology is the degree to which it is distinguishable from magic.
(The term of course comes Clarke’s third law that any significantly advanced technology is indistinguishable from magic, or more specifically from the corollary to it that any piece of technology distinguishable from magic is insufficiently advanced).
The Clarke gap is characterised mostly by how likely something is to go wrong in a way that requires you to understand what the system is actually doing in order to fix it. The more likely it is, the larger the Clarke gap is. If something always works and you never need to understand it, it might as well be magic. If you have to open the hood every five minutes and poke the bits that have got misaligned, it’s obviously technology. Most things are somewhere in between.
It’s not that things with a low Clarke gap don’t do wrong, but when they go wrong they tend to do it in a way that is either easy or extremely hard to fix – e.g. a computer has a fairly low Clarke gap – when it goes seriously wrong you need a lot of technical expertise to figure out what’s happened and why – but computers go wrong all the time and mostly you can just turn it off and on again (either the whole computer or the misbehaving program) and it will work.
A low Clarke gap is neither wholly good or wholly bad. It’s mostly about which cases you’re optimising for: Systems with a low Clarke gap tend to be much easier to learn at a casual usage level, but much harder to acquire expertise in.
For any given class of technology, the Clarke gap tends to go down over time. If you look at search engines now versus search engines twenty years ago, the Clarke gap has gone way down – a modern search engine is very much trying to guess your intent and thinks it knows better than you, while one from twenty years ago is basically a boolean query engine with maybe some spelling correction if you’re lucky. I suspect this has a correspondingly negative impact on people’s abilities to pick up search skills – we’re so used to search being magically good, we don’t pick up the necessarily knowledge about what to do when it’s not.
There are a number of factors that tend to push it this way. If users commonly experience a problem, the natural thing to do is to try to make that problem non-existent or easy to solve. This is good development practice! You’re making people’s lives easier. But by doing this you’re also removing a lot of intermediate ways in which the system can go wrong, and making the system more complex and thus increasing the number of more subtle ways it has to fail. This lowers the Clarke gap.
I think this is a useful phenomenon to pay attention to, and consequently to have a term for, so I’m probably going to keep using the term until a better one comes around.
we’re so used to search being magically good, we don’t pick up the necessarily knowledge about what to do when it’s not.
A more serious example of this phenomenon: the better autopilots get, the less practice pilots have at dealing with slightly-weird situations, so they’re less well-prepared for dealing with the most serious situations (this was a factor in the crash of Air France Flight 447). In safety engineering, this is discussed as part of the Paradox of Automation.
Right, absolutely. The difficulty of building up expertise through normal usage can be a major problem!
Pingback: Interesting Links for 13-01-2018 | Made from Truth and Lies
I read a not-very-good book called “The Shallows: What the Internet is Doing to our Brains” recently. One point it made that struck me as good was that, when we interact with Google in the normal way, we do not “search”, we merely “ask”.
Yeah, absolutely! And I think this is something they actively encourage these days. My guess is it’s because there’s a long history of people doing to that on search engines where it absolutely didn’t work, and Google found it easier to make it work than train their users out of it.