Author Archives: david

A time and a place for work

For an Inclusive Culture, Try Working Less is currently doing the rounds. It makes the argument that having a rigid 8:30-5:00 working day creates a more diverse environment by including people who a so called “more flexible” set of working hours would not.

I think he’s probably right. Certainly I’ve seen a lot of implicit judgement towards people who had leave early to e.g. pick up the kids from school (or just because they wanted to have an evening to themselves), even when they arrived at work early (which was of course invisible to the people who rocked in at 11).

But at the same time I will never work at a company which enforces those hours. It’s simply not going to happen, even though I fully support the right of other people to work those hours if that’s what works best for them.

I have a low grade sleeping disorder. Many people have far worse experiences than me, but mine are still bad enough that I’m going to take a hard line that if you’re going to make my sleeping experience worse then I’m not going to work for you. Forcing me to conform to your schedule in the mornings definitely counts as making my sleeping experience worse.

What’s interesting here is that I don’t think I’m unusual in this.

I’ve been noticing for a while that tech seems to have a disproportionately high number of people with sleeping disorders in it – I definitely know people with sleep apnea at a higher rate than the background rate would suggest likely, and it often seems like most of the people I know in tech have some sort of problem with sleep.

Some of this is probably caused by tech – high caffeine consumption, sitting all day, lots of blue light, and a cultural encouragement for obsession are all things that can cause sleep problems. But enough of it (including mine) has a physical root cause that it’s definitely not all caused by tech.

My working theory is simply that flexible hours mean that a job works a lot better for people with sleeping disorders, so people with sleeping disorders will tend to gravitate towards jobs with flexible hours. For better or for worse, that currently includes tech.

I don’t have data to prove that sleep disorders are atypically common in tech. It sure looks like they are, but that could just be selection bias at work.

But regardless of whether it’s more common than usual, it’s certainly common enough that I and a lot of people I know are in this situation, and any work environment which demands we turn up to work at 8:30 is going to be throwing us under a bus.

It’s not just people with sleeping disorders either, there’s a broader ableism issue. Have you ever tried taking a wheelchair on public transport in rush hour? I haven’t, but I’ve helped take someone on the London underground not at rush hour and even that wasn’t much fun. Demanding people all arrive at the same time is a great way to seriously disadvantage people who need a wheelchair (I imagine it’s not great for people with any other sort of mobility issue either).

Sure, you could make exceptions for all of us who have sufficiently convincing reasons. That would be better, but it means we now need to be singled out as special cases, which inevitably makes other people annoyed about our special treatment – I have seen a huge amount of ill will towards developers from coworkers whose jobs require them to be in the office during a particular time range, and I can only imagine it would be worse from people you work more closely with.

And what about those for whom it doesn’t really stretch as far as a disorder or a disability, but is merely a really strong preference? e.g. even if I didn’t have sleep problems (here’s hoping for that future) I really hate crowds and as a result I would very strongly prefer not to travel during rush hour even if I’m awake. I’m not saying I can’t take a rush hour tube, but I’m still going to preferentially select for companies that give me the ability to come in an hour later, and I’m going to really unhappy if I don’t have that option.

People like me in this regard are sufficiently common in tech now that it’s really unlikely that you’ll ever get a situation where an early start is the norm – we’ll just avoid those companies in preference for ones that don’t require us to do something really unpleasant and harmful to our mental and physical well-being, and the result will be a tech industry divided into two distinct groups of companies with a relatively small intersection moving between them. That’s not a great situation.

Fundamentally the problem is that there is no one-size fits all solution. No single set of office hours is going to work for everyone, so what we need is a diversity of options where people can work whichever hours they want or need.

But that’s what we have now and it doesn’t actually work.  Everything I’ve seen suggests that flexible working hours doesn’t really mean flexible, you just converge on a new cultural norm of working later – people tend to gradually conform to a later (and longer) schedule, because when you leave work too early you feel subtly or not-so-subtly judged by your coworkers (whether or not you are being judged, but you usually are), which creates a strong pressure to conform or leave.

Even if the original article about the diversity implications is wrong (I do not think it is, but would like to see data before I believe it wholeheartedly) and this isn’t excluding women, it still means we’re creating an environment that is just as bad for early birds as an 8:30 start would be for night owls and others with sleep problems.

So what’s the solution?

Well, I think it’s probably remote work. By separating out the need to physically be there, and allowing a lot of work to be asynchronous, you remove a lot of the implied social pressure to conform to a particular set of hours.

And as a bonus, by opening yourself up to remote work you potentially open up a whole bunch of other opportunities for diversity – even without rush hour, commuting in a wheel chair is hard, and for other disabilities (e.g. people who are immunocompromised) it might not be safe for them to come into the office at all, but they might still be perfectly able to work.

A lot of diversity problems (I’d guess most diversity problems that don’t stem from up front flat out bigotry? I don’t know) come from trying to pretend we can fit everyone into a single mould, and thus silently excluding all the people who can’t fit into that mould. Rigid working hours don’t fix that, they just choose a different shaped mould. I’d rather break that mould altogether instead.

Of course, remote work is itself another mould to try to fit everyone into. It doesn’t work for everyone (though some people who think it doesn’t work for them can learn to love it – I did), but you can fix that to a large extent by e.g. renting them desks in a co-working space or having an office people can come into if they want. You can also probably get a lot of the benefits by allowing partial remote work – e.g. I personally would be mostly happy if I could do a couple of hours of work in the morning and then come in. Other peoples’ needs will differ, and that definitely won’t be enough for everyone, but even small accommodations help to include more people.

So it’s not perfect, and it certainly won’t fix everything, but nothing is and nothing will. I still suspect that starting from the principle that presence isn’t required and then fixing the problems that causes is going to be a much easier path to diversity than trying to force everyone to be in the same place at the same time and then trying to fix all the problems that causes.

This entry was posted in Work on by .

Talking to people at conferences

I’m currently at the Halmstad Summer School on Testing, where I know literally nobody. This means that I’m having to exercise one of my most useful and hardest won conference skills: Going up to new people and talking to them.

I can’t claim any special ability at doing this. If anything, I’m bad at it. But I started out terrible at it, so I thought I’d offer some advice for other people who are terrible at it and want to become less terrible (which, based on observational evidence at conferences and talking to friends, is a lot of us).

The big thing to know is that it’s not complicated (which is not the same as saying it’s easy). The following procedure works for me basically 100% of the time:

  1. Go up to somebody who isn’t currently talking to someone and doesn’t look like they’re busy.
  2. Say “Hi, I’m David” (you may wish to substitute your own name here if it is not also David).
  3. Make conference appropriate small talk.
  4. Part ways at a suitable juncture (e.g. beginning of next talk), and if you enjoyed each other’s company you can naturally say hi again later, and if not you won’t.

If you’re like me, that probably sounds impossible, but it’s actually surprisingly doable once you manage to suppress the associated feeling of mortal dread.

The thing that helped me the most was understanding what caused me stress (going up to groups where I didn’t know anyone) and just not doing that, which is why it’s about finding single people to go up and talk to. I generally don’t approach groups unless I already know some of the people in the group.

The second thing that helps is understanding that this behaviour is appropriate, socially acceptable, and often outright welcome.

You are at an event where a large part of the purpose is to meet people. Therefore introducing yourself to strangers is a thing that is part of the event and does not need an excuse. Also, the people around you are probably also struggling to do the same. By picking someone else who is also not talking to people, there’s a good chance you’ve found someone who is struggling the same way you are and have done them a massive favour by removing that struggle.

Is it sometimes a bit awkward? Yeah. Is it the perfect approach? No. But it works reliably, I am able to do it, and it does not rely on flawless execution to go smoothly. It is very unlikely to go terribly, and it will probably go well.

It’s still anxiety inducing, but for me the knowledge that this is acceptable behaviour and nothing bad is going to happen is enough to take it from terrifying to merely intimidating, at which point it’s fairly feasible to just force myself to do it.

Specific tips:

Picking who to talk to is tricky, but the nice thing about this just being a brief introductory conversation is that you don’t have to do it well. I don’t have a particularly good algorithm, but vaguely use the following guidelines:

  • People you’ve met in passing but not really properly talked to are an easy place to start.
  • If I see a speaker or someone I vaguely know something about, I’ll tend to default to them (as someone who regularly speaks at conferences, I can confirm speakers are just as socially awkward about doing this as the rest of us and will appreciate you talking to them).
  • I often preferentially try to talk to women or other people who are in a minority for the conference (obviously at some conferences women won’t be a minority, but I work in tech where sadly they usually are). This advice works better if you are yourself in a minority at the conference, but I figure that if people are feeling isolated it’s still better to have someone to talk to who isn’t going to be a jerk (which I’m told I’m mostly not), and they’re at least as likely (probably more) to be interesting people to talk to as anyone else.
  • Other than that, I just pick a random person nearby.

Once you’ve picked a person and introduced yourself, it’s time for the dread small talk. Fortunately, although small talk in general is hard, conference small talk is much easier. There are two reasons for this:

  • When you ask “What do you do?” the chances are good that it’s something relevant to the conference, and thus you have common ground to talk about.
  • You can always talk about the talks at the conference – which they have enjoyed, if there are any they are particularly looking forward to, etc.

The parting ways aspect is important largely to avoid the problem of finding one person to talk to and then latching on to them. It’s doubly important for me because of a moderate amount of insecurity about seeming to do that even when I’m probably not. Fortunately conferences come with a natural rhythm, so it’s fairly easy to do.

Another reason why it helps is that it keeps the entire interaction fairly low cost – you’re not committing to a new best friend for the entire conference, you’re just meeting someone new and having a brief chat with them.

So that’s how I introduce myself to new people. After that, I try to “pay it forward” in a couple of ways:

  • I try to introduce people I’ve talked to to each other. e.g. if I’m talking to someone and someone I’ve previously interacted with wanders past I say hi to them and ask “Have you two met?”
  • If I’m in a group (or even just talking to one other person) and see someone awkwardly standing around, I try to bring them into it (a “Hi, I’m David. Come join us” is usually sufficient).

Other people are also struggling with this, and helping them out is a good deed, which is the main way I do it, but conveniently it’s also a good way to meet people. It’s much easier to meet someone by bringing them into a group than it is to approach them on your own, and by forming a group you’ll tend to get other people members of the group know agglomerating on. Even if you don’t talk to them now, talking to them later becomes easier.

 

This entry was posted in Uncategorized on by .

Thoughts on the election

Beeminder is demanding that I blog today, but I’m not really feeling it, so I’m going to phone this one in, sorry.

I was trying to write something intelligent. Maybe something about delta debugging, maybe something about voting systems. When the dust settles and we have some data maybe I’ll do an analysis of what this election would have looked like under Random Ballot or something.

But, well, right now I’m too depressed, so here are some very ill-formed and ill-informed thoughts on the UK general election that is causing that impression.

This was a lot better than I feared, slightly worse than I expected, and a lot worse than I’d hoped it would be. I never expected a Labour majority (and I’m not the biggest Corbyn fan so would have felt only modestly positive if we’d got one), but I did think Labour might have been able to form a coalition.

Instead we get a Conservative + Democratic Unionist Party (think a more right-wing Irish version of the Conservatives. This is probably not a very fair description but I’m not very inclined to provide a fairer one) not-quite-a-coalition.

I confess I forgot completely about the DUP as a factor (English centric bias, sorry), and the Tory wins in Scotland were a complete surprise to me (Somewhat English centric bias, mostly that the Scots I know very strongly conform to the stereotype of Scotland being very left wing even though I know the reality is different), but I’d be lying if I said I ever really had a very firm sense of how the political landscape was going to go. I was mostly going on a mix of general knowledge and dread.

The dread turned out to be pretty warranted. Although I’m enjoying the schadenfreude of May losing her majority, this isn’t really much better than we started with. The DUP are terrible, and a Conservative/DUP alliance is going to be an improvement on the Conservative majority replaces in only three ways:

  • Their majority is smaller
  • They will be less able to get things done due to internal disagreements
  • They might go for a softer Brexit than they otherwise would have.

There’s also the argument that Brexit is going to be a disaster for whichever party deals with it, so in the long run this might be better by making the Conservative government pay the consequences. I’m not entirely sure I buy the calculus here, but it’s at least a small glimmer of hope.

Mostly I feel like as usual this election underlines the need for a better electoral system. The popular vote is so close between Labour and Conservative, with neither of them that close to a majority.

It is of course invalid to project how people would vote under a different voting system from this, but counting up the minority parties it is at least suggestive that if we’d had a more proportional system then we’d have likely been in the territory of the progressive alliance many people were hoping for – Labour + SNP + Lib Dem comes to 50.4% of the vote. Add the greens in and you’re up to 52%. Of course, in reality, that 52% of the vote came to 47.5% of the seats, so a small win became a small loss instead.

Oh well, so it goes. Another five years of something resembling this government.

Unless someone calls another general election I guess. So give it six months?

This entry was posted in Uncategorized on by .

Adaptive delta debugging

Note: This blog post is mostly intended to be expository about a nice technique, so I’m omitting a lot of the evidence for my claims. Sorry. I hope to write those up better at some later date.

As I explained previously, I’m quite interested in the problem of faster test case reduction.

In my previous post I explained how to improve on the basic greedy algorithm for test case reduction by better choices of where to try to delete. Now I want to show how to improve on delta debugging itself by better choices of deletion size.

Delta debugging can be thought of as the following code:

def ddmin(test_case, predicate):
    block_size = len(test_case) // 2
    while block_size > 0:
        prev = test_case
        i = 0
        while i < len(test_case):
            attempt = list(test_case)
            del attempt[i:i+block_size]
            if predicate(attempt):
                test_case = attempt
            else:
                i += block_size
        if test_case == prev:
            block_size //= 2
    return test_case

(this is not how delta debugging is described in the original paper, which is in terms of partition size, but it is compatible with their description and is more straightforward to adapt to the linear time version of greedy search)

This is the same as the greedy search I outlined previously, but instead of always deleting ranges of size one we try to delete long ranges. When this doesn’t work, we decrease the size of the range and try again.

For many cases this produces a dramatic speedup. e.g. it can reduce a long list to a single element list in log(n) steps.

Unfortunately for many other cases this ends up being substantially slower than the naive greedy algorithm. In simulation I’ve found that if the final test case is about 10-15% of the size of the original test case then delta debugging starts to be more expensive than just running the greedy search. For cases where the test case is already minimal, delta debugging pays an additional cost between 50% and 100% on top of the greedy algorithm.

The problem is basically that delta debugging is not making good choices of block size for these cases, and doesn’t have a better way to choose a block size than just try it and see what happens. In order to improve on this behaviour we need to get a block size that is more appropriate for the particular target we’re shrinking.

There turns out to be an easy way to do this. Instead of trying to choose the block size globally, we choose it locally, finding a block size for the current index that produces a large delete. The key tool for doing this will be the binary probe and search algorithm I outlined before.

At each index in our greedy search instead of just trying to delete one element, we search for a long block that we can delete starting from there and delete that:

def adaptive_ddmin(test_case, criterion):
    prev = None
    while prev != test_case:
        prev = test_case
        i = 0
        while i < len(test_case):
            def is_block_deletable(k):
                return criterion(test_case[:i] + test_case[i+k:])
            block_size = find_change_point(is_block_deletable, 0, len(test_case) - i)
            if block_size > 0:
                test_case = test_case[:i] + test_case[i + block_size:]
            i += 1
    return test_case

This has all the advantages of ddmin, but is able to automatically figure out the appropriate block size. In almost all circumstances it will perform far fewer shrinks than ddmin. The counterexamples are basically when there is a great deal that can be deleted, where in some cases it will perform an additional log(n) deletes to discover that fact.

It also performs much more closely to greedy search in the cases where greedy search beats delta debugging. It doesn’t always achieve that close to greedy search – each block deletion deletes block_size elements and checks that you can’t delete an additional element. This takes O(log(block_size)) calls and would take block_size + 1 calls in the greedy algorithm, so as block_size gets large this algorithm starts to win big. For small block sizes though it’s not a win, and you can see it making up to 50% more calls (as bad as classic delta debugging! Although in very different scenarios) in some cass.

This can be improved by special casing the algorithm to do a linear probe for very small values. We replace it with:

def find_change_point(f, lo, hi):
    base = f(lo)
    linear_cap = min(hi + 1, lo + 6)
    for i in range(lo + 1, linear_cap):
        if f(i) != base:
            return i - 1
    return binary_search(f, *exponential_probe(f, linear_cap - 1, hi))

This special case removes some of the worst cases, and leaves us with the the following:

block_size ratio
1 1.000000
2 1.000000
3 1.000000
4 1.000000
5 1.000000
6 1.142857
7 1.000000
8 1.111111
9 1.000000
10 0.909091
11 0.833333

Above this the ratio consistently remains better than 1. So the worst case scenario is that this algorithm ends up making < 15% more calls than the greedy algorithm.

It also consistently beats ddmin in simulation. I tested them both on a model of reducing a sequence to a unique subset (the predicate was that the example was interesting if and only if it contained a particular set of indices), with subsets chosen at various fixed expected fractions of the sequence. It was essentially always better, over a range from 1% to 99% of the set. It fairly consistently makes about 75% as many calls to get the same result, but I don’t have a theoretical argument that that should be the case.

This may not seem like a big deal, but this sort of technique is often applied in cases where individual calls can take multiple seconds (e.g. because you need to compile and run some code), so anything that gives you a speedup like this is greatly appreciated.

It also appears to be a major improvement in practice as well as in simulation. I’m not currently using this technique in hypothesis (though I do plan to), but I have a branch of structureshrink using this and it’s proving to be a major win.

This entry was posted in Code on by .

Older women as protagonists in SFF

I asked on Twitter the other day for examples of science fiction and fantasy where the protagonists were middle aged and upwards women.

Why? No amazingly good reason. Mostly just I was rereading “A Key, An Egg, An Unfortunate Remark” by Harry Connolly, which I really like, and I was thinking that most of the examples of such books I’ve read I’ve enjoyed. There’s a broader political point about representation, and I do care about that, but I was mostly just looking for good and slightly different books.

Here are the results from that thread:

Ones I can personally recommend

These are the ones I’ve read and what I think about them.

A Key, An Egg, An Unfortunate Remark by Harry Connolly

This is the one that started the thought process. It’s good. It’s a mystery novel about a 60 year old high-society lady who used to be a monster hunter but has turned pacifist. She now basically dominates the magical scene of Seattle and forces people to play nice.

It’s stylistically a bit odd (at least partly due to a conceit you find out about a third of the way in that I enjoyed as a one off but think would get old fast), but a lot of fun and I definitely recommend it.

Gentleman Jole and the Red Queen by Louis McMaster Bujold

Probably doesn’t make a lot of sense if you haven’t read the previous books in the Vorkosigan Saga (which is very good but mostly does not count for this list) – it’s a book about the Cordelia Vorkosigan, the mother of the eponymous protagonist of most of the rest of the series. It’s also a book about how much of her life that her son (the protagonist of most of the other books) has been completely oblivious to.

I think everything Bujold does is great, but I really enjoyed this one in particular.

Paladin of Souls by Louis McMaster Bujold

I confess I remember literally nothing about this book except that it was by Bujold and didn’t change my opinion that I like almost everything she writes. So recommended on that strength but not much else.

The Ninth Rain by Jen Williams

Only partially counts and only partially recommended. The book arguably has three protagonists – a middle aged black lesbian, a young woman persecuted for witchcraft, and an immortalish man who is somewhere between a vampire and an elf and acts a lot like he was a young man.

I like Jen Williams but I only weakly recommend the book. The protagonist who gets it onto this list is great, but I felt like the story overall had a bit too much going on and didn’t quite live up to its promise.

The Fifth Season by N. K. Jemisin

A middle-aged mother tries to find her daughter, who has been abducted by her husband. A rare instance of the genre of post-apocalyptic high fantasy (which I enjoy in general).

N. K. Jemisin is very good and I recommend most of her books, including this one.

A Crown for Cold Silver by Alex Marshall

The protagonist is a retired revolutionary who conquered the known world and then deliberately faded into obscurity, but her past catches up with her and she’s forced out of retirement.

I actually remember very little about this book other than that I enjoyed it.

My Real Children by Jo Walton

A sort of sliding doors style exploration of a woman’s life as she lives it in two alternate histories of the world based on the differences caused by a single choice she made that turned out to have far-reaching ramifications. It covers most of her life from a young age but primarily the middle of it.

This novel is very good but absolutely heart breaking.

The Annihilation Score by Charles Stross

One of the many books in Charles Stross’s Laundry Files, this one focusing on Mo, the wife of the protagonist from earlier books.

Doesn’t really stand on its own without the rest of the series, and I’m kinda tiring of the Laundry Files a bit, but they’re still worth reading.

Rule 34 by Charles Stross

Not 100% sure this one counts, but the protagonist is a fairly senior police officer so is at the very least not young.

Good book, although it’s been long enough since I’ve read it that I’m hazy on the details. Would nevertheless recommend.

The witches books by Terry Pratchett

  • Equal Rites
  • Wyrd Sisters
  • Witches Abroad
  • Lords and Ladies
  • Maskerade
  • Carpe Jugulum

Nanny Ogg and Granny Weatherwax are some of my favourite of Pratchett’s characters. Can definitely recommend.

Jackalope Wives by T. Kingfisher

T. Kingfisher (secretly Ursula Vernon) is great and you should read everything she writes, but the titular Jackalope Wives and a few other stories in that collection (including “The Tomato Thief” where the protagonist of Jackalope wives reappears) is a grandmother who likes tomatoes and magic and puts up with very little nonsense.

Recommendations from Others

I haven’t read these yet, but I aim to fix this.

  • The Marq’ssan cycle by L. Timmel Duchamp’
  • Tea With the Black Dragon by R. A. MacAvoy
  • Holy Fire by Bruce Sterling
  • The Gaia Series (Titan / Wizard / Demon) by John Varley
  • Tehanu by Ursula Le Guin (I think I actually have read this one but it was ages ago and I have no memory either way of it)
  • Woman on the Edge of Time by Marge Piercy
  • The Year of the Flood by Margaret Atwood
  • Fair Peril by Nancy Springer
  • The New Moon’s Arms by Nalo Hopkinson
  • Remnant Population by Elizabeth Moon
  • Sassinak by Anne McCaffrey and Elizabeth Moon
  • The Baba Yaga by Una McCormack and Eric Brown (this is the third in a series where the first two don’t seem to qualify)

I’m currently reading through The Memoirs of Lady Trent series by Marie Brenna (which will probably count in some of the later books I haven’t read yet – the protagonist is writing her memoirs from an age where she would definitely qualify, but I’m so far only on book three where she’s only just in her 30s, which I am determined not to count as middle age while I’m still in them). After I’ve finished this series I’ll start making inroads into the above list and report back.

In the meantime, feel free to suggest more books, or point out ones I missed from the thread, in the comments.

This entry was posted in Books on by .