A cute lower bound for weighted FAS tournaments

As mentioned, for my random hack project du jour I am currently working on an optimiser for weighted feedback arc sets.

This problem can largely be summarised as follows: I have a non-negative weight matrix \(A\) and I want to find a permutation \(\pi\) that maximizes the score $$w(\pi) = \sum_{\pi(i) < \pi(j)} A_{ij}$$ (It's more normal to phrase this in terms of minimizing the sum going the other way, but the problems are equivalent) The thing I was wondering about recently: What sort of scores am I guaranteed to be able to achieve? There's the trivial lower bound that $$w(\pi) \geq \sum_{i < j} \min(A_{ij}, A_{ji})$$ but, equally trivially, in the case where \(A\) is not a constant matrix this is easy to exceed. My question is: What sort of values are we guaranteed to be able to find a value of \(\pi\) that exceeds them? I realized earlier that there's a really nice solution to this. Let \(\chi\) be a random variable that picks a permutation uniformly at random. Then we can write $$w(\chi) = \sum_{i < j} S_{ij}$$ where \(S_{ij}\) is a random variable that takes the value \(A_{ij}\) if \(\chi(i) < \chi(j)\) and else takes the value \(A_{ji}\). It's easy to see that \(S_{ij}\) takes its two values with equal probability, so $$E(S_{ij}) = \frac{1}{2}(A_{ij} + A_{ji})$$ But then we have $$E(w(\chi)) = E(\sum_{i < j} S_{ij}) = \sum_{i < j} E(S_{ij}) = \frac{1}{2} \sum A_{ij}$$ So given that the expected weight of a random permutation is \(\geq\) this value, it's certainly the case that there exists a permutation with weight \(\geq\) this value. This is nice and all, but it turns out to be relatively easy to do better. Consider \(S_{ij}\) and \(S_{kl}\). You can prove by inspection of cases that these are independent of eachother (note that the set \(S_{ij}\) is not independent - even sets of three elements of it typically aren't. This only works for pairs). This is enough that the variances add. Therefore we have $$Var(w(\chi)) = \sum Var(S_{ij})$$ It's a simple calculation to show that $$Var(S_{ij}) = \frac{1}{4} (A_{ij} - A_{ji})^2$$ So $$Var(w(\chi)) = \frac{1}{4} \sum_{i < j} (A_{ij} - A_{ji})^2$$ Why is this important? Well, we are guaranteed that we have some point which is at least the standard deviation away from the mean, and the distribution of \(w(\chi)\) is symmetric about its mean (you can see this by considering what happens under \(\pi \to \tau \cdot \pi\) where \(\tau\) is a reversal). Therefore there must be a point which is at least the mean plus the standard deviation. Therefore we can find a permutation \(\pi\) with $$w(\pi) \geq \frac{1}{2}\left(\sum A_{ij} + \sqrt{\sum_{i < j} (A_{ij} - A_{ji})^2} \right)$$ Some closing notes:

  1. Yay for mathjax. This is my first post using it
  2. My optimizer pretty consistently exceeds this bound anyway (it would be sad if it were doing worse than moderately determined random sampling), but I’ve now added a first pass which shuffles until it does.
  3. I’m nearly certain this bound is not tight except for in the trivial case of the constant matrix. When plotting graphs of the distribution of \(w(\chi)\) the result has a distinctly bell curved nature (although obviously it’s actually discrete) which drops off smoothly rather than cutting off at the SD. However calculating more about the distribution of random sampling is hard, and I suspect may not be that useful, so for now I’m leaving it at this.
  4. I have no idea if this is a known result, but it’s pretty trivial to calculate once you have the basic idea, so it’s likely it’s one of those things that has been discovered several times but wasn’t regarded as interesting enough to publish. Yay for blogs
This entry was posted in Numbers are hard on by .