Distance-based voting rules

After a long gestation period in which I seemed to be publishing nothing, a few projects have reached maturity. With Benjamin Hadjibeyli, I have a preprint studying so-called distance rationalizable voting rules, which we recently submitted. These are voting rules in which we specify some notion of consensus winner, set up a distance measure on elections, and then choose the winner(s) based on minimizing the distance to a consensus election. This framework has been used by other authors over the years, particularly by Edith Elkind, Piotr Faliszewksi and Arkadii Slinko in a series of papers.

We deal with a lot of foundational issues and give some insight into the way Minkowski geometry relates to decisiveness of such rules. This includes both positive and negative results.

This work was inspired by discussions with Elchanan Mossel and Miklos Racz in Berkeley in 2013, about whether the Dodgson rule is a so-called hyperplane rule. That question is still not answered, but now the underbrush has been cleared, it makes sense to go back to it.

Experimental research

When I was a PhD student, stretching my horizons meant thinking about commutative ring theory, instead of general rings. Over my career I have gradually stretched further, taking in mathematical parts of computer science and social choice theory. However in recent years the stretching has become much larger. In addition to supervising PhD students in network science, my first (joint) work on experimental social science has been uploaded to the world.

In this work, ultimately inspired by a logical model developed by our colleague Patrick Girard and coauthors to describe belief changes in social networks, Valery Pavlov and I have conducted a laboratory experiment with human participants, designed to measure influence and social learning of factual information. A novelty was the way we allowed and incentivized participants to truthfully report “I don’t know” – this seemingly small change has large effects on the dynamics.

Almost everything about this was new to me, but I now feel confident about taking this work much further. Threshold-type diffusion models, as opposed to the infection-type models so common in network science, seem to be much more relevant to this kind of situation. Our work suggests a different model from the usual threshold model.

Who knows what the next decade will bring? Perhaps an art installation or a musical performance? There are still faculties of the university I haven’t been much involved with.

Canadam 2015

Last week I made a flying visit to Canada for this annual meeting (it’s hard to imagine any other type given the distance, but it was particularly short – less than 72 hours in the country). I was invited by Steve Melczer, an impressive PhD student who seems to have read my coauthored book better than anyone alive, and with whom I hope to collaborate this year. Despite the arduous travel, it was a worthwhile experience and should lead to some nice joint publications.

I noticed that many of the buildings at the beautiful campus of the University of Saskatchewan are connected underground, giving a clue to the winter weather. Even though it was early June, the weather was not particularly warm. On my return via Calgary Airport, I noticed that it was 12 degrees and raining at 4pm (in summer, or close enough). The next morning in Auckland at 4am, it was 13 degrees and raining (in winter).

Conferences: SSCW and COMSOC

I recently (late June, but other things got in the way of this report) attended the Society for Social Choice and Welfare meeting in Boston and the Computational Social Choice meeting in Pittsburgh. Here follows a short report.

SSCW was held at Boston College with around 300 attendees.

Positives: The meeting was overall well organized and the open air conference “clambake” dinner a particular highlight. There were several famous speakers from the Boston area, including Amartya Sen and Daron Acemoglu.

Negatives: Holding parallel sessions in different buildings made switching sessions difficult, which was a problem because there were many sessions that clashed (at least for me). The schedule was very tough, with talks starting at 0800 most days, and the lack of accommodation on campus meant that, with travel time, each day was about 11 hours long. Several of the invited speakers were very hard to track down afterwards, and I suspect they only came in for their talk.

COMSOC 2014 was held at Carnegie Mellon University with around 70 attendees. It was nicely hosted by Ariel Procaccia, and the conference dinner among the dinosaurs at the Carnegie Museum of Natural History was even better than the SSCW one. While less gruelling than the week before, this meeting was certainly a test of stamina for me. Carnegie-Mellon Computer Science was an impressive place, with the architecture of the Gates building and the conference auditorium coming complete with individually sponsored seats including some reserved for Manuel Blum and relatives (how many other university departments worldwide have 3 members of the same family as professors?)

I came away with some new ideas, but not as many as I had hoped, from these two meetings.

Perhaps it is time to rethink conferences of this sort. I would like to see more time for discussion, which probably means reducing the talk length even more. The main purpose of such meetings is to meet people and talk to them, but if you are sitting in talks all the time, this is not easy. Maybe everyone should give a 5-minute (or shorter) talk to advertise, and then have an enormous “poster session” where questions can be asked in detail. Speed dating for researchers has a certain appeal, perhaps.

Paradoxes of runoff voting

The New Zealand Labour party will soon have an election for leader of its Parliamentary caucus. The voting system is a weighted form of instant runoff using the single seat version of Hare’s method (instant runoff/IRV/alternative vote). IRV works as follows. Each voter submits a full preference order of the candidates (I am not sure what happens if a voter doesn’t rank all candidates but presumably the method can still work). In each round, the voter with smallest number of first preferences (the plurality loser) is eliminated, and the candidate removed from the preference orders, keeping the order of the other candidates the same. If there is a tie for the plurality loser in a round, this must be broken somehow.

The NZLP variant differs from the above only in that not all voters have the same weight. In fact, the caucus (34 members) has a total weight of 40%, the party members (tens of thousands, presumably) have total weight 40%, and the 6 affiliated trade unions have total weight 20%, the weight being proportional to their size. It is not completely clear to me how the unions vote, but it seems that most of them will give all their weight to a single preference order, decided by union leaders with some level of consultation with members. Thus in effect there are 34 voters each with weight 20/17, 6 with total weight 20, and the rest of the weight (total 40) is distributed equally among tens of thousands of voters. Note that the total weight of the unions is half the total weight of the caucus, which equals the total weight of the individual members.

IRV is known to be susceptible to several paradoxes. Of course essentially all voting rules are, but the particular ones for IRV include the participation paradoxes which have always seemed to me to be particularly bad. It is possible, for example, for a candidate to win when some of his supporters fail to vote, but lose when they come out to vote for him, without any change in other voters’ behaviour (Positive Participation Paradox). This can’t happen with three candidates, which is the situation we are interested in (we denote the candidate C, J, R). But the Negative Participation Paradox can occur: a losing candidate becomes a winner when new voters ranking him last turn out to vote.

The particular election is interesting because there is no clear front-runner and the three groups of voters apparently have quite different opinions. Recent polling suggests that the unions mostly will vote CJR. In the caucus, more than half have R as first choice, and many apparently have C as last. Less information is available about the party members but it seems likely that C has most first preferences, followed by J and R.

The following scenario on preference orders is consistent with this data: RCJ 25%, RJC 7%, CRJ 10%, CJR 30%, JRC 20%, JCR 8%. In this case, J is eliminated in the first round and R wins over C in the final round by 52% to 48%. Suppose now that instead of abstaining, enough previously unmotivated voters decide to vote JRC (perhaps because of positive media coverage for J and a deep dislike of C). Here “enough” means “more than 4% of the total turnout before they changed their minds, but not more than 30%”. Then R is eliminated in the first round, and C wins easily over J. So by trying to support J and signal displeasure with C, these extra voters help to achieve a worse outcome than if they had stayed at home.

The result of the election will be announced within a week, and I may perhaps say more then.

Terry Tao visit

We were privileged to have a visit from Terry Tao today. He gave a nice talk on work with Ben Green on the orchard planting problem. For me even the Sylvester-Gallai theorem was news. His public lecture on the “Cosmic distance ladder” drew a huge crowd. I very much enjoyed the historical discussion of how astronomical distances were measured. It is interesting to think about why Aristarchus’s ideas were rejected by peer review. I have a renewed respect for Kepler and learned that Io orbits Jupiter every 42.5 hours.

He is touring the country as the 2013 Maclaurin Lecturer for the next several days. Definitely worth seeing!

AofA 2013

The invitation-only conference was held in Menorca 27-31 May 2013. I gave a talk there on diagonal asymptotics of combinatorial classes (paper available from my research outputs page). After missing 5 of these meetings in a row, it was good to return. The name of Philippe Flajolet was mentioned many times, and it is clear that this research community still misses him very much.

There were many very interesting talks including the longer invited ones, although the schedule was gruelling with too much time sitting down listening. Highlights for me, in no particular rank order, were:

  • Bob Sedgewick’s talk about his MOOC experiences. He urged us all to give it a try, both as producer and consumer of content.
  • Basile Morcrette showing that generating function methods can work for studying even unbalanced urn models, a nice tribute to the vision of Flajolet.
  • The survey talk of Mihyun Kang on phase transition results in random graphs.
  • Philippe Jacquet on green leader election algorithms (standard methods use too much energy in wireless networks).
  • Michael Drmota on singularity analysis of positive algebraic functions.
  • Konstantinis Panagiotiu’s survey of random k-SAT including his recent results with Coja-Oghlan.
  • The excellent organization of Conrado Martinez.

Lowlights: the hotel was isolated and although it had some good features, not completely suited to the conference. It was filled with English tourists many of whom, unfortunately, didn’t really mix well with the intellectual nature of the conference and didn’t understand how to use sunscreen. The weather was cool and the beach under attack from jellyfish who stung at least two conference participants. The talks were held in the piano bar, which had really good seats, but poor acoustics and visibility. The travel to and from Menorca was really arduous, even though I only came from San Francisco.

From the mathematical point of view, there were some interesting topics. The “Algorithms” part of AofA seemed to be even less prominent that previous years, and this may be a problem in future. A talk by Markus Nebel on Yaroslavskiy’s dual pivot quicksort showed that the old models used since the time fo Knuth are not very good at predicting actual performance, and some hard work is desperately needed there. The notion of a tradeoff between accuracy and other performance characteristics versus energy use as mentioned at least twice, and seems a promising approach.

Many community activities are planned. In particular, AofA2014 will be in Paris 16-20 June, with Donald Knuth as the Flajolet memorial lecturer.

Flajolet memorial volume of CPC

The journal Combinatorics, Probability and Computing (obviously, since the publisher is Cambridge University Press, it doesn’t use the “Oxford comma”) will publish a special issue dedicated to Philippe Flajolet, whose early death in 2011 shocked, although perhaps didn’t surprise, the Analysis of Algorithms community.

I have submitted a paper, and judging from one I have been asked to referee, the standard will be high. Philippe’s own research work and encouragement of others were an inspiration to the entire community, and I am sure everyone wants to submit something approaching his standards.