Sunday, April 11, 2010

Stop Publishing!

The last few months, I feel that I have an endless queue of reviewing tasks to complete. WWW, followed by DBRank, followed by EC, followed by KDD, followed by VLDB, followed by WebDB, plus an NSF panel, plus some journal reviews, and I have rejected invitations for a few additional conferences including SIGIR, SIGMOD, and a few others. This puts my count at least 40 reviews over the last 4-5 months. (Just to break even, I will need to submit 10-12 papers.)

Needless to say, having such a reviewing load means that I cannot really do a good job in reviewing. My reviews have been declining in quality, signalling that I need to learn to say no.

At the same time, I also notice that the other reviews that are being submitted are not that great either. While on the one hand I feel happy ("OK, I am not that bad"), on the other hand I feel that this cannot be good. If nobody has time to review thoroughly, what is the whole point of peer reviewing?

One solution is to accept fewer invitations for PCs, allowing for more time per paper. However, I know that without volunteering time and effort for reviewing the system cannot work! There are simply not that many reviewers available!

Part of this problem is, of course, the increased need to get papers published: For tenure, for getting a job, even for being admitted to a PhD program! I feel that there is something wrong when, to be admitted to a PhD program, you need to have already research experience. This increased need for more and more publications, overloads the reviewing system which unfortunately has a limited capacity.

Unfortunately, it is not easy to reverse this trend. The incentives are setup in a way to encourage quantity of publications, preferably in good venues. Once the paper gets accepted in a good venue, the goal is achieved. This encourages publications that are "good enough" to pass the reviewing process, not papers that have stellar quality. And with the increased noise in the reviewing process, the distinction between "good enough to be published" and "what the hell, send it, we may get lucky" is getting blurrier and blurrier. In fact, I have cases in my papers that the reviewers did such a poor job that I never understood at the end if my paper was worth getting published, or I got just lucky.

I noticed though a positive development! Through the Greek University Reform Forum, I learned that:


The German Research Society (DFG) has introduced new guidelines for applications and evaluations of proposals, which will be valid as of July 1, 2010.

A rough-and-ready translation of the main points:
  • Applicants should cite in their CV only up to FIVE publications, those which are most relevant for the proposal at hand;
  • In reports about running projects, a maximum of TWO publications PER YEAR. In case of projects with more than one PIs, a maximum of THREE publications PER YEAR.
  • The goal of the new guideleines is to put emphasis on quality instead of quantity and to stop the flood of publishing for the sake of the numbers.

It has caused quite some stir here in Germany and the voices to enforce such rules also for decisions on faculty positions are getting louder.

While some of these ideas are already in place (e.g., NSF also allows only five publications in the CV), the idea of "counting" only two publications per year for each project is definitely a step towards the right direction. It is not going to be trivial to reverse the "get as many publications in top venues as possible" trend, but every step towards de-emphasizing quantity counts.

After all, Pollock was also getting paid by the piece when he worked for the Federal Art Project, but none of his famous paintings come from that period.