The problem with unidimensional research

Yesterday, I listened to some of the BDA’08 talks. One common issue I noticed is that most work is unidimensional. That is, researchers tell us “my technique is 4 times faster”. The trade-off, if any, is not presented. Here is what I would like to hear: “I use twice the RAM, but I am 4 times faster”.

This is a side-effect of our quest for positive results. People hide the trade-offs because the negative points make their papers less likely to be accepted. I believe this is the main reason why practitioners ignore researchers. Researchers paint a severely distorted view of the world.

Reviewers need to wise up and give extra points to researchers who describe the negative and positive points of a method. We need to stop believing that the one ideal technique is just around the corner.

In my experience as a researcher, there is no silver bullet. Impressive techniques often fall flat when viewed from a different angle. What appears deep and thoughtful may sound trivial a few years later once it is well understood. What appears childish may be the best option in most cases.

Paint a rosy picture of your work and I will distrust you. Paint the shadows and I start believing in you.

Published by

Daniel Lemire

A computer science professor at the University of Quebec (TELUQ).

3 thoughts on “The problem with unidimensional research”

  1. I totally agree. My favorite papers always have something negative to say about their results.

    I also like totally negative results. If something doesn’t work like you expected based on your intuition, you can learn much more from that.

  2. This is perhaps the best post I read on your blog. “I believe this is the main reason why practitioners ignore researchers.”. You nailed it straight on the head.

    Is there any way to fix this situation? Medical researchers can’t hide negative evidence. Students in all disciplines can be instructed to maintain academic honesty (e.g., to report all joint work on exercises) and they seem to follow. Is there a good way to encourage academic honesty in our community as well? Maybe to include a mandatory special section in referee reports, titled “criticism/drawbacks of results”?

  3. Nice post. But I wonder if the problem is deeper–that it’s not that researchers are hiding the negative evidence, but are oblivious to it. I’ve seen information retrieval researchers who don’t think of efficiency as an interesting metric, and therefore don’t see a cost when their improvements lead to much higher computational expenses. And I suspect that many of those same researchers are also serving as reviewers.

Leave a Reply

Your email address will not be published.

To create code blocks or other preformatted text, indent by four spaces:

    This will be displayed in a monospaced font. The first four 
    spaces will be stripped off, but all other whitespace
    will be preserved.
    
    Markdown is turned off in code blocks:
     [This is not a link](http://example.com)

To create not a block, but an inline code span, use backticks:

Here is some inline `code`.

For more help see http://daringfireball.net/projects/markdown/syntax

You may subscribe to this blog by email.