A “Measure of Transaction Processing” 20 Years Later

I just read an interesting short report by Jim Gray. The gist of the matter is that, since 2000, the rate of increase in computer performance per dollar has gone down. It is still exponential, but the rate of growth is much, much smaller. Gray blames memory latency.

As a side-note, how fast can you sort 16 GB of data on a typical PC? The answer is about 16 minutes.

Published by

Daniel Lemire

A computer science professor at the University of Quebec (TELUQ).

3 thoughts on “A “Measure of Transaction Processing” 20 Years Later”

  1. Seb: Right. That ought to be specified, but the paper doesn’t say. These are standard tests and there must be strict definitions somewhere, but I would assume that they mean sorting 32 bits blocks.

Leave a Reply

Your email address will not be published.

To create code blocks or other preformatted text, indent by four spaces:

    This will be displayed in a monospaced font. The first four 
    spaces will be stripped off, but all other whitespace
    will be preserved.
    
    Markdown is turned off in code blocks:
     [This is not a link](http://example.com)

To create not a block, but an inline code span, use backticks:

Here is some inline `code`.

For more help see http://daringfireball.net/projects/markdown/syntax

You may subscribe to this blog by email.