A “Measure of Transaction Processing” 20 Years Later

I just read an interesting short report by Jim Gray. The gist of the matter is that, since 2000, the rate of increase in computer performance per dollar has gone down. It is still exponential, but the rate of growth is much, much smaller. Gray blames memory latency.

As a side-note, how fast can you sort 16 GB of data on a typical PC? The answer is about 16 minutes.

Published by

Daniel Lemire

A computer science professor at the University of Quebec (TELUQ).

3 thoughts on “A “Measure of Transaction Processing” 20 Years Later”

  1. Seb: Right. That ought to be specified, but the paper doesn’t say. These are standard tests and there must be strict definitions somewhere, but I would assume that they mean sorting 32 bits blocks.

Leave a Reply

Your email address will not be published. The comment form expects plain text. If you need to format your text, you can use HTML elements such strong, blockquote, cite, code and em. For formatting code as HTML automatically, I recommend tohtml.com.

You may subscribe to this blog by email.