A “Measure of Transaction Processing” 20 Years Later

I just read an interesting short report by Jim Gray. The gist of the matter is that, since 2000, the rate of increase in computer performance per dollar has gone down. It is still exponential, but the rate of growth is much, much smaller. Gray blames memory latency.

As a side-note, how fast can you sort 16 GB of data on a typical PC? The answer is about 16 minutes.

3 thoughts on “A “Measure of Transaction Processing” 20 Years Later”

  1. Seb: Right. That ought to be specified, but the paper doesn’t say. These are standard tests and there must be strict definitions somewhere, but I would assume that they mean sorting 32 bits blocks.

Leave a Reply

Your email address will not be published. Required fields are marked *