If human population grew at the pace of computer storage…

Between 1990 and 2010, the cost of one megabyte of disk storage went from $9 to $0.00015. Had the human population followed a similar growth, there would be 300 trillion people on Earth. (For simplicity, I am not taking inflation into account. It would make the result even more impressive.)

Between 1990 and 2010, the cost of one megabyte of RAM went from $110 to $0.01. With a similar growth in population, there would be 55 trillion people on Earth. (Again, I am not taking inflation into account.)

Many information systems have storage costs which are proportional to the number of individuals. I call them sapien-bound systems. They include most employee, customer and student databases. They also include blog engines and email systems. Soon, all sapien-bound systems will fit in RAM cheaply.

Further reading: Examples and definition of machinegenerated data by Curt Monash. You may also want to check my older post What is infinite storage?

Source: Cost of Hard Drive Storage Space, Memory Prices (1957-2010), the World population article on wikipedia

4 thoughts on “If human population grew at the pace of computer storage…”

  1. @Mugizi

    By extrapolation, within 20 years, there will be cheap computers with more than 700 TB of RAM. So, not only will you able to store the data, you’ll also be able to process it very quickly.

  2. This is a great point that I have often thought about:

    Soon it will be possible(and arguably it already is) to store DETAILED INFORMATION about every human being alive on an average computer.

    Lets say 7 billion people and 100,000 bytes for biographical data and employment history and a small picture, that comes to 700 Terabytes. If you just look at Americans (300 million) its just 30 Terabytes. Very feasible.

    What are the implications going to be when storing and accessing such vast personal information is much easier than it is now? What are governments and corporations going to do with it?

    I guess we’re going to find out.

  3. @Daniel, you know that the “processing it quickly” part is the rub. Fact is, processors and memory bandwidth is not getting faster at the same pace that memory is growing. So, memory will become free and all our money will go into backplanes, CPUs and support chips, and power supplies.

    And nobody knows how to effectively use large numbers of cores for problems that aren’t trivially parallelizable, at least not without months or years of work on each individual problem.

    Question: is the web a sapien-bound system?

    Spam question: is très + sex = good thing?

  4. @Mike

    It does look like Google can keep indexing a good fraction of the Web and keep it in RAM. So some important part of the Web is sapien-bound(ed).

    It would be very interesting to do a more scholarly analysis on what can be considered sapien-bound(ed)… with hard numbers an so on… Are you interested?

    I share your concerns regarding our limitations, but we have many good years to go. Having 1024 cores per main CPU will certainly help… though you have to worry about heat and power.

Leave a Reply

Your email address will not be published. Required fields are marked *