Pac-Man running at 1 million frames per second

In What does technology want?, Kevin Kelly argued that technology is on an evolutionary path. In some real sense, technology is alive and growing. It seeks out to improve itself at an ever faster rate. And it seems that technology currently loves software.

The difference between a recent processor and a processor from 10 years ago is ~10x. Yet it is not like Google is the same as it was, except 10x faster. Without better software, our video games would be like old-school Pac-Man, except that they would run at 1 million frames per second. If that sounds ridiculous, that’s because it is: software has improved thousandfolds.

I don’t think we are going to see an evil artificial intelligence trying to enslave us in the coming decades. However, we are going to get seriously insane software. It is not going to look like what we have today, only faster. It is probably unimaginable right now.

8 thoughts on “Pac-Man running at 1 million frames per second”

  1. Perhaps incidentally, for the research I do it is vitally important to be able to run Pac-Man (and similar games) at a million frames per second (or faster). This is because we need to run tens of thousands of times to learn to play it, or even run thousands of simulations of the whole game each frame in order to figure out which action to take next. In general, modern AI techniques for games require insane amounts of simulation, which is why it’s great for me that it’s possible to run these old games so much faster now.

    Something similar can be done other types of software, not just games. When the basic functionality is so simple that it could be run a million times faster than realtime, we can do lots of simulation for how it will run under various conditions. We can learn, predict, adapt.

    And this is one way in which old software can give rise to qualitatively different new software. By running a million times faster.

  2. You almost touch on something quite interesting to me as a programmer with a Comp.Sci. background. Its that despite the hardware speed improvements, user interfaces still operate at a rate that is within our ability to perceive the operation. We often must wait for something to “open”, “start” or “complete”, sometimes the periods are so long we wonder if its working. Wouldn’t it be incredible if every interaction with software occurred within a blink of the eye?

    1. Wouldn’t it be incredible if every interaction with software occurred within a blink of the eye?

      We have working virtual reality coming in a few months on PlayStations. A PS4 is hardly a super powerful machine. Yet for virtual reality to work, everything must happen faster than the blink of an eye. So I am curious what you mean by “user interfaces still operate at a rate that is within our ability to perceive the operation”.

  3. Well for example, when a desktop app’s button is pressed to perform some operation or action we expect there to be some response within a second. For a web link, we are more forgiving if the response is within 10 to 20 seconds. The thing is, no matter how fast our technology becomes the response times of “today’s” software seem to remain within our ability to notice time elapsing. As the million fps game demonstrates, older software on newer hardware can runs at speeds well beyond our ability to process. I wonder if we’ll ever get to the point where every interaction or query for information might occur instantaneously (as we perceive it).

    1. I think we have had instantaneous response at the press of a button for quite some time.

      Let us set, somewhat arbitrarily, the latency threshold we can perceive at 20 ms. We can argue about what the right number is… but it does not change the argument. For a computer, that’s at least 20 million cycles. Our computers are multicore, vectorized and superscalar… There is just very little that your CPU cannot do in this delay. Admittedly, you can easily waste 1000s of cycles in various latencies… but you have millions of them to spare!

      Developers do take shortcuts that force high delays at times, for example when applications start, or when they shutdown. But they work on the assumption that these delays are not disruptive.

      Network latency is something else. Because of the speed of light, you cannot expect much less than 50 ms of latency just because information cannot travel any faster. Though 50 ms is barely noticeable, it is probably already noticeable. This means that we probably have no chance of ever having instantaneous access to all the servers in the world… Physics gets in the way. Of course, we can cheat… and give the illusion that we can… but I don’t think we can realize the real thing.

      1. Daniel, I think you are not understanding Terry’s point (either that or I am).

        To clarify as an example: If I am on my computer, and I launch Microsoft Word, there is a small, but noticeable delay between when I clicked and when the Word application is visible on the screen. I believe Terry’s point is that despite vast improvements in hardware over the years, we still see this small delay. The program is not before our eyes the exact visual moment that we clicked the icon.

        Part of this, I believe, is because while our hardware has improved, our software has also increased in complexity, requiring more time to be copied from the hard drive to the RAM for use, and more time to process the increased number if code lines.

        1. If I am on my computer, and I launch Microsoft Word, there is a small, but noticeable delay between when I clicked and when the Word application is visible on the screen.

          Launching big applications does take some time, true.

          However, on a fast PC, every interaction with Microsoft Word ought to be instantaneous after that.

Leave a Reply

Your email address will not be published. Required fields are marked *