Most people have a mental model of computation based on the Turing machine. The computer does one operation at a time. For example, maybe it adds two numbers and outputs the result.

In truth, most modern processor cores are superscalar. They execute several instructions per CPU cycle (e.g., 4 instructions). That is above and beyond the fact that many processors have several cores.

Programmers should care about superscalarity because it impacts performance significantly. For example, consider an array of integers. You can compute the gaps between the integers, y[i+1]=x[i+1]-x[i], faster than you can recover the original values from the gaps, x[i+1]=y[i+1]+x[i]. That is because the processor can compute several gaps at once whereas it needs to recover the values in sequence (e.g., x[i] before x[i+1]).

Superscalar execution is truly a wonderful piece of technology. It is amazing that our processors can reorder and regroup instructions without causing any bugs. And though you should be aware of it, it is mostly transparent: there is no need to rewrite your code to benefit from it.

There is another great modern feature that programmers need to be aware of: most modern processors support SIMD instructions. Instead of, say, adding two numbers, they can add two vectors of integers together. Recent Intel processors can add eight 32-bit integers using one instruction (vpaddd).

It is even better than it sounds: SIMD instructions are superscalar too… so that your processor could possibly add, say, sixteen 32-bit integers in one CPU cycle by executing two instructions at once. And it might yet squeeze a couple of other instructions, in the same CPU cycle!

Vectorization is handy to process images, graphics, arrays of data, and so on. However, unlike superscalar execution, vectorization does not come for free. The processor will not vectorize the computation for you. Thankfully, compilers and interpreters do their best to leverage SIMD instructions.

However, we are not yet at the point where compilers will rewrite your algorithms for you. If your algorithm does not takes into account vectorization, it may not be possible for the compiler to help you in this regard.

An important problem when working with databases or search engines is the computation of the intersection between sorted arrays. For example, given {1, 2, 10, 32} and {2, 3, 32}, you want {2, 32}.

If you assume that you are interested in arrays having about the same length, there are clever SIMD algorithms to compute the intersection. Ilya Katsov describes an elegant approach for 32-bit integers. If your integers fit in 16 bits, Schlegel et al. have similar algorithms using special string comparison functions available on Intel processors.

These algorithms are efficient, as long as the two input arrays have similar length… But life is not so easy. In many typical applications, you frequently need to compute the intersection between arrays having vastly different lengths. Maybe one array contains a hundred integers and the other one thousand. In such cases, you should fall back on a standard intersection algorithm based on a binary search (a technique sometimes called “galloping”).

Or should you fall back? In a recent paper, SIMD Compression and the Intersection of Sorted Integers, we demonstrate the power of a very simple idea to design better intersection algorithms. Suppose that you are given the number 5 and you want to know whether it appears in the list {1,2,4,6,7,8,15,16}. You can try to do it by binary search, or do a sequential scan… or better yet, you can do it with a simple vectorized algorithm:

  • First represent your single number as a vector made entirely of this value: 5 becomes {5,5,5,5,5,5,5,5}. Intel processors can do this operation very quickly with one instruction.
  • Compare the two vectors {5,5,5,5,5,5,5,5} and {1,2,4,6,7,8,15,16} using one instruction. That is, you can check eight equalities at once cheaply. In this instance, I would get {false,false,false,false,false,false,false,false}. It remains to check whether the resulting vector contains a true value which can be done using yet another instruction.

With this simple idea, we can accelerate a range of intersection algorithms with SIMD instructions. In our paper, we show that, on practical and realistic problems, you can double the speed of the state-of-the-art.

To learn more, you can grab our paper and check out our C++ code.

Reference:

  • Daniel Lemire, Nathan Kurz, Leonid Boytsov, SIMD Compression and the Intersection of Sorted Integers, Software: Practice and Experience, 2015. (arXiv:1401.6399)
  • Further reading: Efficient Intersections of compressed posting lists thanks to SIMD instructions by Leonid Boytsov.

    As a college student, I was convinced that the most important part of science and engineering was to have good and original ideas. If you contemplate Einstein, it is not hard to come to this conclusion… The man had a never ending stream of great ideas. Notice however that we never read Einstein himself, or examine his workbooks… instead, we just hear about his ideas.

    The same might be said about Google. Brin and Page came up with the idea of using hyperlinks to rank web pages. And they went on to become billionaires, just like that. We rarely talk about how the then dominating player (AltaVista) failed to keep its index updated for months and tried hard to milk every penny out of visitors by plastering its web site with ads… while Google just offered a working search engine kept up-to-date.

    Matt Might, a computer science professor, recently wrote that “the limiting reagent for scientific progress is good ideas”.

    Matt presents the scientific model in a way that is familiar to many, but contrary to my own experience. One imagines scientists idle, waiting for good ideas to come to them. Maybe we can imagine Einstein who sits by his window receiving all these great ideas, as if they were telegraphed from an Oracle directly to his brain. And then, once the idea hits, it is all over in an instant. Einstein grabs his pen and writes the paper in a minute. Brin and Page imagine PageRank and then build their search engine in a week-end, taking the world by storm.

    When I supervise graduate students, I often see them struggling with this model. They have ideas, but they do not know which ones are good. Some of them choose to wait… maybe a great idea will come eventually? It never does.

    My model is different. Instead of imagining Einstein as someone receiving these deep insights magically, I imagine an obsessive thinker who keeps asking… “yes, but what does it mean?”… or “yes, but how does it work?”. Whereas others are satisfied with recopying the accepted answers, or accepting the unknowns, he redoes the work, more carefully. I imagine Brin and Page not as people who surfed on a brilliant idea… but rather as people who took a problem that might have been considered solved, or too technical, that is “how do you build a search engine”… and they revisited it, putting a lot more care into it.

    When I look around, I see a deeply flawed world with thousands of superior ideas waiting to be tried out. I am literally never running out of good ideas and I doubt others are. I am not exceptionally smart, but give me any interesting problem, in science or engineering, and I am sure I can come up with five approachable questions that have not been answered yet. In fact, I think that almost anybody can do this… Give me a few years to become really knowledgeable, and I can probably prototype a solution in a few weeks.

    I have two young boys. Almost every day they come up with a science question that I cannot answer. Once a week, they come up with a question that I cannot answer with Wikipedia. Many of these questions are genuinely interesting and could make a research project of its own.

    What is genuinely scarce is interest and motivation.

    Let me put it another way. Imagine that I can go back in time to 1996. I know that the next critical piece of infrastructure is going to be a search engine. I know this, but there are already well entrenched leaders. I am in Montreal, and nobody is going to fund me, or join me to work on this problem… So I have to move to Silicon Valley where I will live in my car for a time. Then I have to find the ressources to build the new Google. It is really hard work. It is also very frustrating work because I know that Brin and Page are around… and maybe they will still come up on top.

    Or maybe I could go back to 1900. I basically have a major in Physics from the 1990s. I know a lot of Physics that even Einstein cannot yet imagine. I could find a job to feed myself while I hack away at brilliant research papers. Or maybe I could try to rewrite one of Stephen King’s famous novels and get it published before he could… by going back in time.

    Would I succeed? Maybe I would, maybe I would get discouraged. What is clear is that, even with perfect hindsight, success would be far from easy. Even with a perfect knowledge of all the great ideas… success would be really hard work.

    Experienced software programmers know that, often, it is just as hard to understand someone’s code than to rewrite it from scratch. There are many ways to re-express this same idea. Some might say that execution is everything. But I think that the truth is even harsher: the concept of good idea is ill-defined. In hindsight, we try to explain success as “having a good idea”, but even if you had received this “good idea” on a silver platter, you might not have done anything with it in the end.

    If you imagine the world a being two-dimensional, then there are many dead-ends, and few correct paths. The truth is that there are far more than two dimensions. You are not stuck in a dead-end. You are not out of good ideas. You are just bored.

    I have two young boys and I have decided to pay attention to what they are learning in school. Beside basic writing and reading skills, mathematics feels like the next most important academic topic. I realize that relatively little has changed in 30 years regarding how mathematics is taught.

    As a young kid, I liked mathematics but it became less enjoyable in high school. After a time, I learned to love algebra as it made it possible to solve difficult word problems more easily. To this day, I still do algebra to way I was initially taught, and I often get the right answers.

    Later in my schooling, I came across the quadratic formula. That was quite a disappointment. I will not even reproduce it here because it is an ugly hack. Algebra all made sense up to this point, and then we had to memorize a complicated formula. I was not impressed and there were many students to complain about it too. I attended a Catholic school at the time, and I remember that the teacher told us to take it on faith.

    Relying on faith to do mathematics did not square well with me. Maybe you can guess that I never did memorize the quadratic formula. Indeed, it is almost trivial and much more satisfying to derive it from first principes. Our teacher was kind enough to rush through the derivation on the blackboard once: it was too fast for any of us to understand, but sufficient for me to figure it out on my own later. How could he derive it at first and then ask us to take it on faith? I guess he thought that having us learn to figure it out on our own would take too long.

    I would argue that there is little sense in having people who cannot derive the quadratic formula, memorize it. If they cannot derive it, it is very unlikely that they will grow up to become adults who use algebra in a non-trivial manner. And if they are never going to do algebra, why would they need the quadratic formula?

    I think that most people at ease with algebra can derive the quadratic formula quickly. The first thing to note is that there is no reason to find the roots of a x2 + b x + c. You should divide throughout by a.

    So, really, all you need to do is to figure out the roots of x2 + b x + c. It is good enough. It is already a more elegant problem.

    Then you just need one more trick… it is called “completing the square” and it says that x2 + b x = (x + b/2) 2b2/4. Of course, there is no need to memorize the completion of the square formula…

    From this, solving for the quadratic formula is easy… you start from…

    x2 + b x = -c

    and you complete the square…

    (x + b/2) 2 = -c + b2/4

    And that is all!

    Admittedly, it can take me longer to solve for the roots of a quadratic polynomial using this derivation than someone who has memorized the formula. I would guess that it takes me an extra 5 seconds. But how often do you have to solve for the roots of such a polynomial without the assistance of a computer?

    I will also reiterate my argument against memorizing multiplication tables… Reasoning out that 6 times 8 is 48 is more important than the fact itself. Similarly, figuring out the binomial formula from first principles might help you approach a wider range of problems with confidence…

    To be fair to contemporary education, the American Common Core seems to recommend doing exactly as I did as a kid regarding the quadratic formula.

    Of course, the quadratic formula was only the first of many disappointments to come. Next we learned trigonometry, and then we had to suffer through analytic geometry…

    The pattern would repeat itself endlessly: “You have to learn that that the square of the secant minus the square of the tangent is 1 [replace by your favorite piece of trivia].” Then I would find a way to “route” around the problem. For example, in trigonometry, I eventually figured out that most things could be derived from the fact that the square of the sine plus the square of the cosine is 1. That fact itself was intuitive enough if you could remember that the cosine and sine give you the coordinates of a point on the unit circle given an angle. In fact, almost all of trigonometry can be derived from this simple observation.

    We still insist on presenting mathematics as a collection of definitions, facts and routines to be memorized. Thankfully, the emphasis is somewhat lesser than in my days… but if you scratch beyond the surface, too little has changed.

    Evidently, the purpose of such mathematics in schools is primarily to rank students. It is not very different from the Imperial examination of the Song dynasty. We are teaching to the test because only the test results matter… the test only serves to separate the “good students” from the bad… under a pretence of “merit”.

    And then, wait for it, we find out that many of our graduates lack skills that would make them employable! It does not matter because employers only look for degrees and diplomas… except when they do not or cannot.

    My boys love to play Minecraft. They build all sorts of crazy devices. By my estimation, this is probably a better preparation for the “real world” they will face in 10 years than the math that they are learning in school. Sadly, I am quite serious.

    Further reading: Several people have pointed out that many of my comments are related to a Mathematician’s Lament by Lockhart.

    When I write that we should focus on teaching useful skills, and avoid deliberate rote memorization… I get far more opposition than I expect. To my surprise, there is an abundant supply of teachers and parents openly supporting rote memorization and antiquated skills.

    • When I object to having kids being drilled in doing complicated but useless operations like the long division… I am told that “you need to do it the long way in your early education so you understand what’s going on under the hood.”

      Take a pen and some paper and compute 123.422 divided by 12.4. This is a real problem that grade 6 or grade 7 kids are being asked to solve. If you are anything like me, you will probably pause and realize that you have forgotten how to do it. Of course, it will come back… but my point is that it is a dead skill that only a tiny minority of adults use. And most of these adults are teachers.

      If we are trying to teach what is under the hood… what is the hood?

      How processors do divisions is not quite like pen-and-paper long division. In fact, most mathematicians have no idea how a processor does a division. Though it is an interesting process that might be worth teaching to some high school kids… it never gets taught…

      Let me repeat that: these educators telling you that kids need to know “what is under the hood” most probably could not design a circuit to compute a division without access to the Internet.

      “Knowing what is under the hood” is overrated. Honestly, I barely know what is under the hood of my car. I do not need to. And even though I have been programming for over 30 years, and I am a published computer scientist… there is much that I do not know or understand about modern microprocessor technology. At a lower level, though I know a bit about circuits, but I do not really understand how a transistor works. It is not that I could not understand… it just never proved useful. I could look it up on Wikipedia, I guess, but I am not even motivated to do that.

      How come millions of people use computers, but they do not know how transistor works? Well, because knowing how transistors work is not useful to them. Useful facts are relative: to a few people in the world, knowing about the inner workings of transistors is really important, but for most of us, it is irrelevant knowledge. Irrelevant knowledge is not automatically useful by some magical process.

      So what makes us so certain that all kids worldwide, millions of them, need to know how to divide 12.45 by 45.1 by hand using pen and paper…?

      For a long time, in Europe and North America, privileged kids had to learn latin. The intuition was that since European languages were derived from Latin, learning Latin made you better at these languages. Though this is true, the effect is small. You are much better off learning useful skills (like learning Spanish or German or French) than learning Latin. And that is why few kids learn Latin today.

      Just like we stopped teaching Latin, we should stop teaching long divisions and related antiquities.

      As far as I can tell, kids are probably better off playing Minecraft.

    • When I object to rote memorization for its own sake, people tell me that what is being memorized is useful. This, somehow, justify forcing kids into rote memorization routines.

      Let me restate my objection: rote memorization on its own is theory divorced from practice. If you work a lot with numbers, you will learn to multiply and divide them reasonably quickly.

      Deliberate rote memorization is an attempt to take a shortcut in the learning process… Instead of having people learn important facts by themselves through practice, we decide once and for all what the important facts are, we delay practice, and start with the memorization of the “important facts”.

      And, of all the important facts we could teach kids, we decide that 6 times 9 is one of them. In this world of ubiquitous computers, we have somehow concluded that doing arithmetic super quickly in one’s head is critical.

      Let me repeat that I have never memorized what 6 times 9 is… instead, I do 6 times 10 is 60, minus 6 is 54. It takes me a few extra milliseconds I suppose… but my approach scales… to do 15 times 9… I do 10 times 9 is 90, plus half that is 135…

      In a country like Canada where we are supposed to leave 15% tips, I am often amazed at how few people can compute the tip quickly. I suppose that is because the multiplication tables only go up to 12… but why 12 and not 15 or 25 or 50? Of course, multiplying by 15 is trivial once you stop trying to memorize and just apply simple algorithms. Maybe more people could do it if we stressed computational thinking instead of rote memorization.

      I pity the kids who were forced by their parents into memorizing these tables… I will not discourage my kids from using algorithms to solve problems.

      Is it any wonder that kids worldwide conclude that mathematics is boring?

      And what kind of model of learning does this rely upon? That the brain is some sort of crude data bank to be filled?

      Our brain memorizes many important facts because they are useful to us. I know the names of my kids. I can also recall when they were born. I know two natural languages: French and English. I know about the fundamental theorem of algebra. I know about Galois fields. I know how to make excellent bread. I know the exact recipe to make good yogourt. My two sons appear to know a lot about Minecraft. But none of this memorization is deliberate rote memorization. Rather, it is natural or organic memorization… by repeated practice, you memorize important facts. Babies do not learn languages by deliberate rote memorization.

      Yes, learning mathematics is hard (for everyone!), but it simply requires practice. Nobody ever became a great mathematician, or even a good one, by relying on rote memorization.

      I have learned many computer languages through my life… The latest one is Go. I think I must have learned the basics of Go in a week or so. How did I do it? I started programming. I absolutely did not start by rote memorizing the grammar of the language. I just started programming and when I got stuck, I looked up the answer. And some facts just stuck with me over time because I kept looking them up. For example, I have memorized that fmt.Println(“…”) will print a string on screen. It is just something that anyone programming in Go will get to know. The objective is never to memorize fmt.Println per se. That string is not important… we just have to memorize it, as a side-effect of our practice.

      Now, of course, if you enjoy rote memorization, then there is nothing wrong with it. There are entire competitions dedicated to rote memorization. It is a fine sport, but a lousy general-purpose learning strategy… especially when you apply it to kids.

    Math education has made progress compared to my school days in specific areas. For example, I am really happy that statistics and probabilities are fully integrated with how we teach mathematics today. I cannot remember learning any notion of probabilities in class until I reached college. I do remember computing the probability that I would get a 7 if I were to throw the dice as a teenager, but this was never part of my schooling.

    However, it is not clear that we are moving forward as fast as we need to.

    • As I have criticized before, we seem to stress too much the importance of rote memorization, in particular by pressuring student to memorize the addition and multiplication tables. I have never memorized these tables. To this day, to compute 6 times 9, I do… 6 times 10 minus 6. Not only do I feel no shame, I am convinced that rote memorization for its own sake can be detrimental. For one thing, it annoys students, making them hate the topic. For another, it encourages rote memorization as a way of doing mathematics.

      If you think it is odd for someone trained in mathematics like myself not to know his multiplication tables, note that Benoit Mandelbrot, arguably one of the greatest mathematicians of the XXth century, never memorized his multiplication tables.

    • There is a strong emphasis on naming the concepts and memorizing the names. My oldest son was given the following test: in 5 times 10 is 50, what is 5? The correct answer was “a factor”. I am quite sure I would not have known what was demanded. The way geometry is taught is almost entirely about memorizing the names of various figures. Who cares about what a scalene triangle is? Math is not about learning definitions.
    • Generally, we stress trivial but complicated algorithms. As a professional scientist, I rarely have to add fractions, or to do long divisions. To be honest, I had to pause for a few seconds before I could explain how one goes about computing 12.53 times 13.21. Really, you never compute such a thing the long way in the real world. Though I read that it has been de-emphasized, and this makes me happy… I still do not know why you would drill anyone by asking them to divide 115.5 by 24.21 by hand. It is useless and boring.
    • Computational thinking is not encouraged. The students have to be able to divide 115.5 by 24.21 by hand, but they are not be made aware that they are using an algorithm. In fact, computational thinking appears to be almost entirely absent while rote memorization and mindless drilling rule.
    • The students are asked to do “real-world” problems, that are designed to be encouraging and relevant. So instead of an openly arbitrary 4-line word problem, we get a 3-page story about Any who is trying to organize a party and she has 12 friends who each need to drink 120mL of juice, but she can only buy juice in units of 1.2L. Except that she can go across town where she can buy cheaper juice in units of 1.5L, but then she has to pay for her cab. Except that she can ask her sister to drive her, but then she also needs to pick up blue curtain for her mother and they are across town anyhow. Or she could go to the local store and buy red drapes, but it is out of her way… You think I am joking, but I barely exaggerate… I guess that they are also trying to train kids in the art of filling out complicated tax forms or do some corporate job, so they need to read through long instructions with many details and plan ahead… Except that if I am turned off by these problems as a trained mathematician, I can only guess how people who proclaim to hate math must feel. No, I do not think that they are fooled into thinking that this is relevant because of how it is phrased.

    How would I train kids in mathematics? Good old word problems combined with an introduction to programming.

    Next Page »

    Powered by WordPress