How to learn efficiently

I am convinced that much of the gap between the best college students and the worst is explained by study habits. Frankly, most students study poorly. To make matters worse, most teachers are incapable of teaching good study habits.

Learning is proportional with effort

Sitting in a classroom listening to a professor feels like learning… Reading a book on a new topic feels like learning… but because they are overwhelming passive activities, they are inefficient. It is even worse than inefficient, it is counterproductive because it gives you the false impression that you know the material. You can sit through lecture after lecture on quantum mechanics. At some point, you will become familiar with the topics and the terminology. Alas, you are fooling yourself which is worse than not learning anything.

Instead, you should always seek to challenge yourself. If some learning activity feels easy, it means that it is too easy. You should be constantly reminded of how little you know. Great lectures make it feels like the material is easy: it probably is not. Test yourself constantly: you will find that you know less than you think.

Some students blame the instructors when they feel confused. They are insistent that a course should be structured in such a way that it is always easy so that they rarely make mistakes. The opposite is true: a good course is one where you always feel that you will barely make it. It might not be a pleasant course, but it is one where you are learning. It is by struggling that we learn.

On this note, Learning Style theory is junk: while it is true that some students have an easier time doing things a certain way, having it easier is not the goal.

There are many ways to challenge yourself and learn more efficiently:

  • Seek the most difficult problems, the most difficult questions and try to address them. It is useless to read pages after pages of textbook material, but it becomes meaningful if you are doing it to solve a hard problem. This is not news to Physics students who have always learned by solving problems. Always work on the toughest problems you can address.
  • Reflect on what you have supposedly learned. As an undergraduate student, I found that writing a summary of everything I had learned in a class was one of the best ways to study for an exam. I would just sit down with a blank piece of paper and try to summarize everything as precisely as possible. Ultimately, writing your own textbook would be a very effective way to learn the material. Teaching is a great way to learn because it challenges you.
  • Avoid learning from a single source. Studying from a single textbook is counterproductive. Instead, seek multiple sources. Yes, it is confusing to pick up a different textbook where the terminology might be different, but this confusion is good for you.

If sitting docilely in a classroom is inefficient and even counterproductive, then why is it so common a practice? Why indeed!

Interleaved study trumps mass study

When studying, many people do not want to mix topics “so as not to get confused”. So if they need to learn to apply one particular idea, they study to the exclusion of everything else. That is called mass (or block) practice.

Course material and textbooks do not help: they are often neatly organized into distinct chapters, distinct sections… each one covering one specific topic.

What researchers have found is that interleaved practice is far superior. In interleaved practice, you intentionally mix up topics. Want to become a better mathematician? Do not spend one month studying combinatorics, one month studying calculus and so on. Instead, work on various mathematical topics, mixing them randomly.

Interleaved practice feels much harder (e.g., “you feel confused”), and it feels discouraging because progress appears to be slow. However, this confusion you feel… that is your brain learning.

Interleaved practice is exactly what a real project forces you to do. This means that real-world experience where you get to solve hard problems is probably a much more efficient learning strategy than college. Given a choice between doing challenging real work, and taking classes, you should always take the challenging work instead.

Further reading: Make It Stick: The Science of Successful Learning by Peter C. Brown et al. and Improving Students’ Learning With Effective Learning Techniques by Dunlosky et al.

MOOCs are closed platforms… and probably doomed

Colleges and universities, left and right, are launching Massive open online courses (MOOC). Colleges failing to follow are “behind the times”.

Do not be fooled by how savvy MOOC advocates sound. They do not understand what they are doing.

Let us start with how they do not even understand what a MOOC is, or should be. MOOCs are supposed to be open platforms. It is right there in the name. Downes’ original MOOCs were indeed open. Yet the actual MOOCs that colleges publish are closed platforms, as per Wikipedia’s definition:

A closed platform is a software system where the carrier or service provider has control over applications, content, and media, and restricts convenient access to non-approved applications or content. This is in contrast to an open platform, where consumers have unrestricted access to applications, content, and much more.

The word “open” has been perverted beyond belief, but let us be clear: Facebook is not an open platform. It is public, certainly, in the sense that everyone can join… but it is a closed platform. The content is locked up. If search engines cannot index the content, then it is closed. It is that simple. If your course requires that prospective students “register” to access the content, then it is not an open course. It might be an online course, it might even be massive, but it is not open.

There is nothing wrong with closed platforms per se. The ancient Greek philosophers made a living by selling their lectures to paying customers. But most modern college campuses are remarkably open in contrast. In all likelihood, I can just show up for class on campus in most colleges in North America and attend lectures, for free. I do not need to provide an email address or a password. If there is room in the class, I can generally sneak in. Nobody will care. Why is that? Because we have learned that selling lectures is a tough business. It was different for the Greeks because so little was written down… but we live in an era where Amazon can deliver a textbook on any topic directly to your door within 48 hours. In this era, it is much better to sell diplomas and degrees. Unlike lectures, they have tangible financial value for the students. Some colleges also serve as meeting places, others provide an experience.

What colleges do not do, at least on campus, is to make money off course content. As it is, you can easily order all the textbooks you could possibly read on Amazon. You can join discussion groups about them. You sneak into lectures, or find tons of them online. There is simply little value in the course content.

Do not believe me? Run the following experiment. Make all courses tuition free. Students can enrol for free and if they pass the exam, they get the credit. However, they must pay $20 for each hour of lecture they choose to attend. You know what is going to happen? Nobody but the instructor will show up. How do I know? Because, as it is, with free lectures once you have enrolled in a class, most students never show up for class unless they are compelled to do so. Why would anyone think that it is going to be somehow different with pre-recorded lectures online? You know, the lectures colleges like so much? The truth is that there is only value at the margin for course content.

It is probably harder to make a living selling lectures than it is as a journalist, and it has become nearly impossible to live off journalism. The volume of great free stuff is just too high.

Colleges that try to lock down course content, let alone the content of their MOOCs, are signalling that they have no clue about the business that they are in.

Optimizing polymorphic code in Java

Oracle’s Java is a fast language… sometimes just as fast as C++. In Java, we commonly use polymorphism through interfaces, inheritance or wrapper classes to make our software more flexible. Unfortunately, when polymorphism is involved with lots of function calls, Java’s performance can go bad. Part of the problem is that Java is shy about fully inlining code, even when it would be entirely safe to do so. (Though this problem might be alleviated in the latest Java revisions, see my update at the end of this blog post.)

Consider the case where we want to abstract out integer arrays with an interface:

public interface Array {
    public int get(int i);
    public void set(int i, int x);
    public int size();

Why would you want to do that? Maybe because your data can be in a database, on a network, on disk or in some other data structure. You want to write your code once, and not have to worry about how the array is implemented.

It is not difficult to produce a class that is effectively equivalent to a standard Java array, except that it implements this interface:

public final class NaiveArray implements Array {
    protected int[] array;
    public NaiveArray(int cap) {
        array = new int[cap];
    public int get(int i) {
        return array[i];
    public void set(int i, int x) {
        array[i] = x;  
    public int size() {
        return array.length;

At least in theory, this NaiveArray class should not cause any performance problem. The class is final, all methods are short.

Unfortunately, on a simple benchmark, you should expect NaiveArray to be over 5 times slower than a standard array when used as an Array instance, as in this example:

public int compute() {
   for(int k = 0; k < array.size(); ++k) 
   int sum = 0;
   for(int k = 0; k < array.size(); ++k) 
      sum += array.get(k);
   return sum;

You can alleviate the problem somewhat by using NaiveArray as an instance of NaiveArray (avoiding polymorphism). Unfortunately, the result is still going to be more than 3 times slower, and you just lost the benefit of polymorphism.

So how do you force Java to inline function calls?

A viable workaround is to inline the functions by hand. You can to use the keyword instanceof to provide optimized implementations, falling back on a (slower) generic implementation otherwise. For example, if you use the following code, NaiveArray does become just as fast as a standard array:

public int compute() {
     if(array instanceof NaiveArray) {
        int[] back = ((NaiveArray) array).array;
        for(int k = 0; k < back.length; ++k) 
           back[k] = k;
        int sum = 0;
        for(int k = 0; k < back.length; ++k) 
           sum += back[k];
        return sum;

Of course, I also introduce a maintenance problem as the same algorithm needs to be implemented more than once… but when performance matters, this is an acceptable alternative.

As usual, my benchmarking code is available online.

To summarize:

  • Some Java versions may fail to fully inline frequent function calls even when it could and should. This can become a serious performance problem.
  • Declaring classes as final does not seem to alleviate the problem.
  • A viable workaround for expensive functions is to optimize the polymorphic code by hand, inlining the function calls yourself. Using the instanceof keyword, you can write code for specific classes and, thus, preserve the flexibility of polymorphism.


Erich Schubert ran a similar benchmark with double arrays that appeared to contradict my results. In effect, he sees no difference between the various implementations. I was able to confirm his results by updating to the very latest OpenJDK revision. The next table gives the number of nanoseconds required to process 10 million integers:

 Function   Oracle JDK 8u11   OpenJDK 1.8.0_40   OpenJDK 1.7.0_65 
straight arrays 0.92 0.71 0.87
with interface 5.9 0.70 6.3
with manual inlining 0.98 0.71 0.93

As one can sees, the latest OpenJDK is smarter and makes the performance overhead of polymorphism go away (1.8.0_40). If you are lucky enough to be using this JDK, you do not need to worry about the problems of this blog post. However, the general idea is still worthwhile. In more complicated scenarios, the JDK might still fail to give you the performance you expect.

The Smartest Kids in the World: stories from Finland, Poland and South Korea

I have always been interested in what makes us smart. So I read Amanda Ripley’s The Smartest Kids in the World in almost a single sitting. She is a good writer.

The core message of the book is simple and powerful. Entire countries can change how their kids rank in international academic competition within a few short years. For example, Poland (a relatively poor country) was ranked in 25th position in 2000 on mathematics, but in 13th position in 2009. Finland was the strongest democratic country in 2006, but Japan and South Korea have surpassed it in 2012. Meanwhile, the United States always does poorly despite outspending everyone on a per-student basis. Some Asian countries (e.g., Singapore and selected parts of China) put everyone else to shame, but she dismisses them as being too far from democratic countries.

She covers three education superpowers: Finland, South Korea and Poland. Beside the fact that they are all democracies, there is very little in common between these three countries, except that their 15-year-old fare well in academic tests. It is not clear whether there is any lesson to be learned from these countries.

To make things worse, her choice is somewhat arbitrary. For example, Canada (my home country) also does very well but she somehow decided that comparing Canada with the United States would not make a good story. Maybe Canada is not exotic enough by itself, but she could have consider the French province of Quebec. It is one of the poorest place in North America, but in the 2012 PISA test, Quebec scored 536 in Mathematics, which is as good as Japan and better than Finland (519), Poland (518), and a lot better than the United States (482).

The book is very critical of the Asian mindset. Kids in South Korea are drilled insanely hard, starting their school day at 8am, and often ending it at 11pm. Finland is a much nicer places in the book. Even Poland appears pleasant compared to South Korea.

To be fair, if half the things she writes about how kids are drilled in South Korea is true, I would never send my boys to school there.

Implicit in the book is the belief that the United States will pay a price in the new economy for its weak schools. American kids spend too much time playing football, and not enough time studying mathematics. Or so the book seems to imply.

This seems a bit simplistic.

My impression is that in some cultures, like South Korea, much of your life depends on how well you are doing at 15. So, unsurprisingly, 15-year-old kids do well academically. In the United States, people will easily forgive a poor high school record. You can compensate later on. So maybe American teenagers spend more time playing video games than doing calculus: who could blame them?

What is a lot more important for a country is how well your best middle-age workers do. The bulk of your companies are run by 40-something or 50-something managers and engineers. Only a select few do important work in their 20s. In my experience, you learn much of what you know by the time you are 40 “on the job”.

So I am not willing to predict bad times ahead for the United States based solely on the academic aptitude of their kids. I think we should be a lot more worried about the high unemployment rates among young people in Europe. Sure, French kids may earn lots of degrees… but if you do not have 10 years of solid work experience by the time you are 40, you are probably not contributing to your country as much as you could.

It is important to measure things. I am really happy that my kids are going to go to school in Quebec, a mathematics superpower at least as far as teenagers are concerned. But there may be trade-offs. For example, by drilling kids very hard, very early, you may drain their natural love of learning. This can lead to employees who will not be learning on their own, for the sake of it. Or you may discourage entrepreneurship.

As a general rule, we should proceed with care and avoid hubris because we may not know nearly as much as we think about producing smart kids.

Academia as an ‘anxiety machine’

We learned recently of the suicide of Stefan Grimm, a successful professor at the prestigious Imperial College in London. Professors Grimm regularly published highly cited articles in the best journals. He was highly productive. Unfortunately, some of his colleagues felt that he did not secure sufficiently large research grants. So he was to be fired.

It is not that he did not try. He was told that he was the most aggressive grant seeker in his school. He worked himself to death. I am willing to bet that he was failing my week-end freedom test. But he still failed to secure large and prestigious grants because others were luckier, harder working, or even smarter.

It is not remarkable that he felt a lot of pressure at work. It is not remarkable that he was fired despite being smart and hard-working. These things happen all the time. What is fascinating is the contrast between how most people view an academic job and Grimm’s reality.

Other academics (starting maybe with Richard Hall) described academia as an ‘anxiety machine’:

Throw together a crowd of smart, driven individuals who’ve been rewarded throughout their entire lives for being ranked well, for being top of the class, and through a mixture of threat and reward you can coerce self-harming behaviour out of them to the extent that you can run a knowledge economy on the fumes of their freely given labour.


I know plenty of professors and star researchers who eat, sleep and breathe research, and can’t understand why their junior colleagues (try to) insist on playing with their children on a Sunday afternoon or going home at 6. ‘You can’t do a PhD and have a social life’, my predecessor told me.

It is simply not very hard to find overly anxious professors. I know many who are remarkably smart and who have done brilliant work… but they remain convinced that they are something of a failure.

Successful academics have been trained to compete, and compete hard… and even when you put them in what might appear like cushy conditions, they still work 7 days a week to outdo others… and then, when they are told by colleagues that it is not yet enough… they take such nonsensical comments at face value… because it is hard to ignore what you fear most…

And it is all seen as a good thing… without harsh competition, how are you going to get the best out of people?

Did you just silently agree with my last sentence? It is entirely bogus. There is no trace of evidence that you can get the best out of people at high-level tasks through pressure and competition. The opposite is true. Worried people get dumber. They may be faster at carrying rocks… but they do not get smarter.

Stressing out academics, students, engineers or any modern-day worker… makes them less effective. If we had any sense, we would minimize competition to optimize our performance.

The problem is not that Grimm was fired despite his stellar performance, the problem is that he was schooled to believe that his worth was lowered to zero because others gave him a failing grade…

Source: Thanks to P. Beaudoin for the pointer.

When bad ideas will not die: from classical AI to Linked Data

Back in the 1970s, researchers astutely convinced governments that we could build intelligent systems out of reasoning engines. Pure logic would run the day. These researchers received billions for their neat ideas. Nothing much came out of it.

Of course, what we now call artificial intelligence works. Google is maybe the most tangible proof. Ask any question and Google will answer. But what lies behind Google is not a giant reasoning engine crunching facts. Rather it uses a mix of different techniques including machine learning and heuristics. It is messy and not founded on pure logic.

Collecting, curating and interpreting billions of predicates is a fundamentally intractable problem. So our AI researchers failed to solve real problems, time and time again. And their funding was cut, severely. We had what has become known as the AI winter.

But bad ideas will not die so easily. The AI researchers who started out in the 70s have trained PhD students with their generous funding. These PhD students have grown up to become professors themselves, and the wheel keeps on turning. As long as there are academics finding these ideas appealing, there will be grants and professorships. The ivory tower will not let the reality intrude.

Berners-Lee, inspired by these misguided researchers, proposed the Web as a way to revive the AI dream. For reasons having nothing to do with the original intent, the Web takes root and becomes ubiquitous.

Using his newfound fame, Berners-Lee goes back to his original dream and promotes the Semantic Web. The web should embrace the original vision from the 70s: collect and maintain lots of facts and use a query engine to create intelligence.

Ten years go by and hardly anything (outside academic conferences) comes out of this Semantic Web. Tens of thousands of research papers get written. Hardly anyone ever uses any of the technology produced and those who do achieve few useful results. Not to worry, Berners-Lee rebrand whatever remains as Linked Data.

And the wheel keeps on turning…

So when I am asked about the possibilities offered by Linked Data, I feel a great sorrow.

Further reading: Nova Spivack on a New Era in Semantic Web History

Perfectionism is not the same as having high standards

A lot of what I do is… quite frankly… not very good. To put it differently, almost everything I do fails to meet my quality standards. So I am constantly fighting to do better.

It is not perfectionism. Perfectionism is the refusal to do anything unless it meets your high standards. To put it differently, perfectionism is synonymous with a lack of humility. It is a belief that flawed results are below you.

Let me be blunt: perfectionists are self-centered pretentious pricks.

Why competitive people are often dumb and boring

People who work hard are typically motivated by either their performance (i.e., they want to look good) or their mastery (i.e., they like being good at their craft). Most of us pursue a mix of different goals.

It would seem like performance goals are harmless. What is wrong with wanting to get good grades in school, or having a good salary? Nothing is wrong with these goals, except that they can backfire.

  • Performance-oriented people often develop performance-avoidance goals: they want to avoid looking bad.

    When I was a student at the University of Toronto, many competitive students looked for the easiest classes they could find. If you focus on looking good, you will avoid challenges where you might look bad.

    A professional suffering from performance-avoidance goals may avoid taking on risky projects or jobs. Scientists preoccupied by their performance will often avoid challenging projects, preferring to follow the same tracks over decades. A programmer worried about looking bad might avoid trying out a new programming language.

    In short, performance-avoidance goals may limit your ambition.

    Performance-avoidance goals may also lead you to focus narrowly. Why waste time learning about calligraphy when you could practice for your calculus test?

    You effectively narrow down your life to whatever is most boring or safest.

  • Performance goals are hard on morale. At some point, you will fail. Maybe you wanted to enter this highly competitive school, or you wanted to get this prestigious job… and instead you will have to be satisfied with less than you hoped for.

    It is almost unavoidable because there will always be pressure to set the bar higher, and higher.

    People with performance goals are more likely to crash.

    I have never been to South Korea, but I hear that kids kill themselves over bad results at school. That is one extreme. Most crashes are not so intense or visible… but they are common nevertheless. They sometimes take unexplained forms.

    A favorite example of mine is mathematics. I genuinely believe that the overwhelming majority of the population can be good at mathematics. Of course, not everyone starts on an equal footing, and some people need to work harder. But mathematics is fun. All young kids like mathematics. What turn people away from mathematics is the fear of failure.

I believe we should be especially carefully about setting performance goals for kids. It is especially damaging for kids to limit their ambition, play it safe and burn out. You want kids who are unafraid to try their luck at many things… you want kids to have high morale.

What is the alternative to performance goals? You can focus on growing your skills, mostly forgetting about performance. Ignore selective venues. Forget about beating others… avoid competition if you can… Focus all your attention on doing better and more interesting work.

People who are obsessive about honing their skills are never boring. They also tend to be generous. If your goal is to get better… you have no reason not to help others… especially if it can serve as an excuse to improve your own skills further.

Burning out is less likely when you are focused on mastery… maybe because setbacks are much less likely.

Some will object that performance and competition matter a great deal. If you are a martial art expert and someone is trying to kill you, focusing on improving your skills might not be optimal. But throughout most of your life, you will not be in grave danger. You can afford a few bad grades. You can afford to be passed on for promotion. The truth is that if you are really good at what you do, you will probably do ok most of time without ever having to compete. Oh! And you will probably become a more interesting person.

The hubris of teachers

Today, kids left and right carry the label of some learning disability. Instead of telling kids that they are dumb or lazy, we narrow it down to some problem. It is clearly progress on the face of it. However, when I see that, in some schools, over 10% of all kids have received some kind of disability label by the time they graduate… I worry.

There might be some hubris at work. Do the experts know as much as they claim to know?

A favorite pet peeve of mine is the importance we put on grades as predictors of success. I have spent a great deal of time reviewing graduate students for scholarships in national competitions. I had a nearly perfect GPA myself. I was expecting the undergraduate GPA of students to be strongly correlated with the success at the doctoral level. What I found time and time again was that the correlation was weaker than I expected. Students who do very well as undergraduates often fail to shine as graduate students, and students who disappoint as undergraduates can sometimes do remarkably well as PhD students.

Given a choice, schools should prefer students students who got better GPAs. However, I would abstain from predicting the performance of a given student in graduate school given his GPA.

It is not that grades and tests do not matter, it is that we should use caution and humility when interpreting them. It is relatively easy to make statistical predictions, but it is very hard to translate these statistical predictions into reliable individual predictions.

In my opinion, the greatest mathematician of all times was probably Galois. Coming out of nowhere, he created a deep, useful and engaging mathematical theory that is still, today, viewed as highly original. You are using technology directly derived from Galois’ work today, even though he died in 1832 when he was 20 years old. However, we find that teachers regularly complained about Galois’ uneven results, lack of application, and so on. It seems that he could not focus for long on what his teachers wanted him to do. He was a pain as a student.

This is not uncommon, Gurdon, winner of the 2012 Nobel Prize in Physiology and Medecine, was similarly a difficult student:

“His work has been far from satisfactory… he will not listen, but will insist on doing his work in his own way… I believe he has ideas about becoming a Scientist; on his present showing this is quite ridiculous, if he can’t learn simple Biological facts he would have no chance of doing the work of a Specialist, and it would be a sheer waste of time on his part, and of those who have to teach him.” (source)

Everyone should use caution when judging others, but I believe that educators should be especially careful. They may not understand nearly as much as they think about the mind of their students.

Forcefully boring young people is necessary…

In many schools, a fifth of all boys are prescribed Amphetamine-related drugs because they have been diagnosed with an attention deficit. But these pills are not intelligence-in-a-bottle. To put it differently, taking Adderall may not make you smarter at all:

Although there is a perception among students that stimulant medication may improve academic performance, studies in adults without ADHD suggest that stimulants do not promote learning and may in fact impair performance in tasks that require adaptation, flexibility, and planning. Stimulant-induced improvements in cognition in individuals without ADHD were mainly evident in those with low cognitive performance, suggesting that stimulants may be more effective at correcting deficits rather than enhancing academic performance. (Nugent and Smart, 2014)

To be clearer, with psychostimulants, you might do better at basic arithmetic if you cannot normally do arithmetic, but you could do worse at higher level tasks.

The problem is that students diagnosed with ADHD have commonly other learning disabilities. And it might be these other disabilities that are helped by drugs:

The efficacy of psychostimulants was documented on specific areas of achievement for the ADHD+ [ADHD with learning disabilities] group, but this review did not support the administration of psychostimulants for students with ADHD- [ADHD without learning disabilities]. (Zentall et al., 2013)

There is a larger issue… what do kids think of these drugs? Many teenagers do not like them, at all:

Overall, adolescents reported very low satisfaction with stimulant medication. (Pelham et al., 2013)

I should be clear that I am pro-medication. If we do find intelligence-in-a-bottle, I want the first bottle. If taking amphetamines makes you better at what you care about, then please take it.

I am concerned however that these pills might just be the system pushing the blame on the biology of students. Imagine if the movie industry decided that people who cannot enjoy their movies should take pills instead?

My impression is that schools are unable to face the truth: they are boring and unpleasant to a lot of students.

As a kid, I was diagnosed with a learning disability. I failed kindergarten and was put in a special class. This came about because I would not learn my phone number, or to count up to ten. Throughout my primary education, I refused to learn my multiplication tables. Yet I went on to get a PhD. I am pretty confident that I do not, nor did I ever have, a crippling cognitive defect. Yet I fear that I young version of myself would be prescribed pills today.

If my life depended on it, I could listen to a teacher for 50 minutes without losing a word. I could memorize long tables of numbers. But as a kid, I refused to do it because it is boring and unnecessary.

As a middle-age tenured professor with dozens of published research papers I can say with confidence that rote memorization of the multiplication tables would have been useless to me. I think it is, at best, of a very limited use to a very limited number of people…

Now, there are kids that simply cannot learn to read or to multiply because they have a disability. It seems that drugs can help them. They should certainly take them. But if your little boy cannot be bothered to do rote memorization or other boring school-related tasks, is having him take pills the real solution? Is it fair to force generations of kids to do boring unpaid work because we say so?

There were things that I really disliked as a student. Rote memorization was one of them. Another was the lecture. Even to this day, I cannot listen to most lectures without getting bored in the first 5 minutes. I think I might even be an extremist in this respect: in college, I skipped most of my lectures, or just attended them to know when the assignments and exams were. I then worked on my own at the library, or with close friends.

Simply put, even in college, the ability to listen to boring people for extended periods of time is not a necessary skill. Rote memorization is also not very important: it might help you on some tests, but you are not going to win a Nobel prize by taking tests.

I also hated the always-on social component of school. You are always with lots of other people you barely know. I find this very distracting and exhausting. Say what you will, I believe that human beings are geared toward working within a small tribe. Classrooms far exceed the size of a tribe. How natural is it to be forced into a oversized tribe?

Some will reply that school is not meant to be pleasant. It should be boring. But why? Where is the evidence that forcefully boring young people is necessary? Where is the evidence that rote memorization makes you smarter? What is the real purpose here?

Credit: Greg Linden pointed us to this New York Times article on AHDH.

Bricolage by smart people

Scientific research is fundamentally about learning, about trial and error. Luck and unplanned interactions are a central part of it. Thus research cannot be planned and managed like, say, teaching duties or a Walmart store. If you could manage it, then it would not be research.

Research is usually greedy, in the sense of a greedy algorithm. At each point in time, you try to take the next best move, without knowing anything about the future. Maybe working on this new cloud computing algorithm will open the door to fame and fortune. Maybe you will meet a brilliant student next week who has great ideas on how to advance the field. Or maybe it is a dead end, maybe the problem has already been solved by a famous California professor last year. Typically, you do not know.

If you work hard and you are extremely clever, you might be able to make better guesses. But very, very few researchers can foresee the future 5 or 10 years ahead. Even Einstein got stuck in dead ends.

Fernando Pereira is a leading computer scientist and an ACM Fellow who works at Google. He describes his view of research in similar terms:

Most successful projects I know, and certainly all that I have been involved in (…) started bottom-up, with zero to minor management involvement, and grew through repeated successful interactions with their environment. Pretty much like all the successful projects I’ve been involved in both in academia and industry.

However, much of research today is supposedly based on 5-year plans and funded by the government. The sort of plans that Soviet Russia liked so much. It should come to no surprise as Soviet Russia literally invented our government-sponsored research model. It is worth repeating that it is an absurd model, as Pereira puts it:

(…) in government-funded work I had to go through the ritual of pretending to know where the proposed work would get to in several years before doing the actual experiments. Which over time, as competition for funding increased, became the self-contradictory process of claiming the work was novel and required new funding to carry out while having already enough results to convince reviewers that the project was a sure thing.

It is fascinating how we have a hard time dealing with the fact that R&D is in fact nothing else but bricolage done by smart people.

Having your cat declawed means having its fingers amputated

There are many simple facts that totally escape me for years. For example, though I took biology in college and I knew that plants were made of carbon through photosynthesis, I only realized a few years ago that plants grow by absorbing CO2 from the air. I knew that mass had to be preserved, but I stupidly assumed that plants took the bulk of their mass from the soil. I should have realized right away that the assumption was false by thinking it through.

A few years ago, my wife told me what declawing means. I have spent much of my life assuming the veterinarian had some magic and painless way to remove just the claws of the cat. But once you start thinking it through, it makes no sense.

Many people have their cats “declawed” to protect their furniture. The procedure is actually called onychectomy and involves the amputation of the “fingers” of the car (the phalanges).

It is banned as animal cruelty in 20 or so countries. The procedure appears to cause pain: “Regardless of the analgesic regimen, limb function was still significantly reduced 12 days after surgery, suggesting that long-term analgesic treatment should be considered for cats undergoing onychectomy.” (Romans et al. 2005).

We have non-surgical alternatives to declawing such as Soft Paws or regular claw trimming.

The argument in favour of onychectomy is that it is a relative benign surgical procedure that satisfies pet owners most of the time. Some veterinarians further justify the procedure by stating that without it, the owners would have the cat euthanized or would just abandon it.

I know many cat lovers who have had their cats declawed. I am almost sure that, had I offered to amputate the fingers of their cats, they would have thought me cruel…

We live in a complex world and even the best of us operate on very limited knowledge.

Coffee is probably not killing your productivity

A recent Slate articles warns that coffee makes you less productive. The main claim is that coffee has no cognition enhancement ability but, instead, a range of negative side-effects. Unfortunately, it is not backed by serious research.

There is much we do not know yet about coffee. However, on the whole, the research is clearly positive. Here are a few quotes from research papers:

Aged rats supplemented with a 0.55 % coffee diet, equivalent to ten cups of coffee, performed better in psychomotor testing (rotarod) and in a working memory task (Morris water maze) compared to aged rats fed a control diet. (Shukitt-Hale et al., 2013)

The results demonstrated that consuming caffeine significantly reduced the number of errors and time spent for tracing the star, and also the MMSE [cognitive test] score was significantly higher (…) (Nadji and Baniasad, 2011)

Coffee is associated with a reduction in the incidence of diabetes and liver disease. Protection seems to exist also for Parkinson’s disease among the neurological disorders, while its potential as an osteoporosis risk factor is under debate. Its effect on cancer risk depends on the tissue concerned, although it appears to favor risk reduction. Coffee consumption seems to reduce mortality. (Cano-Marquina et al., 2013)

If your intellectual productivity is low, stopping coffee is almost surely not going to help.

The week-end freedom test

In an earlier post, I compared life in academia with life in industry. One recurring argument to favour an academic job over an industry job is freedom.

Of course, the feeling of freedom is fundamentally subjective. And we have a strong incentive to feel free, and to present ourselves as free. Freedom is strongly linked with social status. Telling people that you are free to do whatever you want in your job is signalling your high status.

So how can you tell how much freedom you have?

I have long proposed the retirement freedom test. If you were really free in your job, you would continue it into your retirement. Another test is the lottery ticket test: would you keep your job if you won the lottery? But these tests are, again, somewhat subjective. Most people only retire once and they usually cannot tell ahead of time what retirement will be like.

For something more immediate, more measurable, I propose the week-end test. I conjecture that, given a choice, most people with a family would want to be free on week-ends to spend all their time with their kids. (Admittedly, others might want to dedicate their week-ends to unbridled and continuous kinky sex. But you get my point.)

So anyone who works on week-end fails the week-end freedom test. If you are checking emails from work on week-ends, you fail.

So how do professors do? In my experience, many of them fail the week-end freedom test. Of course, most of the professors I know are in computer science… and a large fraction of them are active researchers. So my sample is not representative. Nevertheless, many professors who claim to love their freedom fail the week-end test miserably. I know because I got emails from them on week-ends.

Of course, there is no arguing with subjective experience. You can fail the week-end test and claim that it is by choice. But what does it mean objectively?

You pity the poor lawyer at a big law firm who has to prepare his files every Saturday instead of playing baseball with his son. But your case is different: you love your job and that is why you work 60 hours a week. Your decision is entirely yours and it has nothing to do with the professional pressure you are feeling. You genuinely enjoy preparing this new research grant on Sunday instead of teaching your kid to swim. Sure, all professors in your department work on week-ends, except this weirdo who will never get promoted (does he love his job?), but they all do it out of love. It is a love that is so powerful that it beats the alternatives (such as spending time with your kids, or with your sex partner).

Appendix: I pass the week-end test. Mostly. For the last few years, I have stopped checking emails on week-ends. But I fail the retirement and lottery tests.

Academia or industry?

I have done three things after my Ph.D.:

  • I have been a (permanent/regular) researcher in a major government laboratory;
  • I have been an entrepreneur in industry (making deals, paying other people);
  • I have been a professor, in two different schools. I am now tenured and promoted.

My conclusions so far:

  • At least in my case, the difference in income has not been excessively large. While I did take a pay cut to join academia, you tend to make up some of the lost income in later years. Overall, it looks like things average out. However, if money is really important to you, there is no question that you can earn more in industry where your income is basically unlimited.
  • “Everyone” says that you have more freedom in academia. But whenever I hear someone say that, they are invariably an academic. I think that this is a form of rationalization. Overall, freedom is something you earn. You can enjoy a lot of freedom in industry, in government or in academia… but it is something that you have to constantly fight for. It is quite easy in academia to get stuck in a routine: teach, apply for grants, meet with students, teach, apply for grant, sit on meeting, teach… If you want to have a lot of time alone pondering, you are going to have to fight for it. It will probably not come during the first few years… it might take a decade or two (or you could get lucky earlier).

    A real test of freedom is to look at what people do when they retire. Do they keep doing whatever they were doing? The first thing that most academics will do when they retire is to drop the grant applications, the graduate students, the teaching… in effect, they’ll drop the bulk of their job. So how free were they?

    I consider that I have an excellent job as far as freedom goes. Yet much of my freedom comes from my ability to work on Monday night (as I did today) on my favorite research projects. If I chose to work a fixed 35 hours a week… I would be busy with meetings, teaching, grading, reviewing… almost all the time. Freedom is definitively something I earn every day.

    But another question is: how free do you want to be in your job? It is not uncommon for people wealthy enough to retire in luxury to keep working in high pressure jobs under difficult constraints. The fact is: it is often more satisfying to serve others than to cultivate your own egotistical freedom.

    It is not that exciting to write obscur research papers that nobody will ever read. Most of us want to feel useful. Being useful is hard. It means accepting people’s requirements.

  • Tenure is overrated. Most folks in industry that have worked just as hard as tenured professors, have savings, reputation and skills that are in demand. But if you are risk averse, then a government job is also quite safe even if you don’t formally have tenure. And academics with tenure lose their jobs all the time. There is always a clause saying that under “financial hardship” management can dismiss professors. And even with tenure, you still have to justify your job, constantly. If you create trouble, people can make your life hell. If you fail, people can humiliate you publicly. If you get into a fight with a tenured colleague, the fight can last decades and be unpleasant.
  • It is a lot easier to move back and forth between these occupations that people make it out to be. So while you can’t go back in time per se, professors move to industry all the time, and vice versa. To a point, you can even do both. It is not difficult to get some kind of honorary position with a research institute when you work in industry.
  • Academic and government positions require you to work in a bureaucratic setting, maybe for the rest of your life. In industry, you can be a lone wolf if you want. In this sense, there is greater freedom in industry.

Note: this post first appeared on Quora.

Referendums and sovereignty

Next week, the Scots will get to vote to determine whether Scotland becomes its own country.

As a middle-aged Quebecker, I spent much of my youth hearing about the separation of Quebec from Canada. We had two referendums. The first one in the 1980s was defeated decisively. The second one in 1995 was a close call. Because of these two failures, I get to live in Canada, one of the richest countries in the world instead of an independent (and poorer) Quebec.

There is much to be said about the United Kingdom. It is a fine country. But If I were in Scotland, I’d vote for the independence of Scotland the same way I could be convinced to vote for the independence of Quebec.

Why? Because a fragmented Europe took over the world.

Let me explain. If you go back a few centuries… you had huge empires in China and India. The Qing dynasty ruled over 300 million individuals while the Maratha Empire counted 150 million individuals. In Europe, you had a giant mess. Lots of small and weak states. Instead of the modern-day Germany, we had a collection of small kingdoms, the largest one being maybe Bavaria. Italy (and the Italian language) only came about in the second part the IXXth century. France was a collection of culturally distinct provinces, with the French language becoming a standard only after the French revolution. Scotland joined England only in 1707.

This patchwork of weak states enabled great prosperity, at least locally. First in Venice, then in the Dutch Republic and then in England. Venice counted less than 200,000 people, the Dutch Republic had fewer than 2 million people while England had 5 million people.

There was so much prosperity that the Dutch and then the British took over the world. They could afford it.

People look at Europe and think that the lack of unification is the problem. One united Europe would be stronger. But that is like saying that by putting all your eggs in the same basket, you can carry eggs more efficiently.

Europe contributed most as a political laboratory. It gave us the democracy, the industrial revolution and modern science.

It is entirely possible that the United States works better as a giant unified country… But then you get things like an all-powerful spy agency and a federal government that can arm the local police forces for war. If you broke up the USA into small states, at least some of them would be free from this nonsense. Some of them would not have gone to war in Irak. Small countries tend to trade more with other countries than large countries, and trade discourages war. And as an individual, you would have more choices. You could move more easily if you disagreed with the current policies.

Of course, small countries do not have large open markets. But the only 6 countries that offered economic freedom in 2014 are Hong Kong, Switzerland, Singapore, New Zealand, Australia and Canada. They are all relatively small countries in terms of population. The largest ones (Canada, Australia) are modest countries on an international scale… So some small countries can be nice places where to run a business despite their size.

How far would I go? I think that the idea of the city state had a lot of good. The 5 richest countries in the world right now on a per capita basis are Qatar (2 million people), Luxembourg (500,000 people), Singapore (5 million people), Norway (5 million people) and Brunei (400,000 people). Let cities compete for talent and industries. One screwed up city will not harm us much, while one great one can make all the difference. Have Montreal compete against Toronto and New York City. Singapore proves that it can work.

Transemployment: creating jobs out of thin air

Back in the eighties, half of the 16-year-old teenagers were licensed drivers in the US. Evidently, things have changed. Driving is still important, but other activities have become even more important. I am guessing that it is hard to get a date without a mobile phone today.

My point is that we create needs. These needs feel very real. And, in a sense, they are.

Employers and governments are no different. Once your automotive corporation has a social-media specialist, you cannot imagine not having one. Once you have compelled all colleges to have officers in charge of research grants, you cannot imagine how it could ever be otherwise. I call this transemployment.

So, irrespective of primary needs (food, lodging and reproduction), we create jobs that have abstract purposes. Sure, maybe we can automate the job of the social-media specialist. Maybe we can buy software that automatically represents the corporation on Twitter and Facebook. But since we do not really know what the social-media specialist does, how do we know that automation will work? And if you automate, who do you blame when things go bad?

What happens if someone asks what the social-media specialist does and whether it is needed, or whether it can be automated? Nobody is going to ask. The position might be terminated, but it would be too rude to say that the job was not real.

It is easy to question the value of concrete work, like carpentry or plumbing. You either need a plumber or you do not. But how do you know whether you need this particular program manager?

If you work in an office, much of the work you do is not real. The forms you fill are usually there as part of some process. This process feels absolutely essential… except that, not long ago, it did not exist and things worked nevertheless.

We need medical doctors so badly that we can afford to have them spend half their time filling out forms and satisfying regulations.

We know that many jobs are not real. But we need to believe that they are. And if you ask too many questions, someone could ask whether your own job is real. My conjecture is that useless jobs have become a cultural blind spot. Even just asking whether something needs doing has become a major faux pas. Anyhow, most times, we do not understand what others do for a living.

If there was a war, and half of the working age men were sent to fight… maybe fight some aliens in outer space… we would still have food, houses, cars, colleges… We would make do with far fewer people.

But, surely, the free market would take care of these inefficiencies and make people unemployed? It would, but we are getting so wealthy that the inefficiencies brought forth by transemployment are often insignificant.

I know, it does not feel like we are massively wealthy, but we are. These days, people will go down in the streets if you suggest that you might lower their retirement benefits. A century ago, people would have celebrated at the thought that they might have any retirement benefit whatsoever. It used to be that people would go on strike and see their families go hungry… today, the people who went to occupy Wall Street had mobile phones, the Internet, and abundance of food… They were not asking for bread, they were not hungry… they were upset because some were much wealthier than they were.

Some will accuse me of being dismissive of poverty. Yes, there is real poverty in the world. Nothing is perfect. Even in a Star Trek universe, there will be misery. But that only encourages us to sustain transemployment. Trying to put everyone who is not absolutely needed out of a job would be considered very harsh. It is much better to add new jobs.

And that is the future I imagine. In fact, this future is already here. Robots have replaced us. We just choose to ignore that fact.

Paper books are the new vinyl records

I have always loved reading. But it is a love that has been constantly frustrated. As a young teenager, I would spend days in the library, but I quickly exhausted my interests. If you wanted to know about Einstein, you were lucky to find one biography. If you wanted to teach yourself calculus, you might find one boring reference, if that.

When I attended junior college (we call it cegep here), one of my teachers encouraged me to read Feynman’s lectures. (They are now freely available online.) Though they were available at the library, I could not borrow them for the summer. So I tried to purchase them. There was no Amazon, no world wide web at the time. I went to a local bookstore and I asked the owner to order them for me. I said I was willing to pay upfront for the books. She refused to order them. Too much trouble to take in special requests. I think that, many years later, I did buy them in Toronto in a large bookstore, but it was just to show that I could. It felt like an empty victory.

As someone who spends almost all of his free time reading, you would think that I would be at ease in a bookstore. And while I have spent a lot of time in bookstores, they never satisfied me, even when I acquired the means to buy whatever books I wanted.

Libraries are more interesting. I could have been a librarian. The problem with libraries is that they are finite. There is always only so much room. I read about 25 science-fiction novel a year and about as many non-fiction books… all of them chosen with care… and none of them likely to be stocked in any given library. Here is what I read lately: Echopraxia by Watts (follow-up to Blindsight), Make It Stick by Brown and Finite and Infinite Games by Carse. None of these books are likely to be at my local library. And if they are available, I am likely to have to wait days or weeks to get them. In contrast, I bought these books online in seconds.

Back in 2010, I got rid of most of my paper books. Though I have, on occasion, bought paper books, I do not think I have set foot in a bookstore ever since. I have also cancelled by newspaper subscription in 2012.

It may sound like an empty gesture, especially in 2014, but I still have colleagues commenting on the fact that my office is almost empty. Last week, one of my Ph.D. students came in my office for a chat carrying more paper books than I ever had in my current office.

To say that I am still ahead of the curve is an understatement. In Quebec, most local publishers have shied away from ebooks. Though you can get the latest novels as ebooks, they are priced like paper books, making their purchase unreasonable.

Similarly, academia is still very much grounded in paper books. In many departments, you cannot get tenure without a (paper!) book by your name. A genuine book that lies flat in the bookstore for at least 3 months before being returned to the publisher. The fact that nobody would willingly buy your book is irrelevant. Paper has magic.

Do not get me wrong: I love paper. If you have ever seen me in person, you probably know that I always carry a paper notebook. I have always carried paper with me ever since I was a teenager—except for a few years when I believed that PDAs (the ancestor of modern-day smart phones) could replace notebooks. I am sure paper will be obsolete in 10 or 20 years, but we are not there yet. My sons still use paper books. We bring them to the library every three weeks or so. At school, they only use paper books. At home, for their studies, I buy them paper books.

I am a nerd. I like the feel of books. But I embrace ebooks because I cannot stand the thought of being limited by whatever I can carry physically with me, or by whatever space there is on the shelves. I also do not want to be limited by what gatekeepers believe I should be reading.

When you start doing something new, you can usually tell that others are joining in. I know others are embracing ebooks because I see people reading them on the bus every week. But it is harder to tell whether others have stopped doing something… like going to bookstores. Thankfully, Stephen Downes’ recent account of his visit to a bookstore comforts me in my belief that the days of paper books are numbered:

It occurred to me that you would never come to the bookstore to learn anything. The stuff that’s there is mostly superficial and survey literature.


It should not be surprising that the computer book section has been absolutely devastated. Again, you would not go to this section to learn anything – at best, the books could be considered references. Today, if you’re studying computer science – programming, design, concepts – you’re studying online. This section reflects that, and there was nothing for me to even browse.

Downes sums up my frustration with bookstores. They are not, and might never have been, for the intellectually curious folks. They are all about the mass production of cookie-cutter content.

Without profitable bookstores, I do not think that libraries will last very long. Once people have formed the habit of using ebooks or whatever superior alternatives are coming down the line… they will stop coming to the libraries. Libraries will be killed by the likes of Amazon just like Netflix killed Blockbuster.

Like any major change, this will not be painless. Already, the disappearance of bookstores means that many lose fine jobs. I will be the first to admit that ebooks are not quite as good as paper books for learning. But this only means that there are great business opportunities ahead. Ebooks could be so much more than zipped HTML files.

There will always be a market for paper books, the same way there will always be a market for vinyl records. But that is all.

Though unrefereed, arXiv has a better h-index than most journals…

Google provides a ranking of research venue per domain. For databases and information systems, they provide the top 20 venues according to their h-index.

As part of their assessment, they chose to include arXiv: a repository of freely available research papers. Almost anyone can post a paper on arXiv. There is some filtering, but there is no scientific review of the papers. This means that if you download a paper from arXiv and it is complete junk, you have nobody to complain to (except the authors).

Nevertheless, it appears that people are willing to post their great papers on arXiv. On a ranking per h-index, the databases section of arXiv ranks in 11th place, outranking prestigious journals like the VLDB Journal, Data & Knowledge Engineering, Information Systems and Knowledge and Information Systems… not to mention all the journals and conferences that do not appear in the top-20 list provided by Google.

One could argue that the good ranking can be explained by the fact that arXiv includes everything. However, it is far from true. There are typically less than 30 new database papers every month on arXiv whereas big conferences often have more than 100 articles (150 at SIGMOD 2013 and and over 200 at VLDB 2013). Roughly speaking, the database section of arXiv is equivalent to two big conferences while there are dozens of conferences and journals.

You can subscribe to to arXiv on Twitter. All papers are freely downloadable.

Credit: A post by Rasmus Pagh on Google+ pointed out the good ranking of arXiv in theoretical computer science.

Expert performance and training: what we really know

Movies such as Good Will Hunting tell beautiful stories about young people able to instantly master difficult topics, without any effort on their part.

That performance is unrelated to effort is an appealing belief. Whether you perform well or poorly is not your fault. Some go further and conclude that success and skill levels are primarily about genetics. That is an even more convenient observation: the quality of your parenting or education becomes irrelevant. If kids raised in the ghetto do poorly, it is because they inherited the genes of their parents! I personally believe that poor kids tend to do poorly in school primarily because they work less at it (e.g., kids from the ghetto will tend to pass on their homework assignments for various reasons).

A recent study by Macnamara et al. suggests that practice explained less than 1% of the variance in performance within professions, and generally less than 25% of the variance in other activities.

It is one of several similar studies attempting to debunk the claim popularized by Gladwell that expert performance requires 10,000 hours of deliberate training.

Let us get one source of objection out of the way: merely practicing is insufficient to reach world-expert levels of performance. You have to practice the right way, you have to put in the mental effort, and you have to have the basic dispositions. (I can never be a star basketball player.) You also need to live in the right context. Meeting the right people at the right time can have a determining effect on your performance.

But it is easy to underestimate the value of hard work and motivation. We all know that Kenyan and Ethiopian make superb long-distance runners. Right? This is all about genetics, right? Actually, though their body type predispose them to good performance, factors like high motivation and much training in the right conditions are likely much more important than any one specific gene.

Time and time again, I have heard people claim that mathematics and abstract thinking was just beyond them. I also believe these people when they point out that they have put many hours of effort… However, in my experience, most students do not know how to study properly. You should never, ever, cram the night before an exam. You should not do your homework in one pass: you should do it once, set it aside, and then revise it. You absolutely need to work hard at learning the material, forget it for a time, and then work at it again. That is how you retain the material on the long run. You also need to have multiple references, repeatedly train on many problems and so on.

I believe that poor study habits probably explain much of the cultural differences in school results. Some cultures seem to do a lot more to show their kids how to be intellectually efficient.

I also believe that most people overestimate the amount of time and effort they put on skills they do not yet master. For example, whenever I face someone who failed to master the basics of programming, they are typically at a loss to describe the work they did before giving up. Have they been practicing programming problems every few days for months? Or did they just try for a few weeks before giving up? The latter appears much more likely as they are not able to document how they spent hundreds of hours. Where is all the software that they wrote?

Luck is certainly required to reach the highest spheres, but without practice and hard work, top level performance is unlikely. Some simple observations should convince you:

  • There are few people who make world-class contributions at once… there are few polymaths. It is virtually impossible for someone to become a world expert several distinct activities. This indicates that much effort is required for world-class performance in any one activity. This is in contrast with a movie like Good Will Hunting where the main character appears to have effortlessly acquired top-level skills in history, economics, mathematics.

    A superb scientist like von Neumann was able to make lasting contributions in several fields, but this tells us more about his strategies than the breadth of his knowledge:

    Von Neumann was not satisfied with seeing things quickly and clearly; he also worked very hard. His wife said “he had always done his writing at home during the night or at dawn. His capacity for work was practically unlimited.” In addition to his work at home, he worked hard at his office. He arrived early, he stayed late, and he never wasted any time. (…) He wasn’t afraid of anything. He knew a lot of mathematics, but there were also gaps in his knowledge, most notably number theory and algebraic toplogy. Once when he saw some of us at a blackboard staring at a rectangle that had arrows marked on each of its sides, he wanted to know that what was. “Oh just the torus, you know – the usual identification convention.” No, he didn’t know. The subject is elementary, but some of it just never crossed his path, and even though most graduate students knew about it, he didn’t. (Halmos, 1973)

  • In the arts and sciences, world experts are consistently in their 30s and 40s, or older. This suggests that about 10 years of hard work are needed to reach world-expert levels of performance. There are certainly exceptions. Einstein and Galois were in their 20s when they did their best work. However, these exceptions are very uncommon. And even Einstein, despite being probably the smartest scientist of his century, only got his PhD at 26. We know little about Galois except that he was passionate, even obsessive, about Mathematics as a teenager and he was homeschooled.
  • Even the very best improve their skills only gradually. Musicians or athletes do not suddenly become measurably better from one performance to the other. We see them improve over months. This suggests that they need to train and practice.

    When you search in the past of people who burst on the scene, you often find that they have been training for years. In interviews with young mathematical prodigies, you typically find that they have been teaching themselves mathematics with a passion for many years.

A common counterpoint is to cite studies on identical twins showing that twins raised apart exhibit striking similarities in terms of skills. If you are doing well in school, and you have an identical twin raised apart, he is probably doing well in school. This would tend to show that skills are genetically determined. There are two key weaknesses to this point. Firstly, separated twins tend to live in similar (solidly middle class) homes. Is it any wonder that people who are genetically identical and live in similar environment end up with similar non-extraordinary abilities? Secondly, we have virtually no reported case of twins raised apart reaching world-class levels. It would be fascinating if twins, raised apart, simultaneously and independently reached Einstein-level abilities… Unfortunately, we have no such evidence.

As far as we know, if you are a world-class surgeon or programmer, you have had to work hard for many years.

Credit: Thanks to Peter Turney for telling me to go read Carse.