Most researchers are convinced that their current work is important. Otherwise, they wouldn’t do it. Yet, few of them work on obviously important things like curing cancer or solving world hunger. Rather, they do silly things like prove the Poincaré conjecture. A century to figure out some theoretical nonsense? Please!
So, why won’t researchers work on the important problems of our era?
The conventional explanation is that working directly on the major problems is like staring at the Sun. Instead, researchers must do routine work until an opening toward greatness opens up. So real researchers…
- survey existing work,
- comment on special cases,
- provide theoretical justifications for empirical observations,
- validate new theory experimentally, and so on.
That is, researchers are not architects. They use greedy algorithms:
- Look at problems and results you can grasp with your current expertise,
- Select an interesting problem which is a good fit for your expertise,
- rinse and repeat.
- Wait! You are close to solve a major problem? Jump on it!
Should scientists feel guilty that they can’t prove the importance of each increment? I think not. I think scientists are inefficient, but there is no better way known to man. Indeed, consider how real innovation is typically unpredictable:
- The greatest difference between my Honda Civic and the car I drove as a teenager is that I can lock and unlock all doors with a remote. This single function made all the difference in the world for me. I drive my wife nuts as I keep playing with the remote: lock, unlock, lock, unlock… And people thought we would have flying cars!
- I am sure that Google offered better search results that altavista. Yet, the real reason I switched to Google and never looked back is that they did away with the big annoying ads. Understanding that you didn’t need to annoy your users to make a profit was Google’s greatest innovation. (Don’t let them fool you into thinking PageRank had something to do with it.)
- Amazon.com is by far the best e-commerce site on the planet. But what is different? In fact, a lot of small little things. On the surface, Amazon.com is just an HTML view of a database. But they have collected many small innovations that when put together make a huge difference.
My point is that innovation in the little things adds up to important and practical results. That is why academic researchers spend so much time writing surveys or studying to death a detail. They don’t think their own work will change the world, but they count on others doing the same thing. They hope that when they put it back together, the result is great. For the last two hundred years or so, they have fared extremely well.
To put it another way: greedy algorithms can be pretty good. They can certainly beat 5-year plans.
Further reading: Innovative ideas are indistinguishable from crackpot ones
For the last two hundred years or so, they have fared extremely well.
I think this is an illusion due to the large increase in the number of scientists and the enormous expenditure on resources dedicated to Science.
I would bet that the overall return of research has actually declined per head/per dollar (re Joseph Tainter).
Where are the Newtons, Leibniz and even Aristotles of our time?
@Kevembuangga
The reference to Joseph Tainter is interesting. I didn’t know about his work.
He may have a point: as the research “industry” becomes more complex, it eventually gets bogged down by overheads and corruption.
Nevertheless, we *should* be spending more than ever in Science. So, the mere fact that modern science is much more expensive is not a sign of decline.
As for the search for a modern-day Newton, I don’t subscribe to the Heroic Theory:
http://en.wikipedia.org/wiki/Heroic_theory_of_invention_and_scientific_development
“That is why academic researchers spend so much time writing surveys or studying to death a detail. They don’t think their own work will change the world, but they count on others doing the same thing.” May be we are living in a changing world where the values to academic research have changed. I still believe that we could “change the world” with some of the researches done, especially on Web 2.0 and its impact on education and learning. The wikipedia, Google and Amazon examples are also of great significance in changing the way we learn, via those ICT and tools. However, this are all rendered possible by the introduction of internet and advances in computer technology. So, I think we need both researches (as part of education) to ensure that we could gain a better understanding on those principles on a scientific basis, rather than mere application (which could lead to disastrous results). The current financial crisis is just one example of the failure to take responsibility in compliance to regulations by financial instituions. There are factors like greed which might have led to the over-selling and purchasing of the mortgages. Would education and scientific research (risk management research) be able to reveal, anticipate or prevent such problems? Another example is the widespread of virus and spams nowadays. One of my computers has just been hacked and now full of Trojans and spywares, and I have to return it to IT to have it re-booted. This is a by-product of the technology, and I would like such problem be solved as soon as possible. Isn’t it also of great concern to us as computer users. That may also explain why institutions have to build in strong firewalls, in order to prevent any disastrous attack by spammers or intruders. This are all imminent issues that require solutions.
@Berenguel
I already link to the You and Your Research page on Paul Graham’s web site, in my post.
I was being ironic regarding Perelman’s contribution. I have stated several times here that what Perelman did was of historical importance.
Richard Hamming addressed this in his famous speech “You and your research” (Paul Graham has it in his page, you can Google for it in any other format you like). Obviously, you research what you can, as a researcher. If not, you make a living doing X, and research as a hobby, but this won’t get you anywhere in the modern era.
Btw, Poincaré’s conjecture has some implications in deep physics things, like string theory and such, it is not just “nonsense”.
Sorry! My fault for not checking the links in the post, I just read it straight and commented.
I was not a follower of your blog before, so I missed your previous Perelman statements, and my irony detector has been faulty all my life 😉
Thanks for the prompt response, and sorry for my late answer, I forgot to check back!
Ruben
This single function made all the difference in the world for me. I drive my wife nuts as I keep playing with the remote: lock, unlock, lock, unlock… And people thought we would have flying cars!
Well, modern cars may have actually many more gadgets. Not even talking about hybrids and electric cars.
Overall, I agree that progress is a series of small steps, which take longer time than it was originally though.
Therefore, I value people who think that the devil is in the details. They are usually the most practical and useful ones!
the devil is in the details.
You are right, the more details the more devilish, complexity will eat you alive and bring no side benefits.
(currently struggling with cross-browser Javascript compatibility…)
Of course, Kevembuangga, I can hear you. One part of the issue is the ability to simplify things. This is also a part of detail deal. That is why it makes me infinitely sad when people say: let’s add this and that, it MUST be easy. THEY DON’T UNDERSTAND: that almost nothing is easy.