A megabyte is a mebibyte, and a kilobyte is a kibibyte

If you’ve been annoyed about the fact that a kilobyte has 1024 bytes and not 1000 bytes, well, you were right all along! What people call a kilobyte is really a kibibyte. (Thanks to Owen for pointing it out to me!)

Examples and comparisons with SI prefixes
one kibibit  1 Kibit = 210 bit = 1024 bit
one kilobit  1 kbit = 103 bit = 1000 bit
one mebibyte  1 MiB = 220 B = 1 048 576 B
one megabyte  1 MB = 106 B = 1 000 000 B
one gibibyte  1 GiB = 230 B = 1 073 741 824B
one gigabyte  1 GB = 109 B = 1 000 000 000 B

Source: Definitions of the SI units: The binary prefixes

Michael Nielsen: Principles of Effective Research

Michael just finished his essay: Principles of Effective Research. I think it is a must read for all Ph.D. students, young researchers, and even idiots like me who always get it wrong. Michael takes a very refreshing view to what research is all about. He is not cynical yet he is true to what research really is. You may never win the Nobel prize if you follow his guidelines, you may never be a guru researcher, but I think you’ll be a good or even excellent researcher. As he explains, being an influent researcher is not a subset of being a good researcher, and that’s a very important statement. In any case, Michael did all of us a favor and I hope that he essay is read by a lot of people. (Power of the network?) I implore you all: link to his essay!!!

Collaborative Filtering Java Learning Objects

Through Downes’, I found an interesting paper on the application of collaborative filtering to e-Learning in ITDL (by Jinan A. W. Fiaidhi).

It makes the point quite well that we must differentiate heterogeneous settings from sane laboratory conditions:

Searching for LOs within heterogeneous repositories as well as within collaborative repositories is far more complicated problem. In searching for such LOs we must first decide on appropriate metadata schema, but which one!

The Three Dijkstra Rules for Successful Scientific Research

Through Didier and Nielsen, I found a list of Golden Rules for Successful Scientific Research attributed to Dijkstra.

  • “Raise your quality standards as high as you can live with, avoid wasting your time on routine problems, and always try to work as closely as possible at the boundary of your abilities. Do this, because it is the only way of discovering how that boundary should be moved forward.”
  • “We all like our work to be socially relevant and scientifically sound. If we can find a topic satisfying both desires, we are lucky; if the two targets are in conflict with each other, let the requirement of scientific soundness prevail.”
  • “Never tackle a problem of which you can be pretty sure that (now or in the near future) it will be tackled by others who are, in relation to that problem, at least as competent and well-equipped as you.”

Of the three rules, only the last one seems important. The second one appears self-evident: you want to be socially relevant, but not to the point of producing low quality work. This being said, most researchers go to the other extreme and ignore social relevance and their work loses out its motivation. If you tackle a problem that only you care about, don’t expect much recognition. I actually disagree with the first rule: small problems, technical issues actually often hide interesting problems. Always focusing on the management and top level issue is a bad idea I think. Michelangelo was painting a church! In research, do not be so quick to think that there are noble and not-so-noble problems. All problems can be interesting and knowledge of technical issues can bring much insight.

Nielsen’s Extreme Thinking

Blogging is a fascinating past-time. Who would have thought? I just read bits and pieces of an essay on Extreme Thinking.

Here’s a fascinating quote:

The key to keeping this independence of solitude is to develop a long-term vision so compelling and well-internalized, that it can override behaviours for which the short-term rewards are significant, but which may be damaging in the long run.

Update: Independence of solitude: I didn’t know this expression. Found 600 or so hits on Google. Seems that maybe the expression comes from Ralph Waldo Emerson.

What I must do is all that concerns me, not what the people think. This rule, equally arduous in actual and intellectual life, may serve for the whole distinction between greatness and meanness. It is the harder, because you will always find those who think they know what is your duty better than you know it. It is easy in the world to live after the world’s opinion; it is easy in solitude to live after our own; but the great person is one who in the midst of the crowd keeps with perfect sweetness the independence of solitude.

Michael Nielsen: Principles of Effective Research: Part VII

Didier reminded me to check Nielsen’s last post on Principles of Effective Research. I take a quote out of it…

The foundation is a plan for the development of research strengths. What are you interested in? Given your interests, what are you going to try to learn? The plan needs to be driven by your research goals, but should balance short-term and long-term considerations. Some time should be spent on things that appear very likely to lead to short-term research payoff. Equally well, some time needs to be allocated to the development of strengths that may not have much immediate pay-off, but over the longer-term will have a considerable payoff.

This is a refreshing view.

Freedom in networked research: what does it mean?

When I started out as a researcher, as a young Ph.D. student, I thought research was about “having ideas”. Then, it occured to me that it was about “having ideas and ‘selling’ them” because “having ideas” is easy and too many people have too many ideas already. But marketing experts sell ideas all the time… surely, they don’t do “research”. Then, I changed my mind and decided research was about “taking ideas, validating them, putting them in practice, and building tools out of it” where “tools” is to be interpreted in a very wide sense. Turns out it is not a bad definition of what research is. But the part about “taking ideas and validating them” is a networking problem. Where do your ideas come from, how do you know how good they are? Ultimately, “validating” an idea means putting it in front of a community and getting the community to say “this is a good idea”. “Validating” is not the same as selling, though it might be hard to tell what a person is really trying to do.

But to be blunt, I don’t have yet a satisfying definition of what “research” is and I’m not looking very hard… though, networking is a necessary condition for sure. Scientists on desert islands without telecommunication can’t do research. That’s the part that I did not understand until a few years after my Ph.D. Well, maybe I’m hard on myself, maybe I understood it on the surface, but I didn’t internalized until much later.

Michael Nielsen pointed me to an interesting Web page very useful for Ph.D. students and novice researchers: Networking on the Network.

In Networking on the Network, Philip E. Agre accurately describes the world of research as a network. A network isn’t good or bad… so, some nodes will suck energy out of the network, and others will contribute much to it. The network is somewhat self-regulating, but it is possible, nevertheless, for bad leaders to emerge… He has this to say about the relationship between students and supervisor which I find rings very true:

It is good to be powerful, but only in the correct sense of the term. People with the right kind of power, in my view, do not need to manipulate or control others. To the contrary, they are (sic) know that they are well-served when others grow and find their own directions, so they happily support everyone in their growth. They don’t take responsibility for others’ growth, which is a different question. They speak to the healthy part of a person, and they are concerned to draw out and articulate the brilliant ideas and worthy vision that lie beneath the surface of whatever anyone is saying. For example, they don’t try to enroll students as acolytes in their empire-building strategies, but honestly ask what’s best for each student’s own development, confident that their knowledge, vision, and connections will have an important influence on the student’s development in any case.

As you can see, he talks a lot about “Empire building”. Indeed, because research is all about networking, to a large extend, one can build an empire out of thin air, with no substance.

It seems you can either build an empire for the purpose of building an empire, because that’s you definition of success, or else, you can aim to remain “free”. That’s a very powerful idea:

You build networks around the issues you care about, you grow and change through the relationships that result, you articulate the themes that are emerging in the community’s work, and through community-building and leadership you get the resources to do the things that you most care about doing. It’s true that this method will never give you arbitrary power. But the desire for arbitrary power is not freedom — it is a particularly abject form of slavery. If you can let go of preconceived plans then you are free: you can choose whom to associate with, and as you build your network you multiply the further directions that you can choose to go. You also multiply the unexpected opportunities that open up, the places you can turn for assistance with your projects, the flows of useful information that keep you in contact with reality, the surveillance of the horizon that keeps you from getting cornered by unanticipated developments, and the public persona that ensures that people keep coming to you with offers that you can take or leave. That is what freedom is, and it is yours if you will do the work.

I give Agre a lot of credit from bringing in the concept of “freedom” in research. University professors will often talk about “academic freedom”. I think that freedom in research is a stronger form of freedom. You can have “academic freedom” but be a slave to the “publish-or-perish” paradigm for the power it brings you. Or else, you can “do the work”, that is, do your research as a network node, and leverage the strength of the network to make the research you want to do anyhow, much better, much stronger.

Michael Nielsen: Principles of Effective Research: Part IV

I’ve been reading Michael Nielsen’s Principles of Effective Research, he is up to Part IV now.

He makes a very important point about research. When I started out doing research, I thought that research was about sitting in your office thinking up new ideas. God! Was I wrong!

Now, don’t get me wrong, research is not about having meetings with other researchers or spending time chatting, or drawing UML diagrams of what is to be done, or spending weeks on funding proposals. We might do these things, but they don’t make us good researchers. But neither will sitting in your office thinking new ideas. That’s not effective research.

On quasi-desert islands with no telecommunications, you’ll find very few great researchers. The social network doesn’t need to be immediate: I think you can be a great researcher even in a tiny school. And I don’t think your network should be made of students mostly, especially not your own students.

I believe the secret to being a good researcher is to belong to a tightly knitted group of solid researchers. Research is about networking. By tightly knitted, I don’t necessarily mean “military-like”: I mean that you feel peer pressure all the time to do good research. This can be achieved through emails, blogging, phone… whatever the mean…

A must read paper in the Chronicle

A must read paper in the Chronicle Is There a Science Crisis? Maybe Not. The paper is about the oversupply of graduate students in science which is brought upon by universities who have a vested interest in producing more and more science Ph.D.s but don’t necessarily need to adjust to the job market.

It brings back memories. At the end of the eighties, they were predicting a severe shortage of science Ph.D. As it turns out, it was totally false and the paper documents very well the fact that life after a science Ph.D. has gotten tremendously worse and that there are clearly an ever increasing number of science Ph.D.s with fewer and fewer jobs.

The truth is that universities are being irresponsible (and so are professors). Training highly specialized students who know how to solve one type of technical problems has no value for society. Whatever you do, train students to have a wide range of skills. This means that we need to reduce drastically the number of science Ph.D.s and focus on well-rounded students.

I’m convinced governments will soon wake-up and stop listening to universities. They’ll be forced soon to look at the numbers and figure out that generously paying universities to produce more science Ph.D.s is a waste of tax payer money.

Some beautiful quotes:

An editorial in Science this year argued: “We’ve arranged to produce more knowledge workers than we can employ, creating a labor-excess economy that keeps labor costs down and productivity high. Maybe we keep doing this because in our heart of hearts, we really prefer it this way.”

Mr. Freeman, like other economists, looks to dollars to make sense of the trends among graduate students. “They’re not studying science,” he says, “because they look and say, ‘Do I want to be a postdoc paid $35,000 or $40,000 at age 35, with extreme uncertainty working in somebody else’s lab, and maybe getting credit for my work and maybe not getting full credit? Or would I rather be an M.B.A. and making $150,000 and hiring Ph.D.’s?'”

With wages stagnant and too few jobs for engineers, adding to the work force will only make those careers less attractive, says one of the authors, George F. McClure, a retired aerospace engineer who studies employment issues for the Institute of Electrical and Electronics Engineers. “The problem is that everybody has focused on the supply side, and very few have focused on the demand side,” he says. “People in colleges and universities are concerned with maintaining the pipeline and throughput.”

In a case study, Ms. Stephan, the Georgia State economist, has analyzed the growth of the bioinformatics field, generally regarded as one of the hottest areas in science. The number of degree programs blossomed from 21 in 1999 to 74 in 2003. “There’s been a tremendous increase in the number of students in these programs,” she says. But, she adds, “we also track job announcements in bioinformatics, and they’ve been declining.” She sees parallels to other leading fields. “Everybody is talking right now that there’ll be lots and lots of jobs in nanotechnology,” she says. “I’ve not seen a convincing case that that is happening, or that it will happen.”