More on IBM versus Essbase

I wrote earlier that IBM announced it would no longer sell its DB2 OLAP Server. It looks like the move by IBM might mean that they plan to focus on their own OLAP product:

in fact it’s more to do with their current focus on their Cube Views product, which in his opinion is more likely to be IBM’s future OLAP direction.

So DB2 Cube Views will be the main IBM OLAP product?

IBM killed its DB/2 Olap Server

According to COMPUTERWOCHE ONLINE, IBM is killing its DB/2 Olap Server by breaking its deal with Hyperion. This somewhat surprising move brings questions as to what IBM will do in the Business Intelligence arena… partner with Oracle or Microsoft, or do do something else? Maybe get out of the OLAP business altogether?

Expert Opinion: An open letter to Bill Gates

Michael wrote an open letter to Bill Gates. Michael is a smart guy.

(…), unless things have changed drastically in Redmond while I’ve been away this past year, your technical employees (and those of other companies; this is not unique to Microsoft) put in far more than 40 hours per week. It doesn’t matter how interesting that work is; I submit that there is something wrong with an industry that expects its workers, as a permanent state of affairs, to work more than the accepted standard work week. And I think students agree with this and are voting with their feet.

That’s about it. I think that Michael is right that the long hours is probably the main reasons why women won’t go into IT.

How are we going to fix it? It seems that the best way to fix things is to do exactly what students are doing: just don’t enter the profession. As fewer people come into it, there will be more pressure to offer better jobs.

Outsourcing won’t do it. Massive temporary visa programs? We don’t have those in Canada, but the USA should probably do away with them and they probably will, if only for security reasons/paranoia.

I’m really hopeful that in 10 years, IT jobs will be much better. High end jobs will be 40 hours jobs at very good salaries. IT is simply too important to society and the challenges are too great not to have our best people working there.

Why don’t women study Computer Science?

It is the fault of TV shows!!! Who knew?

Princeton University Dean of Engineering and Applied Science, Maria Klawe, said so.

In contrast, Klawe said the number of women in law and medicine has reached parity with men. Why? “I think there is a correlation with TV shows,” that even when Klawe was a teenager, showed women happily at work in those fields. “I think computer science is a lot more creative than the jobs doctors and lawyers have,” she said, asking why Hollywood doesn’t do more with the IT field.

Of course, some will want to distinguish Computer Science and IT. I won’t go there because a large number of Computer Science graduates (as well as other fields such as Physics for that matter) go into IT. It stands to reason that if you have no interest for IT, you might consider a law or medicine major.

Ok. But I still don’t buy the TV show explanation. What about relatively modest salaries, long hours, and the macho caffeine-induced buzz? I’m sure several women don’t mind working 90 hours a week, but if you are to attract women to the IT profession, you better come up with better working conditions: higher salaries, lesser outsourcing threats, more flexible hours and so on.

Maybe having fewer graduates might help the working conditions. Of course, we could always object that outsourcing will quickly compensate and keep wages low. Maybe. It is hard to predict where Information Technology will take us.

However, it is true that Information Technology and Computer Science is really the current driver, or as Bill Gates said it:

Computer science is the change agent of the time.

It is simply not the case, currently, that the profession is very appealing for women.

Thanks to Fred for pointing out the article.

Update: Scott vouched for Maria, so maybe the TV show idea is not so silly. Still, higher salaries would help!

Programming and college CS education

Moving things up on the skill ladder, going to higher level skills and discarding lower level skills where “higher” means “more abstract”, doesn’t necessarily lead to a better education, but to a worse one. You should not discard lower level skills, you should value them: they are our foundation. If you can’t use a broom, don’t use a computer.

Here are a few things you may hear on your campus about CS education:

  • Since this is not a community college, we should not teach more than one OOP language.

    Yes, of course. But even community colleges probably choose either Java or C++ or C# or (gasp!) VB. I have no problem with a school teaching only rudimentary Java as long as the students really know Java. I don’t mean knowing the syntax well or the API well… I mean, being able to do non trivial programming in it. And just generally being fluent with programming: if things go wrong, know how to debug them even when a debugger can’t be used; understand how to do research on newsgroups to help you out; know how to file a proper bug report.

    Either we are saying that a student who knows Java can pick up C++ on his own easily, or else there is something fundamental different about C++. You can’t get around it: it is one or the other. So, can the students who graduate from your program learn C++ easily on their own? If not, you failed to teach them about modern class-based OO. Can they recognize the STL data structures and understand their characteristics immediately, or are they stuck trying to reinvent the wheel?

    In short, teaching only one language is fine, as long as you do it because picking up other languages will be easy for your students, not because programming in various languages is not important.

  • Now that we are using Java, there is no obvious excuse why we have tons of students who cannot program well… before, I thought I knew why!

    Part of the answer is what you value and what society values.

    University professors, generally, don’t know how to write industrial-strength software. They don’t know because they never had to do it and were never involved in real projects. So, they cannot teach it. Period. Note to self: I just made a lot ennemies.

    They have the same problem in the humanities or in business. Several managers can’t write 10 lines of English or French without filling it up with childish sentences. We don’t know about it because these people never really write anything beyond a note. Why should it matter? All they need to do is sign paperwork and attend meetings.

    So, if CS graduates are just supposed to attend meetings and sign papers, then why should they know how to program or how to write in English for that matter?

    The next logical step is… why do you need a degree at all? Oh… you need the degree for the resume… but why do you need the education that comes with the degree?

    We are back at what society values… If all that matters is to direct and manage, then fine, but I don’t think this is a safe road. It will certainly lead to a commercialization of university degrees.

    Of course, the really good students already know programming by the time they get to university, or at least, they can pick it up on their own. Others will never learn programming because it is too hard. But most students won’t learn about data structures and algorithms on their own so a university degree can really take the best students to a higher level. What to do about the students who can’t pick up programming (and in some places, it seems like very sizeable fraction)? Please don’t water down the education for their sake. Help them the best you can and then, let them sink.

High demand for storage

In What was Sun thinking? (CNET News.com), Charles Cooper tells us that storage is now in high demand:

What with some of the confusing–make that idiotic–federal regulations governing corporate behavior that have appeared the last couple years, there’s a near bottomless demand for big storage systems. After the passage of Sarbanes-Oxley and HIPAA, CEOs are so keen on covering their posteriors these days that there’s no such thing as too much documentation. Identity and management access is the hot ticket these days as every management team worth its salt wants to tout how tough it now is on compliance.

Wow. Seems like it is going to be a nice era for data warehousing and OLAP, no?

The Geomblog: Do we really need more students in CS ?

Suresh jumps into the Do we really need more students in CS ? debate. He concludes:

On the one hand, you can make a degree program produce graduates that are more employable, but you veer dangerously close to the ‘vocational training’ edge of the cliff, or you make a degree program more grounded in rigorous training, (essentially what we have now), and continue to lose students to other programs because the CS degree they could get is not ‘marketable’.

Actually, Suresh, I think that this already happened: the CS programs might not become more marketable, but new programs are created.

Solution 1: Offer engineering degrees

The engineering side of computer science has been growing stronger. At least in Canada, there is now a large number of software engineering degrees. Even tiny schools now offer both the CS and software engineering degree. UQÀM has a dual degree (CS and software engineering).

Solution 2: Offer IT degrees

The other solution, at the other hand of the spectrum, is to push IT degrees. Companies will not outsource all critical IT functions: companies will always want to get a competitive edge by using some home grown solution, even if it is built almost entirely from existing software. Even if the grunt work is done in Asia, you need people who can draw a database schema and understand where the data is at all times. You need people who can hook web services together. You need people who can talk effectively about IT to the rest of the company. The guy who aced algorithms but can’t give a good talk or listen to a user, he is useless for such a job.

This is were I haven’t seen much growth. It is plagued by many problems in universities… who want to say he is an “IT professor”? And what does the phrase mean? I don’t know. There are many IT programs out there, some of them very good, but most were built out of scrap from other programs (CS and business), or they feel like it.

So, Suresh, I think you are right. We are at the end of the beginning. New programs, such as IT and software engineering, will grow stronger in the coming years. It seems likely that CS degrees will evolve much like mathematics… attracting students interested in teaching CS or doing research in CS… but I don’t think CS will ever grow back to where it was. CS courses will be service courses for IT and software engineering programs.

(My predictions, as always, are worth the paper they are printed on.)

Do we really want more students interested in CS?

Expert Opinion talks about whether the current drop in enrollment is a tragedy or not

From an employee’s perspective, fewer people seeking jobs is a good thing. And, frankly, while having lots of graduate students to shovel code may help some university research, I’m not convinced that most of the system building that results is truly significant. Interesting, yes, even neat. But not likely to have any significant impact. University faculty should have their students’ interests in mind when talking or writing about the job market, and I’m not sure we do when we talk of declining enrollment as a bad thing, or, even worse, a crisis. Declining enrollment is a rational response on the part of students to a significant drop in the job market.

There are many interesting bits in his posts. For one thing, he shows that the decline comes, in large part, because females have completly left the field now. I think he nails the real reasons why females have left:

I have a radical idea: how about Microsoft leading the way in instituting a real 40-hour work week? How about Microsoft getting rid of the practice of hiring “temporary” technical staff?

Yes, that’s right. Jobs where you are required to work 90 hours a week for an average salary are not going to attract women. Men are macho and stupid, we don’t mind dying at 50 of a heart attact away from our love ones; women aren’t so silly and they require time with their family.

Second of all, he clearly states that fewer graduate students is not going to really hurt research. This is very important. I’ve said it again and again on this blog: stop claiming that we urgently need more graduate students.The lie is everywhere around us, so much so that we can’t see it anymore.

ACM Queue – A Conversation with Tim Bray

This is brilliant! ACM Queue is publishing an interview with Tim Bray (of XML fame) done by Jim Gray (of data cube and database transactions fame). Tim now runs Web technologies for Sun Microsystems. Tim Bray basically says that RDF and Semantic Web are a no go but we knew that’s what he thought.

However, there are many cool quotes. Try to find the pattern in these:

My CEO, Tom Jenkins, agreed to turn me loose to work on it myself, and I spent six months basically doing nothing else and built the crawler and the interfaces. (…) I lost weeks and weeks and weeks of sleep, hacking and patching and kludging to keep this thing on the air under the pressure of the load.

Lark was the first XML processor, implemented in Java. I wrote it myself. I used it also as a vehicle to learn Java. It shipped in January 1997 and actually got used by a bunch of people. (…) So, I let Lark go. It was fun to write and I think it was helpful, but it hasn’t been maintained since 1998.

Some of the people working in syndication were extremely upset about XML’s strictness, saying, “Well, you know, people just can’t be expected to generate well-formed data.” And I said, “Yes they can.” I went looking around and found that there are some quite decent libraries capable of doing that for Java and Perl and Python, but there didn’t seem to be one for C.

So sitting on the beach in Australia I wrote this little library in C called Genx that generates XML efficiently and guarantees that it is well-formed and canonical.

See the pattern? Tim Bray is a hacker with a degree in mathematics and computer science. [Tim doesn’t have a graduate degree.] And he changed the world.

But his life was not always easy:

Microsoft really went insane. There was a major meltdown and a war, and I was temporarily fired as XML coeditor. There was an aggressive attempt to destroy my career over that.

(Note that the interviewer, Jim Gray, works for Microsoft!)

The rise of social conciousness in cyberspace

In an earlier post, I tried to predict what the next Gutenberg printing press or the next Web would be like. I predicted that ubiquitous massive storage would be the next big technological advance and that it would bring three new challenges: the need to bring data warehousing to the masses, the need to bring security to the masses, and the need to move all social software to the Wikipedia level and beyond.

Scott had this comment which is worth repeating here:

I think the third — the rise of social conciousness in cyberspace — is right on. Of course, it’s hard to be more specific. The one thing I’m fairly sure of is that the Next Big Thing will be familiar — it won’t be part of some alien new world. It will be a reflection of what people have been for a long time. What is important to people? Mainly, we communicate with each other. We communicate useful get-through-the-day facts and longer range planning. We gossip and small-talk to maintain or strengthen social relationships. And we produce and consume art to fulfill some deeply ingrained need to find resonance with other people. (Oh yes, and pornography, which is kind of in a class by itself.) So far, the big things in IT have all been direct reflections of those social needs: the Web, e-mail, instant messaging, cell phones, Napster/KaZaA, Skype, “social networking”, iTunes, video-on-demand, etc. I expect this web of communication to mature into something in which reputation and recommenation are pervasive — in a way that mirrors practices that we are already comfortable with, but with dramatically increased efficiency and/or accessibility. The open question for me is whether the increaase in efficiency or accessibility will be sufficient to have an impact approaching that of Gutenberg’s press.

Inexpensive ubiquitous mass storage is closer than you think!

I started using Google Mail (GMail) last year because I want to be able to read my mail from everywhere, all the time. Google offered me 1000 MB of free storage and one of the greatest user interface for a mail client. Oh! Did I mention it has nearly perfect spam filtering, without any effort on my part?

I wondered what would happen when I would reached 1000 MB of mail. For me, that’s about 2 years of incoming mail, maybe a bit less.

Well, my account has now 2057 MB of storage. That’s about 3 years worth of storage. It seems like Google increases your limit as need arises.

Technological singularity

Interesting: I just read the Technological singularity entry. Firstly, it interesting because Wikipedia is a free-for-all encyclopedia, but the entry is really high quality and, secondly, the topic of a technological singularity is fascinating in itself.

In futurism, a technological singularity is a predicted point in the development of a civilization at which technological progress accelerates beyond the ability of present-day humans to fully comprehend or predict. The Singularity can more specifically refer to the advent of smarter-than-human intelligence, and the cascading technological progress assumed to follow. Whether a singularity will actually occur is a matter of debate.

Of course, I doubt anyone understands what intelligence is, despite centuries of research and billions of dollars invested in the last 30 years on AI research.So, I’m not expecting a big breakthrough in AI research soon. However, the Wikipedia entry correctly points out that the singularity could occur through other means: nanotechnology or some other technological breakthrough which allows us to greatly accelerate our rate of technological progress.

I expect the Internet to boost technological progress. To me, it feels like it does enhance my intelligence greatly. (I define intelligence as my capacity to solve problems.) However, in other to reach a singularity, you need a much more profound breakthrough: you need to enter a tight loop where technological breakthroughs lead to faster R&D which in turns leads to more R&D-enhancing technological breakthroughs, and so on. The problem is that right now, R&D is done by humans are humans are limited: we can only adapt so fast to change. Hence, you need to either improve human beings, or create new intelligent beings.

I think that AI is currently out of reach, and probably not desirable: do we really want to create intelligence beyond our own? Books, IT, Google, Wikipedia help make us smarter, and quite a bit so, but I just don’t see the exponential growth in intelligence that we require to reach a technological singularity…? Or maybe it is simply hard to see because we are living through the last few years before the singularity?

Best Degree to Pair w/ a B.Sc. in Computer Science?

Slashdot is running a discussion on the Best Degree to Pair w/ a B.Sc. in Computer Science. The answers are interesting and range from a MBA to Mathematics. These two seem to be the most frequent answers. I was not surprised to the MBA there, but Mathematics was a bit more suprising. Here are two quotes from the posts:

If you want to be a tech for the long haul, perhaps a degree in mathematics.

Most pairable degree with Computer Science: Mathematics. Affinity for math tells employers you’re capable of high level, abstract thought.

This being said, I have a lot of respect for management people, but if everyone becomes a manager, who will be left to manage? The dumb people? Also, with increasingly flat hierarchies, the need for managers should not be so great. Of course, a MBA teaches you more than just management skills (I hope).

Job Market for CS Students

Yuhong worries about CS students. She points to two recent articles on the CS job market:

Here is what she has to say based on the people she talked to:

I recently talked to some master CS graduates. (…) They both said programming jobs are no more and many new hires are master graduates.

Here are two quotes from the second article she cites:

There are certain areas in the technology sector that are thriving. Demand is high for those who specialize in network and IT security.

Technology services companies like IBM and BearingPoint are hiring in the United States, though they are increasingly looking for employees who can combine technology chops with business savvy.

The message is quite clear, I believe. If you want to train yourself or students to produce software (programming or software engineering), you better be damn good because the job market is not there anymore. Will jobs come back? Automobile workers in North America are still waiting for the jobs sent to Mexico or elsewhere to come back. Now, programming or software engineering are not useless skills, far from it, but it might be a better strategy to aim for a business jobs where your programming or computer networking skills can be put to good use, for example. It seems that the job market is moving toward information technology (security, networking, using the right technology at the right time, understanding the implication of a given technology for business).

From British Columbia comes Open Source Academic Publishing Software

It seems like a bunch of schools in British Columbia got together to develop open source academic publishing software.

The University of British Columbia’s Public Knowledge Project (PKP), the Simon Fraser University Library and SFU’s Canadian Center for Studies in Publishing (CCSP) have formed a partnership to support the maintenance and ongoing development of the internationally acclaimed open source software developed by PKP.
(…)

At the heart of the partnership are three major software programs. Open Journal Systems (OJS) provides online management for journal submissions, peer reviewing, editing, and online publishing and indexing. Open Conference Systems (OCS) manages conference registration, programming and paper submission and publication. The PKP Harvester (PKPH) is used to automatically create an online index of materials from a variety of online sites including journals and repositories such as those housed at the Canadian Association of Research Libraries, which are harvested and reside on an SFU Library server.

(I got this through Downes’.)

Die trackback, die!

From now on, all trackbacks to this blog are moderated thanks to the moderate-trackbacks plugins. Spammers have really a lot of time to waste. Good thing the wordpress community is very strong and fighting back.

Now, the simplest thing is: do not to use trackback. It is a weak protocol (in a spam infested world) and I’ll probably not moderate these very often especially if the queue gets very long. Ping my blog instead (pingback specs make spamming difficult).

Update: Downes has a recent post on a related topic: Trackback is Dead, Use PubSub (though I published this post before he published his!).

Update 2: about 2 hours after installing the plugin, I’ve got 18 trackback deleted. This is 9 spams an hour. And I’m a low traffice web site…

The Google Browser?

Through Harold, I found this quote by Seth Godin regarding the recent events (Google hired a key Firefox developer):

1. Running a successful open source effort is a great idea. I can’t think of an individual who has invested the time and not had a great personal outcome as well.

2. Google understands what I failed to persuade Yahoo! of a long time ago–owning the browser is a home run. Microsoft has botched their ownership of IE, because they think like bullies, and you can’t bully consumers into doing what they don’t want to do. The idea of a Google browser is powerful from both a user and a commercial perspective, mainly because Google’s culture will make it work.

JAWS Screenreader Adaptation for Mozilla Firefox

From Catherine Roy, I learned that there is now a screenreader for Mozilla Firefox. This is an essential tool for visual impaired Web surfers. The adaptation to Firefox is a GPL but JAWS itself is a commercial (Windows-only?) tool.

What do visually impaired Linux users do? I know KDE has an accessibility initiative, but how does it compare with the Windows or Mac universe? Are there screenreaders for Firefox under Linux? I suspect that Linux (or even Mac) is probably behind in this respect?

Update: it looks like Fire Vox could be a better alternative.