all bits considered data to information to knowledge


Would you recommend programming as a career to your son?

For me it is a bit of a rhetorical question - I am raising two sons who have no inclination to follow in dad’s footsteps (no surprise there – I did not want to be an electrician like my dad either, instead I went on to study physics and chemistry).

But here is an interesting blog pondering the issue, and offering a piece of advice on where the sweet spot might be,  Next-Gen Jobs by Jim Ericson

He believes that analytics and metadata would become the next big thing, and sound career choices fot the next generation.

I have my doubts; I still remember an advice from a book on VBX programming I've read in the beginning of 1990s: “learn C++ and Visual Basic and you do not have to worry about where the next meal would come from”. Well, good luck to all the programmers out there who followed it. Though it did seem like a good piece of advice at the time: VBX was not easy (arcane C/C++ code), and demand seemed to be inexhaustible (VBX controls allowed to break through limitations of Visual Basic, and yet have easy development environment to program Windows GUI)… Then OCX standard came, which morphed into OLE/COM( and later was supplanted by .Net model), and on top of that one could use VB to implement the spec - no need for additional programming language....

Now, one might argue that this was merely a technology while analytics is  a Profession....  Admittedly, there is something to it.  Still, I believe that specialties of the future are not spelled out yet, and the most promising ones are spanning more than one domain (there was a short SF story where career counselors were randomly matching specialties, and the student was supposed to figure out what the profession might actually be, e.g. "forensic linguistics" or "linguistic taxidermy" etc ... ' don't remember the author, might have been Robert Sheckley)

Who had ever heard about SEO professional before advent of Google?


Caveat emptor: value of non-transferrable skills

One piece of advice to computer programmers making their living in business world that I've heard (and gave out) over and over again, and which - with rare exceptions - proved to be true so far was this: "stay away from custom enterprise applications".

The logic is as follows: a system designed and implemented in-house would provide you with an opportunity for learning invaluable lessons in design and implementation -  if you are lucky enough to be part of the team that was tasked with it, that is. Being through start-to-finish cycle of an enterprise system endeavor will provide you with everything you need to further your programming career – designing and implementing custom application and integrating it into a hodge-podge of other system usually found in a corporate environment.  So far so good.

Now, the system put in production, and you are charged with maintaining and enhancing it. Herein lies the trap: becoming so specialized that any skill mastered would be irrelevant to the rest of the world. It might be technology (and programming landscape is full of charred remains of once hot technologies), it might be business model that was happily left behind by everyone but the company you happen to work for... Whatever might be the reason, you are stuck with non-transferrable skills.

One consolation could be that you might become indispensable to your employer: no-one in the entire world would be able to replace you. But the fallacy of this argument becomes apparent as soon as you consider the possibility that the arcane custom system you know so intimately would be scraped and replaced with a shiny new third party offering, or business process that the system supports becomes obsolete and gets replaced with a new one, and it would not help you if your company goes bankrupt, or acquired, or the entire system is outsourced to the Moon... This is an illusory job security.

Now, it could be argued that one can pick bits and pieces of transferrable skills in problem solving, code maintenance, system integration etc. But let's face it: you are in danger of becoming a one-trick pony, with all that this implies. These considerations are also applicable to these working in software development companies, though a somewhat lesser degree. No matter how hot is technology your company is using right now, it too will become obsolete. You might derive immense satisfaction working with some bleeding edge technology, but if you also make a living with it would it not make sense to hedge your bets? Mainstream technology also undergoes changes and becomes niche before disappearing into oblivion (have you checked demand for SUN Solaris Administrators lately? Informix DBA? dBase programmers?...), but usually it gives ample signs of decline, and most often than not would be picked up by some other mainstream company. A bleeding edge fads tend to disappear overnight, because they rely on small teams of dedicated professionals; once they move on, the odds are that the technology would wither and die...

This is a reflection of a more general concept, going beyond programming, generalist vs. specialist.  This goes back to my previous post... Where do you draw the line?

I do have some ideas on the topic, and will offer my meandering advice in one of the follow-up postings.