all bits considered data to information to knowledge

10Feb/110

10 Commandments of Error Handling

  1. No method shall return “success/failure” code.
  2. All methods shall return results (“void” where appropriate
  3. All operations that could potentially result in an exception must be organized into
    [try…catch…finally] blocks in the method’s body
  4. An exception should be thrown to the caller in case of a run-time error
  5. An exception must be specific to the error, no generic exception should be raised (and if
    necessary, they should be wrapped into actionable exceptions of specific type)
  6. All exceptions must be logged
  7. Exception only should be caught if they are of specific type, and could be handled in a
    meaningful way.
  8. Any exception that could be potentially thrown by a method must be documented in that
    method
  9. Any exception not caught at the top caller level must be logged, and consider fatal. Normally,
    this would result in termination of the execution, though it should be dealt on case by case
    basis.
  10. Use Error Handling and Logging Frameworks

 

Microsoft Error Handling Guidelines for C#

Three Rules for Effective Exception Handling in Java

15Oct/100

Fishing in tide pools

One couldn't help noticing that the computer technology market has improved dramatically – the job posting on the sites such as Dice.com and Monster.com surged by more than 50% since the beginning of the year, and we are all fishing for “talent” (see my previous post) in the same pool of candidates.  The problem is that the pool is shrinking, which brings about my analogy of tides and – by extension – tide pools.

In the lean times – the “high tide” times – the software developer pools were swamped with developers of all calibers, and employers used to getting stacks of resumes of high quality candidates. This might not be the case in the near future as demand for good developers picks up. The waters receded, and we are no longer fishing in the ocean, we are fishing in a tide pool. The landscape is changing; while I do not expect frenzy of the late ‘90s, I see the tables being turned on the employers by the developers with the right skill mix.

Occasionally, a low tide leaves some bigger fish stranded in a tiny pool where might suffocate and die…. But let’s not go there: analogy is a powerful tool, and should be used with extreme caution. Used in moderation it illuminates, take it a notch too far, and it leads to a slippery slope of reality substitution.

update (11/15/2010): there must be something in the air - the Wall street Journal has just published an article with musing on future defections of overqualified candidates once economy improves

15Mar/102

Native speakers vs. Phrase-book savvy

It is not uncommon to hear from developers during an interview: “I do not remember all these details, I can always look them up”… And this might be a valid response – up to a point.

It used to be simple – all one had to learn was 32 keywords in C, learn the grammar and then use his imagination to create applications. Early on it was understood that re-using code will increase productivity, so the concept of library was born, later augmented by MFC and OWL… The trend continued towards ever greater modularity – the foundation classes were eventually integrated into run-time frameworks, such as Java Foundation Classes (JFC) and Common Run-time Library (CLR) in .Net… Suddenly, a developer had thousands of functional libraries built by professional teams, and incentive to re-invent the wheel diminished accordingly…

 Would it be a reasonable expectation for a developer to know every single of the classes rolled into JFC or CLR? No. Would it be reasonable to expect them to be familiar with hierarchy of the classes? Yes.  A developer needs to know how and where find a piece of code to accomplish what is needed.

 To be productive in either Java or .Net, a developer must have working knowledge of the most commonly used classes, plus a good understanding of where to find the rest. He (or she) must control the urge to create something that might already exist in the environment and  leverage the existing capabilities to create new ones.

 When hiring a developer I expect him to be fluent in the technology; at the same time I am extremely wary of both the “native speakers” – someone who is so immersed in the technology (Java, C#) that he forgets the rest of the world; on the other side of the spectrum - also to be avoided - are the phrase-book tourists who know a dozen of cookie-cutter recipes and lack either ability or curiosity about the technology to become fluent in it. The former might indicate a technology bigot who is so set in his ways that he would never be able to work with any other technology that business might require in the future, the latter might be an indication of a “fly-by-night operator” – someone who is chasing “hot” technologies to make a quick buck (or suffering from programming variety of ADHD).

NB: Interesting perspective taking my point ad absurdum can be found in this blog post: How to Find Crappy Programmers

 P.S. I emphasize “might” word in the above paragraph, as I have met - on a rare occasion - “native speakers” who spoke several disparate technologies, and were open to learning new ones… and some phrase-book savvy developers could be made to see the light, given sufficient time and money. Al in all, I would consider a "native speaker" a highly specialized "tool" that could be very sucessfully utilized on certain types of the project, while "phrase-book savvy" should not be relied upon for tasks that could require even a modicum of creativity.

P.P.S According to some sources there are ~988,968 words in English Language ; other sources imply that 171,476 words are currently in use.

Yet to be fluent you only need a fraction of it. Here's one bold attempt at classification:

3000-4000 words: Sufficient for reading newspapers and magazines fluently.

8000 words: All you ever need. More words are not necessary in order to communicate freely and read all types of literature.

10.000-20.000: Active vocabulary in the mother tongue for cultivated Europeans.

19Jan/101

Would you recommend programming as a career to your son?

For me it is a bit of a rhetorical question - I am raising two sons who have no inclination to follow in dad’s footsteps (no surprise there – I did not want to be an electrician like my dad either, instead I went on to study physics and chemistry).

But here is an interesting blog pondering the issue, and offering a piece of advice on where the sweet spot might be,  Next-Gen Jobs by Jim Ericson

He believes that analytics and metadata would become the next big thing, and sound career choices fot the next generation.

I have my doubts; I still remember an advice from a book on VBX programming I've read in the beginning of 1990s: “learn C++ and Visual Basic and you do not have to worry about where the next meal would come from”. Well, good luck to all the programmers out there who followed it. Though it did seem like a good piece of advice at the time: VBX was not easy (arcane C/C++ code), and demand seemed to be inexhaustible (VBX controls allowed to break through limitations of Visual Basic, and yet have easy development environment to program Windows GUI)… Then OCX standard came, which morphed into OLE/COM( and later was supplanted by .Net model), and on top of that one could use VB to implement the spec - no need for additional programming language....

Now, one might argue that this was merely a technology while analytics is  a Profession....  Admittedly, there is something to it.  Still, I believe that specialties of the future are not spelled out yet, and the most promising ones are spanning more than one domain (there was a short SF story where career counselors were randomly matching specialties, and the student was supposed to figure out what the profession might actually be, e.g. "forensic linguistics" or "linguistic taxidermy" etc ... ' don't remember the author, might have been Robert Sheckley)

Who had ever heard about SEO professional before advent of Google?

26Dec/090

Profession vs. Technology

Whenever I hear a statement linking one’s profession to a specific technology (“a .Net programmer”, “a Java programmer”, “a Sharepoint developer”, “ a Ruby-on-Rails specialist”…) I cringe, and recall this story:  Profession by Isaac Asimov first published in 1957 (which I read for the first time in the late 1970s... )

This story I recommend as a required reading for every aspiring programmer, budding scientist or  high school student.

The story  is about a boy who aspires to be a Programmer, and discovers along the way that there is more to it than just a technology to master..The full text of the story (in pdf format) is available here;  for those allergic to science fiction here's a synopsis of the story in Wikipedia.

To summarize: a technology is too narrow to define a profession; maybe this is why technology certifications became virtually worthless. I do not want “Java Programmer” on my team;  I want a programmer who can use Java – or whatever technology we might be using at the moment to create a solution.

 Technologies come and go – no matter how hot they may seem today. What stays is refined body of knowledge, patterns and paradigms, rules to follow and rules to be broken... Nothing stays the same, all axioms have to be constantly re-evaluated. 

The quest is not to achive eternal bliss where everything is set once and for all, but to move forward with imperfect solutions optimal for that particular moment in time.. A catapult might have been the pinnacle of military technology innovation in it's time - and is a historical curiosity for several hundreds years now; but the lessons learned percolated into ballistic, structural engineering and rocket science...

15Dec/092

Speeding up Sharepoint 32bit development environment

Sharepoint 2007 (MOSS) requires Windows server platform to run on (yes, I am aware of Bamboo hack). This would require either set up a central server for the developers, or equip the developers with Windows 2003 Server machines, or resort to virtualization.

 

This is the latter one that I am talking about speeding up. The 32bit  Windows environments have a hard limit , being able to address only up to 4GB of physical memory

 If you are using virtualization to run Windows 2003 Server on Windows XP 32bit your machine slows to a crawl. If for some reasons (and I have tons of those right now) you cannot switch to 64bit development environment, you can go for 64bit HOST environment (Windows XP or 7), and run 32bit virtual GUEST development environment (Windows 2003 Server). 

 Using VMWare Server (free) you’d be able to allocate up to 3600MB of RAM just to your virtual machine( and leave some to the host syste, say, another 4GB). My non-scientific profiling of 64bit Windows 7 (6GB of RAM, total) running 32bit virtual Windows 2003 Server shows improvement in performance by factor of 10 (in some operations) against 32bit Windows XP running 32bit virtual Windows 2003 Server.

 To sum it up: if you must do 32bit Sharepoint development, do it in 64bit host environment to work around 4GB of RAM limitation (read: incessant paging).

6Dec/090

Caveat emptor: value of non-transferrable skills

One piece of advice to computer programmers making their living in business world that I've heard (and gave out) over and over again, and which - with rare exceptions - proved to be true so far was this: "stay away from custom enterprise applications".

The logic is as follows: a system designed and implemented in-house would provide you with an opportunity for learning invaluable lessons in design and implementation -  if you are lucky enough to be part of the team that was tasked with it, that is. Being through start-to-finish cycle of an enterprise system endeavor will provide you with everything you need to further your programming career – designing and implementing custom application and integrating it into a hodge-podge of other system usually found in a corporate environment.  So far so good.

Now, the system put in production, and you are charged with maintaining and enhancing it. Herein lies the trap: becoming so specialized that any skill mastered would be irrelevant to the rest of the world. It might be technology (and programming landscape is full of charred remains of once hot technologies), it might be business model that was happily left behind by everyone but the company you happen to work for... Whatever might be the reason, you are stuck with non-transferrable skills.

One consolation could be that you might become indispensable to your employer: no-one in the entire world would be able to replace you. But the fallacy of this argument becomes apparent as soon as you consider the possibility that the arcane custom system you know so intimately would be scraped and replaced with a shiny new third party offering, or business process that the system supports becomes obsolete and gets replaced with a new one, and it would not help you if your company goes bankrupt, or acquired, or the entire system is outsourced to the Moon... This is an illusory job security.

Now, it could be argued that one can pick bits and pieces of transferrable skills in problem solving, code maintenance, system integration etc. But let's face it: you are in danger of becoming a one-trick pony, with all that this implies. These considerations are also applicable to these working in software development companies, though a somewhat lesser degree. No matter how hot is technology your company is using right now, it too will become obsolete. You might derive immense satisfaction working with some bleeding edge technology, but if you also make a living with it would it not make sense to hedge your bets? Mainstream technology also undergoes changes and becomes niche before disappearing into oblivion (have you checked demand for SUN Solaris Administrators lately? Informix DBA? dBase programmers?...), but usually it gives ample signs of decline, and most often than not would be picked up by some other mainstream company. A bleeding edge fads tend to disappear overnight, because they rely on small teams of dedicated professionals; once they move on, the odds are that the technology would wither and die...

This is a reflection of a more general concept, going beyond programming, generalist vs. specialist.  This goes back to my previous post... Where do you draw the line?

I do have some ideas on the topic, and will offer my meandering advice in one of the follow-up postings.