all bits considered data to information to knowledge

15Aug/110

In praise of sloppiness

Alexander Pope once remarked - To err is human; to forgive, divine.

He might just as well be speaking about the ultimate solution to the uneasy relationship between computer programmers and end users. The former are engaged into arcane art of software, and the latter just want their job done.

Here are some quotes that at first blush appear to be at odds one with the other - but are fully reconciled with the Alexander Pope’s quotation:

-Adam Bosworth

“That software which is flexible, simple, sloppy, tolerant, and altogether forgiving of human foibles and weaknesses turns out to be actually the most steel-cored, able to survive and grow, while that software which is demanding, abstract, rich but systematized turns out to collapse in on itself in a slow and grim implosion.”

- Niklaus Wirth

"Good enough software" is rarely good enough. It is a sad manifestation of the spirit of modern times, in which an individual's pride in his or her work has become rare."

The above quotes are two sides of the same coin.

The software must be designed for humans; it must be forgiving, easy, abuse-tolerant... To design to these standards the highest level of rigor and craftsmanship is required.  A programmer-oriented software that requires user to adapt to the idiosyncrasies of the programmer's thinking is a result of a sloppy programming, and it is never good enough let alone - divine.

7Nov/090

In the beginning there was HTML…

In the beginning there was HTML, and saw Marc Andreessen that HTML was good, and created Netscape browser. And while the browser was buggy and slow, it changed the world, and begat (spiritually) Firefox, Safari, IE, Opera, Chrome and others. Somewhere along the road geeks were expelled from paradise, and business moved in to cash on the Internet…

It  used to be that one could make a living coding HTML by hand, and a number of companies were created to automate the task – injecting standard headers, visualizing website structure, propagating changes, editing JavaScript etc.

Meanwhile, dynamic content generation went from CGI scripts (Unix shells, C, Perl etc), through server-side scripting languages, all the way to ASP, PHP and the likes…Java made its debut in the web via applets, followed by Microsoft knee-jerk reaction to embed ActiveX controls in the IE (and opening the floodgates for security threats in the process), before finding its way to servlets/portlets and other *lets, and Microsoft counterparts – asp.net (with code behind), web controls and webparts…

At every step the community opted for ever more tightly controlled development (naming conventions, standardized logging, error handling, continuous integration), ever greater abstraction level (reusable libraries, IDE, tools, patterns, frameworks), ever greater automation of every stage of software development – from coding to building to testing to deployment to maintenance… Enterprise level software moved from monolithic tightly coupled monstrosities to distributed component based systems, from closely coupled binary contracting to loosely coupled Service Oriented architectures…

I believe, we are entering the phase when ECM – Enterprise Content Management systems - are becoming platforms within which data and applications come together, where composability of the components allows for stringing functionality in ad-hoc applications that are enabled by all of the above: technology independent, loosely coupled, location agnostic nirvana!…

Now, back to Earth. All these concepts have to live somewhere. A number of ECM sprouted in recent time: every heavy weight (Microsoft, Adobe, IBM, Oracle), and upstarts (Alfresco, for instance). None is perfect, all have dizzying array of features, are based on different technologies (.Net, Java, Flash/C etc); arcane programming models…
Yet, I’d argue that they are step in the direction I’ve just plotted in my ramblings above, and are here to stay. This is just a beginning where all these trend converge to produce a qualitative leap forward; one can tell by how the forerunners are trying to subvert it by fencing you in, trying to be everything at once… But the ship has already sailed, ultimate distributed computing (SOA) found its frameworks to move away from craftsmanship into industrial age of software development.

23Sep/090

Who needs a Software Factory?

Software [Development] Factory – a concept most eloquently formulated by Jack Greenfield – did not generate much following. The first book Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools authored by Jack Greenfield (also, see MSDN article), Keith Short, Steve Cook, and Stuart Kent was published in 2004, followed by Practical Software Factories in .Net in 2006, with but a single title advertized for release in 2010 (Agile Software Factories by Damon Wilder Carr)

Apparently it did not struck a chord with majority of software architects and developers out there. Some of the reasons could be developers resistance to commercializing and commoditizing the high art of software development; some of it, I believe, could be attributed to misunderstanding of the applicability domain. Simply put, techies do not need it, and business folks do not understand it.

The promise of SDF is to deliver “economy of scope” where “assembly lines” produce variations of archetypal product/template, it is therefore inherently not applicable to one-off projects. I see the SDF closely aligned with Software Product Lines (e.g. Software Product Lines: Practices and Patterns ), where business domain knowledge meets “patterns, models, frameworks, and tools”.

 There could be no discussion about value of patterns, models, frameworks, and tools; these are given for any software project worth the name. They are firmly established in software development practice. Move up one notch in the abstraction ladder and you have all these wrapped up into archetypal template from which you can generate a variation of a domain specific product. An example of such “wrap up” could be used through a DSL  - Domain Specific Language used by business domain experts. Say, you  - a domain expert - are assembling a financial application that processes loan applications – would not it make sense to use language that finance people could understand with keywords such as “loan”, “mortgage”, “originate”, “fee” etc.? Just as most developers shudder at learning intricacies of APY (Annual Percentage Yield) versus APR (Annual Percentage Rate), the finance folks would not want to be caught dead talking about threads, mutexes, arrays and data types... SDF paradigm allows for both to coexist peacefully: technology aficionados take care of the underlying machinery, and business can assemble applications they need without need to translate the requirements - times and again – into techspeak.

 Now, how is it different from CASE  promises?  Here’s my answer: SOA+DSL. The SDF transcends CASE in a way of operating at ever-increasing level of abstraction, where models do not have to be translated into software modules but rather into software services wired together to produce desired functionality, where patterns are built into the language the domain experts use (think BPEL+), where software engineering best practices are implemented on subconscious level…

I can see these trends when IT folks are increasingly urged to “speak business”, when mash-ups and Google Gears allow for creating a reasonably functional web application with minimal understanding of what technology all these gadgets are using, with proliferation of ECM – Enterprise Content Management systems such as Drupal, Alfresco, Sharepoint, with DSL taking more prominent role (check out Microsoft's Domain-Specific Language Tools for .Net)