Thursday, March 16, 2006
I recently discussed variability as being software developments' nemesis.  One item I mentioned was that from a single vendor, Microsoft, there are seven user interface technologies to choose from.  My point is that there should only be one. Really, really.
 
The software game is a whacky biz to be sure.  All the way from iPods being dispensed by vending machines to the FBI scrapping the development of $170 million worth of software.  The recipe for success in the software world is just as variable as programmer productivity is trying to stay on top of the ever changing world of technology, let alone developing expert level skill sets in any one area.  As BarryV points out in his comment to my last post, I need a longer ferry ride :-)
 
This variability becomes (much) greater when executives are making decisions about software projects/products without any real idea as to how software is designed and constructed or how it works as a finished product. It is a complete mystery to them, yet, they are in charge.  Ok, this is a blanket statement, but I have found more often than not that this is a truism in our industry.  I am not laying blame, just making an observation.
 
So what's the issue?  Education is a major factor.  Education about software development, which in most cases, is truly an exercise in trial and error, given the newness of our industry and the variability in everything that is software.  Btw, our trial and error software development process is a key reality point for anyone in our industry to fully understand.  And I don't mean trial and error in the traditional sense of just guessing at what to do.  It is more like guessing which way is the best way to accomplish any given set of tasks because there is so much overlapping technology to choose from, which also happens to be constantly changing.   For any given technology, especially programming languages, there are hundreds of ways to solve the same problem, some better than others, but all valid with no right or wrong way.
 
The so-called Software Architect is supposed to be the person that can figure this out.  Being employed as one, I say it ain't so easy.  For example, Microsoft has seven user interface technologies to choose from.  How does one become expert in each one of these so when the task comes to develop a user interface, you make the right choice based on the requirements?
 
From an executives point of view, why should they know this or even care?  They should care as it directly impacts the total cost of ownership (TCO) of the software being developed, which the executive is ultimately responsible for.  If the software gets born and is useful to the target user community, (in which our industry track record is less than stellar) it is usually around for a long time. Bug fixes, enhancements and general maintenance usually make up the bulk of TCO. Therefore strategic planning in technology selection is just as important as developing the software itself. 
 
Millions chose Visual Basic 6 to develop their business applications in.  However, there is no easy upgrade path to .NET.  I know of one large organization that has over 100 VB6 apps developed and running their business.  They are now pondering how to move to .NET not only from a technology perspective, but also from a training perspective.  Caught between a rock and a hard place as the TCO here is stratospheric no matter how you cut it.   Btw, the idea for this org is to consolidate many of the VB6 application functionalities into .NET shared components which will reduce the numbers of apps (and therefore maintenance costs), hence the transition.
 
So what does this have to do with why don't they get it?  The why don't they get it question is usually asked of me by fellow programmers that have seen an executive business decision made that makes no business sense at all.  In fact, I have asked that question myself many times while working for various companies that don't see what I (or other technology savvy) people see. What we see as part of any project is targeting the software development on the latest possible technology.  Immediately executives think, the programmers just want to work on the cool new technology and tools.  Yes, that is true, but you know why?  Usually the latest and greatest tools allow me to do my job faster, better and cheaper in some cases only the next technology makes something possible.  This also translates into lower TCO as the software moves though its lifecycle.  And most importantly, the life expectancy of whatever software that has been born has some chance living a full life, Instead of being re-architected in 3 years, when we may get +5 years using a different technology set.  Even longer if the technology chosen has a technology roadmap that shows it has a future as well.  This is what most executives dont get when in charge