In
part 1, I discussed the exponential pervasiveness of software in our world, yet to non-software professionals (i.e. programmers), the process of creating software is highly skilled, manual labor intensive process, that is still mostly trial and error. Due to this trial and error process,
50% of most software projects/products undertaken will fail. How can we have all of this incredible software yet the process of creating it is still in the dark ages?
Throughout this blog are several examples and references of
failed software projects and the symptoms of the reasons why they fail. The bottom line is that the software development process has yet to be industrialized into a predictable and repeatable process.
How do we industrialize software development? A lot has been written on the subject over the years, yet from where I sit as a 15 year software professional, none of the methods or processes suffice. Why? I think it mostly has to do with not looking at software development strictly from a users point of view. For example, consider a typical business user of software who comes to work everyday and launches up their CRM application or any other business application. What is that business user mostly concerned with? Their data represented on the computer screen, usually in a form or graph or spreadsheet or report of some sort. The business user wants to input, manipulate, search and see their data in a manner they can understand and work with. The business user assumes that the data is accurate and valid when they are working with it. And when they are done working with it that it is stored somehow and can be later retrieved exactly the same way it was stored. The business user also wants the software to work the same way every time they use it.
What the business user does not care about is the technology used or how the data gets to their computer screen. They also dont care about how many hours, weeks or months (or even years) it took to make this work. They just want their data now and in the format they want. As
John Crupi says, "Biz folks don't care if you use two cans and a string to help them get to market faster and cheaper"
So what does this mean for industrializing the software development process? It means spending time on the piece that means the most to the business user which is the user interface. Thats it. Thats all the business user sees and interacts with. We, as software developers, spend inordinate amounts of time learning and using new programming languages (like C# 2.0), new architectures (like Service Oriented Architectures), new development methodologies (like Agile) and new operating systems (like Windows Vista), but yet, none of this has little to do with what the business user interacts with on the computer screen.
As mentioned above, the business user spends 100% of their time interacting with the user interface on the computer screen. Therefore it would make logical sense that, as software developers, we spend more time focused on this area then anything else. Sadly, in most of the projects I have worked on, user interface design and construction is but a small part of the overall development effort. I would say that it represents maybe 10% to 25% at most of the total effort. I sometimes wonder why this is. Further, we usually use words (as in UML use cases) to describe the user interface and interaction with the user. Funny thing is the user does not interact at all with these words and could care less. Where are my screens they say.
I think one of the ways to industrialize the software development process is to focus more effort on the way a business user interacts with the software. As software developers we could learn a lot from another industry that has figured this out, which is the film industry. Before any large amount of money is spent in the actual making of a film, detailed storyboards are developed that layout the complete film before a single frame is shot. So why dont we use storyboards in our software development process? This will be the subject of my next post.