Thursday, 06 April 2006
There is an excellent article written in 1994 called, Softwares Chronic Crisis whose conclusion is, Despite 50 years of progress, the software industry remains years-perhaps decades-short of the mature engineering discipline needed to meet the demands of an information-age society. That same year, while I was taking Software Engineering classes taught by Karl from Motorola University, I was told the same thing. Motorola shared their own software metrics to validate what the industry was saying.
Again, that same year, (what was it about 1994?), a CHAOS report came out that said the US spends $250 billion per year on software development for roughly 175,000 projects ranging from $140K to $2.3 million per project. Out of those projects:
- 16% are on time and budget but deliver less than planned (avg 42%)
- 53% overrun (avg 189%)
- 31% are canceled, losing $140B/yr
Flash forward 10 years later and the Standish Group has this update about our software development track record:
- 29% succeeded
- 53% challenged
- 18% failed
So what does this mean? It means we as the software development industry (still) have major problems in developing software on time and budget. Shhh, dont tell our customers. If you read the reports, there are a variety of reasons why we have this record, and not all of it on to ourselves. However, after 15 years in the software development biz, including running my own software company, I can safely say that these metrics are quite real. In my experience, the only way to deliver on time and budget is to give up features. Thats the state of the art today. Even the largest and most successful software company in the world has the same problem on making software development a predictable and repeatable process, Vista 2007, Fire the Leadership Now!"
Softwares Chronic Crisis is right on the money back in 1994 and still is today in my opinion. The root cause is that our software development industry has yet to be industrialized. This is in comparison to other industrialized industries like electronics, mechanics, etc. These industries have matured from one-off custom development to mass customization (i.e. industrialization). Mass customization has the following characteristics:
- Configure, adapt and assemble components to produce variants
- Standardize, integrate and automate production processes
- Develop and configure extensible tools for rote or menial tasks
Our software world currently does not yet have these characteristics of industrialization. Our software development products are designed and constructed one source code line at a time.
So what does this have to do with IT is a commodity? Some people think IT is a commodity. While I think the hardware side of IT, including some basic firmware applications, maybe a commodity, the software side (bought s/w products or custom dev) is still far, far away from being a commodity. Look at our track record. Look at your own products/projects. How many bugs (do you know about) are in it? According to a recent NIST Study on the impact of software bugs, conducted in 2002, "Software bugs, or errors, are so prevalent and so detrimental that they cost the U.S. economy an estimated $59.5 billion annually, or about 0.6 percent of the gross domestic product. More than half of the costs are borne by software users, and the remainder by software developers/vendors."
If you could visibly see the thousands of bugs in most commercial software products, would you still buy it? We do it all the time. I like the fact that companies sell "packaged" enterprise application software and then suggest to you that you pay for services to install and configure the software and it will cost less than a custom build. Maybe, maybe not. I personally know of several configurations of ERP systems in the Oil and Gas industry in Western Canada that have cost 10 to +100 times the cost of the product itself. The product cost is $1 million. If I were a Customer of such a service, I would also want my own personal jet (and full-time pilot) for that kinda of money. Cmon!
How is IT (i.e. software development) a commodity? Software development is still an incredibly skilled and massively labor intensive process. Our development tools lags platform technology. What do I mean by that? While we have platforms that provide the technology to do (most) everything we want, our development tools are so low-level that we are (still!) handcrafting every solution as a custom one-off, one source code line at a time.
How do we industrialize our software development industry? Next post we will look at how other industries have done it, and ironically, using computer technology as an enabler to industrialize their own business domain.
Friday, 31 March 2006
Some years ago I had the opportunity to participate in a software technology bake-off. Technology selections or bake-offs go something like this, Dear Vendor, you have been selected to show your wares/skills at our esteemed company. No, you dont get paid for this privilege, but you have one week to complete the tasks in this envelope that you will perform on-site. And if you win, we are not sure what we are going to do anyway. So, dance vendor, dance!
OK, while I am being a bit tongue in cheek, it was exactly like this. I view bake-offs as the easy way out for CXOs that dont know what they are doing or dont want to do their homework themselves. In my experiences, it has been the former. Unfortunately, for the vendor, it is a very risky proposition to participate if the org does not understand what they are doing or getting involved in. At the time of this particular bake-off, it was all about Enterprise Application Integration (EAI) even though Service Oriented Architecture (SOA) principles applied, this buzzword had yet been invented. In other words, this org was a very early adopter of SOA and did not know it. This meant that Geoffrey Moores Technology Adoption Curve is in full play here.
While I am going to name the technologies that were used in the bake-off, I do want to make clear that years later, I came to the conclusion that it did not really matter what technology was used. The reality is that any of the middleware products would have worked for this org. As an analogy, they were looking for a Cadillac solution and at the time thought BizTalk was an econobox and all they really needed was a bicycle. It has been my experience that this is the case for most orgs from a technology selection perspective.
Back to the story, but first a little background. The orgs CIO and his crew were hardcore Unix fans and therefore were horrified to find a Microsoft product (BizTalk) that had penetrated their domain. Our pro services shop was called in to perform a complicated workflow across a few applications related to one of their core business processes for a particular department which got us in under the radar. When we presented our prototype (live from our servers) in front of 30 some people, everyone seemed to be quite pleased. At that moment, the CIO stood up and said, dont too excited, as we have to go through a formal selection of which middleware technology our esteemed org will use. Just so you know, this was the beginning of the end for the CIO who was let go sometime later.
Yours truly got grilled by the CIO and Unix crew on more than one occasion as to why BizTalk. I was so naive. I dutifully and enthusiastically explained the features and benefits, blah, blah blah. It reminds me so much of the commercial that has a cardboard cutout of a salesman that said, So, how much software would you like to buy? As a technical person playing a sales roll, I cant help feeling that way. However, I happened to be President of this pro services company and just bit the bullet all for the company right? Behind the scenes, the CIO flew in a top sales person from Vitria (their Unix fav) who sold a bunch of licenses to the org., even before the bake-off began. The other two players in the EAI selection were Tibco and WebMethods.
The funniest part of the bake-off, depending on who you ask, is when we opened the envelopes, we quickly realized that the third-party consulting company hired to develop the EAI tasks, did not know what EAI meant. The majority of the tasks were what was traditionally known as a data pump. That is to pump data from one database to another, sometimes indirectly through staging tables and/or file shares. The third-party company that did the spec did not know that the A in EAI stands for application not database integration.
Depending on who you speak to, maybe there was nothing funny at all. Take my business partner for example. As it turned out, I happened to be on vacation in the northern regions of Canada at the time (hey, its the North Pole and got daily phone calls on how choked my partner was. Complicated further by the fact we were not getting as much local Microsoft support that we thought we were and since our business model was using independent contractors, we could not engage our regular crew cause this was not paying gig. Consequently, my partner had to pull double shifts for that week. The only other time I saw him this mad was when one of our business partners tried a hostile takeover behind our backs within the first three months of incorporation, but that is another story.
At the end of selection process, with lots of internal lobbying going on, it was a tie! I have never heard of such a thing in my 15 years in the biz. Totally hilar. And the way it was rationalized was that all Unix based projects would use Vitria and all MS based projects would use BizTalk (the company was already a 50/50 split between Unix and MS applications). Part of the inside joke here is that the whole point of a middleware product was to be programming language and platform agnostic and could integrate any disparate applications, which was lost on the CIO that called for the bake-off in the first place. Further, the independent third party consulting company that put together the bake-off tasks also happened to perform Vitria pro services and actually offered up their services. Isnt that a conflict of interest?
As it turns out, Vitria stayed on the shelf and over a dozen BizTalk projects were designed and constructed for the org with our pro services shop getting most of the work. A personal thank you to Bobby Keen!
I learned later that the org spent close to $2 million in the bake-off over a year time frame. I could not believe it, but after someone on the inside explained to me what all had happened and how many people were involved, I could see how it happened. The org could have saved themselves this money or have it put directly to actually solving business problems. So why dont they get it?
There was no joy of software on this one. Ultimately, none of us got it cause the orgs entire IT shop got dismantled as a large outsourcing org swooped down and took the business away from all of us. However, as mentioned earlier, Geoffrey Moores Technology Adoption Curve was in full effect here and I believe we all fell while Crossing the Chasm. Subject of my next post.
Tuesday, 28 March 2006
I have been employed in the software development biz for 15 years. I used to really like what I did. Now I am not so sure I like what I see in our industry. I see vendors plying their wares with cheesy marketing campaigns like Microsofts Dinosaur ads or Oracles Fusion middleware that is hot pluggable (what marketecture). I am not sure why others arent disappointed by this or maybe I am so nave that this is just the way it works in our industry.
When I first started, I was the eager programmer stayed up all hours to ply my craft. It was fun and I have no regrets. I was fortunate enough to move from C, which I really did not get, to Smalltalk, which I did get. Since then, I have had opportunity to use many programming languages and frameworks. 15 years later, I see them as all the same still low level tools, not much beyond assembly language in some respects. We still laboriously toil over cryptic lines of a foreign language to get it understood by the machine (i.e. compiled) and in the end some human being is going to interact with the code you wrote and hopefully it fulfills whatever function it was supposed to do.
Amidst this is the whole world of quality, metrics, process improvement, Agile, CMMI, ISO, Scrum, Extreme Programming, etc. To me, its all the same, most of it just common sense. It still boils down to functional decomposition from Knuths Truths written many years ago. Make the problem small enough for the humans to understand and have reasonable expectations on planning a solution for the problem. I used to work for a division in Kodak (CMM Level 2) and division in Motorola, where we were a CMM Level 5 shop. My disillusionment of the whole process improvement world was born at these companies, who were not really interested in quality from a product perspective, but more from a marketing perspective, so they could get a sticker that says they have a quality process. Bureaucracy was what it really was.
Over 15 years ago, I came from the electronics engineering world where most of what we did in that industry was fairly predictable and repeatable. I thought when I joined the software world, it was more or less the same. After 5 years in the industry I came to the conclusion that the software world is not even close to being predictable or repeatable. In 1994, I read the Standish companies CHAOS Report, which confirmed to me that our software development track record is absolutely abysmal. The latest report says we have improved, but still our record states that for any given project it is still a crap shoot if the project succeeds or not. As a closet perfectionist, this does not sit well with me.
After I got out of the quality game, I dived deeper into programming and buried myself with projects for a number of years. Some were successful and others failed miserably. None of them failed for any technical reason or staff skills, even though staff variability in productivity can vary 20 to 1 according to industry gurus. My experience meter suggests programmer variability is much higher than that. Programmer productivity aside, I was still seeing the truth in the CHAOS report, every project was a crap shoot. How to increase the odds for success?
I know, I will start my own company and I can control everything, muh, ha, ha, ha. So I did, I formed my own company called 5by5Software with another like minded programming buddy and for the first couple of years, we were successful in turning a profit in our little pro services world. As I have written elsewhere, I am extremely fortunate to have worked with several brilliant programmers. It amazes me to have people that can listen to a conversation and go away and within a short period of time, there is the solution, and with a little refactoring, solid as a rock. Yet, with other people, where the specs have been written in infinite detail, only to be looked at with glassy eyes.
In our company, we did so many projects of a particular type that we saw a repeatable pattern and a business opportunity to become a products company, in which we did, called Bridgewerx. After 4 years being President, I resigned cause I was dissapointed by the direction the company was taking from a business perspective. Of course when you require investment, other people get involved and the original vision gets diluted along with your shares. I am proud of what we invented because it does raise the level of abstraction of a particular problem domain. That part feels good. However, now what?
I could go to work for Microsoft, yet when I read Mini-Microsofts blog, I am amazed at the problems this huge company is having. What about Google? JavaScript you say, hmmm where is the challenge in that? Maybe I will start up another company. For anyone that has started their own software venture knows the effort required to be successful, it aint for the faint of heart.
Yeah, start another company thats it. My objective is to build tools that help raise the level of abstraction in our industry, which in my opinion, we are sorely missing. Where are the tools for a Business Analyst to model or draw business processes? And I aint talking Visio. Domain Specific Languages (DSL) look like a good candidate to start building tools from. You can read an excellent DSL article by industry guru Martin Fowler.
DSLs to me are what is exciting and perhaps the next generation of programming tools for our industry. I am not talking about a new programming language or a framework, I am talking about a fundamentally different way to design and construct software. Rather than build solutions, build tools that build the solutions in an automated way. This is what I mean by software industrialization.
It is also the way of other industrialized industries that have matured from one-off, hand-crafted solutions to more of a product line engineering approach. "I don't think the craftsman will be replaced, but the tools he uses will be provided by companies who provide (or are) software factories." This quote comes from Steve Cook's blog and I happen to agree with it. Just like the drafting table was replaced by a CAD machine doesn't mean one loses his or her's craft, right?
Sunday, 26 March 2006
Many moons ago I worked with two brilliant programmers named Steve Langley and Barry Varga. Steve was our boss at the time with 25 years programming experience and Barry became my partner in a software adventure called 5by5, now Bridgewerx. One interesting note was that Barry and I became Steves boss when he worked for our company a few years later.
We worked for a SCADA company who built RTUs for monitoring electrical substations and sending telemetry to a SCADA application. An RTU is a firmware device with analog to digital interface cards to measure substation line voltage, current, load, transformer oil coolant temperature, etc., and feeds that information to a SCADA system. A typical SCADA system usually monitored many substations that collectively forms part of the grid. A substation may need dozens of RTUs and literally hundreds of different firmware applications running to monitor an electrical grid. We had a catalog of over 300 firmware apps a customer could choose from. The company also built custom firmware apps for our customers.
Each firmware app had hundreds of configurable properties, with no default values or data masks or min and mix values, etc. Multiply that by 20 RTUs per substation with 20 apps running, each with at least 100 have to be configured properties means that a bunch of poor guys had to hand enter 40,000 property configuration values in order to make the substation automation work. A software program was developed to allow configuration of these properties and then upload the config file to the RTUs firmware. It would take six months to commission a substation based on this approach. The bitch and stitch from the customers and company was that this was way too long and costly. Our job was to develop a 2nd gen configurator application that would provide automation for most, if not all, of these properties and therefore reduce the commissioning time from 6 months to 6 weeks. That was our mission Mr. Phelps.
This first generation configurator was a classic example of 2-tier architecture and over-engineering. Using Delphi, the UI was a table editor, meaning that grid controls exposed the database shema directly to end users. Tsk Tsk. However, at the time, considered state of the art. End users either could think like tables (a whole generation of table thinkers and tinkerers was born during the 90s) or could not think that way and wanted a higher level abstraction. The over-engineering is that the engineering department decided to make every possible firmware parameter, for every firmware app, configurable, never thinking about size or complexity or the future of substation automation - the very business they are in.
Enter our team whose task it was to build a replacement for the old configurator to make the configuration process simple and repeatable. We were object and component guys then. So when we did the UML on the problem set, it turned out to be incredibly hierarchical in nature. An electrical grid contains 1 to many substations, a substation contains 1 to many RTUs, each RTU contains 1 to many firmware applications, each firmware application contains 1 to many configurable properties, and each configurable property had 1 to many attributes. This is a gross oversimplification, but nonetheless it was just a set of hierarchies.
Our approach was to build a drawing tool that allowed the Configurator to drag n drop different RTU types from a toolbox onto a canvas and connect the RTUs to a network. If you double click on the RTU, a physical picture of the firmware collection was displayed with the name of each application. Double clicking on an app brought up a configuration wizard that had your typical express and advanced modes. Express mode gave you some simple options of a known working configuration that you can default to so that the system can be up and running ASAP. Advanced mode exposed all of the properties, plus their default values that have already been set.
It was VB6 days for the UI and our only real option was to use J++ for the business objects layer, being an MS shop plus a real object hierarchy that modeled the real world. Steve gets code automation. He developed an Excel spreadsheet that contained all of the attributes for each property for every configurable item for every firmware app. So if we had a property, we could give it a name, description, default value, data type, min or max values, whether it was required or not plus a bunch of other attributes, but I think you get the idea. We used Excel to capture the human readable semantics and syntax (a DSL) configuration language for RTU firmware applications. Steve then wrote a VB6 program that read the Excel spreadsheet and code generated J++ abstract and impl classes and next thing you know we have all of our business objects done. Every time the spreadsheet would change (which initially was often, then settled out), we would regen the abstract classes, but not the impl classes unless we chose to do so. Of course the impl classes is were our specific implementation code was. Now of course we have partial classes in the .NET 2.0 framework which makes it even easier for DSL applications.
I wrote the configuration wizards and Barry did a masterful job on the drawing tool in which we used a third party drawing component. So far, so good.
However, the Delphi Lead Programmer did not want to switch to MS. Battle ensues. I wish I could say we won, but we did not. Our product never made it into production, even though there is nothing wrong with the software and meets the business requirements of both the customers and company by reducing the time and cost to commission a substation with their products. So what happened? Just soap opera stuff because we did not have a strong executive team to tell the Delphi programmer that a business decision has already been made and we are phasing (y)our configurator out cause it is too costly to us and our clients. Instead the Delphi programmer quit and then held the company hostage to hire him back on contract (at a much higher rate) to keep working on the Delphi Configurator. Oh man, sideways.
The Joy of Software was how the three of us worked together building software that resulted in a cool product that also met the business requirements. Our test customer, from a real power substation in the US, was thrilled. The why they dont get it goes to (weak) executive management for simply not following through on a business decision that they had already made.
So Why dont they get it?
Thursday, 23 March 2006
Wow, you can really feel the frustration in the comments at Minis blog on the delay of Vista. There were over 300 comments the last I looked. And the comments are shocking, asking for the resignation of leaders, why the delay is good and bad, comparing software complexity between Adobe Photoshop and Vista, that the product was 5 years in the making and 5 million lines of code later and she is going to blow, whats better in the OS than XP2, devs vs PMs vs Testers, etc. There is a wildfire at the largest and most successful software company in the world today.
Quite frankly I am a bit shocked, not at the wildfire, but by the comments. To me, it is no wonder that Vista has been delayed. Why? Its obvious, software development is a non predictable and non repeatable process. Thats it, thats the reality of our business. In my mind, that is the root cause of why software is not delivered on time and budget. All the rest of it is noise.
Despite all of the process methodologies we have today, we still cant determine when a software product is going to ship. We in the industry continue to underestimate the size and complexity of virtually any size software product. The business side of the industry continues to promise software products that are all singing and dancing to customers.
I have a lot of respect for people like Watts S. Humphrey for writing books that apply the most disciplined approach to software engineering that I have ever come across, and unfortunately, I read too many text books, mostly about programming. Yawn. Whats my point? Even with this much rigor, we still cant accurately predict size of product, release date, total cost, etc. What does that have to do with Minis blog on the delay of Vista? Everything! Despite any type of leadership issues, variability in skill sets, stack ranking, sizing and estimating techniques, etc., Vista was delayed for the simple reason that the act of software development is inherently not predictable or repeatable.
There is empirical evidence ad nauseum that supports the fact that software development is non-repeatable process. We have evidence of thousands of projects not only being late, but failed miserably. I would never have believed it myself when I first came to this industry 15 years ago, because my background was in electronics engineering, where generally speaking, we did have a predictable and repeatable process. In fact, the design and construction of most physical objects is pretty much a predictable and repeatable process today. Unfortunately, software does not abide by the laws of physics.
Mini, in an earlier post asked what was the value of Vista that he could explain to his Mom. Thats easy, navigation of the file system has been greatly improved. That is a core and key feature of any OS, how well can the user navigate the file system. As explained in an earlier post, I thought MS had done an excellent job here. The usability is indeed better, I have experienced it myself. Because of the shortcut approach in the address bar of Vista Explorer, I can navigate around the file system with fewer mouse clicks. What more can my Mom ask for?
Another question Mini asked along the same lines for Office 12 or 2007 or whatever it is going to be called. Have a look at the state of the art in features in Word 2003. It has 32 toolbars. How could this happen? This is a word processor. What should a Word processor do? Allow me to write any type of document in the easiest possible way. Fortunately, Word 12 is much better in that regard with its context switching ribbons.
Microsoft should be focusing on reducing the complexity of their products not only from an end user standpoint, but also from a developers point of view. Seven user interface technologies from one company? Whats wrong with this? Rather then making it simpler, this is going the other way in complexity.
What I like about Office 12 plus severs like SharePoint version 3 is that there are developer tools that can customize a SharePoint site quickly and easily. Sure it is FrontPage rebranded and upgraded to become SharePoint Designer, but the point is that a Systems Analyst can now modify SharePoint sites easily and even describe fairly complicated workflows, all without writing a single line of code. What does that mean? It means we can deliver customized solutions to our Customers quicker and not have to have a PHD in programming, meaning the tool appeals to a wider audience. Thats the ticket MS, build more Power User tools that allow more people to manipulate your products using less code. The same approach could be applied to the tools you are using to build Vista.
This is called raising the level of abstraction. From where I sit as a 15 years software developer, this is what will help industrialize our software industry so that software development becomes more predictable and repeatable, and then the headline could be that Vista ships on time as predicted. Till then, what a mess our industry is in, all indicative of Minis blog detailing the inner workings of the worlds largest and most successful software company in the world.
Friday, 17 March 2006
On occasion I am fortunate enough to participate in various Vendors early adopter programs in getting bits of products before they are publicly available. WIFM is by the time the product goes mainstream, I will have had a number of pilot projects under my belt and be ahead of the technology adoption curve, which is what keeps me employed.
Of course the Vendors want something in return, usually a customer reference as the idea is to have a partner work with a customer to use the early software bits as a pilot project. The good news is that you and the customer get to see a real early and generally working view of the product. The bad news in being a bloody beta tester is that more often than not you jump through (sometimes massive) hoops, cause the product simply aint ready yet. Thats the trade off. However, Vendors could do more for their Beta testers as we will see.
The amount of time it takes to install these products and get them configured right, particularly when various dependant products are also in various Beta stages, makes it a necessity to use Virtual PC (VPC). Virtual PC allows you to run a software emulated computer in your host OS. So I am running XP as my host OS and the VPC image is Windows 2003 Server configured as an App Server, along with SQL Server 2005, VS2005, WinFX 3.0 Beta 2, Community Technical Previews (CTP) of various other bits and pieces. The list goes on and on. Something like a dozen items over two days to get it installed and it has to be done in a certain order. Dear Mr. Vendor, would it be so hard to have all of the software on one DVD and a single button that says install this scenario, just like you have in SQL Server and VS 2005. How hard can it be? Btw, the VPC lifespan is about one month as new Betas and CTPs will appear, making your image obsoleto as it may (read: will) not be upgradeable as we shall see.
VPC works great, but you spend a lot of time loading ISO images and restarting the (virtual) computer. Of course the VPC software emulation (even with 2 gigs RAM) is much slower than your host OS, but the advantage is that you can take snapshots of your VPC image after each item is installed and if things go wrong, you can copy your previously working image and be up again very quickly. One problem is that our image is now 16 gigs in size and takes a while to copy across the network. The other major advantage is that all of the developers are working on the same VPC image, which if your code does not work on someone elses image, its your code thats broke, not the config, which makes our requirement for single unit under test condition true.
Now enter the Beta product I have been testing. We have been working on Beta 1 for a little while and it took a while to get the image running right. There also is little documentation. Which is a bit unusual from this particular vendor cause generally they are very good on docs and code samples. Dear Vendor, please at least have how the tools work documented so we can at least try them out and not guess how they work.
Then comes along a Beta 1 refresh. This is the Vendors response from the crush of feedback for Beta 1, coming to the realization that we cant wait for Beta 2, so lets give em a refresh, which really means, whats the latest known good build, ok, ship it! The documentation says that there is no upgrade path. No upgrade path to this Beta 1 version nor will there be an upgrade path for future versions. Now I know this is Beta software, but I think it is a bit inexcusable when the end user (the developers in this case, including me) need to go through massive hoops of uninstalling the previous version, including other add-on bits like WinFX, etc., and then installing the newer version, only to be told that you still have not removed enough of the old version to continue, even though according to Add and Remove Programs it looks like I have already uninstalled everything. Argh!
So now what? Back to a previous VPC snapshot, that now needs Windows updates, a restart , plus a newer version of WinFX with a restart, plus I needed a virtual DVD software emulator to mount the ISO images, in the virtual PC, (wha?) and man, on it goes and I have not even got to the new Refresh install! Which by the way the refresh has a list of 20 components!
During the installation process on a restart, I got a message that said, Since Windows was first activated on this computer, the hardware on the computer has changed significantly (authors note no it has not). Due to these changes, Windows must be reactivated in 3 days. Do you wish to reactivate now? I said yes, first to find out on-line that the number of licenses has been exceeded. Then I was given a number to call which an automated attendant asked me to read a sequence of 40 characters (took forever) and then was told that this was not right and that I need to talk to a customer service representative (ooooohh a human being, how rare). The customer support rep, who was very nice and gave me another 40 characters to enter in (after I read my characters back to him) and activate my version of Windows. Btw, this was done before I uninstalled all of the previous Beta 1 software and tried installing the new products Refresh only to be told I had not uninstalled everything previously. So I just trashed the entire image. So much for the activation. Double aarrgghh!!
So now I am installing everything under another VPC that I had saved off that does not have any of the original products Beta 1 bits on it for sure or the rest of the software I need for that matter. I hope I dont get that reactivation message again. Dear Vendor, if there is no upgrade process, please provide an uninstaller tool that can completely removes the Beta or CTP product thoroughly with absolutely nothing lingering around. Would have saved me days of effort. Triple aaarrrggghhh!!!
So what about the Refresh install? It is almost growing season here in BC, maybe I will get a job picking grapes because it will be much easier and far less bloody than this!
Thursday, 16 March 2006
I recently discussed variability as being software developments' nemesis. One item I mentioned was that from a single vendor, Microsoft, there are seven user interface technologies to choose from. My point is that there should only be one. Really, really.
The software game is a whacky biz to be sure. All the way from iPods being dispensed by vending machines to the FBI scrapping the development of $170 million worth of software. The recipe for success in the software world is just as variable as programmer productivity is trying to stay on top of the ever changing world of technology, let alone developing expert level skill sets in any one area. As BarryV points out in his comment to my last post, I need a longer ferry ride
This variability becomes (much) greater when executives are making decisions about software projects/products without any real idea as to how software is designed and constructed or how it works as a finished product. It is a complete mystery to them, yet, they are in charge. Ok, this is a blanket statement, but I have found more often than not that this is a truism in our industry. I am not laying blame, just making an observation.
So what's the issue? Education is a major factor. Education about software development, which in most cases, is truly an exercise in trial and error, given the newness of our industry and the variability in everything that is software. Btw, our trial and error software development process is a key reality point for anyone in our industry to fully understand. And I don't mean trial and error in the traditional sense of just guessing at what to do. It is more like guessing which way is the best way to accomplish any given set of tasks because there is so much overlapping technology to choose from, which also happens to be constantly changing. For any given technology, especially programming languages, there are hundreds of ways to solve the same problem, some better than others, but all valid with no right or wrong way.
The so-called Software Architect is supposed to be the person that can figure this out. Being employed as one, I say it ain't so easy. For example, Microsoft has seven user interface technologies to choose from. How does one become expert in each one of these so when the task comes to develop a user interface, you make the right choice based on the requirements?
From an executives point of view, why should they know this or even care? They should care as it directly impacts the total cost of ownership (TCO) of the software being developed, which the executive is ultimately responsible for. If the software gets born and is useful to the target user community, (in which our industry track record is less than stellar) it is usually around for a long time. Bug fixes, enhancements and general maintenance usually make up the bulk of TCO. Therefore strategic planning in technology selection is just as important as developing the software itself.
Millions chose Visual Basic 6 to develop their business applications in. However, there is no easy upgrade path to .NET. I know of one large organization that has over 100 VB6 apps developed and running their business. They are now pondering how to move to .NET not only from a technology perspective, but also from a training perspective. Caught between a rock and a hard place as the TCO here is stratospheric no matter how you cut it. Btw, the idea for this org is to consolidate many of the VB6 application functionalities into .NET shared components which will reduce the numbers of apps (and therefore maintenance costs), hence the transition.
So what does this have to do with why don't they get it? The why don't they get it question is usually asked of me by fellow programmers that have seen an executive business decision made that makes no business sense at all. In fact, I have asked that question myself many times while working for various companies that don't see what I (or other technology savvy) people see. What we see as part of any project is targeting the software development on the latest possible technology. Immediately executives think, the programmers just want to work on the cool new technology and tools. Yes, that is true, but you know why? Usually the latest and greatest tools allow me to do my job faster, better and cheaper in some cases only the next technology makes something possible. This also translates into lower TCO as the software moves though its lifecycle. And most importantly, the life expectancy of whatever software that has been born has some chance living a full life, Instead of being re-architected in 3 years, when we may get +5 years using a different technology set. Even longer if the technology chosen has a technology roadmap that shows it has a future as well. This is what most executives dont get when in charge of software development, and have not come from a programming background.
On rare occasions I have seen a planned technology roadmap that goes along with whatever project or product roadmap that has been developed. Typically, once the software (ever makes) goes into production, any technology upgrades are extremely low on anyones list. Then the bug fixes and enhancements ensue. Over a time frame with no technology upgrades and/or , the software bits that got glued on eventually start dropping off or crush the architecture and the cost of bug fixes and/or enhancements goes exponential. And then the project or product gets "re-architected" as a matter of course. More often than not, on some new technology with no technology roadmap. The cycle continues. How to break that cycle?
One way is demand better and fewer tools/products from Vendors. As a software developer, I dont need a new or different hammer every year and an entirely new toolbox of tools or frameworks every couple of years. Know what I mean? This person gets it. Also, if vendors consolidate and simplify their product lines instead of making even more product variations will help promote the industrialization of software. Even though I use Microsoft technologies, I am unimpressed at the many overlapping technologies and different editions or versions of the same. Can you decipher how many editions of Visual Studio 2005 are from this blog? There are 5 editions of XP and now 5 versions of Vista. This is just the client OS! 10 versions, ridiculous, why not just one? Sure I get the capitalistic thing, but capitalism also can be had with efficiency. Our software development world is far, far, far from being efficient. Sounds like a vendor opportunity to me.
In case it is not clear, the they in why dont they get it are executive/management people in software development decision making positions that have no background in software development - the process of or programming in. A super-charged topic even for the largest software company in the world. Next post I will discuss some of those why they dont get it decisions from the field. To use Dave Barrys phrase, this really happened!
Thursday, 09 March 2006
I have been using Vista Build 5308 since it came out a week or so ago, along with running VS2005 with Cider, Expression Interactive Designer (a.k.a.) Sparkle, Win SDK and WinFX.
It has been very interesting. First, Vista is considerably more interactive than compared to XP or W2K3 Server. What I mean is that I am more efficient on using the OS as it looks like considerable interaction design has occurred in the design phase. For example, in Vista Windows Explorer for file folder navigation, you will notice a different tree view control and address bar. The address bar allows you to click on any part of the path in the address which allows you to navigate easier (i.e. fewer mouse clicks). Given that I do spend (too much) time navigating the file system, this really is a major improvement. There are many other improvements, which are not the point of this post, other than to say, I think Vista will surprise many people on its usability design, aside from the flash and gas of the graphics.
2nd I have been using Cider (think WinForms Designer but using XAML) and the Win SDK to evaluate WPF. Of course on Vista, the graphics is outstanding. My Acer 8204 laptop comes with a pretty high-end video card (ATI Radeon X1600) with DirectX9 embedded into it, which allows Vista to fully utilize the GPU. Vista graphics look awesome, aside from the Aero Glass (i.e. opacity) effects and the window fly-ins. The animation and 3D manipulation of objects is ultra smooth and sharp at any level of resizing.
VS2005 Cider is aimed at the developer (or coder or programmer or software engineer, or craftsman, or whatever moniker you hang by). You spend the same time on the design surface as you do traditionally do in the WinForms 2.0 designer surface, but mostly you spend your time in both XAML and C#. Contrasting this is Sparkle which is really aimed at designers. Btw, the March CTP of Sparkle was just released. The difference is that you spend more time in the design surface than writing code (and XAML). In fact, another one of Steves blogs points out, Sparkle, development teams, and what no code means.
Now this raises an interesting conundrum which you can read Clemens excellent post about Visual Studio Overload And The Specialization Gamble. Clemens does an excellent job explaining what the conundrum is in trying to learn too many languages and tools, The reason is very simple: Capacity. Theres a limit to how much information an individual can process I totally agree and I described this phenomenon in my previous post as variability instead of capacity. I do believe that they are very much interrelated or synonymous with each other, in fact, maybe we are saying the same thing but differently.
The conundrum is what tool I use (Sparkle or Cider) given the limited amount of time I have to play with this, which amounts to an hour or so per day on my ferry commute to and from Vancouver. As someone who grew up on Visual Studio, I am immediately inclined to stick with it. However, having played in both environments, I find I can do way more quicker in Sparkle, once I get past how different it is than Visual Studio. Sure, Sparkle has an IDE, and while not as complicated as VS2005, it does have a funny tool palette with what looks like a dozen objects from Adobe Illustrator or Photoshop which means I am starting from scratch with these tools, not only the usage, but also because they are more graphic art type tools, in which I have no talent for or formal training in.
Not to complicate matters, but there are other XAML tools available, most notably is Mobiform Aurora Designer in which I got a demo of about a month ago. It certainly is a candidate, but where to find the time to fully qualify? Finally for building designers, there is Microsofts excellent Domain Specific Language (DSL) Toolkit that code generates designers, but at the moment, can only be hosted in Visual Studio and does not do XAML. Heck, you can even hand code craft your custom designer in .NET 2.0 Framework (thanks BarryV!).
I have a vision for my Story Board Designer application. All the 3D objects and animation interaction plays nicely in my head, but how many years is it going to take to program into any one of these tools? And which one would that be?
Maybe I will ponder that over a few (more) pints of Guinness...
Saturday, 04 March 2006
This blogs topic is about software industrialization, which means making software development a predictable and repeatable process. The only other requirement for software industrialization is that the software created meets the end user's requirements.
Why is software development not a predictable and repeatable process? I can partially explain this through a small story where after spending 10 years in the software development business, a friend of mine and I opened up our own consulting company in 2001. Our company was based on one single Microsoft product in which we would offer consulting services for. That product was BizTalk Server which is a message oriented middleware product for exchanging disparate data and orchestrating business processes across multiple applications.
Over a four year time frame, we custom designed and constructed twenty-five or so integration solutions using every version of BizTalk Server. Even after that many projects, our process for designing and constructing these solutions was still far from being predictable and repeatable. Sure we got (much) better, but we realized it was the variability that was so difficult to overcome. I mean variability in everything that is software and the processes used to design and construct it.
For example, there is always large variability in the quantity and quality of software requirements. A very small percentage of customers know exactly what they want, more still know exactly what they want, but can't articulate it, all the way to the other extreme where customers have no idea what they want, but still want something built.
For every single "discrete" chunkable requirement, there seems to be at least a dozen ways to design it. For every design, there seems to be almost infinite ways to implement it. When I say design, I mean a particular design for a particular requirement in which the chunkable output of the design is on the order of 40 hours effort per one developer to complete the requirements, finish a detailed design, code, test the chunk and done. The culmination of designs to meet all of the requirements is called the software architecture.
Case studies and industry reports point to inadequate and/or always changing requirements as one major contributing factor as to why software development is not a predictable and repeatable process. Another contributing factor is size and complexity of software development where it is most always underestimated. I would be the first one to agree with both statements, but I would say that this is more symptomatic then the root cause.
Yet another contributing factor to why software development is not a repeatable and predictable process is programmer productivity. I have worked with over a hundred software developers in my 15 years in the industry and I can say that programmer variability is just as broad as the other contributing factors as discussed above. There are several books that quantitatively put programmer productivity variability levels in the range of 20 to 1 and even 100 to 1 between programmers that have been assigned the same code project to, design, construct and test the software. I have seen the extreme with my own eyes where some developers can't write the code no matter how much time was given, while others can write it in two weeks flat. Thats off the chart in terms of variability.
One of the reasons for the wide variability in programmers, aside from the skill sets discussed in the previous paragraph, are the tools that are available for programmer use. The tools themselves are incredibly complex environments and sometimes require people to think in ways that they may not be able to grasp or it is so complicated, no one can figure it out. I can't grasp C, but I grok Smalltalk from a programming language point of view. When we asked a printer to print the help file that came with BizTalk Server 2004, he called us to say it is likely going to be 10,000 pages and cost $500. That's just one product! And we use a half dozen other products for designing and constructing our integration solutions including, SQL Server, ASP.NET, Visual Studio IDE, Windows 2003 Server, SharePoint Services, C#, FrontPage, .NET Framework, and on it goes. While some of these products are not the same size and complexity of BizTalk Server, they require deep understanding of just what they heck they do and how all the products fit together in order to provide the tools and framework to design and produce the Customer's solution in any reasonable time frame (read: cost). Even the .NET Framework Class Library alone has over 5,000 classes to get to know, some very intimately.
These are tools and technologies from one vendor! What about multiple vendors? Also every vendor seems to be pumping out the latest and greatest tools and technologies every year. Where does one find the time? Answer: one does not find the time which results in peoples knowledge of these tools and technologies, plus the specialized skills required and experiences to use them effectively, varies wildly. This is another major contributing factor as to why software development is not a predictable and repeatable process - the programmer never gets a chance to gain years of experience using one tool or even small set of tools - so everything is (always) new.
Even within the Microsoft technologies mentioned above, there are many technologies that do more or less the same job (but the tools are totally different) for one specific area - user interfaces. There are (at least) five Microsoft technologies for developing user interfaces. To me, it is mind boggling why even within a single vendor, that not only are there five different technologies to develop user interfaces (actually 7 if you count InfoPath and SharePoint Designer), there are multiple tools for each technology. For example, ASP.NET - there is Visual Studio and FrontPage. Both have very deep features, but the tools are completely different.
Some would say introducing standards would alleviate this problem. While I concur and it has proved to help industrialize other industries (e.g. electronics), it is still early game in the software world and the technology advances far outpace the speed at which standards can be ratified. Also, believe or not, Microsoft's latest technologies are all (mostly) standards complaint and all with public specifications. So what's the value of standards? What our industry needs is innovation. What would be truly innovative from Microsoft (and other vendors) is simply one technology and tool that produces any type of user interface you want. From a developers perspective, this means being able to focus on "one" tool or technology to do a specific task, like designing and constructing any type of user interface. With one tool and language (for user interfaces), then we might have a hope of industrializing software development.
Let me put it another way, how many people do you know that are fluent in six foreign languages or more? How many of those people are fluent in both the spoken and written word? Have you ever tried learning a foreign language so you are just as fluent in it as your native language? Learning and becoming fluent in any foreign language is no easy task. But for software programmers, we must learn multiple foreign languages to design and construct software. It may even be tougher than learning a traditional foreign language as our programming languages regularly change including the introduction of brand new ones (e.g. XAML). This gives some insight as to one of the major reasons why software development is not a predictable and repeatable process even for the software programmers.
Sunday, 26 February 2006
I live in a small town of about 5,000 people on the Sunshine Coast in BC, Canada. Meeting my neighbors Grandpa was an interesting experience. Grandpa grew up in the Northern part of the Sunshine Coast and has lived here most of his life. I got to know him a bit when we were launching fireworks (its big here!) last Halloween with our families.
Over 4 months I would have occasion to bump into Grandpa on the ferry as I take a 40 minute ferry ride into Vancouver everyday, as does anyone that needs to make it to Vancouver from the Sunshine Coast.
Yesterday, I was sitting at the local Tim Hortons having a coffee and working on my new Acer 8204 laptop (which I purchased specifically for WinFX, Vista development), and I met Grandpa after I had been sitting for an hour, long enough for me to disappear completely into the computer. He said, Are you actually doing something or trying to look intelligent. I said I was coding. He said, What do you do for a living. "I write software". He said, that he used to program assembler way back when. He said, "if just one letter or number is off, it doesnt work!. I said nothing has changed.
First, I was a bit taken back that he could even remember assembler, I cant, and he does not even remotely look the type. Second, I was surprised by my own comment of nothing has changed since assembler. Now really, we have come a long way since assembler, see Raising the Level of Abstraction . But otoh, it is still true today one character or number is off in your source code that you are hand writing and it wont compile. It is much easier to find the compile error today, I suppose. And with Intellisense, how can you miss? (ha ha). The point is, the computer is excellent at repeating precision instructions and we humans are not. So why not get the computer to do the work of writing precision code based on a higher-level abstraction (i.e. tool) where we don't concern ourselves with hand writing low level code.
This makes me wonder about how far programming languages have come over the last twenty five years. Personally, I am a Smalltalk fan. The concept of everything is an object and message passing made much more sense to me than C. The fact that objects in Smalltalk could do only three things made it easy for my brain to understand what was going on. However, from an employment perspective, I chose and have been working with Microsoft technologies since 91 when VB1 was first introduced.
The only language that has come my way that makes me as excited like Smalltalk did is XAML. I have discussed a bit about XAML in my previous posts. Here is a snippet from the WinFX SDK help file for Build 5308:
"WinFX application development raises abstraction to the level of a declarative programming model. To facilitate programming for this model a new XML-based declarative programming language, XAML, has been developed. XAML is based on Extensible Markup Language (XML) and enables developers to specify a hierarchy of common language runtime (CLR) objects with a set of properties and logic"
"XAML is the preferred way to create a UI in the WinFX programming model because it provides a way to separate UI definition from logic, and enables you to integrate code by using code-behind files that are joined to the markup via partial class definitions. XAML enables you to create a UI entirely without using code. You can create quite elaborate documents or pages entirely in markup using controls, text, images, shapes and so forth."
I am impressed with XAML. It is early game, but GDI has done well for 20 years, Microsoft is betting on XAML, WPF, WinFX and Vista as a worthy successor to GDI and hoping it will last as long. For the computer user, the next generation XAML based applications will be like when HTML first hit the web, (then FLASH), where everyone, all of a sudden, is a web site designer However, what shakes out is the next level of computing interactions based on these Microsoft technologies.
However, to make life a bit confusing, XAML is but one of the 5 ways, just using Microsoft Technologies to develop user interfaces. Tim Sneath has an interesting angle on explaining these five UI technologies called, "Windows Forms, ASP.NET/Atlas, DirectX, WPF, Win32 - that's five UI technologies to choose between. How do I decide?"
Having read his article I am satisfied that using XAML and WPF to develop my Storyboard Designer Application is the right choice. Once I have developed something, I can say to Grandpa that we dont have to worry about each individual character or number anymore, thats what the software tool does!
© Copyright 2009 Mitch Barnett - Software Industrialization is the computerization of software design and function.
newtelligence dasBlog 2.2.8279.16125 Theme design by Bryan Bell
| Page rendered at Monday, 19 October 2009 18:35:41 (Pacific Daylight Time, UTC-07:00)
|
On this page....
Search
Categories
Software Engineering Links
Sign In
|