# Saturday, 12 June 2010
Dear CNN, Web Site Crashes Are Avoidable

Check this article out on CNN, “Why website crashes are unavoidable -- at least for now”  

No-one wants to say why it really is.  Money, of course it is.  Quality costs money.  Same as the car industry, or any other “item”, you get what you pay for.  And good software is expensive.  Why, because it still has not become a “well known” practice.

So what’s a well known practice?  Building houses, cars, skyscrapers, bridges, airplanes, etc.  And as always, quality is what you pay for on top of the artwork.  Quality goes way beyond “materials” in our software world.  We rely heavily on the training, skills and experience of our engineering people and their “know how” to produce “quality” software.

The point being, we are just getting there in terms of software engineering and today it varies widely, just like the CNN article not fully understanding the real reason those web site are crashing.  It’s all about the design. 

A really good design will have been re-factored many times as several implementation prototypes, going though iterative improvements, some incremental and some experimental, before a particular variation meets the customer requirements and design goals.  Many people confuse this with “Big Design Up Front.”  

What we are doing is iterating the software design by having several iterations for x number of software prototypes.  The prototypes are designed to not only meet the customer requirements, but us as professional Software Engineers, are making design bets on what several iterations of this project over a multi-year release strategy might look like.  This is called Software Product Line Engineering.  This varation in the process itself is a well known practice used in the design and engineering of cars, cell phones, electronics, airplanes, and just about damn near any produced item, and has had several design/prototype iterations before it ever made its way to the customer.

Let me contrast that with why businesses web sites are making news on CNN.  This is how it really goes, whether the business knows it or not, we developed our first “just get a working” prototype and showing it to “management” to show progress.  They say great! Let’s use it right now and BAM!  The prototype that I just slapped together, just to get something working, is now a production application, which needs to be supported.  How about re-architected?  A one off, “just get me a working prototype” does not constitute a product ready application Mr. Business.

Well, that was many years ago and I don’t work that way anymore.  Among the pro software engineers out in the world, designing and building quality software is not a Scooby Doo mystery.  It’s engineering.  And just like any quality engineered product, it costs money.  And right now in our business, designing and building software with a high level of quality is expensive.  And most businesses, who may not have access to professional software engineers, or do it on the cheap, of course, hello CNN Headline News.

So bottom line CNN, if there was some investigation behind what really was the root cause for these web sites going down, it is all about the quality of the software product (i.e. the web site) and the so-called 5 9’s infrastructure that runs the software where engineers can operate and monitor quality of service levels and take instant corrective action. 

So the next time a car careens into a data center and takes out both the power plant and emergency generators, your doing it wrong!  As the saying goes, the design is the thing!  If that is a possible scenario, then design for it, prototype it and verify the design works.  That means there is likely to be several design iterations of several different implementation prototypes.  That costs much more than having your first attempt go into “production.”  But over the planned lifecycle of a designed  product versus, hacking, well... 

It also takes skill.  What I am about to say will be blasphemy to some  I blame the software development methodology (fad?) “Agile” for our current set back in our bid to have software engineering a licensed, professional practice.  The latter statement being another can of worms served up another day.  

The problem with Agile are these two items from the “Manifasto” 

“Individuals and interactions over processes and tools”  As several other engineering disciplines have found out over the decades is, the quality of a product is directly related to quality of the processes (especially design) and tools (especially automation) used to build the product.  Try designing an iPhone without computer aided design and building it with a hammer and chisel. 

“Responding to change over following a plan”  You may feel I am interpreting these wrong and that could be right.  But what my experience has been working with Agile teams in general is that there is so much responding to change that there is no plan!  And btw, If you don't know where you're going, any road will do. ...

If I am the CEO of a company and I was asked to ok an estimated multi-million dollar software project, I damn sure want to not only see the plans, but I want to read a business case that states exactly what this software is going to do for the business, why and what the ROI is before anything happened. 

And btw, I do expect to see detailed business requirements, process maps, use cases, user stories, I don’t really care, as long as it is very clear to me and in gory detail exactly what I am “getting” for those millions.  Otherwise, I ain’t spending a dime.  My expectation is that given the amount of salary/options/bonuses, etc., the CEO’s are getting in today’s market and how heavily most if not all businesses rely on software, they know this and should go hard core on not just rubber stamping someone’s software pipedream.

Of course, there is the obligatory well, requirements change all the time.  In reality, ever changing business requirements = the business does not know what the hell it’s doing.  Ya dig.

So like most professional software engineers, we know what to do and how to do it.  But doing it right = more money.  IMO, that’s where we are at as far as the evolution of software engineering is from someone that spent 10 years in the electronics design engineering world and the last 19 years in the software engineering world.

As an industry, we must resist the urge to jump on any bandwagon that flashes by and stay focused on what other engineering disciplines have come to realize, even the basic of measure twice and cut once we are still learning in our software world.  It has not yet become a well known practice.

It is getting better.  Microsoft has a financial guarantee associated with its pay for online services of 99.9% uptime. That’s pretty damn good if you ask me.

I patiently wait for the business world to realize that if you want a web site that needs to be up 99.9% of the time, you need to pay for it man!  It requires proper design and engineering.  That takes talent and good talent is hard to find... and expensive.  In the meantime, we can watch more businesses make the headlines until the people paying for those products deem that level of performance (CuSat) unacceptable and go with a different brand.  It’s just a matter of time where quality will be a deciding factor in the software world and that’s when our industry will have reached becoming a well know practice.  When?  I can’t say for sure, but I feel we are at the knee of the exponential curve – hang on man!

Saturday, 12 June 2010 00:18:10 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Friday, 01 January 2010
Witnessing Software Industrialization

In the mid to late 80’s, in the audio electronics industry, I recorded several rock bands in a couple of professional recoding studios located in Western Canada.  The Northern Pikes were my favorite band for a variety of reasons.  From a software technology standpoint, there was little used in audio recording at the time.  In those days, it was the peak of analog sound equipment including 24 track (2 inch wide tape) recorders, large physical mixing consoles, expensive sound proofed and tuned recording studio facilities, real instruments/amplifiers/effects, real drums, vocal booths, etc.

 

The first digital 24 track recorders (from Sony) were just coming on the scene.  The only gear that was really controlled by software at the time were a few digital effects processors (for guitars/vocals) and a SMPTE time code controller that synched the 24 track tape recorder to the software controlled faders (i.e. the mixing consoles volume and mute controls) on each track.  So when you mixed the sound on the console, you can program the volume, mute, effects on each individual track during playback - yes we used all 24 tracks and heard the song repeated several hundred times.  You would build this “program” up over time, sometimes mixing one track at a time and muting everything else, or tweaking while listening to the overall mix.  Point being, that was the limit (along with early MIDI recording on Mac’s) of software controlled audio.

 

During recording sessions, the recording process was called “live off the floor” recording.  Meaning the band was spread out in the studio, sometimes in various rooms, but all hooked together through audio headsets so the band could all play together live off the floor.  A trick, used by many sound producers before me, was to tell the band that you want them to warm up by going through the song a couple of times before we put it “on tape.”  Unbeknownst to the band, I was rolling tape the first time they played the song in the studio (i.e. a first take) and as some would argue, myself included, that the first take has a good chance of being the best take.  Meaning you caught the band at their best.  This was called a bed track with a ghost (guide) vocal track and then the other tracks would be overdubbed on this bed track ovetime (i.e. more guitars, solos, vocals, harmonies, embellishments, etc.), until all 24 tracks were used.  And in some cases, never to be recreated again – meaning a moment in time was captured on tape and no matter how the same players tried to recreate it, they could not come up with the same groove/sound they had the first time they played the song.

 

The thing of it is, this was millions of dollars of gear to record a live band.  Plus operations and maintenance of a professional recording studio shows why the rates can be +$200 per hour to rent the place.

 

Fast forward to 2009.  I purchased Rock Band 2 (on PS3) for my 6 year old daughter.  PS3 was $299 CDN, Rock Band 2 was $159, and another controller $59 (for Little Big Planet). I already have several PC’s (being in the software business),  Asus 25” HDMI monitor and X-fi sound card with Logitech 5.1 surround sound speakers.  I was blown away with the (software controlled) sound processing on the drums, bass, guitar and vocals on Rock Band 2.  Sounds just like in the studio!  Well, not quite, but for non-audiophiles, most people could not tell the difference.

 

The point being, the latest (software driven) digital technology used for sound engineering has increased in capability that was not even dreamed of 25 years ago.  Look at Band in a Box – audiophile edition is 669 bux (they send you pre-installed software on a 1.5 Terabyte hard drive).  Look what it does - a fully equipped, 10 million dollar, professional recording studio, completely virtualized in under $1000 software.  Unbelievable!!! 

 

To me, this is witnessing an area of software industrialization that I have been a part of for over 25 years.  My conclusion is that over time the technology has exponentially increased in capability and at the same time exponentially decreased in cost (including operating and maintenance costs).  What used to be $10 million 25 years ago is now less than $5K to have a professional recording studio virtualized on your computer in your home.

 

The best part, of course, was being able to play Rock Band 2 with my daughter and watching her play the various instruments and sing.

 

Happy New Year!

 

Friday, 01 January 2010 23:52:45 (Pacific Standard Time, UTC-08:00)  #    Comments [0]
# Saturday, 15 August 2009
Who's Your Daddy?

 

+1 WolframAlpha

Saturday, 15 August 2009 00:32:09 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Tuesday, 14 April 2009

Ola Bini wrote an interesting article called, “Languages Should Die.”  I was hoping that this was really the case, but actually it was the opposite.  I believe it is this type of thinking (i.e. language proliferation) that has our software development industry in trouble:

“No existing language will die in a long time. But we need new ideas. New fresh ways of looking at things. And new languages that incorporate what other languages do right. That’s how we will finally get rid of the things we don’t like in our current languages.

You should be ready to abandon your favorite language for something better.”

Define better.  Define do right.  I feel we have already reached good enough long ago with C or Smalltalk.  Heck, Smalltalk is one of the simplest languages there is - the entire language syntax can fit on a postcard.  What more do we want?  Oh sure, there are some small technical issues, like with all languages, but technical deficiencies is not what did in Smalltalk, it was marketing, but I digress.

Wikipedia says there are +560 programming languages.  Sure they are not all in use today, but how many programming languages do we really need?  How many written and/or spoken languages (i.e. English, Spanish, German, etc.) do you know well?  Expert level?  How many people do you know that are expert level in 2 or more spoken and/or written languages?  How about 5 or more spoken and/or written languages?  How long do you think it would take for you to learn 5 spoken/written languages at an expert level?  Apply your CoCoMo model to that!

Yet, for any web developer, typically you need to know one or more core languages like C#, Java, Python, whatever, along with HTML, CSS, JavaScript and then usually your app is hooked to a database so you need to know some dialect of SQL.  Ok that’s five languages right there.  How many of us are expert in those 5 languages? I mean honestly. And of course there are variations for each language – example, look at the difference between C# 3.0 and 4.0.  And for each language, there is one to many frameworks to learn as well – e.g. AJAX frameworks like JQuery, Mootools, Prototype, etc.

It is no wonder to me that we cannot get any real work done as we are all learning existing or new languages, frameworks, etc.  It is no wonder to me that business is getting pissed off at developers in general because everything takes too long, too error prone and worse it does not meet their requirements.

Ok my point is not to be some old stodgy dinosaur, yeah I just turned 50, but I have a 6 year old daughter that can play Lego Star Wars better than I can.  I am not trying to stifle innovation, but let’s be smart about what we are trying to innovate.  Here is an excellent example in my neck of the woods, I spent a bunch of time learning C# since 2000 and kept up with the versions.  Recently, I went to PDC and checked out all of the new C# 4.0 features.  One of them was introducing a dynamic type to what is otherwise a very statically typed language.  I used (Iron)Python for my dynamic needs and now some of that is in C#.  Is that good, bad, better or worse or what?  Damned if I know, it is yet another thing I have to figure out as a software engineer.  Oh, and I really liked F#, a brand new functional language.  So how many days, months or years do I need to invest in learning F# to become an expert?  And most importantly, why would I do that?  Other than the coolness factor, admittedly really cool to me personally, but what possible business value or real world application does it have that would cause me to use it?  Answer = none.

See what I mean?  As Ola says,

“So - you should go out and fork your favorite language. Or your least favorite language. Fix the things that you don’t like with it. Modify it for a specific domain. Remove some of the cruft. Create a new small language. It doesn’t have to be big. It doesn’t have to do everything - one of the really cool things about functional languages is that they are often have a very small core. Use a similar approach. Hack around. See what happens.”

Give me one sound business reason why this would be a good thing to do?  You want a simpler language, well, we have that already, see Smalltalk example above.  You want a better or a “right” programming language, you better have some real definitions as to what those mean, have identified real shortcomings (not just some syntactic sugar) in any other mainstream language, and your proposed improvement must be like 10X or why do it?

I feel that we already have (more than) enough programming languages to choose from.  Let alone the frameworks and batteries that come with them. We software developers/programmers/engineers seem to be our own worst enemies as we are causing more and more complexity to a domain that is already complex enough.  What are we really doing to reduce the complexity instead of adding more?  Adding another language to my skill set like F# (which I would love to learn personally) absolutely has no business value for me or my customers in my world of ecommerce web applications.

In my software engineering world, I am looking at every angle to reduce complexity.  It is simply a matter of numbers, the less is better.  So if I can reduce the number of programming languages, frameworks, integration points, executables, assemblies, etc., the simpler the solution, the lower the cost, the less time to deliver, the easier it is to change and maintain and therefore representing the best business value to the purchaser of the custom developed software.

However, these days it feels like I am a minority in our software industry as we proliferate everything, including software development methodologies until it becomes insane.  I am concerned that this is happening to programming languages (and everything else in our software development industry) as well.  Just like the financial market situation we are in now, I am wondering when that will happen in our industry?  For all of our brain power, we seem to be following the same path.  What is it going to take to divert this path?  When will the pendulum swing the other way to favor economic sense (i.e. proven software engineering principles) instead of the crazy proliferation of anything in the name of continuous improvement?

As Brad Cox says in, “Planning the Software Industrial Revolution, I only wish that I were as confident that the changes will come quickly or that we, the current software development community, will be the ones who make it happen. Or will we stay busy at our terminals, filing away at software like gunsmiths at iron bars, and leave it to our consumers to find a solution that leaves us sitting there? “

Tuesday, 14 April 2009 23:42:06 (Pacific Daylight Time, UTC-07:00)  #    Comments [4]
# Wednesday, 08 April 2009

I may be getting old, but I don’t think I am out of touch.  My sanity of being a software development professional seems to be tested daily by our industries predilection for Silver Bullets.  The latest it seems is Scrumban.  My wife can’t stop laughing when I say it to her.

 

I have not read the book, so I have no opinion on it.  I have read some of the author’s blog material.  Seriously, I mean no disrespect to Corey Ladas at all, but when I read the material, I can’t help to think that this is written by someone with a MBA.  When I read this article, I think I get what he is saying, but I swear it is written in a different language than the software engineering world I live in.

 

Regardless of what marketing terms are used, the reality is that software development is always: understanding the requirements, creating a design, implementing the design, and testing the design and implementation to ensure it meets the requirements. Requirements, Design, Code and Test.  Always have, always will, no matter what other fancy (marketing) words are used to describe it.

 

With respect to very large software projects, I understand the label Feature Crews, but to me, this is nothing more than classic functional decomposition at work, but with a new (fancy marketing) label.

 

Questions that the Agile, Scrum, Lean, give it a name,  practitioners seem to never answer is the two most asked questions by people that pay for software development projects, which are, “how much is it going to cost and how long is it going to take?”  It seems there is no best practice in any of these methodologies to answer these fundamental questions as these methodologies are focused on very limited scopes of the project as opposed to the entire project.  And quite frankly, that is my main beef with these Silver Bullet’s, because as Fred Brooks postulated 23 years ago, there is no such thing as a Silver Bullet in software development.  In my opinion and professional experience, he is right.

 

Software development is an incredibly complex, massively manual labor intensive effort, whose primary work product is lines of source code.  For people that write code day in and day out, know this to be the truth.  There is no hiding from it.  We grind it out as we know how and love it.  So when I am asked how much the software development project  is going to cost and how long it is going to take, I apply a tried and trued approach to answer what is a very, very difficult question.  This is why you see software estimation models like the Constructive Cost Model or CoCoMo for short.  You can read the gory details of the model here. PDF alert!  No surprise to see that counting lines of source code (or equivalent lines of source code) is the way to answer the tough question of how long and how much.  Dr. Barry Boehm had it figured out years ago.  When I took Software Engineering as a post graduate course at the University of Calgary, this is what we were taught and has been consistent with what I have found in the field.  So yes, I use CoCoMo to answer the ugly, gnarly questions. And there are several automated tools that implement CoCoMo’s model, even ones online.

 

So what has happened to our software development industry that we need to keep reinventing the wheel in the name of continuous improvement?  I think it is indicative of anything that is really tough to do.  Everyone is looking for an easy answer or the next big thing.  But as most things in life, the answer is already figured out.  You just need to look in the right place, listen to wisdom and apply some common sense.  Congratulations, you just made pro software engineer!

 

What’s my point?  My point is two fold.  Our software development industry is being run by marketing people and has gone insane ;-)  Ok, I half jest.  I know over time the pendulum will swing back to the basic fundamentals of what software development really is.  My other point is to the young aspiring software engineers.  Kids, look at some of the real software engineering techniques that hold up to the test of time.  These are the gems.  This is what is real in our industry.  CoCoMo is real, it works, and is based on facts and historical proof.  If you were to only read one book on software engineering, read the as it embodies what is really happening in our software development world, regardless of when the book was written.  There is a reason why Fred Brooks earned the Turing Award.  Your assignment is to understand why.

Wednesday, 08 April 2009 23:43:44 (Pacific Daylight Time, UTC-07:00)  #    Comments [8]
# Wednesday, 17 December 2008

I mean it, honestly! I have been programming for so long and have built so many apps and have looked at, and used 100’s of (web and desktop) apps, that I no longer know what a good user interface design looks like anymore.  I have lost my objectivity.

 

Think of it this way.  As a user, I have been using Outlook and Outlook Web Access (OWA) daily for I don’t know how many years, but yet it (read: both) “feels” like a good user interface design to me.  Maybe I have been “programmed” by using it so much that I just feel that way.  Which is maybe why I dislike the ribbon, but that’s another story.

 

As a side comment, Jeff Atwood at Coding Horror wrote an article called, “Avoiding The Uncanny Valley of User Interface” saying, or at least I think it says from a summary point of view, that web applications should not try to mimic desktop applications.  Clearly, I don’t get it.  I believe the whole point to OWA is to indeed mimic the desktop (i.e. Outlook) as closely as possible so that I, the user, don’t have to “think” about the differences between the two email applications.  In fact, I think I would be quite pissed, as a user, that the desktop email application I use everyday and the web mail application were so different that I would have to then learn two different ways of working with email, calendars, tasks, folders, etc. Don’t make me “think” about the differences, I just want to “use” the applications.

 

I also read most of the comments form Jeff’s blog, and a few made sense to me (see below), whereas I think most were offering up their personal opinions, while valid, I know my job, as a programmer, is to give my clients what they want.  Oh sure, I have my own ideas, like many programmers, but after spending a lot of time in the trenches talking to customers over 17 years, it is clear to me that they are paying me money to get what they want – even if they don’t know what they want. “do you get me, sweetheart?”

 

I really like Shane’s comment:

 

Yeah, that "Google Maps" web app thing that acts like a desktop app, that'll never work. People only want web apps that act like web apps damn it! And all the cool AJAX features in Facebook/MySpace/Gmail, no one really wants those. They would prefer to wait for the entire page to refresh every time they make any kind of change. Or wait for a huge page to download just to make sure the page contains every bit of data that they could possibly want. Because they would prefer their "web apps" to act just like the crappy 1st generation web apps that were around before they even knew what the Internet was. Pfft.

Shane on December 17, 2008 05:55 AM

 

Or how about Daath’s comment – sounds like a Pragmatic Programmer to me:

 

For a software developer - yes.
For an average user - no. An average user simply doesn't care if it's real, doesn't search for differences, and for him/her it would be the best if the web application and the desktop application would behave the same, because it would be easier to get used to.

For a developer a software is close and personal, like another person for everyone. That's why it feels unnatural for him/her to use web apps that try to mimic desktop apps' behavior. But for an average user, web and desktop apps are nothing but utilities, things that have to be used in order to achieve a goal. They don't care, it's nothing personal for them. (Sorry for breaking your heart, you just have to realize this and move on with your life. :))

Daath on December 17, 2008 08:08 AM

 

And finally, Brian says it all:

 

What are you guys talking about! The purpose and intent of Ajax and RIA technologies is to enable web UI designers with the ability to do things that would be considered "closer" to desktop application operations than "traditional" web applications.

 

That’s because "traditional" stateless web application user interfaces sucked. What are web application expectation anyway? Type stuff... submit(postback)...wait...view result. I say rock on web ui designers! Give me drag-drop, give me background updating panels (event-driven updates). I am still waiting for some of those other crusty old desktop features like great undo/redo functionality and the ability to paste an image directly into an email body, but as they keep working the technology I am sure it's not far off.

 

For users, web apps accel because of collaboration and accessibility. Users dump outlook for web-based email so they can read their mail from home, work, school, wherever. Certainly not because Outlook's desktop user interface sucks. Users have had to trade off rich interaction for those benefits. Today, that trade off isn’t any where near as bad as it once was. Today many web apps simply rock. And that’s because of the energy and effort by many folks to bring rich (or desktop-like) interactions to the web. So let’s dispense with the noise that this is a bad thing.

Brian H on December 17, 2008 08:09 AM

 

Hallelujah, I say.  I just don’t understand why Jeff would write such a blog post.  Which brings me to my point, rather than hypothetically avoiding the uncanny valley, I want to hear about and see pictures of “good” user interface design and most importantly, why they are good.  I mean convincingly good.

 

I work for an ecommerce company that designs and builds large scale ecommerce web applications.  When it comes to good user interface design, our customers are predominantly always asking for two things:

 

  1. Can our customers (easily) find the product(s) they want easily with the minimal amount of mouse clicks? This all about navigation and search.

 

  1. Can our customers (easily) buy the product(s) they want easily with the minimal amount of mouse clicks? This is all about conversion, i.e. turning “browsers into buyers” as quickly as possible.

The point I am making is in my ecommerce web app world, there is a purpose to the user interface design, the key word being design.  The ecommerce web app is “designed” to fit the requirements for the intended end users.  We, as programmers, use a lot of interesting technologies, like AJAX, to fulfill these requirements, but the end user could care less what technologies or techniques are used to fulfill these two requirements.

 

Bad user interface design to me is the opposite of fulfilling these requirements in the context of what I do for a living. And if I put my customer hat on while I am shopping online, if I can’t find what I am looking for easily, I give up.  As a programmer, I give up even sooner, particularly with a web application that is loaded with so many links, 5 navigation bars, and tons of “stickers”, I already know this is going to be really painful to find what I want, so I just give up - sooner.  The consequence is that ecommerce site just lost my revenue, which hits the bottom line.  Everything else after that is irrelevant.

 

Another reason why I think Outlook for the desktop and Outlook Web Access for the web are so successful is because if you use the desktop app and then have to use OWA, the learning curve is almost zero, so it is instantly usable and adopted by the “user.”  I.e. I can find what I want easily and I can transact my email easily – no think time differences between the two apps.

 

I can give another example of good user interface design coming from my 5 year old daughter, who plays web based games on the internet and locally on my computer (i.e. “desktop” games).  She already knows that it is either going to be the arrow keys or the WASD keys to move around in the game.  I only had to show her once and it is consistent with the many games she plays.  Adult note – yes, I (or more correctly, my wife) limit the time and type of games she can play on the computer.  Heck, she even knows how to hit the green “play” button in Visual Studio to run some of the XNA games I am working on, but that is another story.

 

One criterion that makes for a good user interface design, even if it was arbitrarily chosen, is consistency.  Example, the ubiquitous “File” menu, in and of itself, in addition to that it is the same for Outlook desktop application and OWA, plus many, many other desktop applications that share the File, Edit, etc. paradigm.  It is a bit strange to me that not too many web applications take this same approach.  Why not?  And I mean from a user perspective, not a programming one.

 

Of course, Microsoft broke this paradigm with the ribbon, but I think they were actually trying to solve a different problem then good user interface design, see Office 2007 and the Killer Ribbon.

 

So what makes a good user interface design?  I am genuinely asking.  Like I say, I am so close to it that I feel I have lost objectivity.  Not completely mind you, but as a firm believer in , I have a sense that our industry has moved off the mark somewhat and after reading Jeff’s post, and some of the comments, there are a group of software folks that seem to have forgotten the basic “aesthetics versus function” argument.  E.g. if the iPod was not so easy to use from a user perspective, it would not matter how cool it looked.  Some may see it differently, but my ecommerce customers see it the same way as well.  If the ecommerce site is “flashy” but no-one can find or buy anything easily, then it is nothing more than eye candy as opposed to an ecommerce business, producing revenue.

 

So what makes a good user interface design?  Show me one (links, pictures, descriptions, all good) and tell me why.  Thanks!

Wednesday, 17 December 2008 23:06:47 (Pacific Standard Time, UTC-08:00)  #    Comments [6]
# Tuesday, 09 December 2008

This post is to answer several of the comments posted to my blog, Hacker News, Programming Reddit, and unClog - in no particular order.  I tried to answer most if not all questions, but if I missed yours, it was not on purpose.

 

At the end of this post, are my own answers to the productivity lost questions I have posed.

 

PJ says, “A bad craftsman blames the tool.”  So PJ, if the tool is broken or is in need of serious repair, would a good craftsman ignore it?

 

Arun says, “…I believe you need to think outside the .Net world. …What's new is not always complex. The productivity in modern languages have actually improved dramatically. People release new features all the time. But then, you need to stop using the old tools.”  Arun, I make my living as a .NET programmer and I would not call Visual Studio 2010, .NET 4.0 and C# 4.0 “old tools”, in fact they are so new they have not been released yet, except if you attended PDC.

 

Alecks says, “At age 17 I sit in programming class trying to find my path through the maze of the .NET framework, thinking to myself 'okay, so I can do it . . . but how does it work?”  Alecks, hang in there buddy, one step at a time.  Pick a few .NET classes (like the collection or array classes for example) and play with them to see how they work.  Look at the MSDN examples or search online.  Step through the examples in the debugger and inspect the variables, program flow, etc.  Once you see how one or two of these classes work, then you can expand to others.  One step at a time and you will be fine.  Keep at it!

 

Sridhar says, “…try Zoho Creator creator.zoho.com which tries to provide on the web the ease of use you get in a VB6 environment.”  Thanks Sridhar, I will give it a try, looks interesting.

 

Jeff says, “…I think it should be as easy to build apps for the web as it is for the desktop…”  Bingo!  Jeff was able to summarize my two posts in basically one sentence.  Good job Jeff!  Unfortunately, the reality is that we maybe years away from making this happen.

 

Andrew says, “We know that code re-use is a wonderful thing, and its just a real surprise to hear someone argue otherwise.  … Lastly, I'm not sure quite why you've got some of the highly passionate responses above, for sure its an opinion that differs from most people I've met, but you're not too far of death threat territory in some cases :P”  Andrew, what makes you think that I am against code reuse?  Where did I say that?  Quite the contrary, almost 80% of my open source project is predicated on code reuse, I would have never been able to complete it otherwise.  With respect to the passionate responses.  Well, I don’t think I would call them passionate…

 

Roger says, “As a developer you have to keep getting better tools, and get better at using those tools. …When I job interview developers, one of my questions is … if the candidate cares about their own productivity and actively seeks out new tools and techniques.”  Say Roger, I think you missed the part where I am using the latest tools and editors, including a pre-release of Visual Studio 2010.  I think your brain may have froze when you saw VB6 in my post – don’t worry, I think most of us have the same problem with VB6.

 

Daniel Lyons says, “It's not just you. I don't know what's going on with these other guys but I feel what you're saying just as strongly as you do and I'm only 27.  We're reaching levels of complexity where what we've been doing just doesn't work anymore. Especially with the web.”  Daniel, nice quote, “where what we’ve been doing just doesn’t work anymore.”  I say amen to that.  And as I have said before, it is getting worse, not better, in my opinion.

 

Roe says, “You raise some good points, but the conclusion is weird:  "to me, productivity is everything. Productivity. It is as simple as that."  To me, delivering something of value to users is everything.”  Hey Roe, I think you missed my point.  It is already a given that what is delivered is going to have value to the “users”, and if not, you have bigger problems then productivity.  If you can’t deliver to your customers in any timely manner, no matter how much value it has, is of no value when the project is canceled cause it is way overtime and budget.

 

Mark says, “You should try ruby. Or python.”  Yup, I have and in my article I state that I have tried so many languages over the last 17+ years, that I can’t keep track of them. I have been using IronPython and IronRuby, well, because I am a .NET guy.  And in fact, the open source project I developed embeds IronRuby and IronPython as the scripting engine to program remote computers in real-time, but that’s another discussion.

 

Sixbit says, “…I think I've rediscovered your VB 6 experience of 1991. When I got into iPhone programming this year, and got the hang of using Interface Builder to rapidly prototype things, I really got a feeling that, this is how simple things should be in other frameworks, and have rediscovered my love of programming again.”  Wow, sixbit, that sounds cool.  I will have a look at that.  I wonder how that extends, if at all, to general purpose web development, probably not…

 

Phillip Zedalis says, “I did not comment on your last post because I was still contemplating it - but yes I heard your point loud and clear. I also over time have felt a great discomfort with the growing 'complexity' of the libraries surrounding Visual Studio.”  Philip, wait till you get a load of Visual Studio 2010 and .NET 4.0 – more and more and more.  I am not sure how it is any better.  I used to be an early adopter of the Visual Studio environment, right back to Visual Studio InterDev in 97, anyone remember that environment?  Now, I just cringe at the extracurricular effort required to figure out all of the new things have been added to the IDE and framework.  I can’t even face going to anymore PDC’s which as a passionate programmer, were my favorite conferences, it is just too much.  By the way, fantastic site Phillip!

 

Ald says, “I am not an expert, but I think you could halve the list by refusing to use MS technology, and XML.  Are you sure you couldn't do everything in javascript, with may be a few simple scripts on the server serving the asynchronous calls?”  Ald, download my open source project and show me how I could halve the list by using different technologies, seriously I would be interested.  You will also see why it cannot be all JavaScript with a few simple scripts.

 

Joseph Gutierrez says, “It makes you feel less a craftsman. More like a father at Christmas with the bicycle instructions, trying to put the damn thing together. I've started learning Lisp for this simple reason. Less imperative and more declarative programming. Stay with it.  An increase in velocity is what you're looking for. Take a look at altdotnet news groups on yahoogroups. VB6 with OCX and ActiveX damn good things, but it seems to have dropped out of the mainstream.”  Joseph, thanks for your comments, absolutely true on the “bicycle instructions”, problem is that each part is from a different manufacturer and some sizes are in metric and others in imperial.  I hear you on Lisp – have been there, and may even try IronScheme, but still cannot see how I could build a web app using this technology.  Btw, I love your snowflakes on your blog!

 

Buford Twain says, “The next big breakthrough in software development will come when someone makes things radically simpler for us poor programmers.”  Buford, I agree, I just hope I get to see that in my lifetime...

 

Itay Maman says, “When you need two or more frameworks in the same application you have a problem: each one is imposing its own structure, and the two structures are usually in a conflict with each other.”  Itay, you are too right, in my own application I have several frameworks and components and the fight is to structurally fit them together even though each one on its own has the functionality I want.

 

Mark Jones says, “Very interesting. As a hobbyist assembly-language programmer whom has taken courses in the latest Java, C, and VB disciplines, I could not agree with you more Mitch.”  Bless your heart Mark!  Assembly language was the last thing I thought I would see on my blog about web applications, but I totally get where you are coming from.  I sure am tired of “syntax code”, particularly now that Microsoft has fully embraced XAML, it just keeps getting more and more verbose and making me less and less productive.

 

Evrim Ulu says, "We've faced similar problems in past, and the only solution we've found is to rewrite everything from scratch." Wow Evrim, this is pretty cool.  I will indeed look at this in more detail.  I also happen to agree with your approach by the way.  It sure appears the only solution is to indeed rewrite it all from scratch.  I totally get that, thanks!

 

mycall says,

 

“1. Why does GSB exist?
2. What is wrong with the normal way of using hosting services (e.g. FTP pre-tested application from localhost development to QA to production)?
3. How does this compare to the new Amazon EC2 or Windows Azura?
4. Does GSB actually have the minimum number of components to (a) minimize maintenance and issues?
5. Hows does Silverlight 2.0 with IronPython running in the browser change GSB? And

 Forgot to mention, here is an idea for you..

 

Wow, great questions mycall, let me try and answer them.  GSB exists because I wanted to program in real-time, using dynamic languages, many remote computers from a web browser.  I may have 30 or 40 web apps that need maintenance around the world.  This tool allows me to geocode the locations in the map program so that I have some physical frame of reference, and keeps all of the IP addresses, notes, etc. contained in one location.  I can then remote desktop in and launch the “service component” of GSB which contains the DLR and IronPython and IronRuby interpreters that I access through my web browser via WCF web services in both a REPL console window and a code editor.  This allows me to make simple changes to code, scripts, etc. much easier than a) trying to hook up Visual Studio remotely to +30 web sites does not make you productive and b) to answer your question 2) trying to do the “traditional” way of maintaining 40 sites in localhost and promoting those via ftp, etc., again not too productive.  And it is highly experimental as mentioned here. 

With respect to Amazon EC2 and Windows Azure, this is something completely different.  My IDE is “in the cloud” whereas the applications for Azure and EC2 are “in the cloud”. 

Yes, I do have the minimum number of components, (don’t need Agent SDK, just used as Robby the Robot demo and almost all of the components are open source that I have reused in my application.  My code is the glue code and programming for the overall functionality, along with eh ability to  “reflect” assemblies on the remote computers so I can look at the custom classes types and members I have at my disposal. 

With respect to Silverlight and IronPython in the browser, the IronPython execution is actually local and not on the remote computer.  I won’t get into the other productivity issues of trying to develop and debug a Silverlight application as that is another whole can of worms.

Finally, to answer about Michael Foord’s Silverlight in the browser, have a look at this web-based IDE round up.

 

Pfedor says on Hacker News, “The article is interesting, but I can't help but think that large part of the guy's problem is that he's not using Unix.”  I am wondering how that would make my application any simpler or me more productive, please do tell.

 

On Hacker News in general about VB6 – I think the point is being missed.  I was very productive in developing desktop applications in VB6 but there is no equivalent for web applications.

 

Thomasmallen on Hacker News says, “Big surprise. They guy's sick of programming because he never liked coding in the first place. This is his ideal development model…”  Dude, my very first line of the article says, I love to write code and have been doing it fulltime for 17 years.  If I was sick of it, would I still be doing it?  I don’t think so.  I would have opened up that flower shop by now!

 

Munificent on Programming Reddit says, “His core argument is wrong. Frameworks don't exist to make programmers' lives easier. Our job difficulty is essentially fixed: we will push ourselves as hard as we can. What frameworks do is increase the scope of what we can accomplish within that level of effort.”  I disagree and I think you missed the point.  There is so much language and framework being used that it is a humungous job to stitch anything together easily.  At least in my .NET web application world.  That’s the point.

 

Gregk on Programming Reddit says, “We can't re-invent the wheel all the time. There is a tendency for over-engineering a little these days. But that is avoidable. In general, I can say we have better tools than 10 years ago but we are also more demanding and tackling more complex tasks.”  Again, I think my point may have been missed.  I am not trying to reinvent the wheel at all or over engineer anything, I am trying to solve a business problem in the most productive way possible.  Like I said before, VB6 allowed me to solve a business problem simply and was a better desktop tool almost 10 years ago then the web development tools we have today to solve the same problem, except on the web.

 

Vhold on Programming Reddit says, “Every now and then this sort of thing is said, and there is always a very significant missing piece. The software people are writing now does a lot more, and a lot more reliably (on average) than in the past.”  Hmm  does a lot more, says who?  I think we are solving the same problems over and over again, but getting harder to do as our tools and frameworks are not making it any easier as these are increasing in size and complexity, yet our business problems are the same.  An ecommerce application 10 years ago is “basically” the same as it is today.  Sure lots of extra stuff, but the bottom line is to add stuff to the cart and get it transacted, or at least that what my ecommerce customers really care about, and they care about it doing it faster and cheaper than 10 years ago.  That’s the issue.

 

Recursive on Programming Reddit says, “No one's stopping you from doing pure programming.

Write your own web server in a procedural language, and then write your own procedural web app on top of it.  The big scary complicated frameworks will be waiting when you're done banging your head on the desk and crying.”  Dude, who said anything about me wanting to do pure programming?  Show me in my post where it says I want to do that.

 

Ken on unClog says, "I see several markup (HTML, CSS) and programming languages (C#, Javascript, Python, Ruby?), operating systems (Windows and Mac OS X?), system introspection and end-user programmability, and a third-party 3D plugin. It wasn’t too many years ago this would have been impressive in a desktop app. I’m not sure where the “simple” is. If you’re looking for a program to tie all these things together, it’s going to have a lot of inherent complexity. The solution to “too many languages” is never “another language”. I’m curious why one would think that such a complex web app would be easy out-of-the-box with a particular language or framework. Frameworks make common things easier. Uncommon things will always take more work." 

 

If I had a prize to give out, Ken would get it!  My one page web application is not simple.  It was a trick question.  My one page web app was “designed” to be “simple” to use, it was not simple to design and build.  I had to use a lot of languages, frameworks and components that were never really designed to work together and that was the effort involved to reach my goal.

 

There is real wisdom in the statement, “The solution to “too many languages” is never “another language”.”  Maybe that is what I was looking for, a web based language that can tie HTML, JavaScript, CSS, etc., into one language – a web language.  The analog of using the VB example was to indicate that I could indeed use one language and even with 3rd party plug-ins, I could develop a fairly sophisticated desktop application.  Actually that could be said for many languages if developing a desktop application.  But in the web world, there seems to be no choice around having multiple languages or scripting languages or mark-up this or that, each with its own strength and weaknesses, which result in productivity lost.  Or at least for me.

 

That is why I liked Evrim's response, “We've faced similar problems in past, and the only solution we've found is to rewrite everything from scratch.”  He even has an example web site written in their own technology.  I have not had a chance to look into detail, but from an innovation perspective, this looks promising to me and breaks out of the HTML, JavaScript, CSS, et al paradigm.  I hope his works continues.

 

My point in all of this, being that this is a blog about software industrialization, is that we, the software development industry, have not caught up to developing web applications like we can desktop applications.  I would say that had I developed my open source web app as a desktop app it would have taken me 10x less time, meaning I would have been 10x more productive (probably like 100x) and that is my point.  I hope the future is not too far away where we can develop sophisticated web apps in the same time and ease as we can develop desktop apps today regardless of choice of languages or frameworks.

Tuesday, 09 December 2008 23:59:26 (Pacific Standard Time, UTC-08:00)  #    Comments [0]
# Friday, 05 December 2008

Wow!  Was I surprised to see the number of page loads reported in StatCounter for my little blog article from the night before.  My PII 600Mhz 384 meg RAM web server circa 2001 was barely able to keep its CPU from melting down.

Double wow on some of the comments.  It seems a split between those who got the article and others, well, maybe giving the benefit of the doubt that I was not as clear as I could have been with what I was trying to communicate.

However, having looked at some of the Hacker News comments and Programming Reddit comments, it is clear that some people did not seem to get the point I was trying to make.

Let me try again using a concrete example this time.  Let’s use my open source web application I wrote as the example because you are free to download the source code and examine it to your heart’s content to make up your own minds as to what my beef is, if you so desire.

My web app is really simple, in fact, the application consists of exactly one page!  Mind you there is a lot going on as your will see, but still, it is one page.  For now, who cares what it does, let’s look at the list of technologies, frameworks, languages, components, etc., used (and reused) in this “one page” web application:

  1. ASP.NET 2.0 framework classes
  2. ASP.NET AJAX (.NET 3.5 FCL)
  3. ASP.NET AJAX Control Toolkit (Tab and Hover controls)
  4. .NET 3.5 Windows Communication Foundation (WCF) framework classes
  5. Visual Studio 2008 IDE
  6. C#
  7. JavaScript
  8. VBScript
  9. CSS
  10. HTML
  11. XML
  12. MDbg CLR Managed Debugger (mdbg) Sample application and API framework
  13. .NET 2.0 Winforms framework classes
  14. Microsoft Agent SDK (API framework)
  15. Microsoft Remote Desktop Web Connection Active X component
  16. Google Earth application
  17. Google Earth Airlines ActiveX web plug-in for Google Earth with JavaScript API
  18. Prototype.js open source AJAX framework
  19. IronPython 2.0B2 and Dynamic Language Runtime (DLR) with IronRuby support
  20. IronPythonTextbox – open source IronPython rich client text box and interpreter
  21. Edit Area – open source JavaScript source code editor with syntax highlighting
  22. Color List Box – open source WinForms modified List Control
  23. Joshua – open source interactive JavaScript HTML console window
  24. RealWorldGrid – open source ASP.NET modified GridView control

For those that did read the first post in this series, do you now see what I mean?  No? Let me further illuminate with the following ten points:

  1. Did I say I hated frameworks?  No
  2. Did I say I wanted to reinvent the wheel?  No.
  3. Did I say I wanted to do pure programming?  No.  
  4. Did I say I wanted to code from scratch and roll my own framework?  No.
  5. Did I say I did not want to use frameworks or reuse code?  No.
  6. Did I say that I wanted to return to my beloved VB 6 as my “ideal development model?” No, in fact, I have not used VB6 since 2000 when I converted to C# .NET (I was an early adopter).
  7. Did I say I was sick of programming?  No, first sentence in previous post – I love to write code!
  8. Yes, I have tried several programming languages, O/S’s, IDE’s,  etc., over the last 17 years that I have been programming professionally – this is not about programming languages, other than that they are a part of a much larger issue.
  9. Am I a hobbyist programmer?  No, and not to sound defensive, I am a professionally trained software engineer, with a 2 year post grad in Software Engineering Management and 17 years professional experience.  I have worked for Motorola (CMM Level 5), Kodak (CMM Level 3) and several other shops of various sizes and industries, plus ran my own successful custom software development company for four years with a staff of 25 people.  Currently I work for one of the best custom software development companies that produce enterprise ecommerce solutions to some of the largest retailers and supply chains in the world – the odds are that you have already used our ecommerce software and don’t even know it.
  10. You want to interview me? (for those that asked) First, pass my test by downloading my open source code and explain to my why my software design is good or bad and the reasons why for either.  Then we will talk.

Enough Tom Foolery.  What I said was that web application frameworks, components and tools are like 10 years behind the tools we are used to using to develop desktop applications.  In fact, I stated that it is getting worse, not better.  I used VB6 in a simple two tier application scenario as the example.  Now lets compare to my web app.

Look at how much “stuff” has to be used to make my “one page” web application work.  Imagine developing this as a desktop application.  You could easily cut out 50% of the stuff used in the web application and still have the same functionality in a desktop application, the least of which would be a working WYSIWYG design editor.

This is what my one page web app looks like rendered in IE 7.

 

This is what my one page web app looks like rendered in Visual Studio 2008 design time mode.

 

And no, this is not just a Visual Studio problem, I have run into this with a variety of other web development tools.

My point is illustrated by the number of items listed to make a one page web application and the design time view of the web application.  Way too much stuff and I can’t even see what I am doing!  How is this helping me being more productive?  It isn’t and that is my point.  Our frameworks and tools for developing web applications are getting worse, not better.  Vendors keep adding more and more stuff, yet we can’t even get an editor that shows a proper design time view of what we are trying to build. 

How would an engineer design an automobile (or house or electronics or cell phone or game console or name your item) in AutoCAD with a design time view the same as the one for my web application above?  It would not only be totally unacceptable, it would be ludicrous!  So why do we, the software development industry, accept it?  And worse yet, have people defend it in some of the comments I have read?  That is what makes no sense to me at all.  Am I the only one with the sunglasses from They Live?  I hope not, cause if I remember the movie correctly, it ends badly ;-)

Updated:  A Programmers Dillema - Productivity Lost - Answers

Friday, 05 December 2008 23:45:15 (Pacific Standard Time, UTC-08:00)  #    Comments [11]
# Thursday, 04 December 2008

I love to write code.  I am 49 years old and have been programming off and on from 1977 – 1990 and as a fulltime professional since 1991.  I hesitate to even guess how many programming languages I have used over that time frame.  Since I love to program, not only was I using several different programming languages at various jobs, I was also experimenting with several others after work.

I don’t do as much “on the job” programming as my role has been a “Software Architect” for several years now, but I still do a fair share and even more so in my spare time.  For example, I released an open source project that I have been working for the last two years called Global System Builder.  It was supposed to be fun, but that is the crux of the issue I am having – it was mostly a lot of really hard work.  Not that it was technically difficult, but seemingly something very simple turned out extremely hard to do.

Let me digress a moment to illustrate a point.  As the domain name of this blog indicates, I was progressing in programming languages as the level of abstraction was being raised over time.  Meaning in modern day times, not worrying about memory management, ala garbage collection in languages like C#, enjoying the REPL feel of dynamic programming languages (e.g. IronPython and IronRuby) and marvelling at the power of functional languages (e.g. F#) and then...

I came to the realization that as time marches on, rather than programming becoming easier (read: simpler) it was actually becoming more complex, to the point where today even to write anything simple seems to take a monumental effort, and seemingly more configuration effort than programming, and with so many moving parts, fraught with errors that are not compile time related.  I felt this not only in my professional software development world that I live in, but even on the open source project I was working on in my spare time for fun.  And it was supposed to be fun, but instead it was a design exercise at every turn figuring out which of the programming languages, frameworks, components, and widgets I could use that were the lesser evil since none of them did what I wanted them to do.  As you will see, this is the irony.

Sure we use all sorts of frameworks today that supposedly make our programming lives easier.  The one I am most familiar with is the .NET framework.  At +11,000 types and +100,000 members, I am overwhelmed by the sheer size and complexity of the framework.  I can barely wrap my head around 7 +- 2 items let alone several orders of magnitude larger in size and complexity.  I spend more time looking at MSDN documentation trying to figure out how some Type works and the members I can use rather than actually writing code.

The argument is that we become more productive cause “it is taken care of in the framework” My experience and others would tend to disagree from a practical perspective and let alone the simple math truth.  Our brain is not designed to juggle thousands of Types and so we spend a great deal of time, searching, looking up docs, figuring out what to use from a design perspective, looking at the samples from an implementation perspective, looking at how others have used it – only to find that while it is close to what I want, it does not really meet my requirements.  Fine, close enough and with a few overrides, no problem.  But when you get into low level design and implementation, you run into hard stop limitations and then you a) try and find ways around the limitations or b) go back to the drawing board or c) think about not programming anymore.

And so ends the digression.  The point being, as told by Charles’s Petzold, “Does Visual Studio Rot the Mind” under the subtitle, “The Pure Pleasures of Pure Coding”, where he states, “...but there’s no APIs, there’s no classes, there’s no properties, there’s no forms, there’s no controls, there’s no event handlers, and there’s definitely no Visual Studio. It’s just me and the code, and for awhile, I feel like a real programmer again.

At last we arrive at why I am possibly deciding not to code anymore.  There is so much in the way and there is so much “of it”, that doing anything simple has become an incredibly complex task and doing anything complex takes teams of software folks to deliver due to the sheer size and complexity of our own programming environments, let alone trying to solve the problem domain we have been given.

Here comes the “I remember at time when...”   Love it or hate it, Visual Basic 6 (or the Delphi equivalent at the time) was the most productive programming language and toolset I have ever worked with (Digitalk Smalltalk takes 2nd place) for building business applications in the last 17 years I have been doing software development.  Why?

I used to do storyboards right in the VB6 IDE asking the business guys what they wanted, “so you need a form, with a list box and what should be in the list box, and when you clicked ok, now what... “And on it went.  In a couple of days to a week, we basically had the front end of the application prototyped.  Since it was back in the “two tier” days all we had to do was hook it up to the database.  And if we got really “fancy”, (or had the luxury),  we added a third tier of business logic written out as VB6 classes, not in the stored procs, and after a bunch of testing, report writing, etc., boom!  off it goes into production.

There were a lot of happy people back then.  The business users were happy because they got to sit in and basically design the user interface with me while trying to figure out the app.  And then weeks later it was delivered and did exactly what they wanted it to do.  I was happy, along with my team as we got a buzz on from being so productive and delivering what the users wanted.  And the tools, language and database was simple enough back then to not to get in anyone’s way. Everyone happy!  So what happened?

One part of it is web applications happened.  Even in 2008, Visual Studio ASP.NET cannot give you a WYSIWYG view of your web application.  Yet, VB1 was able to give us WISYWYG back in 1991!  What the heck?  Not to just pick on Microsoft, I would say the majority of vendors’ web development tools are like a decade or so behind their “traditional” development tools for building desktop applications.  Further the vendors push more and more features, more and more “layers”, which mean more moving parts, which means more complexity, which means making it harder and harder for the programmer to use.   It reminds me of the infamous air-conditioning “duct” scene in the movie Brazil.

Our tools and applications have become so large, many and complex, it is literally making us less productive.  And to me, productivity is everything.  Productivity.  It is as simple as that.

Updated: A Programmers Dilemma - Productivity Lost - Part II

Thursday, 04 December 2008 22:24:03 (Pacific Standard Time, UTC-08:00)  #    Comments [12]
# Sunday, 19 October 2008
Get Off My Cloud

Dare Obasanjo wrote about Cloud Computing and Vendor Lock In where Dare commented on Tim Bray’s Get In the Cloud which Tim described a couple of issues.  One of them, called a tech issue was, “The small problem is that we haven’t quite figured out the architectural sweet spot for cloud platforms.”  I would say we have not figured out the architectural sweetspot for “any” platform, cloud or otherwise.

 

It seems that every API we program to, whether cloud based or not, is still a custom, one off API.  It appears to be that way since the dawn of software programming and still continues that way, even when we are in the clouds.

 

What I am talking about?  I know people cringe about analogies to software development, but this one contains the single point I am making.  Do you know what an 8 pin DIP is?  DIP stands for dual inline package and an 8 pin DIP is one “standard” package (or interface) for integrated circuits (IC’s).  The keyword being “standard”.  There are millions of different types of 8 pin DIP IC’s out in our world today used in virtually anything electronic.  Much like our accompanying software, however, there is one crucial distinguishing difference between the two.  Can you guess what it is? 

 

Those millions of IC types can fit into “one" package, or put another way, exposed through “one” interface and that is what the 8 PIN DIP is, an interface.  Through that one interface type or package, I can access millions of features and functions.  Further, through using multiples of those packages/interfaces, I can make almost anything I want, from a motherboard, to a robot servo controller to circuitry that makes my Blackberry a Crackberry (oops that’s just me, the carbon unit) to circuitry that guides a rocket to the moon.

 

How come we have (still) not figured this our in our software world where we continue to hand craft one off interfaces that seemingly are tied to the implementation, even though we don’t think that (i.e. the vendor lock-in described by Tim’s article).  Brad Cox seemed to have that figured out in his concept of a Software IC years ago and further in his book on Superdistribution.  A man before his time, I would say.

 

What’s my point? My point is that there is going to be an inevitable conclusion at some point in time where “how” you interface with any given piece of software functionality is going to be more important than the functionality itself.  Know what I mean?

 

Imagine you woke up this morning and you could access all software libraries through an operations based API like, Query, Create, Update, Delete, Copy and Move.  That was it.  That was your “complete” API and all software libraries, components, features and functions exposed that API where the general pattern is create a request comprising one or more operations on specified items.  Submit the request to the API service for processing.  Interpret the result returned by the service.  That’s it.

 

Remember I am just talking about the interface exposed by the software library and not its internal representation.  In other words, a standard API used by everyone.  How would that change the way we would consume libraries (or components or services or features or functions) to develop software today?

 

Don’t get me wrong, I am not saying we are not heading in this direction, we are.  There are several examples of this today, but it is not ubiquitous or universal or all encompassing as it could be.  As a .NET developer and using the .NET framework class library of some 30,000 types, I can tell you it is not that way today.  In fact, it does not matter what framework class library I use to be clear I am not picking on Microsoft. I am sure that at the PDC with some 38 sessions on cloud services and the announcement of .NET 4.0, we will see some interesting developments there, I am just hoping it is towards the direction of a standardized interface for cloud services.

 

Maybe it is just wishing thinking that I will see in my lifetime a standard interface or package like the 8 PIN DIP for software libraries where through one "standard" interface I can access thousands of library functions.  Then again, software industrialization is occurring at a dizzying pace and I can’t help feel that it is just around the corner.  History in the making is the phrase I believe and I hope to be a part of it.

Sunday, 19 October 2008 22:45:16 (Pacific Daylight Time, UTC-07:00)  #    Comments [2]
# Wednesday, 26 September 2007

In Part 1, I briefly introduced the guide to the Software Engineering Book of Knowledge (SWEBOK) that was published in 2004.  “The IEEE Computer Society establishes for the first time a baseline for the body of knowledge for the field of software engineering…”  We will come back to the SWEBOK in a future post as this post is about how to qualify for “professional” league play.

 

In Part 1, I discussed software engineering as being a team sport.  This is nothing new as far as I am concerned, but I am still amazed when a multi-million dollar software development project is assigned to, and executed by, a newly assembled team.  This team has never played together, and quite likely consists of shinny players to NHL stars and everything in-between.  Now imagine that this team is somehow “classified” as an amateur Junior “B” league hockey team and their very first game was to play a real NHL professional hockey team.  What is the likelihood of the B hockey team winning?  Don’t forget, the B team has not played together as a team yet.  Their first game is to play together and play against an NHL pro hockey team.  Did I mention that just before actual game play the team figures out that they are missing their Center, who also happens to be the Captain of the team.  Again, what is the likelihood of success?

 

Of course, this hockey scenario would never happen in real life, but it is certainly happens in our software development world.  Where teams (assembled or not) get assigned software development projects that are way beyond their software engineering capabilities to even have a chance at successful delivery.  I see it everyday.  And to varying degrees, I have seen it over the past 16 years of being in the software development business.  I have been fortunate as some of the dev teams I have worked on are at “professional” league play, where all team members have +10 years of software development engineering experience.  Aside from work being fun on theses teams, our success rate for the design and delivery of software on time and on budget was very high. Our motto was give us your biggest, baddest, dev project you got – c’mon, bring it on!

 

However, most teams I have worked on have been a mix of shinny players to pros.  Our success rate was highly variable.  And even more variable (chance has better odds) if the model is assembling players from the “resource pool.”  Some of the shinny players have aspirations of becoming pros and take to the work, listening to the guidance of the pros.  Other shinny players are finding out for the first time what it means to try out in the pro league and are questioning whether they are going to pursue their career in software development.  It’s a hard road, being unindustrialized and all.

 

There are shinny players (and pro posers) that try real hard but simply don’t have the knowledge or skills (aptitude?) to play even at the shinny level, for whatever reason.  This is especially difficult to deal with, if one or more of this type of player happens to be on your team.  It is the equivalent of carrying a hockey player on your back in the game because s/he can’t skate.  Never happens in the pro hockey league, but amazingly often in our software development world.  If our software development world was more “visible”, you would see in every organization that develops software of any sort, some people wandering around carrying a person on their back.  It would be kind of amusing to see, unless of course you were the one carrying… or being carried.

 

That is the irony of what we do as software developers.  It is nowhere near as “visible” as a team (or even individual) sport where everyone can visibly see exactly what you are doing.  And even if people could see what you are doing, it still may not matter, because to most people, other than software developers, software engineering is a complete mystery. 

 

So how does one find out what level of league play you are at?  One problem in our software development world is that we do not have levels of team play.  We are not that mature (i.e. industrialized) yet. Well, some would say that the CMMI has levels of play and I would agree, but I am talking about individual level here first, then we will discuss what that means when you qualify to be on a team (or even the remote possibility of finding a team that has the same knowledge and skill level you do).  Another way to determine your knowledge and skill level is through certifications.  There are several ways to get certified.  Some popular ones are: Certified Software Development Professional and the Open Group's IT Architect Certification Program.  Other certifications are directly related to vendors’ products such as Microsoft’s Certified Architect and Sun's Certified Enterprise Architect.

 

For this series of articles, I am looking at level of play as being a “professional software engineer.”  I firmly believe that software development is an engineering discipline and it appears that there is only one association, in the province I live in, that recognizes this and that is the Association of Professional Engineers and Geoscientists of BC (APEGBC).  That designation is called a Professional Engineer or P.Eng.  “The P.Eng., designation is a professional license, allowing you to practice engineering in the province or territory where it was granted.  Only engineers licensed with APEGBC have a legal right to practice engineering in British Columbia.”  Of course, this may be slightly different depending on your geographic location. 

 

Note how this is entirely different than any other certification – it is a professional license giving you a legal right to practice software engineering. The term Professional Engineer and the phrase practice of professional engineering is legally defined and protected both in Canada and the US.  The earmark that distinguishes a licensed/registered Professional Engineer is the authority to sign and seal or "stamp" engineering documents (reports, drawings, and calculations) for a study, estimate, design or analysis, thus taking legal responsibility for it.” 

 

Ponder this for a while, what would it mean in your software development job right now if you were a professional software engineer and that you were legally responsible for the software that you designed?  How would that change your job today?  I know it would change mine.

 

All of this is not news to any practicing electrical, electronic and all of the various other engineering disciplines as this have been standard practice for years, even decades.  Yet it is seemingly all new to us software professionals.

 

Let’s take a look at the requirements for applying for P.Eng., specifically related to software engineering.  These two documents are: "2004 SOFTWARE ENGINEERING SYLLABUS and Checklist for Self-Evaluation” and “APEGBC Guide for the Presentation and Evaluation of Software Engineering Knowledge and Experience.”    

 

From the Software Engineering Syllabus:  “Nineteen engineering disciplines are included in the Examination Syllabi issued by the Canadian Engineering Qualifications Board (CEQB) of the Canadian Council of Professional Engineers (CCPE). Each discipline examination syllabus is divided into two examination categories, compulsory and elective. Candidates will be assigned examinations based on an assessment of their academic background. Examinations from discipline syllabi other than those specific to the candidate’s discipline may be assigned at the discretion of the constituent Association/Ordre. Before writing the discipline examinations, candidates must have passed, or have been exempted from, the Basic Studies Examinations.”

 

That’s right – exams.  While I have 16 years of professional software experience, I may end up having to write exams.  I wrote 17 exams when a postgraduate Software Engineering Management program, so what’s a few more.  I say bring it on.  Oh yeah, did I mention I was applying to become a professional software engineer?  I want to work on teams near or at “professional” league play.  Practice what you preach, eh?  I will be documenting the process and my progress through this blog.

 

Open up the Software Engineering Syllabus PDF and have a look for yourself.  Can you take your existing education and map it to the basic studies?  How about that Digital Logic Circuits? Or how about discrete mathematics, remember your binomial theorem?  Now look at Group A.  Unless you are a very recent Comp Sci grad, it is unlikely you took Software Process – so perhaps exams for everyone!  Who’s with me?!

 

While I am having a little fun here, no matter what, it is going to be work.  And that’s the point, you don’t become a professional engineer overnight.

 

In addition to the educational requirements, have a look at the Presentation and Evaluation of Software Engineering Knowledge and Experience PDF.  You need a minimum of 4 years of satisfactory software engineering experience in these 6 main software engineering capabilities:

 

  1. Software Requirements
  2. Software Design and Construction
  3. Software Process Engineering
  4. Software Quality
  5. Software Assets Management
  6. Management of Software Projects

Some capabilities are further decomposed into sub-capabilities.  To each capability area can be associated:

 

  • the fundamental knowledge or theory , indirectly, pointing to the model academic curriculum of Software Engineering,
  • the applicable industry standards,
  • the recognized industry practices and tools,
  • and a level of mandatory or optional experience on real projects. 

The document then defines these capabilities and sub-capabilities as to what they mean.  Finally the document provides a suggested presentation of experience and an example of how to layout your projects.  Seems pretty straight forward enough, but when you sit down and actually go through it, remembering the projects you worked on and what capabilities were used and referenced to the sub-capabilities, it can take a while.

 

While I won’t go through some of the details of the  other requirements, which you can read in this 25 page application guide,  two other items stand out, one is writing the Professional Practice Exam, in addition to attending the Law and Ethics Seminar and the other is references.

 

The Professional Practice Exam. Before being granted registration as a Professional Engineer, you must pass the Professional Practice Examination.  The Professional Practice Examination is a 3-hour examination consisting of a 2-hour multiple-choice section and a 1-hour essay question. The examination tests your knowledge of Canadian professional practice, law and ethics.  There is a (large) book reference that you need to study in order to prepare for the exam.

 

The reference form is interesting in the sense that the Association requires that Referees be Professional Engineers with first hand working knowledge of the applicant’s work and that the applicant must have been under suitable supervision throughout the qualifying period.  I don’t know what your experience has been, but in my 16 years of being in the software development business, I have only worked with two professional engineers.

 

One more specific aspect I would like to point out is the APEGBC Code of Ethics.  The purpose of the Code of Ethics is to give general statements of the principles of ethical conduct in order that Professional Engineers may fulfill their duty to the public, to the profession and their fellow members.  There are 10 specific tenets and while I understand and appreciate each one, there is one in particular that is very apt to the state of software engineering today and that is:

 

(8) present clearly to employers and clients the possible consequences if professional decisions or judgments are overruled or disregarded;

 

You know what I mean.

 

This sums up the application process for becoming a professional software engineer. As you can see it is considerable effort and can take 3 to 6 months to acquire your license.  However, the main benefit is that it tells employers that they can depend on your proven skills and professionalism.  The main benefit for me is to use it as a qualifier for any new software development teams I will join in the future.  My goal is to work on teams that are at the “NHL” level of play. 

 

In Post 3 we are going to dive into the SWEBOK for an explanation of what the guide to Software Engineering Body of Knowledge is and how through this knowledge and practice, we as software professionals, can assist in the industrialization of software.

Wednesday, 26 September 2007 20:09:21 (Pacific Daylight Time, UTC-07:00)  #    Comments [3]
# Thursday, 13 September 2007

“In spite of millions of software professionals worldwide and the ubiquitous presence of software in our society, software engineering has only recently reached the status of a legitimate engineering discipline and a recognized profession.”

 

Software Engineering Body of Knowledge (SWEBOK) 2004.

 

 

“Software industrialization will occur when software engineering reaches critical mass in our classrooms and workplace worldwide as standard operating procedure for the development, operation and maintenance of software.”

 

Mitch Barnett 2007.

 

I had the good fortune to have been taught software engineering by a few folks from Motorola University early in my career (1994 – 96).  One of the instructors, Karl Williams, said to us on our first day of class, “we have made 30 years of software engineering mistakes which makes for a good body of knowledge to learn from.”  He wasn’t kidding.  A lot of interesting stories were told over those two years, each of which had an alternate ending once software engineering was applied.

 

Over 16 years, I have worked in many different software organizations, some categorized as start-ups, others as multi-nationals like Eastman Kodak and Motorola, and a few in-between.  I have performed virtually every role imaginable in the software development industry: Business Analyst, Systems Analyst, Designer, Programmer, Developer, Tester, Software/Systems/Technical Architect, Project/Program/Product Manager, QA Manager, Team Lead, Director, Consultant, Contractor, and even President of my own software development company with a staff of 25 people.  These roles encompassed several practice areas including, product development, R&D, maintenance and professional services.

 

Why am I telling you this?  Well, you might consider me to be “one” representative sample in the world of software engineering because of my varied roles, practices areas and industries that I have been in.  For example, in one of the large corporations I worked in, when a multi-million dollar software project gets cancelled, for whatever reason, it does not really impact you.  However, if you happen to be the owner of the company, like I have been, and a software project of any size gets cancelled, it directly affects you, right in the pocket book.

 

I wrote this series on software engineering to assist people in our industry on what the real world practice of software engineering is, how it might be pragmatically applied in the context of your day to day work, and if possible in your geographical area, how to become a licensed professional software engineer.  Whether you are a seasoned pro or a newbie, hopefully this series will shed some light on what real world software engineering is from an “in the trenches” perspective."

 

One interesting aspect of real world software engineering is trying to determine the knowledge and skill of the team you are on or maybe joining in the near future if you are looking for work.  While there are various “maturity models” to assist in the evaluation process, currently only a small percentage of organizations use these models and even fewer have been assessed.

 

Did I mention that software engineering is a team sport?  Sure you can get a lot done as “the loner”, and I have been contracted like that on occasion. I am also a team of one on my pet open source project.  However, in most cases you already are, or will be part of a team.  From a software engineering perspective, how mature is the team?  How would you tell?  I mean for sure.  And why would you want to know?

 

Software engineering knowledge and skill levels are what I am talking about.  Software engineering is primarily a team sport, so let’s use a sports team analogy, and since I am from Canada, it’s got to be hockey.  A real “amateur” hockey team may be termed “pick up” hockey or as we call it in Canada, “shinny."  This is where the temperature has dropped so low that the snow on the streets literally turns to ice – perfect for hockey.  All you need are what look like hockey sticks, a makeshift goal and a sponge puck. I can tell you that the sponge puck starts out nice and soft at room temperature, but turns into real hard puck after a few minutes of play.  The first lesson you learn as a potential hockey star is to wear “safety” equipment for the next game.

 

Pick-up hockey is completely ad-hoc with minimal rules and constraints.  At the other end of the spectrum, where the knowledge and skill level is near maximum maturity level, is the NHL professional hockey team.  The team members have been playing their roles (read: positions) for many years and have knowledge and skills that represent the top of their profession.

 

How does one become a professional hockey player?  It usually starts with that game of shinny and over the years you progress through various leagues/levels until at some point in your growth, it is decided that you want to become a professional hockey player.  Oh yes, I know a little bit about talent (having lived the Gretsky years in Edmonton), luck of the draw, the level of competition and sacrifices required.  The point being it is pretty “obvious” when you join a hockey team at what knowledge and skill level the team is at.  And if you don’t, it becomes pretty apparent on ice in about 10 minutes as to whether you are on the right team or not – from both your and the teams perspective.

 

In the world of software engineering, it is not obvious at all if you are on the right team or not and may take a few months or longer to figure that out.  Why?  Unlike play on the ice where anyone can see you skate, stick handle and score, it is very difficult for someone to observe you think, problem solve, design, code and deliver quality software on time, on budget.  This goes for your teammates as well as the evaluation of… you.

 

Software engineering is much more complicated than hockey for many different reasons, one of them being that the playing field is not in the self contained physical world of the hockey rink.  The number of “objects” from a complexity point of view is very limited in hockey, in fact, not much more complex than when you first started out playing shinny, other than the realization for wearing safety gear, a small rule booklet and a hockey rink. 

 

The world of software engineering is quite a bit more complex and variable, particularly in the knowledge and skills of the team players.  It is more likely that your team has a shinny player or two, a NHL player and probably a mix of everything in-between.  Without levels or leagues to progress through, it is actually more than likely, it is probably fact that the team is comprised of members at all different levels of software engineering knowledge and skill.  Why is this not a good thing?

 

This knowledge and skills mix is further exacerbated by the popularity of “resource pools” that organizations favor these days.  The idea is to assemble a team for whatever project comes up next with resources that are available from the pool.  As any coach will tell you, at any level of league play, you cannot assemble a team overnight and expect them to win their first game or play like a team that has been playing together for seasons of play.  This approach just compounds the fact of a mixed skill level team by just throwing them together and expecting a successful delivery of the software project on their very first try.  We have no equivalent to “try outs” or training camps.

 

And that’s a big issue.  People like Watts Humphrey, who was instrumental in developing the Software Engineering Institutes, Capability Maturity Model realized this and developed the Personal Software Process.  Before you can play on a team, there is some expectation that the knowledge and skill levels are at a certain point where team play can actually occur with some certainty of what the outcome is going to be.  I have a lot of respect for Watts.

 

So how do we assess our software engineering knowledge and skills?  In part, that is what the guide to the SWEBOK is about.  It identifies the knowledge areas and provides guidance to book references that covers those knowledge areas for what is accepted as general practice for software engineering. 

 

The other assessment part is to compare our knowledge and skills to what is required to becoming a licensed professional engineer (P.Eng.)  We will do this first before we look at SWEBOK in detail.

 

I see licensing professional software engineers as a crucial step towards software industrialization.  I base this on my “in the trenches” experience in the electronics world prior to moving into the software development world.  The professional electronics engineers I worked with had a very similar “body of knowledge”, except in electronics engineering, that was required to be understood and practiced for 4 years in order to even qualify to become a P. Eng. 

 

Most importantly, the process of developing an R&D electronics product and preparing for mass production, which I participated in for two years a long long time ago, was simply standard operating procedure.  There was never any question or argument as to what the deliverables were at each phase of the project, who had to sign them off, how the product got certified, why did we design this way, etc.  Comparatively speaking to our software industry, that’s what the guide to the Software Engineering Body of Knowledge is all about, a normative standard operating procedure for the development, operation and maintenance of software.  Yes, it is an emerging discipline, yes, it has limitations, but a good place to start don’t you think?

 

In Part 2, we are going to look at the requirements in British Columbia for becoming a licensed software engineer.  We will use these requirements to assess the knowledge and skill level to uncover any gaps that might need to be filled in the way of exams or experience.  If you want to have a look ahead, you can review the basic requirements for becoming licensed as a P. Eng., and the specific educational requirements and experience requirements for becoming a professional software engineer.

 

PS. Part 2 posted

 

PPS.  Happy Unofficial Programmer's Day!

Thursday, 13 September 2007 21:16:31 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Monday, 17 July 2006
Well, to answer the question, I wrote 80 mini articles in just over a year that comprises this blog.  However, if you want to read just one article on the subject, may I suggest, Planning the Software Industrial Revolution 15 Years Later.
 
My blog has been a year long journey of discovery which has led me to a conclusion in the field of software industrialization.  This conclusion is now my main pursuit.
 
Therefore, no more blogging and frankly, after 80 posts on the subject area, I dont have anything more to say, or at least for a while anyway.
 
The one thing I do have to say is that I wish my blogging software had the blog titles in the Archives list instead of the month/year date labels, including the number of posts beside each month.  Given the subject matter, I think this rather ironic.  To overcome this, I manually cut n pasted the titles into this post (newest first, in descending order) as a human readable index to the articles.
 
So long, and thanks for all the fish. 

Dynamic Programming using IronPython Part 5 
Dynamic Programming using IronPython Part 4
Dynamic Programming using IronPython Part 3 
Dynamic Programming using IronPython Part 2 
Dynamic Programming using IronPython Part 1 
Software Reuse: Enigma or SOP? 
Software Industrialization using .NET Reflector and IronPython
A Software Design Tool idea using IronPython
Got any CAD Tools for Designing Software? 
Planning the Software Industrial Revolution 15 Years Later
Is IT a commodity or are we still in a Software Crisis? 
The Joy of Software or Why dont they get it?  Part 3 
Our Software Industry is a Sham(bles) 
The Joy of Software or Why dont they get it?  Part 2 
Mini-Microsofts blog explodes on Vista delay
The Bloody Edge of being a Beta Tester
The Joy of Software or Why dont they get it?  Part 1 
Vista 5308, WPF amd XAML way cool but got the Cider and Sparkle blues
Software Developments nemesis variability 
The people you meet in the software biz 
Whats new in software development for 2006?  Part 10 Finish 
Whats new in software development for 2006?  Part 9 
Whats new in software development for 2006?  Part 8
Whats new in software development for 2006?  Part 7 
Whats new in software development for 2006?  Part 6 
Why Software Industrialization?  Part III
Whats new in software development for 2006?  Part 5
Whats new in software development for 2006?  Part 4
Whats new in software development for 2006?  Part 3 
Whats new in software development for 2006?  Part 2
Whats new in software development for 2006?  Part 1
Top Software Failure for 2005 or The Future of Software 
The Evolution of Software Industrialization Part 7 Conclusion 
The Evolution of Software Industrialization Part 6 
The Evolution of Software Industrialization Part 5 
The Evolution of Software Industrialization Part 4
The Evolution of Software Industrialization Part 3 
The Evolution of Software Industrialization Part 2 
The Evolution of Software Industrialization Part 1 Introduction
Marketecture or why software vendors are not helping industrialize our industry
Hey Software Programmer Stop Coding! 
Raising the level of abstraction Part II Programming with Models 
Book review Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools  
SOA details Contract-First Development 
Service Oriented Architecture (SOA) the Truth 
Code is Model Damn Right! 
BRIDEGWERX A Software Factory approach for generating Application Integrations 
lesscode and more design 
lesscode using Domain-Specific Languages (DSL) 
Leadership in the Software World 
Future Directions in Modeling Tools
SOA this, SOA that, SOA what!! 
Why Software Industrialization?  Part II 
Dynamic Interrogation SOA magic! 
An on-demand application generator for producing application integrations 
On-Demand Application Generation exposed 
Size does matter 
Generating On-Demand Workflow Applications in SharePoint
Sideways or Groundhog Day or both? 
What is a Software Factory? 
More on code generation 
What is code generation? 
Bringing a software invention into the business world 
A standard for application integration has been invented 
Raising the level of abstraction 
Software Industrialization Arrives 
Stupid Computer Tricks got your secret decoder ring? 
Size and complexity does matter 
The sorry state of our software industry is there hope? 
What is an IT Architect anyway? 
So you wanna be a certified IT Architect do ya? 
Doesnt ANYONE want to be a programmer? 
BRIDGEWERX a full fidelity software modeling tool ala AutoCAD style 
AutoCAD for software development?  Introducing model driven development 
The Dynamics of Software Development 
What weve got here is a failure to communicate
Software Engineering, Not Computer Science Part 2 
Why Software Industrialization? 
Software Engineering, Not Computer Science
 
Monday, 17 July 2006 15:24:24 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Thursday, 13 April 2006
I have a great deal of respect for Brad Cox.  In fact, consider this essay a tribute to Brad. Ten years before he wrote, Planning the Software Industrial Revolution, I was living in the integrated circuit world of R&D electronics, where it was already an industrialized industry.  I could order chip, card and rack level components from catalogs and plug them together though a set of standardized interfaces that solved a set of problems that I had specified in an electronic schematic diagram.  During that time, I was also a participant in industrializing printed circuit board layouts that initially took 6 to 8 weeks of hard manual labor on a light board to, ironically, using a computer program to read an electronic schematic diagram (i.e. specification) and producing the printed circuit layout in a matter of minutes.  In other words, I experienced first hand what industrialization was and meant in the electronics industry.
 
When I first got into the software industry in 1990, I thought it was going to be similar to the industrialized electronics industry I was used to.  I could order software components (like integrated circuits) from catalogs and using a schematic diagram, CAD like software program of some sort, assemble those components to solve a set of problems.  Nothing could be further from the truth.  Then I read Brad Coxs article and realized what I took for granted in the electronics engineering world may not even happen in my lifetime in the software world.  It was quite a shock to me.  I had no idea the software world was so far behind, and in fact, completely unindustrialized.
 
Flash forward 15 years and I have performed just about every role one could imagine in the software industry, including running my own software company.  I feel like I have gone nowhere in this industry and in fact, in some respects gone backwards.  I still write the same source level code I did 15 years ago and build most everything from first principles.  Sure technologies have changed, but we (meaning everyone in this industry) are still building our software world, one source code line at a time.  As Brad says, grains of sand where bricks are needed.  Whats wrong with our software industry?  Why dont we learn from the successes of other industrialized industries?  Are we that arrogant that we have to do everything from first principles over and over again?
 
In Brads article he discusses the importance of Blanchards pattern lathe and that the real crucial discovery made was, that implementation tools were insufficient unless supplanted by specifications tools capable of determining whether parts complied to specification within tolerance.  As Brad points out, this discovery has not yet occurred in software.  In my opinion, 15 years later, it still remains undiscovered.  Making software tangible and observable, rather than intangible and speculative, is the first step to making software engineering and computer science a reality.  Brads sentence still applies today.
 
Ironically, computers and computer programs have helped other industries become fully industrialized.  In 1982 AutoCAD came to the world and revolutionized the industrial design and manufacturing industries, including my printed circuit board story above.  The craft of drafting has not gone away, but the tools used have raised the level of abstraction to the point today where those specifications are now items that I can order out of a catalog.  For example, if I want to design a house, I dont have to invent the drafting symbols that represent my house, they already exist and in fact, can be ordered from a catalog.  Further, I can order the entire house specification (funny enough called an architectural blueprint :-) from a catalog and customize it (using standard symbols) to what I want.  That specification is then used to implement (i.e. build) the house.  Some people would say in the software world we have that through UML. When I showed a business person a UML use case diagram, he said, Stickmen?  Thats all you have is stickmen? And then he proceeded to laugh uproariously and just walked away.  At the time I felt angry and thought another bozo that does not get it.  But as years went by, I realized he was totally right.  How embarrassing for our software world where the best we can do on the specification side is stickmen.
 
So we need better specification tools, thats for sure what else do we need?  As Brad writes, the programmer shortage can be solved as the telephone-operator shortage was solved by making every computer user a programmer.  I wholeheartedly agree.  This would dramatically increase the chances of industrializing our software world.  At the very least it would help us with the most frustrating part of our world and that is human machine interface design.  Generally speaking, software engineers and simply dont get it, for whatever reason.  Lets look at several examples of what I mean and tell me I am wrong.
 
Take your average DVD player. Did you ever happen to notice that the hardware (DVD player) does not control the software (DVD disc)?  One of my commenters (thanks Michael K) on my blog said, I want a DVD player that works like a VCR.  When I press fast forward on a VCR - the recording goes forward. With a DVD - it is up to the disk whether I can fast forward it or not - total lunacy!  Its my player it should do what I tell it, now!  And just how many DVD players are there in the world?
 
How about automated phone attendants? A sure way to increase the blood pressure of even the calmest West Coaster, particularly the new and improved attendants that respond to your voice commands instead of key commands. Insert your favorite voice command here :-)  The fact companies purport that they are customer driven by using automated attendants is a joke or is it that they dont get it? (ha!).  When I speak to a company, I want to hear a human being answer the phone and one that knows the inner workings of the company to provide me with a layer of abstraction between the companys goofy organizational structure/behavior and myself.  S/he is the public interface to the company that hides the inner workings from me.  The thing about it is that we used to have it, so what happened?
 
What about ATMs?  Well, it is a matter of convenience.  The fact that I can quickly get money, usually without waiting in line, is what I want from a users perspective.  So that part of the abstraction is good, but the human machine interface could do with another tweak.  That is, if I have only one account, then dont give me account choices.  I get asked for checking, savings, and other, every time I use the machine, no matter where I go in the world and I have only one account.  How hard can it be?
 
Even the firmware in my wifes German precision engineered washing machine is koo koo.  It has lots of intelligence built-in, but the machine does not automatically shut-off by itself.  It will beep beep, beep at you until you come and manually turn it off.  Going through the 27 page manual, I found a way to at least turn-off the beeping.  However, months later it came back and I forgot the magic key combination to turn it off and I refuse the read the frickin manual again.  How can this be?  How many millions of washing machines have rolled off the production line in the last 25 years?  And still cant get down to the one button interface called wash clothes, and dont bother me again.
 
Or how about, Windows Vista, 5 million lines of code, 5 years later and Hasta La Vista Baby!  As you will see they have also opted for aesthetics over functionality.  What does the customer really want? An extremely fast, low memory, transparent to the user and completely reliable operating system.  Thats it!   Again, how hard can it be?  Yet, wait till you see it and by the way, you will need to buy a new computer to run it.  That from the largest and most successful software development company in the world today.
 
Finally, while not really software based, is an excellent example of poor human to machine interface design.  The building that I work in is having its washrooms renovated and you guessed it, upgraded to automated faucets, soap dispensers, urinals and toilets.  Yes, the toilet is fully automatic, with no manual flush.  Of course, the renovations are occurring floor by floor meaning that for the people on the floor that is being renovated will need to go up or down a floor, which means those washrooms are rather busy.  So what if the automatic toilet does not flush?  And there are people waiting?  I can tell you that one of the toilets, (3rd floor, right stall), is batting a .500 average.
 
The first lesson for anyone becoming a software programmer is to become a designer first.  There is an excellent book called. "", written around the same time as Brads essay.  This book should be required reading for every designer, software, hardware, industrial, art or otherwise.  One story discusses the design of doors and the visual clues it presents to the user to determine does this door push forward or draw backward and if I need to turn a knob or push on a bar or what exactly to make it work.  Yes, something as simple as a door, we still cant get right.  Aesthetics win over functionality, almost every time cause thats what the consumer perceives s/he wants.  Me?  I just want the door to open like in old school Star Trek, complete with the cool sound.  Oh yeah, you will recognize the book as it has a picture of a red tea pot on the cover with the spout and the handle on the same side.  Nice design!
 
So what does this have to do with industrializing software?  Everything actually.  Until consumers demand better software, cheaper and faster, we will continue to handcraft solutions one source code line at a time that cant be measured for conformity to any users specification.  Eventually consumers will revolt or go elsewhere where someone else has figured out a better way to do it, just like when consumers got fed up with American cars, they went to Japan.
 
I am somewhat encouraged by the fact that Domain Specific Languages (DSLs) are making a resurgence as I believe this is the right path towards industrializing software development and meets Brad Coxs vision.  DSLs raise the level of abstraction to a point where a visual model specification is used to specify the implementation and can verify its correctness.  AutoCAD can be classified as a DSL as it allows non-draftsperson or non-engineer to still produce a drawing that is accurate enough for someone to implement it.  For example, from a catalog, I can order a drawing of a piston, put it in a CAD program and modify a dimension, save the electronic file output, take it to a milling shop and get my pistons manufactured, and yet I know nothing about the industry.  Now thats industrialization.
 
A Challenge to My Software Developer Peers
 
I challenge my software developer peers to help industrialize our software development industry.  Brad Cox has made a major contribution not only with the article he has written, which I reference here, but also with Objective-C System Building Environment that he invented.  My very small contribution to software industrialization is co-inventor of a DSL called Bridgewerx that allows a Business Analyst (not a programmer) to specify (AutoCAD like) an application integration specification that is then automatically implemented, without any programming knowledge required.  And before you crucify me for plugging a product I no longer have any financial interest in, I did it because I am personally sick and tired of writing the same source level code over and over again for the last 15 years.  I want higher level abstractions in better tools.  What are you doing to raise the level of abstraction in our software industry?
 
I leave you with the last paragraph that Brad Cox wrote 15 years ago in his article.  It is just as applicable today as it was then and probably still applicable when I am no longer around.
 
I only wish that I were as confident that the changes will come quickly or that we, the current software development community, will be the ones who make it happen.  Or will we stay busy at our terminals, filing away at software like gunsmiths at iron bars, and leave it to our consumers to find a solution that leaves us sitting there?
Thursday, 13 April 2006 17:34:34 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Saturday, 04 March 2006
This blogs topic is about software industrialization, which means making software development a predictable and repeatable process.  The only other requirement for software industrialization is that the software created meets the end user's requirements.
 
Why is software development not a predictable and repeatable process?  I can partially explain this through a small story where after spending 10 years in the software development business, a friend of mine and I opened up our own consulting company in 2001.  Our company was based on one single Microsoft product in which we would offer consulting services for.  That product was BizTalk Server which is a message oriented middleware product for exchanging disparate data and orchestrating business processes across multiple applications.
 
Over a four year time frame, we custom designed and constructed twenty-five or so integration solutions using every version of BizTalk Server. Even after that many projects, our process for designing and constructing these solutions was still far from being predictable and repeatable.  Sure we got (much) better, but we realized it was the variability that was so difficult to overcome.  I mean variability in everything that is software and the processes used to design and construct it.
 
For example, there is always large variability in the quantity and quality of software requirements.  A very small percentage of customers know exactly what they want, more still know exactly what they want, but can't articulate it, all the way to the other extreme where customers have no idea what they want, but still want something built.
 
For every single "discrete" chunkable requirement, there seems to be at least a dozen ways to design it.  For every design, there seems to be almost infinite ways to implement it.  When I say design, I mean a particular design for a particular requirement in which the chunkable output of the design is on the order of 40 hours effort per one developer to complete the requirements, finish a detailed design, code, test the chunk and done.  The culmination of designs to meet all of the requirements is called the software architecture.
 
Case studies and industry reports point to inadequate and/or always changing requirements as one major contributing factor as to why software development is not a predictable and repeatable process.  Another contributing factor is size and complexity of software development where it is most always underestimated.  I would be the first one to agree with both statements, but I would say that this is more symptomatic then the root cause.
 
Yet another contributing factor to why software development is not a repeatable and predictable process is programmer productivity.   I have worked with over a hundred software developers in my 15 years in the industry and I can say that programmer variability is just as broad as the other contributing factors as discussed above.  There are several books that quantitatively put programmer productivity variability levels in the range of 20 to 1 and even 100 to 1 between programmers that have been assigned the same code project to, design, construct and test the software.  I have seen the extreme with my own eyes where some developers can't write the code no matter how much time was given, while others can write it in two weeks flat.  Thats off the chart in terms of variability.
 
One of the reasons for the wide variability in programmers, aside from the skill sets discussed in the previous paragraph, are the tools that are available for programmer use.  The tools themselves are incredibly complex environments and sometimes require people to think in ways that they may not be able to grasp or it is so complicated, no one can figure it out.  I can't grasp C, but I grok Smalltalk from a programming language point of view.  When we asked a printer to print the help file that came with BizTalk Server 2004, he called us to say it is likely going to be 10,000 pages and cost $500.  That's just one product!  And we use a half dozen other products for designing and constructing our integration solutions including, SQL Server, ASP.NET, Visual Studio IDE, Windows 2003 Server, SharePoint Services, C#, FrontPage, .NET Framework, and on it goes.  While some of these products are not the same size and complexity of BizTalk Server, they require deep understanding of just what they heck they do and how all the products fit together in order to provide the tools and framework to design and produce the Customer's solution in any reasonable time frame (read: cost).  Even the .NET Framework Class Library alone has over 5,000 classes to get to know, some very intimately.
 
These are tools and technologies from one vendor!  What about multiple vendors?  Also every vendor seems to be pumping out the latest and greatest tools and technologies every year.  Where does one find the time?  Answer: one does not find the time which results in peoples knowledge of these tools and technologies, plus the specialized skills required and experiences to use them effectively, varies wildly.  This is another major contributing factor as to why software development is not a predictable and repeatable process - the programmer never gets a chance to gain years of experience using one tool or even small set of tools - so everything is (always) new.
 
Even within the Microsoft technologies mentioned above, there are many technologies that do more or less the same job (but the tools are totally different) for one specific area - user interfaces.  There are (at least) five Microsoft technologies for developing user interfaces. To me, it is mind boggling why even within a single vendor, that not only are there five different technologies to develop user interfaces (actually 7 if you count InfoPath and SharePoint Designer), there are multiple tools for each technology.  For example, ASP.NET - there is Visual Studio and FrontPage.  Both have very deep features, but the tools are completely different.
 
Some would say introducing standards would alleviate this problem.  While I concur and it has proved to help industrialize other industries (e.g. electronics), it is still early game in the software world and the technology advances far outpace the speed at which standards can be ratified.  Also, believe or not, Microsoft's latest technologies are all (mostly) standards complaint and all with public specifications. So what's the value of standards?  What our industry needs is innovation.  What would be truly innovative from Microsoft (and other vendors) is simply one technology and tool that produces any type of user interface you want.  From a developers perspective, this means being able to focus on "one" tool or technology to do a specific task, like designing and constructing any type of user interface.  With one tool and language (for user interfaces), then we might have a hope of industrializing software development.
 
Let me put it another way, how many people do you know that are fluent in six foreign languages or more?  How many of those people are fluent in both the spoken and written word?  Have you ever tried learning a foreign language so you are just as fluent in it as your native language?  Learning and becoming fluent in any foreign language is no easy task.  But for software programmers, we must learn multiple foreign languages to design and construct software.  It may even be tougher than learning a traditional foreign language as our programming languages regularly change including the introduction of brand new ones (e.g. XAML). This gives some insight as to one of the major reasons why software development is not a predictable and repeatable process even for the software programmers.
Saturday, 04 March 2006 06:48:47 (Pacific Standard Time, UTC-08:00)  #    Comments [0]
# Saturday, 21 January 2006
Lets take a break from whats new in software development for 2006.  In Part 1 and Part 2 of why software industrialization, I discussed some basic concepts around what does software industrialization mean and why we need it. Software industrialization is designing and constructing software in a predictable and repeatable manner.  Software industrialization is also producing a software, of any sort, that actually meets the user communitys requirements, whatever and whomever they happen to be.  Thats it.
 
In our software industry, there is a still a tremendous gap between requirements (i.e. describing the intent of the software) and the actual software deliverable (i.e. executables).  In fact, if you look at and believe statistics, our numbers aint so good!  The reality is that, we as software developers or engineers, dont have any predictable and repeatable process for creating software and we dont get the software right the first time.  In fact, we may never get it all right.
 
I have been involved in the software development industry for 15 years and have worked for several companies both in Canada and the United States ranging from my own start-up company, that I co-founded with Barry Varga, called, 5by5Software, now Bridgewerx to the very large (Kodak, Motorola and lots in-between.  Each company had their own process for designing and constructing software.  These development processes ranged from fly by the seat of the pants to highly process controlled CMMI complete with formal assessments.  In these companies I participated in several different development methodologies including RUP, extreme programming, Agile, old school waterfall for those that remember, and a mix of pretty much everything in-between.
 
I have read several dozens of books on the subject of software engineering, hundreds of articles and attended many a conference/seminar on the subject area.
 
Based on 15 years experience, I draw a few observations and maybe a conclusion or two:
 
1. Its still the Wild West" when it comes to software development.
 
2. None of the software development processes or methodologies that I have participated in has proven that one is no better than the other. The harsh reality is that manual labor software development is still mostly a trial and error process.  Meaning we are still far away from making it a predictable and repeatable process.
 
3. I have had the good fortune to work with several dozen extremely talented and brilliant software developers/engineers, project amangers, BA's, etc., who are directly responsible for the success of the projects/products I have participated in.  It is these people that have made the difference between success and failure and had nothing to do with processes or methodologies or toolsets or programming languages or technologies.  Its all about the people!
 
4. There have been a few critical innovations of late that appear very promising for increasing the level of predictability and repeatability of designing and constructing software.  These innovations are described in a book called Software Factories  
 
5. Another critical innovation is the renewed interest around Domain Specific Languages (DSLs).  Every programmer should take the time to read Martin Fowlers most excellent paper called, "Language Workbenches: The Killer-App for Domain Specific Languages?  DSLs assist in raising the level of abstraction for developers to produce software in a more predictable and repeatable manner thats the real point behind DSLs.
 
6. All the business user ever interacts with 100% of the time is the user interface, graphical, text or otherwise.  All the business user cares about is that through the user interface they can get at, manipulate, save data in an intuitive and functional manner.  Functional manner means that the software does what the business user expects or wants it to do.  In our technical world, this maybe written as workflow or business rules or custom C# libraries, or whatever development artifact(s) represents this functionality.  However, we software developers forget that the business user does not care how it is done, or what with - when I click on this thing, here is what I expect to happen, your software needs to do that.  Its that simple from a business users perspective.
 
Point 6 will be a topic of a future post, because after 15 years in the biz, I finally get the business user.  And interestingly enough, from a programmer perspective, not getting the business user is the number one reason why software projects/products failed in 1994, and still failing today
Saturday, 21 January 2006 15:54:56 (Pacific Standard Time, UTC-08:00)  #    Comments [0]
# Monday, 12 September 2005
In part 1 of why software industrialization, I explained how even the simplest of end user software products still cant be designed right (i.e. washing machine firmware).
 
I have been a software professional for 15 years and I am still amazed at how little we have improved software engineering in our industry. The software development tools we use still feel like hammers and chisels to me.  The software development process, Agile or otherwise, seems overly complicated, convoluted and hardly predictable or repeatable, in addition to involving way too much effort and using people skills that vary wildly in knowledge and expertise.
 
I gave an example of how AutoCAD revolutionized/industrialized the engineering design world way back in 1982. We have nothing like this in our software development world.  In some respects, I think we have actually gone backwards instead of raising the level of abstraction to improve the process of software development, we are actually writing even more code as products and layers of interfaces are being added to our toolsets daily, plus watching vendor framework class libraries ever increasing in size, (literally 1000s of classes), but not really doing anything to raise the level of abstraction.
 
Here is another analogy I can give you to explain what I mean by the industrialization of software.  Too many years ago, I worked for an electronics company where I designed printed circuit boards by hand.  This meant hanging a circuit diagram (a real blueprint) on a wall and on a light table, with a Mylar grid 4 times the size of the physical circuit board, I laid down red and blue tape (for a double-sided board) of different thicknesses to represent electrical connections between components and different sticky symbols that represented resistors, capacitors, integrated circuit patterns, etc.  Once the layout was complete, we then had to shoot negatives (actually positives) of the layout (reduced by 4X), expose a blank copper plated printed circuit board in a UV oven, with the positive on top and then etched the unexposed copper away in a caustic chemical soup, which all tolled, took 2 weeks to complete.
 
I would work 8 hours a day times 6 to 8 weeks staring at a light table and as I made each connection (out of several hundred), I would highlight the connection on the circuit diagram (i.e. blueprint) to indicate a completed connection.  There were several design constraints, layout size of the circuit board, making sure power traces were not beside low level signal traces, etc.  Kinda like a chess game, except it took 6 weeks or so to see if you won or not.  If you painted yourself in a corner, it invariably meant a starting from scratch.  Management did not appreciate starting over, so we got real good at playing chess :-)
 
Then industrialization hit the electronics/printed circuit board industry with the evolution of computer numerical control (CNC) that helped digitize the circuit diagram into an autorouter software program that automatically designed and laid out the printed circuit board from the digitized input by a light pen.
 
 
 
Today of course, this manual labor intensive process is fully automated.  The circuit diagram is now fully digital (remember AutoCAD) and the autorouter can simply read the circuit diagram and produce the circuit board directly with no human intervention or other intermediate steps.  In fact, the electronic circuit diagram is fed directly into this system and out comes the finished printed circuit boards (forget double sided, how about a dozen layers!), all in a matter of minutes to a few hours.  What used to take me 6 to 8 weeks or longer by hand is now reduced to minutes or hours, with no errors. How is that for industrialization?
 
What can we point to in our software industry, over the same time frame as our analogy above, that has reduced two months of manual labor intensive hand-coding effort into a matter of minutes or hours?  And I dont mean marketing words or the latest TLA buzzword SOA, or latest programming language, I mean some tool or generator that actually reduces the manual effort and coding required to produce a finished software product.  Can anyone point to anything that substantially has advanced the industrialization of software over the last 15 years?
 
In the software industry, we are still in the dark ages as far as I am concerned and my little blog here is a small attempt to try and help push the envelope of software industrialization from an educational perspective.  Hope you enjoy it.
Monday, 12 September 2005 04:06:06 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Tuesday, 02 August 2005
Today, we are on a journey of the industrialization of software much the same way when Autodesk invented AutoCAD in 1982 and introduced it to the engineering design and manufacturing production world.  AutoCAD offered, for the first time, a full-fidelity drawing tool that enabled a predictable and repeatable way to produce engineering diagrams that could be saved electronically in a (publicly accessible) universal file format to be used by all sorts of other CAD/CAM devices. A standard was born. In my opinion, AutoCADs virtual drawing world has done more for the engineering design and manufacturing world then any other invention in the last 20 years.  Without it, we would still be forging hammers by hand at a rate of one a month and cost several thousands of dollars a piece to produce.
 
The same revolution is now taking place over 20 years later for the software design and code production world.  The only people that really know about it are software programmers from various tool vendors around the world.  Programmers are realizing that the days of hand crafting ever increasing complex software solutions are becoming too costly and taking too long in our internet time business world.  I predict in the next 5 years we are going to see the business world using software that is heading up the knee of the exponential curve and offer business services through software automation that are going to further increase the speed of which business is performed compared to today on a scale that is unfathomable.  Just as unfathomable as the railroad once was.  Just as unfathomable as electricity once was.
 
The software industry need tools that can visualize the size and complexity of the software structures to be constructed much like how the (traditional) building industry uses architectural blueprints to visualize the design of building structures and yet, are detailed enough to cover all aspects of the building to be constructed.  Not only will this give some sort of continuity to the world of software development, but will help non-technical people understand the (equivalent) difference between constructing their 3 bedroom bungalow house and the Empire State building.  Even a lay person with no level of expertise can visualize by looking at the blueprints and generally comprehend one is much bigger and more complex than the other.
 
Right now in the software industry we have no universal way of showing size and complexity differences even amongst the programming community.  The current state of the art is still so low level and specialized, that most programmers look at these crude architectural drawings and cant infer any meaning as to what is to be built, aside from understanding what the size and complexity is.  Sure we have software modeling languages like the Unified Modeling Language (UML) and the newly introduced Domain Specific Language (DSL), but these tools and languages are still very low level compared to a buildings architectural blueprint.
 
I am not trivializing these language inventions, in fact, they have done a world of good for the software world.  However, we need to raise the level of abstraction using better modeling tools and languages so we can get ourselves out of the dark ages and into the industrialization age.
 
Raising the level of abstraction is the topic of my next post.
Tuesday, 02 August 2005 04:06:06 (Pacific Daylight Time, UTC-07:00)  #    Comments [0]
# Thursday, 30 June 2005
Someone asked me the other day why the need for software industrialization and to give an example of whats wrong.  Here is one trivial example.  I have a new washing machine that is run by software.  Fully programmable!  The user interface is interesting, but that is another topic of discussion.  Here is the problem, once it is all finished it beeps at me every minute with three beeps.  It seems that I need to actually hit a button to say that machine, you are all done.  Ridiculous!
 
So I decided to see if I wait long enough, it will just shut off by itself.  So I let it go for an hour listening to its 3 beeps every minute for an hour.  Thats 180 beeps.  OK, maybe I aint so bright, but I just about took my sledge hammer to the thing to shut it up.
 
I looked in the owners manual to see if this is normal behavior.  First of all the owners manual is 27 pages!  Its a washer, how complicated can it be?  Next under a section called Acoustic Signal was the magic key combination to turn the sound off.  Of course, would have never guessed this without reading the manual.
 
And thats the point of why software industrialization, its a frickin washing machine!  How many millions of washing machines have been designed and produced over the years and I still need to read a manual for one?  Why cant there be just one button that says Start or Go!  Thats a feature I want in my next washing machine, but it seems its going to take a software revolution to get this one feature.
 
Oh, by the way, my stove timer does the same thing it does not automatically turn off, I actually have to go over and click cancel (with my finger) or it beeps three times every minute.  Now where did I put my sledgehammer
Thursday, 30 June 2005 01:24:03 (Pacific Daylight Time, UTC-07:00)  #    Comments [2]
© Copyright 2010 Mitch Barnett - Software Industrialization is the computerization of software design and function.

newtelligence dasBlog 2.2.8279.16125  Theme design by Bryan Bell
Feed your aggregator (RSS 2.0)   | Page rendered at Monday, 16 August 2010 06:04:41 (Pacific Daylight Time, UTC-07:00)