Monday, 09 February 2009
Man, it seems that every few years there is a new software development methodology that comes out and while not purported as the 2nd coming, it certainly has all of the fanfare of the latest and greatest.
Let’s see there has been CMMI, TDD, Agile, Scrum, Lean, Waterfall, Continuous Integration, Spiral, Extreme Programming, RAD, MDD, YAGNI, Cowboy Programming, and the list goes on. I would like to add to the list an oldie but a goodie: Brute Force Development or BFD.
I would suggest that BFD is the most widely practiced software development methodology in the world. In fact, I would claim that the majority of organizations and people use this methodology daily and have been since the inception of software development.
How do I know this? In the real world of software development, where the size and complexity of even the smallest projects (e.g. >5000 lines of code) exceed the allocated budget and timeline, almost everyone resorts to brute force development in the end. Why? Because, we have to. How else can we do it? When was the last time, as a professional programmer, that you actually finished your project/product on time and on budget? Did you do it working 40 hours per week? Honestly?
Typically we start out with the best intentions, but as the schedule starts to slip and the budget is disappearing at the rate that a 426 Hemi goes through a tank of gas, we drop in BFD mode. We try and do the impossible. Extra hours are burnt, features are slashed, quality goes out the window, and we brute force our way to meet the impossible schedule.
Now, I am not complaining. This is just an observation having worked in many different shops, large and small, including my own start-up. We end up with BFD in the end.
Heck even the guru himself (who I have nothing but respect for), in his own post called, "Building a Fort – Lessons in Software Estimation" made some pretty interesting slip ups. My favorite was the, “I dropped a little piece of my laser level down the side of one of the footing holes, between the concrete form and the dirt, after I'd poured the concrete.” Oh Steve, think of all the things we have dropped in the software!
I will go out on a limb by saying I have yet to see any evidence that we, as software architects, developers, estimators, etc., are actually getting any better at this. I have been doing this for 18 years professionally and maybe I am dreaming, but it seemed simpler years ago. Not just that the requirements were simpler, but even from a technology standpoint. What I mean is that software vendors that produce tools, programming languages and applications have grown (seemingly) exponentially during this time frame as it seems any solution (and the tooling, languages, and apps) I am involved in has way more moving parts. A lot of these moving parts are new and unforeseen issues crop up well into the development cycle where one vendors library interfaces don’t seem to match what the documentation says, for example. And then we brute force it – to make it work.
Part of this is tongue in cheek as there is another meaning to BFD than can be applied to programming methodologies. You can get a hint by looking at the blog category this was filed under. I sincerely don’t mean any disrespect to the authors and believers of these software development methodologies, but sometimes the “marketing messages” can be a little much and even downright embarrassing. For example, try explaining to your significant other what Agile and Scrum mean. What do you think the “business folks” are thinking when you explain it to them? Do they even care? I would hazard a guess that all they care about is how much is it going to cost and when can we start using the software. Btw, they are also thinking, it better do what I want it to do for this amount of money... or else...
So the moment that things don’t go as planned, BFD kicks in. Whether you know it or not.
In 1994, before most of these methodologies and marketing names came into effect, I had the good fortune of taking a 2 year post grad course in Software Engineering Management at the
University of Calgary in Alberta, Canada. It was taught by Motorola University and one of the instructors, with 30 years experience, had some awesome stories on how “yer doin it wrong.” The funny thing was, while we learned a great deal about software engineering (that’s the last time I write 17 exams in 2 years!), what we learned most was common sense and communication. In other words, how to tell your customer (ahem, the one paying your rate, salary, contract, or whatever) that we can’t write 100,000 lines of code in 2 weeks. The real methodology here folks is just called common sense.
I don’t think much has changed since then as we are always fighting that battle. Developing software for any decent sized project (>5,000 lines of code) is really, really hard, maximally labor intensive and fraught with… well, you name it.
I can hear the Agile folks saying that our methodology is the one that mitigates this risk. While that may be partially true, how do you answer the top two questions asked by the customer: how long will it take? And how much will that cost? And our requirements list is just that, a one pager with bulleted high level fatures items and some of the bulleted items have two words explaining the requirement. Oh yeah and at fixed price. Ready to sign up? In the end, in order to make that deadline or not burn through your fixed cost, it's BFD man. That’s the reality. And btw, could you not have come up with a better name. I mean did you not know that Agile is Dead?
So what’s my point? Well, aside from having some fun with the BFD acronym, as with most things, there is some truism there for sure. We have all done it, yes? I am sure that anyone that has written code for any length of time has done BFD. Which makes my newly minted, TLA marketing buzzword an instant leader in the world of software development methodologies!
All kidding aside, maybe it is time to step back and look at some of the basics for any software development project. The very first I would think is answering the two most basic questions of any software development project – how long and how much $’s. Do you have a predictable and repeatable way of doing that? How accurate is it? If you don’t, then you are likely to be doing BFD even before you write a single line of code and therefore none of those fancy software development methodologies won’t help you one bit. Know what I mean?
Remember, keep the rubber side down!
Wednesday, 29 October 2008
I have been coding web applications In Visual Studio since Visual InterDev was introduced in 1997. Over that time, I have seen a wild array of error messages, but yesterday, while debugging an ecommerce web application using Commerce Server, I got this interesting message that I have never seen before:
Object is in a zombie state? I wonder what that really means?
Clicking Yes, produced this dialog box:
Huh? How can an object be a “zombie” and how can debugging be stopped, but not yet complete? I know it is Halloween, but... oooooo pretty scary, huh kids!
Wednesday, 30 July 2008
Dell’s Fashionable PC’s– Yours Is Not Here
I have poked marketing fun at Microsoft’s Dinosaur Ads and Oracles “hot pluggable” EAI platform, but Dell just beat them with, “Get the mobile, fashion-forward student laptop.”
“This personalized laptop reflects your sense of style and keep you connected to fun, friends and assignments, no matter where the school day takes you.”
Wow, check it out man, FREE COLOUR!
I know after being in this industry for 17 years, I have a little bit of cynic in me, of the Dilbert kind, but honestly, Dell is not only marketing to an ever younger audience, (reminds of Camel cigarette ads for kids), but it is to the point of trying to make computers be as hip as skateboards (psst hey Dell, never going to happen!). Note the ‘Street” version above. I wonder if I got one of those that I would develop a bad boy, street attitude. Oh wait a minute, most of my co-works already think that about me...
"More You. Inside and Out. Personalize your Dell STUDIO with ‘Designed for Dell” accessories – the brands you trust customized to match the colour, fit and style of your system.”
Never would I believe such branding could be applied to a... computer? Now for one moment, I will admit that I was always attracted to Alienware computers as being a closet gamer, plus they are cool – and the marketing and branding is slick. But I will always associate Dell with business computers – that’s their brand to me. Why would they jeopardise their business brand to go after the skateboard market? Share value? Pfffttt!
“A cool campus accessory that is ready to move.” Honestly Dell, just what is this marketing message supposed to convey? That a computer is a cool campus accessory for woman? That it is the new purse? And what about that locker... Show me one student that has a pink fur lined shelf for her books. Even my five year daughter feels pink fur is on the outs. What is that picture of? Of her and her Mom when she was little or her and her daughter or ?? This is so wrong. My wife says, “Who are they kidding? Computers are supposed to be tools to help people and now it has become a fashion statement – an image conscience thing. F*&! - there is no stopping these marketing people.” OK, that was a quote from my lovely wife when I showed her this. She said a lot more, but none that I can repeat here It is embarrassing to me being in the computer industry to be associated with this. Good thing I don’t have any Dell computers.
“Make Your Dorm Room The Centre Of Fun”
‘Whether they’re an aspiring botanist or a fan of film noir, this PC will bring inspiration and entertainment to their dorm room for a fantastic price.”
Oh man, I can tell you that when I was taking computer courses in college, my dorm room was the center of fun and inspiration, but there were no computers in it
“Handles Whatever Your World Throws At It.”
Dell, what happened to your brand? I picked up a Globe and Mail on Monday and you had this flyer in it. It has changed my view of Dell forever – lost all credibility to me. How can I ask my business customers to take your brand seriously when you are all trying to be all hip and designer like to a younger generation? Worse yet, the ads are seemingly designed by someone in marketing that seems to have no clue about that demographic. That’s aside from being pretty money grubbing going after an ever younger audience – pretty soon we will see Dell ads for grade school kids in summer camp...
Someone else from the fashion industry wonders about the same thing, but in reverse, "Why Would Dell Hold a Fashion Show.” I can only hope that this new low in computer marketing is just a total oversight on Dell’s behalf and they will say it is an experiment gone awry and turn back to what they do best – build practical home and business computers for the masses. But somehow I doubt it. With all of this advertising comes the sunk cost of designs and tooling to produce all of these free color laptops. .
Thursday, 26 June 2008
In my best Sam Kinison voice, “ah ahhh ahhhhhhhhhh!!!!” I can’t take it anymore. I am re-installing Office 2003 and forgetting about Office 2007. Why? It’s the ribbon man! For all of the usability design, I find it unusable. No offense to Jensen Harris or Microsoft, but for me, the consumer of the product, and after trying it for over a year, I just can't get used to it.
First, full disclosure, I am not a “usability designer” or a Microsoft “hater." In fact, I have been making a living as a software architect/programmer type on the Microsoft stack since 1991 and have been fairly happy with the platform (I love VS2008!) – except for the ribbon. But I digress.
The “ribbon.” Jensen says one of the reasons it was invented was because people could not find the new features when they were added to the product. Then he goes on to say that there are over 250 menu items and over 30 toolbars in Word 2003, which resulted in this satirical view:
Now, fair enough, but I would suggest that if a “word processing” application has +250 menu items and over 30 toolbars, then “Toto, we're not in Kansas Anymore." Meaning, this is no longer a word processing application.
Honestly, Word should have been “refactored” into perhaps multiple products or features split into a desktop publishing application or a whole other suite of applications. But instead, the UX team went through an honorable and noble design process of solving the wrong problem. Kudos to you Jensen, but I just can’t do it anymore. Every time I look at the ribbon, my brain freezes - I have to think, which means bad usability design.
Why? It boils down to simple math. When I see the Word 2003 menu, I see this:
Ok, I see 9 “objects.” Notice no toolbars. That’s right, simple is better… right? Ok when I get crazy, and add a toolbar, I see:
Even then, it is 19 objects on the toolbar and another 9 objects for the menus. But what do I really use?
Yah, that’s right 13 objects in total! That’s it. The bullets, numbering and indent/outdent are merely conveniences for me. Note one complaint already is that these are 2 separate toolbars and there is no way for me to put them on one row, even though there is lots of horizontal space, I am forced to use up two vertical rows. That ain't usability.
Oh yeah, not full menus on the pull down – who designed that? Yes, I know what you thought, and I know the "fix", but honestly, it does not work. Give me the full menu every time so I do not have to click twice. In my mind, usability is all about minimizing the choices a user has to make and minimizing the number of mouse clicks to make those choices. If you have too many choices, maybe you are trying to solve the wrong problem?
Here is my default Word 2007 "Ribbon":
There are, count them, over 60 possible choices or selections to make. And that is the problem. Too many visible choices! My poor brain needs to parse and process each item to see if it matches what I want to do. Whereas before, I had a pretty good idea that in the one of the 9 menus in Word 2003, I would be able to locate and narrow down the “decision tree” to find what I am looking for. In fact, I got really good at in 2003 and did not have to "think" about it. And that's the point of good usability design - no think time. In Word 2007 I have 5 times as many visible choices per "ribbon" x 8 menus, which means exposing ~480 visible objects to the user, which is way too many! In my mind, this is a classic case of solving the wrong problem – i.e. if a “word processor” has 480 objects, commands, menu items, whatever the heck you want to call it, give it a name, then it is no longer, by far, a word processing application. Something is really wrong here.
Oh and some hidden UI gems. When I first fired up Word 2007, I was trying desperately to find the “options” menu item which has always been Tools/Options, for like 10 years its been there - if it ain't broke... After several minutes of hunting, I had to ask one of my co-workers, where the heck is the Options option? It is hidden at the bottom of the "magic" Microsoft Office Button. I say magic because a) who knew it was a button? and b) why the heck is it there? I might as well be playing a pinball game for all the pretty widgets!
Funny that there is a “Locations of Word 2003 commands in Word 2007” article... What does that say about the user experience? Ok, I will admit to being totally programmed by the “File” menu approach, but so is the rest of the world and the mass majority of applications in the world (meaning everything but Office 2007) also operates that way, so what up? As mentioned before, I believe the wrong problem is being solved.
As a related aside, it took me forever to find on the IE7 toolbar where the “find on this page” menu item was. Have a look at the screenshot below. Where would you look?
My first instinct (decision) was to look under the “Page” menu/toolbar for "find on this page" menu item:
Nope, not there. Other related page menu items are there, but not my find on this page menu item. So then of course I looked under each menu, in random desperation, and still no go. WTH? I had to search on the internet to find the “find on this page” menu item and lo and behold it is hidden away here:
Again, I feel the wrong problem is being solved here. We have a menu called Page and if you wanted to find something on the “Page” you would look under the “Page” menu, yes? I know I live and breathe software for a living, but I just don’t get how this is usable. Again, I am not trying to pick on MS, but as someone that uses MS tools daily, there are items that come up that defy any sort of logic. And that can be said for any software products and services company.
What’s my point? While there is a lot of hype around usability and the user experience, it does no good to be solving the wrong the problem. Rule #1 in software development, regardless if it is usability or not, make sure the right problem is being solved. And if the software industry moves towards adopting the "ribbon" as a standard user experience widget, I think I will take early retirement!
Friday, 31 March 2006
Some years ago I had the opportunity to participate in a software technology bake-off. Technology selections or bake-offs go something like this, Dear Vendor, you have been selected to show your wares/skills at our esteemed company. No, you dont get paid for this privilege, but you have one week to complete the tasks in this envelope that you will perform on-site. And if you win, we are not sure what we are going to do anyway. So, dance vendor, dance!
OK, while I am being a bit tongue in cheek, it was exactly like this. I view bake-offs as the easy way out for CXOs that dont know what they are doing or dont want to do their homework themselves. In my experiences, it has been the former. Unfortunately, for the vendor, it is a very risky proposition to participate if the org does not understand what they are doing or getting involved in. At the time of this particular bake-off, it was all about Enterprise Application Integration (EAI) even though Service Oriented Architecture (SOA) principles applied, this buzzword had yet been invented. In other words, this org was a very early adopter of SOA and did not know it. This meant that Geoffrey Moores Technology Adoption Curve is in full play here.
While I am going to name the technologies that were used in the bake-off, I do want to make clear that years later, I came to the conclusion that it did not really matter what technology was used. The reality is that any of the middleware products would have worked for this org. As an analogy, they were looking for a Cadillac solution and at the time thought BizTalk was an econobox and all they really needed was a bicycle. It has been my experience that this is the case for most orgs from a technology selection perspective.
Back to the story, but first a little background. The orgs CIO and his crew were hardcore Unix fans and therefore were horrified to find a Microsoft product (BizTalk) that had penetrated their domain. Our pro services shop was called in to perform a complicated workflow across a few applications related to one of their core business processes for a particular department which got us in under the radar. When we presented our prototype (live from our servers) in front of 30 some people, everyone seemed to be quite pleased. At that moment, the CIO stood up and said, dont too excited, as we have to go through a formal selection of which middleware technology our esteemed org will use. Just so you know, this was the beginning of the end for the CIO who was let go sometime later.
Yours truly got grilled by the CIO and Unix crew on more than one occasion as to why BizTalk. I was so naive. I dutifully and enthusiastically explained the features and benefits, blah, blah blah. It reminds me so much of the commercial that has a cardboard cutout of a salesman that said, So, how much software would you like to buy? As a technical person playing a sales roll, I cant help feeling that way. However, I happened to be President of this pro services company and just bit the bullet all for the company right? Behind the scenes, the CIO flew in a top sales person from Vitria (their Unix fav) who sold a bunch of licenses to the org., even before the bake-off began. The other two players in the EAI selection were Tibco and WebMethods.
The funniest part of the bake-off, depending on who you ask, is when we opened the envelopes, we quickly realized that the third-party consulting company hired to develop the EAI tasks, did not know what EAI meant. The majority of the tasks were what was traditionally known as a data pump. That is to pump data from one database to another, sometimes indirectly through staging tables and/or file shares. The third-party company that did the spec did not know that the A in EAI stands for application not database integration.
Depending on who you speak to, maybe there was nothing funny at all. Take my business partner for example. As it turned out, I happened to be on vacation in the northern regions of Canada at the time (hey, its the North Pole
and got daily phone calls on how choked my partner was. Complicated further by the fact we were not getting as much local Microsoft support that we thought we were and since our business model was using independent contractors, we could not engage our regular crew cause this was not paying gig. Consequently, my partner had to pull double shifts for that week. The only other time I saw him this mad was when one of our business partners tried a hostile takeover behind our backs within the first three months of incorporation, but that is another story.
At the end of selection process, with lots of internal lobbying going on, it was a tie! I have never heard of such a thing in my 15 years in the biz. Totally hilar. And the way it was rationalized was that all Unix based projects would use Vitria and all MS based projects would use BizTalk (the company was already a 50/50 split between Unix and MS applications). Part of the inside joke here is that the whole point of a middleware product was to be programming language and platform agnostic and could integrate any disparate applications, which was lost on the CIO that called for the bake-off in the first place. Further, the independent third party consulting company that put together the bake-off tasks also happened to perform Vitria pro services and actually offered up their services. Isnt that a conflict of interest?
As it turns out, Vitria stayed on the shelf and over a dozen BizTalk projects were designed and constructed for the org with our pro services shop getting most of the work. A personal thank you to Bobby Keen!
I learned later that the org spent close to $2 million in the bake-off over a year time frame. I could not believe it, but after someone on the inside explained to me what all had happened and how many people were involved, I could see how it happened. The org could have saved themselves this money or have it put directly to actually solving business problems. So why dont they get it?
There was no joy of software on this one. Ultimately, none of us got it cause the orgs entire IT shop got dismantled as a large outsourcing org swooped down and took the business away from all of us. However, as mentioned earlier, Geoffrey Moores Technology Adoption Curve was in full effect here and I believe we all fell while Crossing the Chasm. Subject of my next post.
Tuesday, 28 March 2006
I have been employed in the software development biz for 15 years. I used to really like what I did. Now I am not so sure I like what I see in our industry. I see vendors plying their wares with cheesy marketing campaigns like Microsofts Dinosaur ads or Oracles Fusion middleware that is hot pluggable (what marketecture). I am not sure why others arent disappointed by this or maybe I am so nave that this is just the way it works in our industry.
When I first started, I was the eager programmer stayed up all hours to ply my craft. It was fun and I have no regrets. I was fortunate enough to move from C, which I really did not get, to Smalltalk, which I did get. Since then, I have had opportunity to use many programming languages and frameworks. 15 years later, I see them as all the same still low level tools, not much beyond assembly language in some respects. We still laboriously toil over cryptic lines of a foreign language to get it understood by the machine (i.e. compiled) and in the end some human being is going to interact with the code you wrote and hopefully it fulfills whatever function it was supposed to do.
Amidst this is the whole world of quality, metrics, process improvement, Agile, CMMI, ISO, Scrum, Extreme Programming, etc. To me, its all the same, most of it just common sense. It still boils down to functional decomposition from Knuths Truths written many years ago. Make the problem small enough for the humans to understand and have reasonable expectations on planning a solution for the problem. I used to work for a division in Kodak (CMM Level 2) and division in Motorola, where we were a CMM Level 5 shop. My disillusionment of the whole process improvement world was born at these companies, who were not really interested in quality from a product perspective, but more from a marketing perspective, so they could get a sticker that says they have a quality process. Bureaucracy was what it really was.
Over 15 years ago, I came from the electronics engineering world where most of what we did in that industry was fairly predictable and repeatable. I thought when I joined the software world, it was more or less the same. After 5 years in the industry I came to the conclusion that the software world is not even close to being predictable or repeatable. In 1994, I read the Standish companies CHAOS Report, which confirmed to me that our software development track record is absolutely abysmal. The latest report says we have improved, but still our record states that for any given project it is still a crap shoot if the project succeeds or not. As a closet perfectionist, this does not sit well with me.
After I got out of the quality game, I dived deeper into programming and buried myself with projects for a number of years. Some were successful and others failed miserably. None of them failed for any technical reason or staff skills, even though staff variability in productivity can vary 20 to 1 according to industry gurus. My experience meter suggests programmer variability is much higher than that. Programmer productivity aside, I was still seeing the truth in the CHAOS report, every project was a crap shoot. How to increase the odds for success?
I know, I will start my own company and I can control everything, muh, ha, ha, ha. So I did, I formed my own company called 5by5Software with another like minded programming buddy and for the first couple of years, we were successful in turning a profit in our little pro services world. As I have written elsewhere, I am extremely fortunate to have worked with several brilliant programmers. It amazes me to have people that can listen to a conversation and go away and within a short period of time, there is the solution, and with a little refactoring, solid as a rock. Yet, with other people, where the specs have been written in infinite detail, only to be looked at with glassy eyes.
In our company, we did so many projects of a particular type that we saw a repeatable pattern and a business opportunity to become a products company, in which we did, called Bridgewerx. After 4 years being President, I resigned cause I was dissapointed by the direction the company was taking from a business perspective. Of course when you require investment, other people get involved and the original vision gets diluted along with your shares. I am proud of what we invented because it does raise the level of abstraction of a particular problem domain. That part feels good. However, now what?
Yeah, start another company thats it. My objective is to build tools that help raise the level of abstraction in our industry, which in my opinion, we are sorely missing. Where are the tools for a Business Analyst to model or draw business processes? And I aint talking Visio. Domain Specific Languages (DSL) look like a good candidate to start building tools from. You can read an excellent DSL article by industry guru Martin Fowler.
DSLs to me are what is exciting and perhaps the next generation of programming tools for our industry. I am not talking about a new programming language or a framework, I am talking about a fundamentally different way to design and construct software. Rather than build solutions, build tools that build the solutions in an automated way. This is what I mean by software industrialization.
It is also the way of other industrialized industries that have matured from one-off, hand-crafted solutions to more of a product line engineering approach. "I don't think the craftsman will be replaced, but the tools he uses will be provided by companies who provide (or are) software factories." This quote comes from Steve Cook's blog and I happen to agree with it. Just like the drafting table was replaced by a CAD machine doesn't mean one loses his or her's craft, right?
Sunday, 26 March 2006
Many moons ago I worked with two brilliant programmers named Steve Langley and Barry Varga. Steve was our boss at the time with 25 years programming experience and Barry became my partner in a software adventure called 5by5, now Bridgewerx. One interesting note was that Barry and I became Steves boss when he worked for our company a few years later.
We worked for a SCADA company who built RTUs for monitoring electrical substations and sending telemetry to a SCADA application. An RTU is a firmware device with analog to digital interface cards to measure substation line voltage, current, load, transformer oil coolant temperature, etc., and feeds that information to a SCADA system. A typical SCADA system usually monitored many substations that collectively forms part of the grid. A substation may need dozens of RTUs and literally hundreds of different firmware applications running to monitor an electrical grid. We had a catalog of over 300 firmware apps a customer could choose from. The company also built custom firmware apps for our customers.
Each firmware app had hundreds of configurable properties, with no default values or data masks or min and mix values, etc. Multiply that by 20 RTUs per substation with 20 apps running, each with at least 100 have to be configured properties means that a bunch of poor guys had to hand enter 40,000 property configuration values in order to make the substation automation work. A software program was developed to allow configuration of these properties and then upload the config file to the RTUs firmware. It would take six months to commission a substation based on this approach. The bitch and stitch from the customers and company was that this was way too long and costly. Our job was to develop a 2nd gen configurator application that would provide automation for most, if not all, of these properties and therefore reduce the commissioning time from 6 months to 6 weeks. That was our mission Mr. Phelps.
This first generation configurator was a classic example of 2-tier architecture and over-engineering. Using Delphi, the UI was a table editor, meaning that grid controls exposed the database shema directly to end users. Tsk Tsk. However, at the time, considered state of the art. End users either could think like tables (a whole generation of table thinkers and tinkerers was born during the 90s) or could not think that way and wanted a higher level abstraction. The over-engineering is that the engineering department decided to make every possible firmware parameter, for every firmware app, configurable, never thinking about size or complexity or the future of substation automation - the very business they are in.
Enter our team whose task it was to build a replacement for the old configurator to make the configuration process simple and repeatable. We were object and component guys then. So when we did the UML on the problem set, it turned out to be incredibly hierarchical in nature. An electrical grid contains 1 to many substations, a substation contains 1 to many RTUs, each RTU contains 1 to many firmware applications, each firmware application contains 1 to many configurable properties, and each configurable property had 1 to many attributes. This is a gross oversimplification, but nonetheless it was just a set of hierarchies.
Our approach was to build a drawing tool that allowed the Configurator to drag n drop different RTU types from a toolbox onto a canvas and connect the RTUs to a network. If you double click on the RTU, a physical picture of the firmware collection was displayed with the name of each application. Double clicking on an app brought up a configuration wizard that had your typical express and advanced modes. Express mode gave you some simple options of a known working configuration that you can default to so that the system can be up and running ASAP. Advanced mode exposed all of the properties, plus their default values that have already been set.
It was VB6 days for the UI and our only real option was to use J++ for the business objects layer, being an MS shop plus a real object hierarchy that modeled the real world. Steve gets code automation. He developed an Excel spreadsheet that contained all of the attributes for each property for every configurable item for every firmware app. So if we had a property, we could give it a name, description, default value, data type, min or max values, whether it was required or not plus a bunch of other attributes, but I think you get the idea. We used Excel to capture the human readable semantics and syntax (a DSL) configuration language for RTU firmware applications. Steve then wrote a VB6 program that read the Excel spreadsheet and code generated J++ abstract and impl classes and next thing you know we have all of our business objects done. Every time the spreadsheet would change (which initially was often, then settled out), we would regen the abstract classes, but not the impl classes unless we chose to do so. Of course the impl classes is were our specific implementation code was. Now of course we have partial classes in the .NET 2.0 framework which makes it even easier for DSL applications.
I wrote the configuration wizards and Barry did a masterful job on the drawing tool in which we used a third party drawing component. So far, so good.
However, the Delphi Lead Programmer did not want to switch to MS. Battle ensues. I wish I could say we won, but we did not. Our product never made it into production, even though there is nothing wrong with the software and meets the business requirements of both the customers and company by reducing the time and cost to commission a substation with their products. So what happened? Just soap opera stuff because we did not have a strong executive team to tell the Delphi programmer that a business decision has already been made and we are phasing (y)our configurator out cause it is too costly to us and our clients. Instead the Delphi programmer quit and then held the company hostage to hire him back on contract (at a much higher rate) to keep working on the Delphi Configurator. Oh man, sideways.
The Joy of Software was how the three of us worked together building software that resulted in a cool product that also met the business requirements. Our test customer, from a real power substation in the US, was thrilled. The why they dont get it goes to (weak) executive management for simply not following through on a business decision that they had already made.
So Why dont they get it?
Thursday, 23 March 2006
Wow, you can really feel the frustration in the comments at Minis blog on the delay of Vista. There were over 300 comments the last I looked. And the comments are shocking, asking for the resignation of leaders, why the delay is good and bad, comparing software complexity between Adobe Photoshop and Vista, that the product was 5 years in the making and 5 million lines of code later and she is going to blow, whats better in the OS than XP2, devs vs PMs vs Testers, etc. There is a wildfire at the largest and most successful software company in the world today.
Quite frankly I am a bit shocked, not at the wildfire, but by the comments. To me, it is no wonder that Vista has been delayed. Why? Its obvious, software development is a non predictable and non repeatable process. Thats it, thats the reality of our business. In my mind, that is the root cause of why software is not delivered on time and budget. All the rest of it is noise.
Despite all of the process methodologies we have today, we still cant determine when a software product is going to ship. We in the industry continue to underestimate the size and complexity of virtually any size software product. The business side of the industry continues to promise software products that are all singing and dancing to customers.
I have a lot of respect for people like Watts S. Humphrey for writing books that apply the most disciplined approach to software engineering that I have ever come across, and unfortunately, I read too many text books, mostly about programming. Yawn. Whats my point? Even with this much rigor, we still cant accurately predict size of product, release date, total cost, etc. What does that have to do with Minis blog on the delay of Vista? Everything! Despite any type of leadership issues, variability in skill sets, stack ranking, sizing and estimating techniques, etc., Vista was delayed for the simple reason that the act of software development is inherently not predictable or repeatable.
There is empirical evidence ad nauseum that supports the fact that software development is non-repeatable process. We have evidence of thousands of projects not only being late, but failed miserably. I would never have believed it myself when I first came to this industry 15 years ago, because my background was in electronics engineering, where generally speaking, we did have a predictable and repeatable process. In fact, the design and construction of most physical objects is pretty much a predictable and repeatable process today. Unfortunately, software does not abide by the laws of physics.
Mini, in an earlier post asked what was the value of Vista that he could explain to his Mom. Thats easy, navigation of the file system has been greatly improved. That is a core and key feature of any OS, how well can the user navigate the file system. As explained in an earlier post, I thought MS had done an excellent job here. The usability is indeed better, I have experienced it myself. Because of the shortcut approach in the address bar of Vista Explorer, I can navigate around the file system with fewer mouse clicks. What more can my Mom ask for?
Another question Mini asked along the same lines for Office 12 or 2007 or whatever it is going to be called. Have a look at the state of the art in features in Word 2003. It has 32 toolbars. How could this happen? This is a word processor. What should a Word processor do? Allow me to write any type of document in the easiest possible way. Fortunately, Word 12 is much better in that regard with its context switching ribbons.
Microsoft should be focusing on reducing the complexity of their products not only from an end user standpoint, but also from a developers point of view. Seven user interface technologies from one company? Whats wrong with this? Rather then making it simpler, this is going the other way in complexity.
What I like about Office 12 plus severs like SharePoint version 3 is that there are developer tools that can customize a SharePoint site quickly and easily. Sure it is FrontPage rebranded and upgraded to become SharePoint Designer, but the point is that a Systems Analyst can now modify SharePoint sites easily and even describe fairly complicated workflows, all without writing a single line of code. What does that mean? It means we can deliver customized solutions to our Customers quicker and not have to have a PHD in programming, meaning the tool appeals to a wider audience. Thats the ticket MS, build more Power User tools that allow more people to manipulate your products using less code. The same approach could be applied to the tools you are using to build Vista.
This is called raising the level of abstraction. From where I sit as a 15 years software developer, this is what will help industrialize our software industry so that software development becomes more predictable and repeatable, and then the headline could be that Vista ships on time as predicted. Till then, what a mess our industry is in, all indicative of Minis blog detailing the inner workings of the worlds largest and most successful software company in the world.
Friday, 17 March 2006
On occasion I am fortunate enough to participate in various Vendors early adopter programs in getting bits of products before they are publicly available. WIFM is by the time the product goes mainstream, I will have had a number of pilot projects under my belt and be ahead of the technology adoption curve, which is what keeps me employed.
Of course the Vendors want something in return, usually a customer reference as the idea is to have a partner work with a customer to use the early software bits as a pilot project. The good news is that you and the customer get to see a real early and generally working view of the product. The bad news in being a bloody beta tester is that more often than not you jump through (sometimes massive) hoops, cause the product simply aint ready yet. Thats the trade off. However, Vendors could do more for their Beta testers as we will see.
The amount of time it takes to install these products and get them configured right, particularly when various dependant products are also in various Beta stages, makes it a necessity to use Virtual PC (VPC). Virtual PC allows you to run a software emulated computer in your host OS. So I am running XP as my host OS and the VPC image is Windows 2003 Server configured as an App Server, along with SQL Server 2005, VS2005, WinFX 3.0 Beta 2, Community Technical Previews (CTP) of various other bits and pieces. The list goes on and on. Something like a dozen items over two days to get it installed and it has to be done in a certain order. Dear Mr. Vendor, would it be so hard to have all of the software on one DVD and a single button that says install this scenario, just like you have in SQL Server and VS 2005. How hard can it be? Btw, the VPC lifespan is about one month as new Betas and CTPs will appear, making your image obsoleto as it may (read: will) not be upgradeable as we shall see.
VPC works great, but you spend a lot of time loading ISO images and restarting the (virtual) computer. Of course the VPC software emulation (even with 2 gigs RAM) is much slower than your host OS, but the advantage is that you can take snapshots of your VPC image after each item is installed and if things go wrong, you can copy your previously working image and be up again very quickly. One problem is that our image is now 16 gigs in size and takes a while to copy across the network. The other major advantage is that all of the developers are working on the same VPC image, which if your code does not work on someone elses image, its your code thats broke, not the config, which makes our requirement for single unit under test condition true.
Now enter the Beta product I have been testing. We have been working on Beta 1 for a little while and it took a while to get the image running right. There also is little documentation. Which is a bit unusual from this particular vendor cause generally they are very good on docs and code samples. Dear Vendor, please at least have how the tools work documented so we can at least try them out and not guess how they work.
Then comes along a Beta 1 refresh. This is the Vendors response from the crush of feedback for Beta 1, coming to the realization that we cant wait for Beta 2, so lets give em a refresh, which really means, whats the latest known good build, ok, ship it! The documentation says that there is no upgrade path. No upgrade path to this Beta 1 version nor will there be an upgrade path for future versions. Now I know this is Beta software, but I think it is a bit inexcusable when the end user (the developers in this case, including me) need to go through massive hoops of uninstalling the previous version, including other add-on bits like WinFX, etc., and then installing the newer version, only to be told that you still have not removed enough of the old version to continue, even though according to Add and Remove Programs it looks like I have already uninstalled everything. Argh!
So now what? Back to a previous VPC snapshot, that now needs Windows updates, a restart , plus a newer version of WinFX with a restart, plus I needed a virtual DVD software emulator to mount the ISO images, in the virtual PC, (wha?) and man, on it goes and I have not even got to the new Refresh install! Which by the way the refresh has a list of 20 components!
During the installation process on a restart, I got a message that said, Since Windows was first activated on this computer, the hardware on the computer has changed significantly (authors note no it has not). Due to these changes, Windows must be reactivated in 3 days. Do you wish to reactivate now? I said yes, first to find out on-line that the number of licenses has been exceeded. Then I was given a number to call which an automated attendant asked me to read a sequence of 40 characters (took forever) and then was told that this was not right and that I need to talk to a customer service representative (ooooohh a human being, how rare). The customer support rep, who was very nice and gave me another 40 characters to enter in (after I read my characters back to him) and activate my version of Windows. Btw, this was done before I uninstalled all of the previous Beta 1 software and tried installing the new products Refresh only to be told I had not uninstalled everything previously. So I just trashed the entire image. So much for the activation. Double aarrgghh!!
So now I am installing everything under another VPC that I had saved off that does not have any of the original products Beta 1 bits on it for sure or the rest of the software I need for that matter. I hope I dont get that reactivation message again. Dear Vendor, if there is no upgrade process, please provide an uninstaller tool that can completely removes the Beta or CTP product thoroughly with absolutely nothing lingering around. Would have saved me days of effort. Triple aaarrrggghhh!!!
So what about the Refresh install? It is almost growing season here in BC, maybe I will get a job picking grapes because it will be much easier and far less bloody than this!
Thursday, 16 March 2006
I recently discussed variability as being software developments' nemesis. One item I mentioned was that from a single vendor, Microsoft, there are seven user interface technologies to choose from. My point is that there should only be one. Really, really.
The software game is a whacky biz to be sure. All the way from iPods being dispensed by vending machines to the FBI scrapping the development of $170 million worth of software. The recipe for success in the software world is just as variable as programmer productivity is trying to stay on top of the ever changing world of technology, let alone developing expert level skill sets in any one area. As BarryV points out in his comment to my last post, I need a longer ferry ride
This variability becomes (much) greater when executives are making decisions about software projects/products without any real idea as to how software is designed and constructed or how it works as a finished product. It is a complete mystery to them, yet, they are in charge. Ok, this is a blanket statement, but I have found more often than not that this is a truism in our industry. I am not laying blame, just making an observation.
So what's the issue? Education is a major factor. Education about software development, which in most cases, is truly an exercise in trial and error, given the newness of our industry and the variability in everything that is software. Btw, our trial and error software development process is a key reality point for anyone in our industry to fully understand. And I don't mean trial and error in the traditional sense of just guessing at what to do. It is more like guessing which way is the best way to accomplish any given set of tasks because there is so much overlapping technology to choose from, which also happens to be constantly changing. For any given technology, especially programming languages, there are hundreds of ways to solve the same problem, some better than others, but all valid with no right or wrong way.
The so-called Software Architect is supposed to be the person that can figure this out. Being employed as one, I say it ain't so easy. For example, Microsoft has seven user interface technologies to choose from. How does one become expert in each one of these so when the task comes to develop a user interface, you make the right choice based on the requirements?
From an executives point of view, why should they know this or even care? They should care as it directly impacts the total cost of ownership (TCO) of the software being developed, which the executive is ultimately responsible for. If the software gets born and is useful to the target user community, (in which our industry track record is less than stellar) it is usually around for a long time. Bug fixes, enhancements and general maintenance usually make up the bulk of TCO. Therefore strategic planning in technology selection is just as important as developing the software itself.
Millions chose Visual Basic 6 to develop their business applications in. However, there is no easy upgrade path to .NET. I know of one large organization that has over 100 VB6 apps developed and running their business. They are now pondering how to move to .NET not only from a technology perspective, but also from a training perspective. Caught between a rock and a hard place as the TCO here is stratospheric no matter how you cut it. Btw, the idea for this org is to consolidate many of the VB6 application functionalities into .NET shared components which will reduce the numbers of apps (and therefore maintenance costs), hence the transition.
So what does this have to do with why don't they get it? The why don't they get it question is usually asked of me by fellow programmers that have seen an executive business decision made that makes no business sense at all. In fact, I have asked that question myself many times while working for various companies that don't see what I (or other technology savvy) people see. What we see as part of any project is targeting the software development on the latest possible technology. Immediately executives think, the programmers just want to work on the cool new technology and tools. Yes, that is true, but you know why? Usually the latest and greatest tools allow me to do my job faster, better and cheaper in some cases only the next technology makes something possible. This also translates into lower TCO as the software moves though its lifecycle. And most importantly, the life expectancy of whatever software that has been born has some chance living a full life, Instead of being re-architected in 3 years, when we may get +5 years using a different technology set. Even longer if the technology chosen has a technology roadmap that shows it has a future as well. This is what most executives dont get when in charge of software development, and have not come from a programming background.
On rare occasions I have seen a planned technology roadmap that goes along with whatever project or product roadmap that has been developed. Typically, once the software (ever makes) goes into production, any technology upgrades are extremely low on anyones list. Then the bug fixes and enhancements ensue. Over a time frame with no technology upgrades and/or , the software bits that got glued on eventually start dropping off or crush the architecture and the cost of bug fixes and/or enhancements goes exponential. And then the project or product gets "re-architected" as a matter of course. More often than not, on some new technology with no technology roadmap. The cycle continues. How to break that cycle?
One way is demand better and fewer tools/products from Vendors. As a software developer, I dont need a new or different hammer every year and an entirely new toolbox of tools or frameworks every couple of years. Know what I mean? This person gets it. Also, if vendors consolidate and simplify their product lines instead of making even more product variations will help promote the industrialization of software. Even though I use Microsoft technologies, I am unimpressed at the many overlapping technologies and different editions or versions of the same. Can you decipher how many editions of Visual Studio 2005 are from this blog? There are 5 editions of XP and now 5 versions of Vista. This is just the client OS! 10 versions, ridiculous, why not just one? Sure I get the capitalistic thing, but capitalism also can be had with efficiency. Our software development world is far, far, far from being efficient. Sounds like a vendor opportunity to me.
In case it is not clear, the they in why dont they get it are executive/management people in software development decision making positions that have no background in software development - the process of or programming in. A super-charged topic even for the largest software company in the world. Next post I will discuss some of those why they dont get it decisions from the field. To use Dave Barrys phrase, this really happened!
Sunday, 27 November 2005
While paging through a software industry mag, I came across an ad for Oracles middleware product. The ads buy line was that by using their middleware product, application integration was hot pluggable. Hot pluggable? The inference is that you can plug-in applications to their integration bus as easily as one can plug-in a hot swappable hard drive into a computer without powering down or effecting other applications. I have been building application integration solutions for 5 years and my experience can attest that application integration is not hot pluggable. In fact, it is an incredibly complex rats nest that you have to make sense of many different vendor applications proprietary data formats, using (mostly) proprietary APIs. Most vendors middleware products also work at this complex level of abstraction, i.e. too low. Also, while the middleware vendors help manual is 10,000 pages (talk about complexity!), no-where does it tell you in detail how to solve the actual application integration problem, which in itself, is just as complex as the vendors toolset.
No wonder software developers are a cynical bunch when it comes to working with most vendors products. What the data sheet says and the marketecture talks about, makes no sense in the real world. While one could say that for any type of advertising, our software industry seems the most plagued in my opinion. Why? As discussed in previous posts software development is a complete mystery to almost everyone but the programmers. So who are the ads targeted for? Decision makers, who are CIOs or purchasing agents, most of whom dont (really) understand software development (generalization) so all they have to go on is the marketecture.
I am not just picking on Oracle, I saw a Microsoft Office cartoon ad where the people in the ad had dinosaur heads on their bodies. Dinosaur heads? Huh? Is MSFT inferring that their users are dinosaurs? The buy line is that one dinosaur accidentally forwarded everyones salary to the entire company and had the dinosaur evolved to using Information Rights Technology in MS Office, then this would not have happened. I suppose that it also means that one of the dinosaurs in the ad is about to become extinct.
As an industry professional, I find this dinosaur ad - embarrassing. And insulting to the user community it is targeted at. Anyone that knows me personally wont accuse me for not having a sense of humor, but honestly this is lame. Not only is the ad just plain dumb, but it deals with laying FUD on the target audience. This is unacceptable. Do the marketing people for these companies know no bounds? Why not say that by using our product you can prevent sensitive information from being read by unauthorized readers using Information Rights Technology and here is how easy it is to do it. Right now you might be thinking, what a naive software programmer maybe so. But then again, "".
Both ads remind me of an old Dudley Moore movie called, where he was an alcoholic adman and came up with his best ads while being institutionalized at an insane asylum. Dudley had the inmates create ads that told it like it was i.e. the truth. How is that for irony. Maybe the marketing people in the software industry can learn a lesson from this? My cynical programmer persona says I doubt it. However, eventually customers will push back with another cycle of what happened in the dot com era and want demand software products that do exactly what the marketing materials says it does. This would help the industrialization of software, but until then, would you like fries with your SOA?
Wednesday, 17 August 2005
Have you seen the movie Sideways? It sometimes reminds me of the IT world where everything seems OK and then something goes so sideways that you wonder if you are in the right profession.
4 years ago, my previous company was hired to develop a Contracts Management System (CMS) for an Energy Utility company based in Western Canada. The Energy Utility was going through the process of fitting into a recently deregulated industry. Anyone that has been through this understands the extreme chaos involved. Here was a Utility company whose customer base, for over 20 years as a public utility, was in the few hundreds servicing small to medium commercial businesses and now had to scale to hundreds of thousands for the retail market, hence the need for software automation.
The amount of business change served up so fast was a painful experience for many. Most business processes had to be reengineered (read: discovered), every software application either had to be replaced or retooled, or a net new packaged or custom developed application was put into place to handle deregulated business processes. With respect to CMS, the workflow for processing contracts was incredibly complex and recursive based on the deregulation rules. We used BizTalk Server and its workflow capabilities to develop the solution and also used BizTalk from an application integration point of view to be able to send contracts (XML documents) to other applications.
The division we custom developed CMS for thought we did such a good job that an independent ROI case study was produced. We were pretty happy with the work we did and felt like we provided real value to an incredibly difficult business problem. So far so good.
Fast forward 4 years - I am working for a different Systems Integrator in a different location, but I get a call saying that there is this Utility Company that is looking for a document workflow solution for handling contracts. I think to myself, this cant be. But of course it is! It is indeed another Contracts Management System for the same Energy Utility. What are the odds? Or maybe this is a sure bet? No-one knows what happened to the old CMS as there are all new employees in the IT department.
It kinda feels like Groundhog Day as everyday I wake up and feel like I am still working on CMS forever and ever. The sad part is that the Energy Utility has already spent a $500,000 on a system that apparently does not exist anymore. What happened to it? No-one knows. How do you lose $500,000 in software?
This reminds me of an excellent book by the Alan Cooper, father of Visual Basic, called, The Inmates are Running the Asylum. If there ever was a truism in the IT business, I would say this one is it.
Someday, when software industrialization actually makes it way into our world, these types of sideways scenarios will become less prevalent in both the business and IT world. In the meantime, its Groundhog Day!
Friday, 29 July 2005
Ok, we IT people in the software world have been known to take ourselves way too seriously (including myself) on occasion (how about all the time) so today we are going to have some nonsense. The nonsense I am talking about is my washing machine. Now my long time friend and co-founder of 5by5Software, Barry Varga, has asked me what I was doing around a washing machine in the first place. Well, I was fixing it for my lovely wife, Lesley. Yah, thats the ticket.
Remember the annoying beep beep beep firmware problem? Well, its back. It seems that Lesley has pressed some magical key combination that caused the infernal machine to resume beeping, three beeps on the minute, every minute when the machine says its done. Someone in the house has to physically go over to the machine and manually turn it off. Sledgehammer anyone?
But there is more, and here is where the annoying crosses the boundary into the stupefying. Lesley likes to add things to the wash, whilst the washer is washing. This according to her is common practice. So she presses the pause button (no kidding, there is a pause button) and tries to open the door. You guessed it, the door wont open. Her words were, why wont the machine do what I want it to do? I used to do this with my old washer (not software controlled) and not only could I stop and add items at any time, I can even restart and the washer would already know what level the water was at and not add any more. I would have liked to tell her that once we purchased a computer controlled machine that she is no longer in control, but I am sure she does not want to hear that
In the end, this means I get enlisted to fix it.
First, I do the unthinkable and consult the 27 page manual first (told ya, I am a geek). Searching through the manual I come across the following statement, Adding laundry is not possible because the door is locked for safety reasons. The very next line says, Laundry may be added after pressing the Start/Stop button. Huh? In fact, in several places these contradictory statements are made. There is some (stupid computer) trick to make it work because we still cant open the door, even after pausing the machine and even after following the procedure(s) in the manual. Obviously, as I told Lesley, we must call the manufacturer and request a secret decoder ring to figure out the magic computer logic sequence to unlock the door.
Lesley has something to say to the designers of this machine. As far as she is concerned, the old washer lets you do this and the new one doesnt. From her perspective, the new washer is a poor design as it does not meet her requirements and she is frustrated that the machine and the instruction manual are so complicated, that she gives up. Someone at the new washer manufacturer has played a stupid computer trick on her.
And thats the point. Something as simple as a computer controlled washing machine appears to be too complicated to get right in the way of features and user interface design. How hard can this be? What does that say about software programs that are 100 or 1000 times the size and complexity?
Next week we are continuing with the industrialization of software where the topic will be lessons in abstractions using modeling tools.
Tuesday, 26 July 2005
In my discussion on certifying IT Architects, I came across this quote, But much of the work that architects do today is really an art form, not a certifiable set of practices, said James Barry, vice president of development for payroll and human resources applications at Automatic Data Processing Inc. (ADP) in Roseland, N.J. "The written communication and how they present their architecture would be mainly what we would look for in an architectural certification -- not the methodology that determines what to build," he said. "That would come from experience, not certification."
I am really beginning to wonder if we will ever get out of the dark ages in the software development world. With all due respect to Mr. Barry, I must admit I am dumbfounded by his comments. Art form? How they present their architecture? Lets look up the definition of Architect, One who designs and supervises the construction of buildings or other large structures I would say that definition applies to an IT architect as well except that our buildings are software structures. The art in architecture is around the look and feel of the structure, much like that of the user interface in a software structure.
However, in my experience, the look and feel represents less than 10% or even 5% of what an IT Architect does. We spend most of our time designing software structures so that they do not fall down! We design software blueprints much like a building architect would design building blueprints for any size structure. We design and construct software architectures indeed on a best set of practices, like Grady Boochs executable architecture which is an industry best practice and has been for over 10 years. A comment from a reader of my previous post (thanks Brian Di Croce!) mentioned that there is a Software Engineering Body of Knowledge available, called SWEBOK. Perfect!
In fact, it is indeed the methodology (I prefer practice or body of knowledge) that makes an IT Architect successful in the way that s/he can predict and repeat successes in an industry that isnt too successful. Thats the premise behind the Software Engineering Institutes Capability Maturity Model. You follow a prescriptive approach for increasing the maturity of your software development process by using sound engineering principles for developing software. The mantra is, the quality of a software system is highly influenced by the quality of the processes used to develop and maintain it.
This is much in the same way a building architect follows, well known, defined processes for designing and constructing buildings. Patrick MacLachlan, one of my co-workers at Burntsand is a real Architect. I asked him what it took to get his Architects degree. Patrick said it takes 8 years minimum and on average, 10 to 12 years!
Look at what Patrick had to do to obtain his Architects degree. How can we take certifying our IT Architects seriously when there is no prescribed body of knowledge, no exams and takes 3 to 6 months to certify? What a sorry state our software industry is in. This sorry state will be the topic of my next post.
Monday, 25 July 2005
In my last post, I discussed the issues around using the title Software Engineer and how our educational system (i.e. Computer Science programs) needs to get into the software engineering game. While perusing about on this topic, I come across a posting on Grady Boochs site called, Certification of IT Architects.
Reading the Open Groups faqs, I came across this statement, 2.2 Will there be tests required to obtain the IT Architect certification? Since there is no prescribed body of knowledge for the program, there is no test. Instead we will assess candidates experience and skills against the requirements of the program by evaluating their written applications and by a Certification Board interview.
The faqs go on to say how long it will take to obtain certification (3 to 6 months) and how much it costs ($1250 for the assessment plus $175 per annum to remain certified plus recertification every 3 years).
Now, I am all for industrializing our software world, but something bothers me here. It is the statement that there is no prescribed body of knowledge and therefore no test. No prescribed body of knowledge for an IT Architect? Come on, of course there is. Any seasoned IT Architect can pretty much tell you what the body of knowledge is required to be successful at the job. It is the same body of knowledge that has been required since the first commercial software project was written some 40 years ago.
It goes something like this: requirements, design, code and test. Over and over again. In fact, it has never changed, other than the fancy marketing names that have been attached to this process over the years. Oh yes, you need some project management skills that you can find in the Project Management Book of Knowledge (been around for 20 years). Specifically, I would also suggest Grady Boochs excellent book on (been around for 10 years) as a body of knowledge. You will need some body of knowledge on the process of quality as well, the Software Engineering Institute, has several bodies of knowledge on this subject area, including software engineering in general. The SEI has been around for over 20 years.
10 years ago, I took a post-graduate, 2 year certificate program, called, Software Engineering Management at the University of Calgary taught by Karl Williams on behalf of Motorola University. I can tell you there was a body of knowledge because I had to write 17 exams over those two years to prove I knew what it was.
With respect to being a certified IT Architect, I am disappointed that our software industry would let an organization be accredited to issue certificates without any requirements for a prescribed body of knowledge and no written exams. How is this advancing the industrialization of software development?
Thursday, 21 July 2005
Shane Schick wrote this great article on why no-one wants to be a programmer As an old-school programmer, I laughed out loud about being a cereal inventor, ha ha! Thanks to George Rafael for sending this to me.
I think Shane is bang-on about Microsoft and other vendors have to take some time out from patting themselves on the back for making great products to show prospective computer science students that it stands to solve still more. They have to talk about what those problems are, and why they will be worth spending hours staring at a screen trying to combat them. They have to explain why the only thing better than living in an software-driven world is programming it in the first place.
I can tell you that we are still in the stone ages with respect to software development compared to other engineering disciplines. The industrial revolution has not occurred yet in the software world. As a programmer, I cant go to a software catalog and order a login component like I can order an integrated circuit in the electronics world. How many login boxes do you think have been invented over and over again? Millions! And we as software developers continue to code these from scratch even today.
It would be great if the young bright minds of our computer science students helped us old-school programmers in developing tools that bring us into modern times so that we dont keep coding login boxes over and over again. In fact it would be great if our Universities actually offered Software Engineering degrees in addition to Computer Science degrees (arent these the same? See Software Engineering, Not Computer Science ), but this idea still seems to be in its infancy in our software world. Unfortunately, in Canada we are still arguing over the title of Software Engineer
Titles aside, I am hoping that the professional engineering bodies in Canada are trying to help our immature software development world by institutionalizing some solid engineering practices in our computer science programs. Doing so will bring software development out of the dark ages and into the modern world where our chances of building successful projects on time and budget will increase beyond the less then 20% success rate (The CHAOS Report) we currently have today.
Monday, 11 July 2005
I would like to thank Jim Bowyer, BizTalk guru extraordinaire, for bringing this to my attention. I have forgotten about the bozo bit. 10 years ago, I read an excellent book by Jim McCarthy called, . Jim sent me an article called, Un-Dynamics of Software Development, or, Don't Bite the Flip Bozo by Paul Kimmel that references McCarthys book, specifically the bozo bit.
Flipping the bozo bit means to make a mental note that a particular person is a bozo and everything they say in the future should be ignored or looked upon as the meanderings of a slightly annoying, occasionally amusing child or a drunken uncle.
Paul then goes on to write about Software Development Truisms:
1. It is still the Wild, Wild West out here and anything goes.
2. When someone says the schedule is going to be missed, they are never lying.
3. Change is a constant, but people will seldom thank you for changing their code.
4. A lot of bad software is being written by people who don't read.
5. People believe too much of what they read.
6. Authors are not smarter than the rest of us; they just read more.
7. Managers should not make technical decisions, but do.
8. If a manager says I am not technical, be prepared to spend a lot of time explaining things to them so they can make decisions they shouldn't be making.
9. Managers hire experts and ignore them all the time.
10. Messengers get shot more often than not.
11. Leaders have to lead; sometimes you will look behind you and find that someone is actually following.
12. You are the best programmer.
13. Programmers hate to read another programmer's code; if they volunteer to review your code, it is not to do you a favor.
14. There really are programmers ten times faster than everyone else.
15. Every man is in some way my superior, as long as he doesn't keep reminding me.
16. Programmers are emotionally attached to their code, but never say this out loud.
17. Many programmers are intellectual bullies and egomaniacs.
18. Everyone talks about constructive tension but doesn't want you disagreeing with them.
19. Benevolent dictators build the best software.
20. Decisions are, more often than not, emotional.
21. The mean time between the time you start speaking and someone flipping the bozo bit on you is ten seconds.
Some of you may think this is funny in a Dilbert kind of way. Others may think this is cynical. After being in the industry 15 years, I can tell you it is all true. I am especially fond of this quote from Pauls article, Ultimately, software engineers will have to be licensed, but I hope to be retired before that happens
© Copyright 2010 Mitch Barnett - Software Industrialization is the computerization of software design and function.
newtelligence dasBlog 2.2.8279.16125 Theme design by Bryan Bell
| Page rendered at Tuesday, 10 August 2010 14:00:54 (Pacific Daylight Time, UTC-07:00)
On this page....
Software Engineering Links