In 1980 I worked for an advanced R&D electronics company that produced datasets and switches from initial paper napkin design, to engineering prototypes, to full on production manufacturing. One of my roles was to design the printed circuit board layout, commonly referred to in the biz as the artwork. I did this by laying down various decals on a transparent Mylar sheet that represented where the electronic components would go and then I would lay down traces (red and blue tape for a double-sided board) which represents the electrical connection between components.
I used an electronic circuit schematic diagram to read the electrical connections and mark off each trace that I put down on the layout design (i.e. artwork). The scale was 4 times the size of the physical printed circuit board that the design would eventually become copper traces and the physical components would get plugged into the board and soldered.
From a process perspective, I had to shoot a negative (or positive) of the finished Mylar printed circuit board layout design that was reduced by 4 times. This positive was then physically laid on a flat solid layer copper printed circuit board template that was photosensitive, so that under a UV light, the board was exposed and then the board had the photographic imprint of the design on it. The exposed board is submerged into a caustic chemical soup that would etch away the copper on the board except where it was exposed by the positive. I end up with a printed circuit board as a replica of what was on the positive design, Mylar layout design and ultimately the circuit diagram, with no loss in translation. Next hand-drill the holes in the printed circuit board and load it up with all of the electronic components, which I soldered in. Finally, mounting the board in its case and voila, a finished prototype ready for testing.
The printed circuit board layouts I designed had some 200 electronic components including a microprocessor, many ICs and several analog components plus hundreds of traces. It took about 8 weeks of effort times 8 hour days to actually layout a circuit design of this size and complexity. Then it took a few days to get the positive shot, another week or so to get the board etched and another week or so before you had the board masked, drilled, parts loaded, soldered and it was in the test rack in the lab.
In 1980, this was a manual labor intensive process, working at a light table for almost 2 months straight. Add another month to finish prototype. Kinda reminds me of the process of developing modern day software prototypes.
And then in 1982, a specialized computer arrived at my desk with a Computer Aided Design (CAD) program for designing printed circuit board layouts. It did not replace my job. Instead I became a computer operator that required specialized domain knowledge (i.e. printed circuit board layout design). It took a while to figure it all out, but it did reduce my manual effort from 8 weeks down to a couple of days. This was a significant productivity increase that I gladly welcomed.
You programmed the CAD application by entering in your electronic components (mported from a Bill of Materials list with some metadata) and then drawing the traces (actually using a light pen to touch one component lead to another on a graphical display) and the computer would figure out the width of the trace based on how much current was required (if you loaded Bill of Materials metadata) and auto-routed the trace in the most efficient way by trying hundreds of combinations and rearranging other traces, each time a trace sequence was inputted
What does this mean? It means that the level of abstraction for solving this particular problem was raised significantly with the introduction of a highly specialized CAD program. Measuring my involvement with this technology using the Technology Adoption Curve, I would be classified as an early adopter of this innovative and disruptive technology according to Geoffrey Moore. This technology approach radically changed the way printed circuit board layouts were designed thereafter.
Today, some 25 years later, printed circuit boards are still laid out requiring humans to develop the design, but the level of abstraction and automation is unbelievable. One only has to investigate the ASIC world to see how industrialized this industry has truly become. We have abstracted this specialized process so much that major chunks of designs are reused and assembled from massive libraries built upon standards. This is how mature this industry is. Back to the printed circuit board design, the standards based CAD output file is directly interpreted by a Computer Aided Manufacturing (CAM) device that can output a completely etched, masked, drilled printed circuit board with parts placed by robotics and wave soldered, all in a matter of minutes.
At that time in the 80s I saw various parts of the electronics engineering design and manufacturing world go through a small industrial revolution with these highly specialized CAD CAM technologies that have now evolved to the point where DVD players are $30.00. Who would have guessed? It was not that long ago that an HP-41c calculator was $600.
Following opportunities in the electronics design industry, I got into the software development world in 1988. I was exposed to several programming languages. One that always made the most sense to me from an efficiency point of view was Smalltalk. Its entire syntax could be captured in a one page document. Whereas C just made my head hurt, as did assembler I just dont think that way. One Smalltalk app was developed for employees at banking institutions where the user interface modeled the physical objects on and in a bankers desk. Aside from calculators on the desk, the pull-out drawers that contained various folders and forms cold be selected by the mouse, drag the drawer open, select a folder and pull up a form ready to be filled in on your computer screen. The graphics were not that slick in those days, but it really looked like a physical replica of the bankers desk and all of the items it contained.
What totally got me was how the user interface design was received when we first tried it on a real banker. I will never forget, the person sat down and just started to use the software because it really did model his desk, where all the right documents were in the right folders, etc. He thought it was the coolest thing in the world. I thought I could not believe that this person started using the software with no training he just sat down and went to work. Really says something about how user interfaces should be presented for people doesnt it. Hint: think domain specific.
I got into regular Windows programming for both the Mac and Windows (starting with VB1). Man, this was much harder than the Smalltalk programming I did. As a side note, when researching Smalltalk years later, I found out that it was a programming language designed for kids, I guess thats why I really liked it
Back to our regular programming, and did I say how hard it was? I would spend hours trying to figure out how to make something, anything, work. It was a maximally manual labor intensive process, almost felt like I was back in my electronics design world hand laying out printed circuit board designs, trace by trace. I was stepping back in time from an industrial evolution point of view.
Flash forward 15 years, now I am a so called Architect in the professional software development world. Over 15 years, I have worked virtually every position one could have in the software development industry, both in product companies and professional services, including being the President of my own software development company for 4 years with a staff of up to 25 people.
While I think the level of abstraction has risen with tools and techniques in the software design/development world over the last 15 years, it sure has not made the productivity leap from 8 weeks down to 2 days that I experienced with the introduction of a specialized CAD tool in the electronics engineering world. Specialized being the key word.
I am not going to get into analogy arguments of comparing the detailed process and risks in designing software versus designing electronics, but I would suggest that natural evolution of tools and techniques for designing software will follow a similar path to what has already occurred (25 years ago) in the electronics engineering design world. Over the last 15 years, I have seen steady progress in raising the level of abstraction in our software technologies (e.g. framework class libraries and runtime execution environments), tools (e.g. sophisticated designers and programming IDEs) and processes ( e.g. Agile.).
Consider this, that all of the tools and applications discussed above could be considered Domain Specific Languages in our software development world of techno-speak. The key phrase being domain specific or specialized. I know there is some history with CASE tools of not having much success, but one probable downfall was that they were not domain specific (or specialized) enough. CASE tools tried to cover the entire SLDC. In the electronics worlds, I used a separate circuit diagram CAD tool that had a very specific palette of electronic circuit symbol objects and the only thing the tool was designed for was drawing electronic circuits. I used a different domain specific tool to layout the printed circuit board, but note that through standards, the file format from the circuit diagram could be read, along with a bill of materials, into the printed circuit diagram layout design tool, and then again input to a CAM tool that automated the production of the finished product.
I see a similar pattern occurring in the software design world with the introduction of Software Factories. This is why this is very encouraging news to me. Jack Greenfield and crew are advancing the state of the art in software development techniques and tools akin to what John Walker introduced with AutoCAD in 1982 to the engineering design and manufacturing world.
I believe we are at the beginning of an industrial revolution in the software design/development industry. No, I am not talking about robots coding software. I am talking about humans using domain specific, visual design tools to design software in a specialized CAD environments (remember my electronics example above) and have its standard output interpreted by code generators to produce the runtime executable. Code generation is similar to the automated manufacturing process, except we call it an automated build process. Note that the manufacturing process in building software costs as little as selecting the build command from the menu in Visual Studio. All of the effort is in the design, i.e. hand writing source code.
As mentioned above, we already have that today in other industrialized industries. Computer Aided Design software, allows designers to draw and specify in full detail (usually a model that is 100% complete) the design, lets use a real example, of a piston for a car. The drawing output is saved to a standard file format (i.e. DXF, DWG), which can be interpreted by many devices to produce the physical output of the design. For example, a Computer Numeric Control (CNC) milling machine can interpret the design file as a loaded program of thousands of sequential Cartesian coordinates, and execute the program on a solid block of aluminum. The output? An aluminum piston milled to one thousands of an inch tolerance or better that exactly matches the design, with no loss in translation. Fully industrialized and, ironically, driven by software.
I mean ironically in the way that software has industrialized other industries (e.g. electronics design and manufacturing) but our industry in designing the software has not yet been industrialized. We still hand craft solutions by hand, in source code editors, one line at a time and have not seemingly benefited from the level of automation our software provides other industries, such as Product Line Engineering in the electronics design world.
However, we are closing the gap with recent innovations in the art of software design with the introduction of Software Factories. When you think of a Software Factory, think of the end to end process, from initial design to code generating the solution. Most people think the Factory analogy applies to just manufacturing, but completely forget about the design side, which is one of the reasons, that I believe, most of our software design tools are still code (i.e. text) editors
In Microsoft techno-speak, a Software Factory embodies the AutoCAD process of using a specialized CAD like tool to fully design (i.e. model) the software problem domain in 100% fidelity, meaning nothing lost in translation. The standard file format the design is saved in, like the DXF mentioned above, can be exchanged with many other software programs that can interpret the file format. One example is to reconstruct the model in another design time tool with no loss of fidelity. Or, have an interpreter that can read the file format to assemble and configure a multi-project Visual Studio solution in which your interpreter can code generate and compile the solution into a runtime executable.
In the Microsoft speak, the AutoCAD like tool is called a Domain Specific Language (DSL) and in the Visual Studio SDK there is a toolkit (DSL Toolkit) provided for designing DSLs. The toolkit is built around the Visual Studio SDK environment that allows you to design and build your own visual domain specific language. Once designed, you use the supplied code generation framework to produce a runtime version of your specialized Application Designer. This generated Application Designer (hosted in the Visual Studio shell) is designed for a targeted user (i.e. a Business or Systems Analyst) to model and code generate an application that is specific to the problem domain the DSL was designed for.
The Business Analyst uses the runtime Application Designer to draw the solution using the domain specific language, and through the use of property sheets, configure all of the drawn objects for a particular solution problem domain (e.g. a CRM application). The output is saved in a known file format which populates source and component library artifacts of a multi-project solution (using the Visual Studio shell the designer is running in) and builds the output as a runtime executable of the (domain specific) designed application. This entire scenario is called a Software Factory. Need to make changes to the runtime design? The BA changes the drawing, regenerate and done. Over and over again. Need to make changes to the domain specific language itself? Same process, except the programmer changes the DSL drawing design that describes the end-user designer application and recompiles the design tool. Over and over again. Note the capabilities for predictability and repeatability, which are the cornerstones of an industrialized process.
Software Factories are at the innovation stage of the Technology Adoption Curve with 100,000 early adopters. As mentioned before, I am really excited by this as this is the type of industrialization I saw in the electronics engineering design world starting some 25 years ago. I wonder if history is about to repeat itself?
Barry Varga and I could be called innovators when we developed a Software Factory for BizTalk Server, called Bridgewerx. It works using the same principles and concepts that are described in the Software Factories book written in 2004 and behaves much in the same way as the recently downloadable 4 software factories do. The exception is that we code generate the entire Visual Studio solution as our visual integration designer is a highly specialized DSL for completely describing application to application integration solutions built on top of a middleware product (i.e. BizTalk).
If you are someone just getting involved in the software development world or a seasoned pro, you owe to yourself, at the very least from an educational perspective, to evaluate what Software Factories means for you and the future of our industry.