Ola Bini wrote an interesting article called, “Languages Should Die.” I was hoping that this was really the case, but actually it was the opposite. I believe it is this type of thinking (i.e. language proliferation) that has our software development industry in trouble:
“No existing language will die in a long time. But we need new ideas. New fresh ways of looking at things. And new languages that incorporate what other languages do right. That’s how we will finally get rid of the things we don’t like in our current languages.
You should be ready to abandon your favorite language for something better.”
Define better. Define do right. I feel we have already reached good enough long ago with C or Smalltalk. Heck, Smalltalk is one of the simplest languages there is - the entire language syntax can fit on a postcard. What more do we want? Oh sure, there are some small technical issues, like with all languages, but technical deficiencies is not what did in Smalltalk, it was marketing, but I digress.
Wikipedia says there are +560 programming languages. Sure they are not all in use today, but how many programming languages do we really need? How many written and/or spoken languages (i.e. English, Spanish, German, etc.) do you know well? Expert level? How many people do you know that are expert level in 2 or more spoken and/or written languages? How about 5 or more spoken and/or written languages? How long do you think it would take for you to learn 5 spoken/written languages at an expert level? Apply your CoCoMo model to that!
AJAX frameworks like JQuery, Mootools, Prototype, etc.
It is no wonder to me that we cannot get any real work done as we are all learning existing or new languages, frameworks, etc. It is no wonder to me that business is getting pissed off at developers in general because everything takes too long, too error prone and worse it does not meet their requirements.
Ok my point is not to be some old stodgy dinosaur, yeah I just turned 50, but I have a 6 year old daughter that can play Lego Star Wars better than I can. I am not trying to stifle innovation, but let’s be smart about what we are trying to innovate. Here is an excellent example in my neck of the woods, I spent a bunch of time learning C# since 2000 and kept up with the versions. Recently, I went to PDC and checked out all of the new C# 4.0 features. One of them was introducing a dynamic type to what is otherwise a very statically typed language. I used (Iron)Python for my dynamic needs and now some of that is in C#. Is that good, bad, better or worse or what? Damned if I know, it is yet another thing I have to figure out as a software engineer. Oh, and I really liked F#, a brand new functional language. So how many days, months or years do I need to invest in learning F# to become an expert? And most importantly, why would I do that? Other than the coolness factor, admittedly really cool to me personally, but what possible business value or real world application does it have that would cause me to use it? Answer = none.
See what I mean? As Ola says,
“So - you should go out and fork your favorite language. Or your least favorite language. Fix the things that you don’t like with it. Modify it for a specific domain. Remove some of the cruft. Create a new small language. It doesn’t have to be big. It doesn’t have to do everything - one of the really cool things about functional languages is that they are often have a very small core. Use a similar approach. Hack around. See what happens.”
Give me one sound business reason why this would be a good thing to do? You want a simpler language, well, we have that already, see Smalltalk example above. You want a better or a “right” programming language, you better have some real definitions as to what those mean, have identified real shortcomings (not just some syntactic sugar) in any other mainstream language, and your proposed improvement must be like 10X or why do it?
I feel that we already have (more than) enough programming languages to choose from. Let alone the frameworks and batteries that come with them. We software developers/programmers/engineers seem to be our own worst enemies as we are causing more and more complexity to a domain that is already complex enough. What are we really doing to reduce the complexity instead of adding more? Adding another language to my skill set like F# (which I would love to learn personally) absolutely has no business value for me or my customers in my world of ecommerce web applications.
In my software engineering world, I am looking at every angle to reduce complexity. It is simply a matter of numbers, the less is better. So if I can reduce the number of programming languages, frameworks, integration points, executables, assemblies, etc., the simpler the solution, the lower the cost, the less time to deliver, the easier it is to change and maintain and therefore representing the best business value to the purchaser of the custom developed software.
However, these days it feels like I am a minority in our software industry as we proliferate everything, including software development methodologies until it becomes insane. I am concerned that this is happening to programming languages (and everything else in our software development industry) as well. Just like the financial market situation we are in now, I am wondering when that will happen in our industry? For all of our brain power, we seem to be following the same path. What is it going to take to divert this path? When will the pendulum swing the other way to favor economic sense (i.e. proven software engineering principles) instead of the crazy proliferation of anything in the name of continuous improvement?
As Brad Cox says in, “Planning the Software Industrial Revolution, I only wish that I were as confident that the changes will come quickly or that we, the current software development community, will be the ones who make it happen. Or will we stay busy at our terminals, filing away at software like gunsmiths at iron bars, and leave it to our consumers to find a solution that leaves us sitting there? “