Breeding better software

It is often remarked that computer science, and especially its more applied sibling, software engineering, are relatively immature sciences. This is true on many levels, and is of course directly related to their young age. However, the question of how precisely to mature software engineering is still a matter of constant debate. Object oriented design and development was the last mainstream effort in that direction, and it appears that more agile methods are slowly but surely making inroads into general application.

Evolution tree

There seems to be an evolutionary pattern at work here. In the beginning, it was all about the machine. How to make code most easily digestible by computers, i.e., how to improve runtime performance and keep down the memory footprint, was the absolute imperative. It is easy to see why performance and engineering “close to the metal” became king in that era. Since computers in that time had slow processers and heavy memory constraints, they simply would not run overly complex and non-optimized code efficiently enough. In a way, the properties of machines imposed natural “fitness criteria”, which led to a natural selection of the most efficient algorithms and designs.

Eniac

Once machines became powerful enough, performance was gradually starting to take the back seat. Large and complex applications, with massive code bases and a large number of developers involved, were making it ever clearer that optimizing program code solely for machines is a dead end street. While a computer will process any code with indiscriminative and unemotional obedience, programmers have a very different approach. Being human, a developer will understand code more easily if it is nicely formatted, logically structured, and commented. Of course it is a natural trait of humans to look for art and beauty in any system, and to appreciate this, but it also has severe implications on the efficiency of production. Code that is easier to read takes less time to understand, and makes it less likely for developers to introduce errors. Code quality is thus directly related to the effective quality of the software, both in terms of efficiency and the number of bugs.

The center of attention thus turned to developers, and their human nature, as selection criteria of the software development process. Techniques like structured design or object orientation are all about making it easier for developers to understand the code. Development methodologies and software processes help structure the collaboration of many individuals. Design patterns impose a higher-level language on code, in which developers can communicate more efficiently. The Java language and platform epitomizes that approach: It consciously trades runtime performance and memory footprint for an environment in which the developer has to do less work and is less prone to make mistakes1.

Bug

It is unlikely that, without this paradigm shift, we would have arrived at the current state of affairs, where software rules the world. Complex applications, which are embedded into critical points of our daily lives (think about communication systems or controlling nuclear power plants) are hard to imagine without these advances2. While there is certainly a need for better development methodologies, and much room for improvement, I am convinced that there is another frontier for software development, which is slowly creeping over the horizon.

The evolution of life on earth has also followed a similar path. This analogy is far from perfect, and it will probably have evolutionary biologists scream in agony, but bear with me for a moment. In the beginning, the first primitive life forms had the imperative to perfectly adapt to their natural environment (i.e., the composition of sea water, soil, the atmosphere, etc.) in order to survive at all. We can roughly compare this initial battle for survival of life itself with the first algorithms, encoded in punch cards. Then, life entered a long period of development that was governed by the Darwinian concepts of evolution. Basically it was a battle of “each species for its own”, finding suitable niches and adapting to selection drivers. The adaption of species for their own sake, similar to the advancement of software engineering as a methodological craft.

Smart dog

However, since human civilization has begun to gain ground on the planet, it has significantly affected the evolution of life in general. By selectively breeding plants and animals for their use, people have replaced the natural selection criteria with artificial ones, ever further optimizing life forms towards their usefulness for us, humans. The focus of evolution has thus shifted from a more self-sufficient, survival-oriented target towards the purpose-driven development incurred by selective breeding. It is hard to overestimate the effects of this change on human civilization. Before breeding, we had to make do with whatever was available. Now, we have optimized plants for the size and sweetness of their fruits. We have made animals more docile, optimized some for consumption (e.g., dumbing down cattle and increasing their amount of meat) and some for their utility (e.g., dogs which are perfectly suited for hunting ducks3). To a large degree, we are now in control of evolution, gradually steering the development of life towards its greatest utility for us.

We are slowly arriving at the insight, that we need to perform the same paradigm shift in engineering software. It is no longer sufficient to optimize the quality of code for its own sake, or even for its consumption by computers. We need to steer our applications into the direction of end users. Now, I hear your protests. “We do requirements engineering to steer development towards stakeholders’ interests. We perform usability tests to unveil shortcomings in the interaction interface.” Yes, and this is all nice and well. But if we want to advance the state of software, and thus our craft, to the next level, we need to seriously step up these efforts.

cranes

Most programs are still largely developed without the end user in mind. For components like network stacks or libraries, this is of course perfectly alright. But if you look at the complexity of many end user-facing applications, like the Microsoft Office suite, I can’t help but get the feeling that the developers are still running their own show. Exposing program internals through arcane settings, just so that the developer can avoid thinking about her end users, simply won’t cut it anymore in the near future. People demand, and rightfully so, that the software they use caters and adjusts to them, and they are tired of learning yet another usage paradigm that some software developers have thought up.

macintosh

The idea of user orientation, usability, and user experience testing and development is certainly not new. Apple was among the first to put usability in the driver’s seat, with the introduction of the Macintosh in 1984. In this time, and specifically within the Macintosh group at Apple, many fundamental principles were defined which are still among the most authoritative today. Of course, most of the other large software companies, like Microsoft or Adobe, also employ large usability departments. But their effect on design and development and, most critically, on the final product is much less pronounced than it was and still is at Apple. In the recent history, we have seen a lot of the so-called “Web 2.0” companies adopt a similar attitude towards design. And I mean design not in a sense of mere prettiness, but in the Steve Jobs sense:

Most people make the mistake of thinking design is what it looks like… People think it’s this veneer — that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.

Steve Jobs, The New York Times, 2003

Despite all criticism about Apple and its practices (some well-deserved, some less so), its wild success in the latest history cannot be debated. It continues to increase market share, and to compel people to buy new products they did not even think were necessary (iPhone, anyone?), in spite of their resistance to offer products at the widespread discount rates. I am convinced that this success is largely due to the strong customer and end user orientation of Apple4. It does not try to please analysts and tech press writers with specs and bullet point lists of features, but it focuses on the value for the end user. Not in an abstract sense, but in a very down-to-earth manner of “how can we design this so that people will actually use it?“.

babymacuser

Many of the Web 2.0 companies, such as 37signals, are getting this as well. It is not the amount of features, the framework and language you use and its efficiency, and not brand and marketing prowess you are using to push your product. It’s the end users, stupid! If they won’t use it, if they don’t get why the product is useful to them, there will be no sale. And no success. And, ultimately, no advancement for software and the art of software engineering from your particular product.

Whatever you may think about Apple, 37signals, and what have you — it is time that we, as an industry and as individual developers, take this approach more seriously. That more of us use it, and use it more than we traditionally have. It is time to put the model of developing software for its own sake to its deserved rest. We need to start breeding the code, so that it becomes what the people using it will find most useful and natural. In my opinion, this paradigm shift is not a fad or short-lived fancy, but is the next evolutionary step of developing software professionally. Whichever way you think about it, it makes sense to put your users in the driving seat. They are the next drivers of the evolution of software development. After all, this is like any evolution: If you don’t adapt to the selection drivers, you are headed towards extinction.


  1. The most prominent example is Java using garbage collection instead of explicit memory management. However, the orientation of Java as a developer’s language, instead of a computer’s, obviously runs much deeper. 
  2. Of course, there were also very complex systems in the olden times, controlling power plants or financial tradings with assembler code on mainframes. But they required the best developers’ hard work to function properly, and bugs were all but uncommon. Today it is much easier for a programming novice to arrive at solutions of similar complexity, simply by leveraging best practices and ubiquitous frameworks. 
  3. Fun fact: Sebastian adds that dogs were initially domesticated and bred to be mobile, self-attending meat repositories for travel. 
  4. It is noteworthy that end user orientation, somewhat counterintuitively, does not mean catering to all end users. The most sensible and successful approach, which is pursued by both Apple and 37signals, is to pick a relevant market segment for which to optimize. This type of a somewhat more opinionated software will of course turn a large number of users off. However, opinionated software will naturally have some very enthusiast and vocal supporters (or, fans, if you will). And this is something no software developer would complain about.