Developer's Toolkit

Blog archive

Balancing Productivity and Quality

I used to be an unquestioning proponent of formal modeling and other techniques for developing software at high levels of abstraction. And it was easy to see how I came about that notion. While the assembly language programming I did as a college student was technically easy, it took an enormous amount of effort to perform the simplest tasks. I concluded that this activity wasn't the most productive use of my time.

As a graduate student, I was introduced to the concepts of formal modeling (in my case Petri nets, and later state charts), and became an immediate convert. The thought of diagramming my application and executing the diagram was appealing, because I didn't have to worry about housekeeping details such as type declaration and matching, memory allocations and deallocations, and pointer arithmetic. The semantics were all that mattered. The productivity gains from working at such a high level of abstraction had to overcome any inefficiencies in execution, especially with the ever-faster performance of processors.

Well, time wounds all heels, and I've begun to have second thoughts about that set of beliefs. In the intervening fifteen or so years, some things have supported my original position. Processors, as well as memory and mass storage, have made significant advances in performance, and we have largely accepted not making code as fast as it could be in return for the ability to use frameworks and libraries to speed application development. And execution technology has become so good that managed execution environments have done away with most of the memory housekeeping chores I mention above.

Application architectures have become more complex than they were around 1990, and code written in older languages stumbles through N-tier, object-oriented, services-based applications and application components. It's hard enough to get these applications right without having to worry about making sure the interactions between the code and the underlying machine are right, too.

I still believe that better processor performance and managed languages are important and valuable advances in software development, but I have become more concerned about the impact of abstraction on application performance and quality. Legacy languages (C, Pascal, Ada—take your choice) forced you to understand how they worked in order to get it right. It wasn't always pretty or even necessarily fast, but when you were done, you knew more than just your code.

On the other hand, managed code just works if you get the semantics correct. I called it formal modeling back in nineteen-mumble-mumble, but managed code is very similar in that regard. Think of managed code as a more concrete implementation of an abstract model. That's what I was looking for, right?

Well, not anymore. Formal modeling is still the right way to go, but there is more to application development than correct semantics. A software application is more than a model, or even an implementation of a particular model. It has flaws, some of which arise from its own construction, others of which arise from the environment in which it runs. None of these flaws make it useless for its intended purpose, although users might occasionally experience the downside of software with minor failings. But the application exists within the machine, and will have fewer of those failings if it plays well with that machine.

Take memory management. I can write a managed application that operates correctly without understanding a thing about how it uses memory. Years ago I might have argued that that was a good thing. Today it concerns me, because the more you know about the interaction between your code and the underlying machine (both real and virtual), the better prepared you are to find flaws and write fast and efficient code. You can still influence these characteristics in both Java and .NET, if you understand how they work.

Formal modeling languages, such as UML, that can generate code work at such a high level of abstraction that they don't even give you the opportunity to make those adaptations. Because you are farther away from the machine, you don't even have the opportunity to see how your design decisions gobble memory or create a massive working set. You have great productivity, but less quality, and that's not a good tradeoff when you let the tools make it for you.

I'm not advocating a return to assembly language or even legacy languages. Productivity is still important. But developers have to make that tradeoff, not have it made for them. Managed languages are a good intermediate step, but only if developers understand the implications of their decisions on the underlying machine. Formal modeling languages also need to give developers visibility into more than just the semantics of the application. Developers need to see how design decisions affect efficiency and performance. Once they can see and react to the interaction of code and machine, I'll be able to say I was right all along.

Posted by Peter Varhol on 03/26/2005


comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube