Guest Opinion

Agile Coding, or Software Factory

Agile coding and the software factory approach to coding both offer strong benefits for developers, but following either to its extreme logical end can stunt productivity.

There are two camps out there making a lot of noise these days: the Agile/Extreme Programming camp and the Software Factory camp, which is really a subset of the broader Model Driven Engineering movement.

These camps both clamor for our attention, while most of us are still trying to make the best use of components, objects, and frameworks. So does either camp matter? Is one better than the other, and if so, why? Most importantly, should you care?

The Agile camp argues that coding is good, and that the key to quality and productivity is extensive testing. One cornerstone of the Agile world is Test Driven Development: the idea that you should write your test code, then write the code to be tested. Code you don't write is always suspect, because it's difficult or impossible to test.

I recently had a conversation with an Agile advocate and expert. He'd just demonstrated how to write a bunch of code to display data in a tree control in Windows, and I suggested he could have written a tiny fraction of the code had he used data binding. His reply was that he couldn't test data binding easily, but he could test the code he wrote easily. And he did, by the way. As part of his demo, he wrote the tests first, then wrote the code to do the work. And there's no doubt that the process was effective, and the code he produced was high quality.

At the same time, there's no denying that he wrote several times the number of lines of code you'd write if you used data binding, and it took him a lot longer to finish a simple task than it would have taken had he used the pre-built functionality of .NET. So there's certainly a tradeoff here.

The Software Factory camp argues that generating code is good, and that the overall quality of generated code is higher than the quality of handcrafted code. The less code you write, the higher the quality of your application and the higher your productivity. Rather than writing code by hand, you are better off writing code to generate your code; or so goes the rationale from this camp.

The idea of software generating software isn't new. It was tried extensively 15-20 years ago with an idea called Computer Aided Software Engineering (CASE). CASE tools allowed a developer to design and build their applications using high-level metaphors, often diagrams. If you look back at many of those tools, they don't look much different from the Domain Specific Language designers favored by the Software Factory advocates today. So you've got to wonder why Software Factories are going to succeed, when CASE ultimately failed.

A few months ago I had a conversation with one of the leaders of the Software Factory movement, and I asked him exactly that question. His answer was vague, and amounted to "we've learned a lot since then," which didn't exactly inspire me with confidence.

At the same time, there's no denying that the basic concept of code-generation has caught fire. Virtually every major project I've been involved with over the past two to three years has used code-generation to one degree or another. Some larger projects generate the vast majority of their application's code, handcrafting only specific custom algorithms. And the results are typically spectacular in terms of both productivity and quality. If you find a bug, just fix the generator, regenerate the code and it's fixed universally.

Where things get strange is when teams try to mix Agile and Factory concepts. I've seen projects where most business code is generated, and then they also generate a bunch of tests for that generated code. Since both the code and tests come from the same meta-data and are generated by the same generator, it isn't surprising that the tests always work. This is typically done to satisfy some arbitrary requirement to achieve 90% code coverage with tests, but I question the value of tests that are generated automatically alongside the code they are testing.

So who is right? Should you stop using code-generation and handcraft both tests and all your code? Should you abandon the use of frameworks and pre-built components to ensure you can test all the code you use? Obviously that's silly, because it would preclude the use of .NET itself. At the same time, the idea that every line of code you handcraft should have corresponding test code is a good idea, and is something everyone should strive for.

So I contend that both camps offer value, but neither has the whole story. Generate code where you can, and handcraft code where necessary. If you do handcraft code, make sure to write corresponding tests to ensure the quality of that code. The result is the best of both worlds: high quality and high productivity for all your code.

The opinions expressed in this column are those of the author, and do not necessarily reflect the opinions of VSM or 1105 Media.

About the Author

Rockford Lhotka is the author of several books, including the Expert VB and C# 2005 Business Objects books and related CSLA .NET framework. He is a Microsoft Regional Director, MVP and INETA speaker. Rockford is the Principal Technology Evangelist for Magenic, a Microsoft Gold Certified Partner.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube