Lhotka's Labrynth

Keeping Sane in Wake of Technology's Pace

Architecture can be key to staying one step ahead.

The pace of technology change is accelerating. Every year we have to absorb more new technologies and approaches to problems. This comes from Microsoft, from third parties, and from the open source community.

In 2007 I found myself becoming increasingly unhappy, feeling more and more overwhelmed by the rate of change. I felt like I was slipping farther and farther behind. However, the rate of change is outside the control of any of us individually. So this year, rather than fight or despair about it, I’ve chosen to embrace and revel in the change.

The key to making this possible is to develop a consistent, high-level architectural view of software. For me, this involves the idea that every application has business logic that belongs in a formal business layer. Above this layer sits an interface layer, allowing users or other apps to access the business logic. Below this layer sits a data-access layer that abstracts the interaction with the app’s data sources, whether they reside in a database, a set of services, XML files, or other sources.

The interface layer is never allowed to communicate with the data layer directly. It must interact with the business layer. The business layer can then interact with the data layer as needed. Similarly, the business layer can never communicate directly with the user (or consuming application). It must interact with the interface layer, which can then communicate with the consumer of the app. Each layer represents a clear set of functionality. Firm separation of concerns must be maintained between each layer.

At the business layer, the three primary technologies at our disposal are procedural design/programming, object-oriented design/programming, and the use of workflow technologies. The options are plentiful at the interface layer, and growing.

At this time, they include Windows Forms, Web Forms, AJAX, Silverlight, Windows Presentation Foundation (WPF), ASP.NET MVC (model-view-controller), XML services, and workflows. But if you consider the business layer to be the center of every application, then all these technologies simply sit on top. One technique that makes this work in a practical sense is to ensure that your business layer supports data binding, because it simplifies the creation of Windows Forms, Web Forms, Silverlight, and WPF interfaces.

On the data access side of things the options are becoming a bit ridiculous. You can choose between ADO.NET, DataSet/TableAdapter, LINQ to SQL, ADO.NET Entity Framework, the RESTful Web data services model, various third-party ORM tools, various third-party code-generators that create ADO.NET-based code, and perhaps other options. All of these technologies effectively do the same thing by allowing your business layer to interact with the application’s data.

ADO.NET is fastest; all of the other technologies sit on top of ADO.NET, so they’re slower. Understanding that the other options incur overhead, you can then decide whether the overhead is worth it compared to what you get. The DataSet offers flexibility in terms of code simplicity, filtering, and sorting, but it’s relatively heavyweight. LINQ to SQL and the Entity Framework (when combined with LINQ to Entities) offer even more flexibility than the DataSet, and in a lighter-weight model. But some of the third-party libraries or code generators that directly wrap ADO.NET are fast and focused, though often less flexible.

You can evaluate whether raw performance, simplicity of code, flexibility, or other factors are the primary drivers for your application. Then the choice of technology becomes a business issue of balancing cost versus benefit.

Ultimately, the key to remaining sane in the face of the increasing rate of change in technology is to decide on a high-level application architecture. Once you’ve done that, you can fit each new technology into that model, compare it to the other options, and decide which technology best meets the needs of your app or organization within your architecture.

About the Author

Rockford Lhotka is the author of several books, including the Expert VB and C# 2005 Business Objects books and related CSLA .NET framework. He is a Microsoft Regional Director, MVP and INETA speaker. Rockford is the Principal Technology Evangelist for Magenic, a Microsoft Gold Certified Partner.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube