Lhotka's Labrynth

Keeping Sane in Wake of Technology's Pace

Architecture can be key to staying one step ahead.

The pace of technology change is accelerating. Every year we have to absorb more new technologies and approaches to problems. This comes from Microsoft, from third parties, and from the open source community.

In 2007 I found myself becoming increasingly unhappy, feeling more and more overwhelmed by the rate of change. I felt like I was slipping farther and farther behind. However, the rate of change is outside the control of any of us individually. So this year, rather than fight or despair about it, I’ve chosen to embrace and revel in the change.

The key to making this possible is to develop a consistent, high-level architectural view of software. For me, this involves the idea that every application has business logic that belongs in a formal business layer. Above this layer sits an interface layer, allowing users or other apps to access the business logic. Below this layer sits a data-access layer that abstracts the interaction with the app’s data sources, whether they reside in a database, a set of services, XML files, or other sources.

The interface layer is never allowed to communicate with the data layer directly. It must interact with the business layer. The business layer can then interact with the data layer as needed. Similarly, the business layer can never communicate directly with the user (or consuming application). It must interact with the interface layer, which can then communicate with the consumer of the app. Each layer represents a clear set of functionality. Firm separation of concerns must be maintained between each layer.

At the business layer, the three primary technologies at our disposal are procedural design/programming, object-oriented design/programming, and the use of workflow technologies. The options are plentiful at the interface layer, and growing.

At this time, they include Windows Forms, Web Forms, AJAX, Silverlight, Windows Presentation Foundation (WPF), ASP.NET MVC (model-view-controller), XML services, and workflows. But if you consider the business layer to be the center of every application, then all these technologies simply sit on top. One technique that makes this work in a practical sense is to ensure that your business layer supports data binding, because it simplifies the creation of Windows Forms, Web Forms, Silverlight, and WPF interfaces.

On the data access side of things the options are becoming a bit ridiculous. You can choose between ADO.NET, DataSet/TableAdapter, LINQ to SQL, ADO.NET Entity Framework, the RESTful Web data services model, various third-party ORM tools, various third-party code-generators that create ADO.NET-based code, and perhaps other options. All of these technologies effectively do the same thing by allowing your business layer to interact with the application’s data.

ADO.NET is fastest; all of the other technologies sit on top of ADO.NET, so they’re slower. Understanding that the other options incur overhead, you can then decide whether the overhead is worth it compared to what you get. The DataSet offers flexibility in terms of code simplicity, filtering, and sorting, but it’s relatively heavyweight. LINQ to SQL and the Entity Framework (when combined with LINQ to Entities) offer even more flexibility than the DataSet, and in a lighter-weight model. But some of the third-party libraries or code generators that directly wrap ADO.NET are fast and focused, though often less flexible.

You can evaluate whether raw performance, simplicity of code, flexibility, or other factors are the primary drivers for your application. Then the choice of technology becomes a business issue of balancing cost versus benefit.

Ultimately, the key to remaining sane in the face of the increasing rate of change in technology is to decide on a high-level application architecture. Once you’ve done that, you can fit each new technology into that model, compare it to the other options, and decide which technology best meets the needs of your app or organization within your architecture.

About the Author

Rockford Lhotka is the author of several books, including the Expert VB and C# 2005 Business Objects books and related CSLA .NET framework. He is a Microsoft Regional Director, MVP and INETA speaker. Rockford is the Principal Technology Evangelist for Magenic, a Microsoft Gold Certified Partner.

comments powered by Disqus

Featured

  • Creating Reactive Applications in .NET

    In modern applications, data is being retrieved in asynchronous, real-time streams, as traditional pull requests where the clients asks for data from the server are becoming a thing of the past.

  • AI for GitHub Collaboration? Maybe Not So Much

    No doubt GitHub Copilot has been a boon for developers, but AI might not be the best tool for collaboration, according to developers weighing in on a recent social media post from the GitHub team.

  • Visual Studio 2022 Getting VS Code 'Command Palette' Equivalent

    As any Visual Studio Code user knows, the editor's command palette is a powerful tool for getting things done quickly, without having to navigate through menus and dialogs. Now, we learn how an equivalent is coming for Microsoft's flagship Visual Studio IDE, invoked by the same familiar Ctrl+Shift+P keyboard shortcut.

  • .NET 9 Preview 3: 'I've Been Waiting 9 Years for This API!'

    Microsoft's third preview of .NET 9 sees a lot of minor tweaks and fixes with no earth-shaking new functionality, but little things can be important to individual developers.

  • Data Anomaly Detection Using a Neural Autoencoder with C#

    Dr. James McCaffrey of Microsoft Research tackles the process of examining a set of source data to find data items that are different in some way from the majority of the source items.

Subscribe on YouTube