Practical .NET

The Future of Programming

If you’ve been programming long enough, then you know that the "right way to do things" keeps changing. Here’s why, a description of where we are now and a guess about where we’re going.

I’m older than dirt and, as a result, just starting my fourth decade as a professional programmer. When I started programming ... well, with a beginning like that, you’re already guessing about how this paragraph will end. Either I’ll claim that:

  • Nothing ever really changes and I’m still doing the same things I did when I started.
  • We’ve been making constant progress in understanding how to build applications and we’ve now finally discovered the right process (which, given some of my earlier columns, is probably something to do with Domain-Driven Design).

Of course, neither claim is even remotely true.

What Is True
Yes, I’m doing some of the same things I did when I started writing COBOL code for money on a VAX-11 780. I am, for example, still typing code and still debugging that code because it never works the first time.

But I’m now an object-oriented programmer with a completely different toolset. The online, interactive applications I build now look nothing like the batch applications I built back in the day. I even did some assembler programming back then, and I assure you, the code I write now looks nothing like that code.

And I think that’s unlikely to change in the near future. If you come back in 20 years, I bet you two things: I’ll still be writing code; and I’ll be building completely different applications in a different way than I do now.

The Reason We Keep Changing
I blame our tools for these changes: We keep upgrading them.

The point of COBOL, for example, was to create an "English-like" language that was simpler than existing programming tools. With COBOL in place, it would be possible for business managers to write their own applications. And, of course, that’s exactly what happened ... only it wasn’t with COBOL, it was with Excel, Crystal Reports and the like.

COBOL did, however, simplify the act of creating applications. In response, the world asked for more complicated applications. Eventually, in fact, the world asked for more complicated applications than COBOL could deliver and, fortunately for us, more powerful programming tools turned up about the same time to let us deliver those applications.

Also, fortunately for us, the hardware we run our applications on became more powerful. Rather than harness all that power for our applications, we asked our OSes and development tools to do more work for us. The same thing happened with labor in the 1950s and ’60s: As wages increased, workers chose to give up some of their working hours in return for a shorter work week. They could’ve kept the higher wages along with the longer hours but chose not to. We could’ve kept running our applications on DOS (where they would run faster than we could possibly imagine) but chose not to. Instead, we turned some of that additional hardware power over to our OSes and development tools so we could do less work.

Our Dreams: Growing and Shrinking
At the same time, our dreams of what counted as the "ultimate application" first grew and then shrank.

Initially, we were happy if we could just process an entire employee payroll between Friday night and Monday morning. As our tools and dreams grew, we asked for bigger applications that we called "enterprise" applications. This was the era of "systems thinking" when the goal was to build the "Mother of All Applications" (MOAA) that would fully integrate every aspect of our organization. Users would then run our applications.

In the last decade or so we’ve concluded we can’t handle that: We’re not smart enough or capable enough to pull off those kinds of projects. We’ve also decided applications should meet the needs of users, instead of our users being a necessary (but irritating) part of running our applications.

So now we have "design thinking," which starts with the goals and workflows of the users and works outward from those beginnings to create the applications and databases that our users need. We say the "interface belongs to the client" and our job is to write the code implementing the interface the client needs. Once, database design was the starting point for all business applications. Now, our table schemas are an "implementation detail" that falls out of creating the objects necessary to implement the interface.

The crazy thing is that I’m not making fun of the current process -- this is the way we live now.

Of course, this is only possible with tools that let us do that. The "interface belongs to the client" is only a reasonable approach if we have the tools that will let us implement whatever lunatic interface the client comes up with. Database design becomes an "implementation detail" only if we have great object relational mapping tools that support code-first development.

And the tools will change again. When they do, we’ll be asked to build different kinds of applications because the tools will let us do that. Come back in 20 years and we may have tools that let us build "enterprise" applications that can adaptively "integrate the enterprise."

I say that because, I suspect, in 20 years most of the "code" I write will consist of constructing rules to control how customized applications are assembled at runtime in response to unique scenarios. I’ll still, however, think of what I’m doing as "coding." And I intend to still be creating applications in 20 years just to see if I’m right.

About the Author

Peter Vogel is a system architect and principal in PH&V Information Services. PH&V provides full-stack consulting from UX design through object modeling to database design. Peter tweets about his VSM columns with the hashtag #vogelarticles. His blog posts on user experience design can be found at http://blog.learningtree.com/tag/ui/.

comments powered by Disqus

Featured

  • Mastering Blazor Authentication and Authorization

    At the Visual Studio Live! @ Microsoft HQ developer conference set for August, Rockford Lhotka will explain the ins and outs of authentication across Blazor Server, WebAssembly, and .NET MAUI Hybrid apps, and show how to use identity and claims to customize application behavior through fine-grained authorization.

  • Linear Support Vector Regression from Scratch Using C# with Evolutionary Training

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the linear support vector regression (linear SVR) technique, where the goal is to predict a single numeric value. A linear SVR model uses an unusual error/loss function and cannot be trained using standard simple techniques, and so evolutionary optimization training is used.

  • Low-Code Report Says AI Will Enhance, Not Replace DIY Dev Tools

    Along with replacing software developers and possibly killing humanity, advanced AI is seen by many as a death knell for the do-it-yourself, low-code/no-code tooling industry, but a new report belies that notion.

  • Vibe Coding with Latest Visual Studio Preview

    Microsoft's latest Visual Studio preview facilitates "vibe coding," where developers mainly use GitHub Copilot AI to do all the programming in accordance with spoken or typed instructions.

  • Steve Sanderson Previews AI App Dev: Small Models, Agents and a Blazor Voice Assistant

    Blazor creator Steve Sanderson presented a keynote at the recent NDC London 2025 conference where he previewed the future of .NET application development with smaller AI models and autonomous agents, along with showcasing a new Blazor voice assistant project demonstrating cutting-edge functionality.

Subscribe on YouTube