Developer's Toolkit

Blog archive

Predicting the Future

Why do some technologies and products take off and become ubiquitous, while others die a quiet and ignoble death? In my two decades in computing, I've seen many instances of both, and it at least on the surface seems almost impossible to tell them apart.

I can offer some examples. In the early 1990s, the industry analyst community was confidently predicting the rise of IBM's OS/2 operating system. All of the adoption curves showed that it was destined for dominance, and for each year that those predictions didn't happen, they simply pushed those same curves farther into the future. Until, of course, they simply stopped drawing the curves altogether.

The same was true with the OSI (Open Systems Interconnect) network protocols. When I was working on my doctorate in the early 1990s, several of my professors confidently predicted the day when OSI would supplant TCP/IP and associated protocols as the networking standard. By then I had become cynical enough to doubt any pronouncement of future technology adoption that I didn't believe any of it (and paid for that cynicism with lower grades).

We can explain both in retrospect. OS/2, while no doubt the technically superior product at the time, was undercut by Microsoft in favor of its nascent Windows franchise, while poor, clueless IBM had no idea of either the value of its product or the depths of Microsoft's deceit. And OSI, designed by committee, as technically correct but complex and expensive, and the birth of the bohemian Internet locked in the less complex and less expensive TCP/IP took over long haul data transport.

But was there any way of predicting which technologies will win or lose? Is there a pattern? Perhaps. The first thing we have to do is distinguish between the idea and its implementation. Good ideas usually get broad acceptance, and most people can agree on what constitutes a good idea, even if they disagree on its implementation.

Second, we should seek out those ideas whose implementation is being driven by formal standards drafted by standards bodies with wide representation. Specifically, we should seek out these implementations and disregard them. This conclusion harkens back to my experiences with OSI, which had all of the support from a broad range of standards committees and their members, but failed in the market. While it sounds egalitarian to participate in and support standards bodies, the conflicting goals of their members and the glacial pace at which they make progress almost guarantee their strategic irrelevance.

Cost is also a barrier to acceptance of new technologies. Even the most elegant implementation won't be broadly accepted if it cost too much. A good example of this is the early PC development environments, promoted by Microsoft and IBM, which typically cost over two thousand dollars. PC development leaped ahead only with the low-cost Borland's Turbo development tools, which almost took over the market before Microsoft lowered its own prices and improved its tool set.

With those thoughts in mind, it seems to me that an idea that everyone agrees is innovative is worthwhile observing. Those implementations that reach the market quickly, with pricing and distribution to reach a large number of users, are off to a good start.

One more characteristic of a winning implementation is the willingness of the vendor to rapidly assimilate feedback and make changes in response. Far too many companies fall in love with their own technology, and are unwilling to adapt it to customer needs. Anytime you hear a spokesperson say that the market had to catch up to their solution, run away as fast as you can.

I can't say that these observations are foolproof in identifying a winning technology, but they represent my impressions from observing technology over a long period of time. There are certainly other factors involved, but I'll offer these as making a real difference.

Posted by Peter Varhol on 11/13/2005


comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube