News

Q&A with Microsoft's Jerry Nixon: Quantum Computing and the Future of Software Development

Quantum computing is just too far out there for many developers. Even Jerry Nixon, an expert on the topic, admits it sounds a lot like something right out of Star Trek. But Nixon, a developer evangelist at Microsoft, says it's real -- and it's already happening.

In fact, Nixon says, the roles of software developers, data scientists and security professionals will never be the same as the age of Silicon ends and we enter into a new age of practical quantum computing.

To help them prepare, Nixon will present a session titled "Quantum Computing and the Future of Software Development" at the upcoming Visual Studio Live! conference in Chicago, running September 17-20.

We caught up with Nixon in advance of the show to find out more about this game-changing topic.

So, quantum computing really is coming, right?
Quantum computing is already happening. You bet. There are several quantum computers already in production, many on second or third versions. What has not yet come out is a practical, universal computer that is not for a niche problem. That is to say, one that is broad enough to calculate general problems and large enough to solve real problems. The universal quantum computers we see today are hardly as powerful as a calculator, but they can calculate in unfathomable speeds already. The effort going on now is to make them larger to solve larger problems, including their own problems, namely: handling noise from ambient heat and extending the life of a transient qubit before it decays.

What do you think is the most exciting potential of quantum computing, for both society at large and the enterprise?
It's crazy to envision everything that will be possible once we have them working. The problem with computers today, even if we were to magically unify them all to work as one, is the computational potential of humanity is rather pathetic. If everyone mastered and quadrupled the capability of the most powerful GPUs, for example, and created zillions of them, it would be less than a drop in the bucket filling the ocean of computation we would need to simulate quite simple proteins or achieve classic human dreams like curing cancer, room temperature superconductivity or even finding ways to reverse environmental patterns. These are all theoretical right now, and our science is based on trial and error, not actually resolving solutions. That is only because we cannot compute what we need to compute. Quantum computers are the enablers.

What about for developers -- how do you think quantum computing is going to impact them over the next, say, 2 years, 5 years, 10 years?
We will see the first practical, operational quantum computers in five years. Those will be capable of computing every Bitcoin ever mined and every Bitcoin that could ever be mined, all in less than the time it would take you to swallow a drink of your Starbucks.

"We will see the first practical, operational quantum computers in five years. Those will be capable of computing every Bitcoin ever mined and every Bitcoin that could ever be mined, all in less than the time it would take you to swallow a drink of your Starbucks."

Jerry Nixon, Developer Evangelist, Microsoft

This is a rather big deal, sure, but they will be used, instead, not just to wreak havoc on computer security, but to solve the problems stopping us from building the version 2 quantum computers. For those we will start to see them appearing in the 7- to 10-year timeframe -- quantum computers whose inner workings can barely be understood because they are written by machines we barely understand. Not because they are better than we are, but because their computation is so vast and fast that their conclusions can only be verified by machines working similarly -- not by a human or any traditional computer we know of today. Then in the 10-year (maybe 15) timeframe after the second wave of q computers become functional, we can start to see massive human achievement: things like cancer, longevity, physical sciences, environment -- it will all be up to our imagination at that point. What's crazy is all of this is happening right under the noses of mankind; these large research and engineering efforts that will likely enable achievements we have mostly resigned to impossible, they are all happening now by armies of scientists scrambling to build the building blocks of Humanity version 2 (so to speak). Will we need version 3 and 4 q machines to achieve even more? Of course. Yes. And, it's not all going to come so fast because we have never trained scientists to think in this way -- the number of mathematicians and scientists capable of this level of work is pretty low, and the number of STEM graduates continues to decline in America. So, many of these breakthroughs may come from abroad.

What can/should .NET developers be doing right now to prepare for quantum computing, especially those who are interested in this technology?
There's another timeframe worth considering. How long until it is no longer necessary to have a traditional programmer at all. This timeframe is more likely to be 40 to 50 years from now when the creation of software can be accomplished at such staggering speeds and with such amazing accuracy that employing hand-crafted code is akin to building skyscrapers without steel. In the interim, two things will happen. No. 1: Life will go on -- there will be plenty of work for the work that is still going. Companies writing Web sites, data integration, line-of-business and workflow applications will not be in the realm of quantum for decades because there are far more important fish to fry. Also, because quantum will be expensive at first. No. 2: The other is like machine learning and the modern enterprise -- companies will be foolish to discount the advantage of augmenting their systems with appropriate quantum capabilities, even if they are expensive. Eventually, commoditized quantum for the sake of typical enterprise workloads will be a standard offering from clouds like Azure. Developers like you and I need to realize this is inevitable, and we need to recognize that these changes are just starting and we can get in on the ground floor and thrive through integration and simply not resisting. Like machine learning, today.

"Developers like you and I need to realize this is inevitable, and we need to recognize that these changes are just starting and we can get in on the ground floor and thrive through integration and simply not resisting."

Here's a more practical use case. I just presented at the recent symposium for Applied Materials, which, you would not be faulted for not knowing, build the equipment that builds computer chips. Intel is one of their largest customers. Quantum computers will solve a universal problem: one of routing. You might think of it like FedEx. What route to take at which time of day based on previous and anticipated traffic patterns, including the weather, events on the calendar that could slow traffic, the disposition of the individual driver, how slowly the dock workers work on Mondays and so on. Add all that, then include a trillion boxes and a hundred million trucks and you are much more close to the problem of routing in supercomputers. You need a supercomputer to route for a supercomputer, but even then the problem is about repetition. If the first route is not the best, then you have to run it again -- taking the same cost in computation. In fact you have to run them all before you can rank them; today we solve this with sampling -- that is to say, we do not run them all. FedEx would never include all of those variables because it would make the route computation incrementally impractical to compute. Similarly, your company would not try to process a mortgage faster based on, in part, the rotation of the earth because even if it played a roll the inclusion of that data would be impractical. What happens when it is no longer impractical? What happens when solar flares and the delivery schedules of the milkman really do impact the stock market and it is no longer impractical to include those in your calculations? Maybe your hiring policies would change, your workflows would change, even which block you build your next building on might change. It turns out that there are macro and micro impacts on efficiencies, but we only have the computational power to include some of the macro effects at this point. In tomorrow's enterprise, determining things will be a commodity through quantum whose use will differentiate early adopters.

What else do you want to share with Visual Studio Magazine readers about quantum computing ahead of your session at Visual Studio Live! Chicago?
The real deal here is about the problem we have reached. It appears through Moore's Law that our capabilities are ever-increasing. But the rate of increase, though staggering, is far too slow to reach the lofty goals we see in our futures. Let's take curing cancer as an example. If you model a molecule you have to model its molecular behavior, its physical properties, and its quantum properties (or subatomic behaviors) in order to properly model and simulate it. We've already surpassed our capabilities today, by the way, but pretending we dedicated the human race to this feat, we would not have the calendar time to calculate its interactions with treatments or medicines before our sun burned out. But let's pretend we did. And let's pretend we found the cure we wanted. That solution, obviously wonderful, would only be the solution for a single type of cancer and we would need to start over for the next one -- and there are hundreds of thousands each with hundreds of thousands of variants each with hundreds of thousands of possible treatments. If, however, we could calculate just one possible treatment, it would require a massive machine, but if that machine could then calculate every possible derivation across unlimited parallel universes at one time that effort would suddenly be something we could strive to accomplish. A quantum qubit executes every possible state across universes in perfect simultaneity leveraging quantum effects like tunneling, entanglement, and teleportation instead of suffering from them like traditional systems do as our transistors get smaller and smaller. It's not like the industrial age. It's like we've discovered electricity.

This sounds a lot like Star Trek, I know. But it's legit. There's a new particle called the Majorana Particle which was only theoretical until Microsoft captured one in March, proving not only that it existed (mathematicians already knew) but that we could start to use it ourselves. It is a special cluster of electrons in a superconductive state, stirred by a magnetic field that seems to align their behavior into what you and I would consider a new element (both in behavior and composition), one that is both matter and antimatter at the same time, in different dimensions. Again, so much Star Trek stuff here, huh? For the typical .NET developer today, the right thing would be to master .NET, and turn your eyes to Q#, the quantum language written for traditional developers to operate quantum machines (there is also the Microsoft Quantum SDK). And, honestly, it would make a lot of sense to pay attention in your linear algebra class!

About the Author

Becky Nagel serves as vice president of AI for 1105 Media specializing in developing media, events and training for companies around AI and generative AI technology. She also regularly writes and reports on AI news, and is the founding editor of PureAI.com. She's the author of "ChatGPT Prompt 101 Guide for Business Users" and other popular AI resources with a real-world business perspective. She regularly speaks, writes and develops content around AI, generative AI and other business tech. Find her on X/Twitter @beckynagel.

comments powered by Disqus

Featured

Subscribe on YouTube