Q&A

VS2010 Q&A: Directions on Microsoft's Rob Sanfilippo

With the Visual Studio 2010 (VS2010) release candidate available to testers and the final version expected to launch on April 12, we spoke with Directions on Microsoft Analyst Rob Sanfilippo about his observations on the new IDE and the years-long development effort leading to its release.

What are your impressions of the product? Did Microsoft do good with Visual Studio 2010? Can you call out one or two aspects of the development effort that contributed to that result?
Overall, I am very impressed. The new test tools could go a long way toward helping teams raise software quality while shortening schedules, and even using hardware more efficiently. The modeling tools are very powerful, although they target a narrower segment of development teams. The IDE revamp and even the streamlined SKU lineup are commendable.

I think such great things came out of this release because VS2008 was such a solid release that the VS team had the luxury to go for stretch goals this time around, rather than having to focus on repairing shortcomings in the current release. The team effectively explained the goals of this release early and provided ample time for feedback, while keeping VS2008 fresh by releasing SDKs and extensions to address production customers' needs.

Microsoft seemed to succeed in its late scramble to address performance issues in the VS beta. But did it surprise you that such a critical issue cropped up late?
I don't think the discovery of performance issues was a surprise to Microsoft. I'm sure feedback regarding insufficient performance was coming in since the original CTP. Improvements were made with each subsequent release, but at some point during the Beta 2 cycle, Microsoft could see that criteria weren't being met and additional work needed to be done.

Given the ambitious feature set of VS2010, especially the move to a WPF-based IDE, it was clear that good performance would be one of the toughest hurdles. I was surprised, however, that the last-minute slip was limited to three weeks, which rounds to almost nothing for a major product release cycle, so this is a great job by the team, assuming the performance levels are good in the final product.


[Click on image for larger view.]
Figure 1. Rob Sanfilippo, Analyst, Directions on Microsoft.

Any insight into how good a job Microsoft did collaborating with the evaluation community, particularly early on?
The team did a great job providing information through discussion and demo videos on Channel 9, conference sessions that were made available on-demand, and blog postings. Considering the long gestation period of VS2010, about a year and a half from first CTP to expected final release, Microsoft likely had mounds of feedback to cull through. But it appeared to me that team members often responded expeditiously to customers using the test releases.

A product like VS has tentacles everywhere, from specific platform tooling like Silverlight and Azure to cutting edge implementations of foundation technologies like WPF 4. Any thoughts on how Microsoft managed this challenge and what it might have done better?
As has always been the case, Visual Studio has a highly strategic, mission critical agenda for Microsoft technologies. It is the enabler of many key Microsoft platforms, and its delivery of tools for such platforms often determines whether they will be successfully adopted by the industry. This release in particular surely required the most cat-herding yet to get designers, templates and other tools in to support SharePoint, the latest release of Silverlight, and the entirely new world of Azure. But, they are all in the box, so they job did get done. Smaller, more frequent releases is probably the best way to handle this better in the future.

Any insight into how well the VS team comported itself through this project? Do you think the effort was soundly managed?
I think many of the managers and engineers on the VS team right now, without naming names, have the appropriate mix of charisma and humility, and show that they are in touch with the real world of software development where not everything is defined by Microsoft. This hasn't always been the case, so it's refreshing to see. It's nice to see high-level managers blogging technical walk-throughs, being candid about team successes and failures, and remaining open-minded to all parts of the developer community.

Can you single out one or two changes in direction during the VS dev cycle that particularly impressed or surprised you? Can you talk a bit about what happened and why?
It was good to see the VS product line-up revamped with the elimination of the Team System brand, simpler MSDN subscription options and availability of a lower level TFS SKU to provide an upgrade path for the discontinued VSS (Visual SourceSafe) product.

I think the previous set of role-based VSTS editions caused customer confusion since there were too many products to consider, and in many development teams, the role disciplines overlap. It also made licensing compliance more difficult. Also, adding the new Test and Lab Manager SKU will help teams save costs while reducing unnecessary complexity for tools used by many team members.

Microsoft actually patched an IntelliSense-related bug in the VS2010 RC. Could this be a symptom of a rushed cycle?
There are always going to be last-minute bugs. This bug wasn't monumental, in my opinion, so I don't think it's a good indicator of the overall product stability at this point. I also have difficulty thinking of this as a rushed cycle, and it's good to see the team responding quickly with problem fixes. I would not be surprised if a first service pack is released before the end of 2010, with fixes and improvements to issues uncovered with the final release.

Aligning tooling and platforms is always difficult. How good a job overall is Visual Studio 2010 doing getting tools and platforms in synch? Do you fear the team is taking on too many dependencies at once?
The checklist of platform integration in VS2010 is absolutely impressive, especially considering the moving targets that the VS team had to work with. The integrated Azure, Silverlight and SharePoint tools, along with support for all the new technologies of .NET 4 are great achievements. One technology that didn't quite make it in though is Oslo, now renamed SQL Server Modeling -- although this was never promised for VS2010. Also, I'd like to see better integration of Microsoft's Secure Development Lifecycle tools and processes into VS.

Regarding the risk of taking so many dependencies at once, I think this is a worthwhile risk for Microsoft to take, rather than leaving out technologies for this major VS release. There will be some growing pains (patches and guidance), but VS2010 will enable developers to create some great new, high quality applications.

If you could give Microsoft one piece of advice to improve the development effort for the next version of Visual Studio, what would it be?
Mostly do more of the same, but do a minor release on a shorter schedule next so the community can stabilize a bit.

About the Author

Michael Desmond is an editor and writer for 1105 Media's Enterprise Computing Group.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube