After several months of research, review and revision, a white paper I wrote for the SQL Azure team, "NoSQL and the Windows Azure Platform", has been published by Microsoft. If you go to http://www.microsoft.com/windowsazure/whitepapers
and do a find within the page for "NoSQL" you'll see a link for it. If you'd rather download the PDF directly, you can do so by clicking here
. The 25-page (not including cover and TOC) paper provides an introduction to NoSQL database technology, and its major subcategories, for those new to the subject; an examination of NoSQL technologies available in the cloud using Windows Azure and SQL Azure; and a critical discussion of the NoSQL and relational database approaches, including the suitability of each to line-of-business application development.
As I conducted my research for the paper, and read material written by members of the NoSQL community, I found a consistent sentiment toward, and desire for, cleaning the database technology slate. NoSQL involves returning to a basic data storage and retrieval approach. Many NoSQL databases, including even Microsoft's Azure Table Storage, are premised on basic key-value storage technology -- in effect, the same data structure used in collections, associative arrays and cache products.
I couldn't help thinking that the recent popularity of NoSQL is symptomatic of a generational and cyclical phenomenon in computing. As product categories (relational databases in this case) mature, products within them load up on features and create a barrier to entry for new, younger developers. The latter group may prefer to start over with a fresh approach to the category, rather than learn the wise old ways of products whose market presence predates their careers -- sometimes by a couple of decades.
The new generation may do this even at the risk of regression in functionality. In the case of NoSQL databases, that regression may include loss of "ACID" (transactional) guarantees; declarative query (as opposed to imperative inspection of collections of rows); comprehensive tooling; and wide availability of trained and experienced professionals. Existing technologies have evolved in response to the requirements, stress tests, bug reports, and user suggestions accumulated over time. And sometimes old technologies can even be used in ways equivalent to the new ones. Two cases in point: the old SQL Server Data Services was a NoSQL store, and its underlying implementation used SQL Server. Even the developer fabric version of Azure Table Storage is implemented using SQL Server Express Edition's XML columns.
So if older technologies are proven technologies, and if they can be repurposed to function like some of the newer ones, what causes such discomfort with them? Is it mere folly of younger developers? Are older developers building up barriers of vocabulary, APIs and accumulated, sometimes seldom used, features in their products, to keep their club an exclusive one?
In other engineering disciplines, evolution in technology is embraced, built upon, made beneficial to consumers, and contributory to overall progress. But the computing disciplines maintain a certain folk heroism in rejecting prior progress as misguided. For some reason, we see new implementations of established solutions as elegant and laudable. And virtually every player in the industry is guilty of this. I haven't figured out why this phenomenon exists, but I think it's bad for the industry. It allows indulgence to masquerade as enlightenment, and it holds the whole field back.
Programming has an artistic element to it; it's not mere rote science. That's why many talented practitioners are attracted to the field, and removing that creative aspect of software work would therefore be counter-productive. But we owe it to our colleagues, and to our customers, to conquer fundamentally new problems, rather than provide so many alternative solutions to the old ones. There's plenty of creativity involved in breaking new ground too, and I dare say it brings more benefit to the industry, even to society.
NoSQL is interesting technology and its challenge to established ways of thinking about data does have merit and benefit. Nevertheless, I hope the next disruptive technology to come along says yes to conquering new territory. At the very least, I hope it doesn't start with "No."
Posted on 04/25/2011 at 1:15 PM23 comments
The term "going native" can be a terribly derogatory phrase, connoting the prejudiced outlook of colonists toward the peoples on whom they've imposed themselves. But it can also be playful or empathetic, summoning images of intrepid travelers who get out of their hotels and try to meet people in the countries they visit, and maybe even eat their local delicacies and learn a few words of their language.
Are platforms like people? Are operating systems like countries? Is Silverlight a colonizer? Is HTML, especially HTML 5, an empathetic visitor to foreign lands? Or is it the conqueror, of Silverlight, of Flash, and even of the Windows API?
Talk about overloaded terms! After two days of MIX keynotes, the word "native" has been bandied about a lot. In the day 1 keynote, we heard a lot about the "native" support for HTML 5 in Internet Explorer 9/Windows 7 today and the greater support coming in IE10 and Windows v. Next sometime soon. In the day 2 keynote, we even saw how the "Mango" update to Windows Phone 7 will bring the same kind of integration between that operating system and its own implementation of IE 9. These operating systems will natively support HTML, making Windows the HTML place to be. Go where the natives go.
So native is a good thing, right? Not so fast. Because native can also refer to an application written specifically for the operating system's own application stack. Like a Win32 app, or a .NET Windows Forms or WPF app. And as much as native is a good thing when it comes to HTML support, it seems like native apps, in the non-HTML sense of the word, were on Redmond's naughty list at MIX this week. In fact, in the day 2 keynote, the only truly native apps we saw were ones that Microsoft Corporate Vice President Joe Belfiore and others showed running on Windows Phone 7 devices, or else they were demos of the Kinect SDK.
So maybe native is good, or maybe native is bad. Or maybe non-native things are bad, which means native is good. Because when something is native, there's something it's not: a plug-in. And in the day 1 keynote, Microsoft Windows Division President Steven Sinofsky, and Corporate Vice President for Internet Explorer Dean Hachamovitch, in talking about IE9 and IE10, were clear that the hallmark of these two browsers is that they have rid of us on our dependency on pesky plug-ins.
Which is noteworthy, because the very centerpiece of every MIX event up until this one was Silverlight, and Silverlight is, of course, a plug-in. The centerpiece has become an adornment, editorially, this year, at least. Draw your own conclusions. But make sure you learn at least some of the native markup language, because not everyone speaks XAML.
Posted on 04/14/2011 at 1:15 PM3 comments
In the run up to Microsoft's MIX conference, next week in Las Vegas, a new post on the Silverlight Team Blog
from Microsoft Developer Division VPs Walid Abu-Hadba, Scott Guthrie and Soma Somasegar provides new clarification of Microsoft's position on Silverlight and HTML 5. Read the post and interpret it for yourself. My take is this: given the current landscape of Smartphone and tablet OSes, only HTML 5 can let you reasonably target all of them, so Microsoft's going to bring you greatly improved dev tools for that platform. If your app needs to run only on Windows, Mac OS and/or Windows Phone 7, then Silverlight provides a richer, more optimized experience and greater developer productivity, so Microsoft's going to continue to invest there too.
I recently gave a talk on the Mobile Market at an exec briefing outside of Boston. In conducting my research for that talk, I discovered than an emerging strategy for mobile development is the creation of native apps that are merely thin shells around an embedded HTML 5 browser. The combination of there being five or six major mobile OS platforms (WP7, Android, iOS, BlackBerry, Symbian and webOS) and the fact that WebKit browsers show up on most of them (with IE9 and HTML 5 coming soon on WP7) means that a cross-device approach is the only one that's economically feasible for many developers, and HTML 5 is the cross-device approach that works.
With all that in mind, I think the position Abu-Hadba, Guthrie and Somasegar have outlined is the only one that's reasonable. The pledge Microsoft has made to support developers with strong HTML 5 tooling (and its gracious admission of the deficit that exists there right now) is a big deal, and I think it will be genuinely welcomed and appreciated by the developer audience.
Microsoft offers a strong Web and cloud platform that provides a superior environment for serving virtually any device (via HTML 5). It also has a highly evolved rich client platform in Silverlight that works beautifully on Windows, Mac and Windows Phone. So the emerging rule of thumb is to use Silverlight if the device targets support it; use ASP.NET and HTML 5 if not, or use a combination of Silverlight on supported devices and HTML 5 on all others. I think this is an ideal protocol given the far-from-ideal fragmentation we have in the client market.
There's the adage that when developing an application you can pick any two of the following three attributes: good, fast and cheap. Maybe we have a corollary to that now for cross-device development: pick any two of: rich, broad-reach and cheap. If that's the landscape, and I think it is, then Microsoft is giving us the best possible approach and toolset to work within it.
Posted on 04/05/2011 at 1:15 PM18 comments
This has been a busy week for Microsoft, and for me as well. On Monday, Microsoft launched Internet Explorer 9
at South by Southwest
(SXSW) in Austin, TX. That evening I flew from New York to Seattle. On Tuesday morning, Microsoft launched Visual Studio LightSwitch, Beta 2
with a Go-Live license, in Redmond, and I had the privilege of speaking at the keynote presentation where the announcement was made. Readers of this blog know I‘m a fan of LightSwitch, so I was happy to tell the app dev tools partners in the audience that I thought the LightSwitch extensions ecosystem represented a big opportunity – comparable to the opportunity when Visual Basic 1.0 was entering its final beta roughly 20 years ago. On Tuesday evening, I flew back to New York (and wrote most of this post in-flight). Two busy, productive days.
But there was a caveat that impacts the accomplishments, because Monday was also the day reports surfaced from credible news agencies that Microsoft was discontinuing its dedicated Zune hardware efforts. While the Zune brand, technology and service will continue to be a component of Windows Phone and a piece of the Xbox puzzle as well, speculation is that Microsoft will no longer be going toe-to-toe with iPod Touch in the portable music player market.
If we take all three of these developments together (even if one of them is based on speculation), two interesting conclusions can reasonably be drawn, one good and one less so. Microsoft is doubling down on technologies it finds strategic and de-emphasizing those that it does not.
HTML 5 and the Web are strategic, so here comes IE9, and it's a very good browser. Try it and see. Silverlight is strategic too, as is SQL Server, Windows Azure and SQL Azure, so here comes Visual Studio LightSwitch Beta 2 and a license to deploy its apps to production. Downloads of that product have exceeded Microsoft's projections by more than 50 percent, and the company is even citing analyst firms' figures covering the number of power-user developers that might use it. (I happen to think the product will be used by full-fledged developers as well, but that's a separate discussion.) Windows Phone is strategic too... I wasn't 100% positive of that before, but the Nokia agreement has made me confident. Xbox as an entertainment appliance is also strategic.
Standalone music players are not strategic -- and even if they were, selling them has been a losing battle for Microsoft. So if Microsoft has consolidated the Zune content story and the ZunePass subscription into Xbox and Windows Phone, it would make sense, and would be a smart allocation of resources. Essentially, it would be for the greater good.
But it's not all good. In this scenario, Zune player customers would lose out. Unless they wanted to switch to Windows Phone, and then use their phone's battery for the portable media needs, they're going to need a new platform. They're going to feel abandoned. Even if Zune lives, there have been other such cul de sacs for customers. Remember SPOT watches? Live Spaces? The original Live Mesh? Microsoft discontinued each of these products. The company is to be commended for cutting its losses, as admitting a loss isn't easy. But Redmond won't be well-regarded by the victims of those decisions. Instead, it gets black marks.
What's the answer? I think it's a bit like the 1980's New York City "don't block the box" gridlock rules: don't enter an intersection unless you see a clear path through it. If the light turns red and you're blocking the perpendicular traffic, that's your fault in judgment. You get fined and get points on your license and you don't get to shrug it off as beyond your control. Accountability is key. The same goes for Microsoft. If it decides to enter a market, it should see a reasonable path through success in that market.
Switching analogies, Microsoft shouldn't make investments haphazardly, and it certainly shouldn't ask investors to buy into a high-risk fund that is sold as safe and which offers only moderate returns. People won't continue to invest with a fund manager with a track record of over-zealous, imprudent, sub-prime investments. The same is true on the product side for Microsoft, and not just with music players and geeky wrist watches. It's true of Web browsers, and line-of-business app dev tools, and smartphones, and cloud platforms and operating systems too.
When Microsoft is casual about its own risk, it raises risk for its customers, and weakens its reputation, market share and credibility. That doesn't mean all risk is bad, but it does mean no product team's risk should be taken lightly. For mutual fund companies, it's the CEO's job to give his fund managers autonomy, but to make sure they're conforming to a standard of rational risk management. Because all those funds carry the same brand, and many of them serve the same investors.
The same goes for Microsoft, its product portfolio, its executive ranks and its product managers.
Posted on 03/16/2011 at 1:15 PM3 comments
Billy Hollis, my Visual Studio Live!
colleague and fellow Microsoft Regional Director said recently, and I am paraphrasing, that the computing world, especially on the consumer side, has shifted from one of building hardware and software that makes things possible to do, to building products and technologies that make things easy to do. Billy crystallized things perfectly, as he often does.
In this new world of "easy to do," Apple has done very well and Microsoft has struggled. In the old world, customers wanted a Swiss Army Knife, with the most gimmicks and gadgets possible. In the new world, people want elegantly cutlery. They may want cake cutters and utility knives too, but they don't want one device that works for all three tasks. People don't want tools, they want utensils. People don't want machines. They want appliances.
Microsoft Appliances: They Do Exist
Microsoft has built a few appliance-like devices. I would say XBox 360 is an appliance. It's versatile, mind you, but it's the kind of thing you plug in, turn on and use, as opposed to set-up, tune and open up to upgrade the internals. Windows Phone 7 is an appliance too. It's a true smartphone, unlike Windows Mobile, which was a handheld computer with a radio stack. Zune is an appliance too, and a nice one. It hasn't attained much traction in the market, but that's probably because the seminal consumer computing appliance -- the iPod -- got there so much more quickly.
In the embedded world, Mediaroom, Microsoft's set-top product for the cable industry (used by AT&T U-Verse and others) is an appliance. So is Microsoft's Sync technology, used in Ford automobiles.
Even on the enterprise side, Microsoft has an appliance: SQL Server Parallel Data Warehouse Edition (PDW) combines Microsoft software with select OEMs' server, networking and storage hardware. You buy the appliance units from the OEMs, plug them in, connect them and go.
I would even say that Bing is an appliance. Not in the hardware sense, mind you. But from the software perspective, it's a single-purpose product that you visit or run, use and then move on. You don't have to install it (except the iOS and Android native apps where it's pretty straightforward), you don't have to customize it, you don't have to program it. Basically, you just use it.
Microsoft Appliances that Should Exist
But Microsoft builds a bunch of things that are not appliances. Media Center is not an appliance, and it most certainly should be. Instead, it's an app that runs on Windows 7. It runs full-screen and you can use this configuration to conceal the fact that Windows is under it, but eventually something will cause you to abandon that masquerade (like Patch Tuesday).
The next version of Windows Home Server won't, in my opinion, be an appliance either. Now that the Drive Extender technology is gone, and users can't just add and remove drives into and from a single storage pool, the product is much more like an IT server and less like an appliance-premised one. Much has been written about this decision by Microsoft. I'll just sum it up in one word: pity.
Microsoft doesn't have anything remotely appliance-like in the tablet category, either. Until it does, it likely won't have much market share in that space either. And of course, the bulk of Microsoft's product catalog on the business side is geared to enterprise machines and not personal appliances.
Appliance DNA: They Gotta Have It.
The consumerization of IT is real, because businesspeople are consumers too. They appreciate the fit and finish of appliances at home, and they increasingly feel entitled to have it at work too. Secure and reliable push email in a smartphone is necessary, but it isn't enough. People want great apps and a pleasurable user experience too.
The full Microsoft Office product is needed at work, but a PC with a keyboard and mouse, or maybe a touch screen that uses a stylus (or requires really small fingers), to run Office isn't enough either. People want a flawless touch experience available for the times they want to read and take quick notes.
Until Microsoft realizes this fully and internalizes it, it will suffer defeats in the consumer market and even setbacks in the business market. Think about how slow the Office upgrade cycle is. Now imagine if the next version of Office had a first-class alternate touch UI and consider the possible acceleration in adoption rates.
Can Microsoft make the appliance switch? Can the appliance mentality become pervasive at the company? Can Microsoft hasten its release cycles dramatically and shed the "some assembly required" paradigm upon which many of its products are based? Let's face it, the chances that Microsoft won't make this transition are significant.
But there are also encouraging signs, and they should not be ignored. The appliances we have already discussed, especially Xbox, Zune and Windows Phone 7, are the most obvious in this regard. The fact that SQL Server has an appliance SKU now is a more subtle but perhaps also more significant outcome, because that product sits so smack in the middle of Microsoft's enterprise stack. Bing is encouraging too, especially given its integrated travel, maps and augmented reality capabilities. As Bing gains market share, Microsoft has tangible proof that it can transform and win, even when everyone outside the company, and many within it, would bet otherwise.
That Great Big Appliance in the Sky
Perhaps the most promising (and evolving) proof points toward the appliance mentality, though, are Microsoft's cloud offerings -- Azure and BPOS/Office 365. While the cloud does not represent a physical appliance (quite the opposite in fact) its ability to make acquisition, deployment and use of technology simple for the user is absolutely an embodiment of the appliance mentality and spirit. Azure is primarily a platform as a service offering; it doesn't just provide infrastructure. SQL Azure does likewise for databases. And Office 365 does likewise for SharePoint, Exchange and Lync.
You don't administer, tune and manage servers; instead, you create databases or site collections or mailboxes and start using them. Upgrades come automatically, and it seems like releases will come more frequently. Fault tolerance and content distribution are just there. No muss. No fuss. You use these services; you don't have to set them up and think about them. That's how appliances work.
To me, these signs point out that Microsoft has the full capability of transforming itself. But there's a lot of work ahead. Microsoft may say they're "all in" on the cloud, but the majority of the company is still oriented around its old products and models. There needs to be a wholesale cultural transformation in Redmond.
It can happen, but product management, program management, the field and executive ranks must unify in the effort. So must partners, and even customers. New leaders must rise up and Microsoft must be able to see itself as a winner. If Microsoft does this, it could lock-in decades of new success, and be a standard business school case study for doing so. If not, the company will have missed an opportunity, and may see its undoing.
Posted on 02/28/2011 at 12:28 PM3 comments
On Wednesday, HP pre-announced the second coming out for its recently acquired Palm webOS mobile operating system. I happen to think webOS is quite good, and when the Palm Pre first came out, I thought it a worthwhile phone. I was worried though that the platform would never attract the developer mindshare it needed to be competitive, and that turned out to be the case.
Then HP acquired Palm and announced it would be revamping the webOS offering, not only on phones, but also on tablets. It later announced that it would also use webOS as an embedded solution on HP printers.
The timing of this came shortly after HP had announced it would be producing a "Slate" product running Windows 7. After the Palm deal, HP became vague about whether the Windows-powered slate would actually come out. They did, in fact, bring the Slate 500 to market, but by some accounts, they only built 5000 units.
Another recent awkward moment between HP and Microsoft: HP withdrew itself from the Windows Home Server ecosystem. That one hurt, as they were the dominant OEM there. But Microsoft's decision to kill Drive Extender had driven away many parties, not just HP.
On Wednesday, HP came out with their TouchPad, and new phone models. Not a nice thing for Windows Phone 7, but other OEMs are taking a wait and see attitude there too, I suppose. There was one more zinger though, and it was bigger: HP announced they'd be porting webOS to PCs.
No Windows Phone 7? OK. No Windows Home Server? Whatcha gonna do? But no Windows 7 either? From HP? What comes after that, no ink and toner?
Some people think Microsoft's been around too long to be relevant. But HP started out making oscilloscopes! The notion that HP is too cool for Windows school is a bit far-fetched. This is the company that bought EDS. This is the company that bought Compaq. And Compaq was the company that bought Digital Equipment Corporation. Somehow, I don't think the VT 220 outclasses Windows PCs.
What could possibly be going on? My sense is that HP wants to put webOS on PCs that also have Windows, and that people will buy because they have Windows. And for every one of those sold, HP gets to count, technically speaking, another webOS unit in the install base. webOS is really nice, as I said. But being good isn't good enough when you are trying to get market share. Number of units shipped matters. The question is whether counting PCs with webOS installed, but dormant, is helpful to HP's cause.
Seems like a funny way to account for market share, and a strange way to treat a big partner in Redmond.
Posted on 02/10/2011 at 1:15 PM6 comments
Most of my posts are about Microsoft. This one is about a new firm that helps companies build strategies around Microsoft, and its stack. The company is called Blue Badge Insights
, and I'm proud to announce I've launched it, effective immediately.
There's been a lot that's led up to this. For over two years now, I have written the Redmond Review column for Visual Studio Magazine (VSM) and Redmond Developer News (RDN). I've had this blog longer than that, but I got a lot more serious about it when the column started, and when VSM and RDN picked the blog up as Redmond Diary on their Web sites.
I've been in the Microsoft-oriented writing/speaking world now for more than 15 years. I've been a Microsoft Regional Director for almost 10 years, and I've advised the tech press on Microsoft products, technologies and news for most of that time. Most of that has been a side line though, because I've been in the technology consulting business for over 20 years, and that's been my primary focus.
But as much as I like the technology and have enjoyed implementing it, or facilitating its implementation by others, I've started to enjoy the strategy around, and analysis of, the technology even more. So Instead of just writing about Microsoft, I've decided to take my Redmond-focused analyst and advisory activities and promote them from hobby to profession.
From our office in the Empire State Building, Blue Badge Insights has already started work for two major ISVs, a solution provider that is an established Gold Partner, a BI startup and even Microsoft itself. It's a great start, but there's a long way to go, and I hope readers here will consider telling friends, colleagues and their executives about Blue Badge Insights' services. In time, we'll be publishing a subscription-based written briefing that may be worthy of mention as well.
All that said, I must point out that Redmond Review and this blog are not going anywhere. In fact, they've just become a lot more important. What I talk about here and what I hear back in the form of reader comments is extremely important to my new business. Without the public content, there would be little credibility to our fee-based writings and services, and there would be little motivation behind them, too.
Blue Badge Insights is a new kind of business. Some people think it's a great idea. Other people think it's crazy. No matter who is correct, I have to make some noise. I hope you'll keep reading, I hope you'll keep commenting and I hope you'll grant me this one especially self-promoting post.
Posted on 01/24/2011 at 1:15 PM0 comments
Amazon Web Services (AWS) today announced the beta release of its "Elastic Beanstalk" Platform as a Service (PaaS) offering
. The platform initially is available to Java developers only, but it sounds pretty snazzy: you wrap your code up as a Java WAR file (a Web Application Archive), upload it and deploy it. There are tools developers can integrate into Eclipse to do the upload or you can use the AWS Management Console. Wait a few minutes and then visit your app at http://myapp.elasticbeanstalk.com/ or something similar.
What's most notable about the Beanstalk offering, as far as I can tell, is that Amazon doesn't have a hard distinction between their PaaS and IaaS (Infrastructure as a Service) offerings. Instead of being two separate things, they are simply opposite ends of a spectrum, such that a Beanstalk PaaS instance can be customized slightly or modified to the degree that it becomes much like an IaaS EC2 instance.
So what does this mean for Microsoft's Azure cloud platform? Its claim to fame thus far has been that it's a PaaS platform, with an IaaS option merely being phased in (in the form of the Virtual Machine Role offering, which is now in Beta). That's been the story and that's why, for many customers and applications, Azure has looked like the right choice. Is it all over now? Let's take inventory of some important, relevant facts:
- Microsoft has said all along that they and Amazon would eventually each get into the other's territory. That is, MS knew it would have an IaaS offering and MS knew that Amazon would have a PaaS offering.
- Microsoft is working feverishly to round out its platform with things like RDP access, VPN connections, Extra Small Instances and Azure Drive (making VHD files in Blog Storage available as mountable NTFS drives in Azure Role instances).
- SQL Azure is a Database as a Service (DaaS) offering. Although Amazon offers access to MySQL (via its RDS service), Oracle and SQL Server, these services are essentially hosted services built around on-premise products. That's not the same as a DaaS database that is automatically replicated and tuned. Plus AWS' SQL Server offering includes the Express and Standard Editions only. Moreover, SQL Azure offers automatic publishing of data in OData format and will soon offer cloud-based reporting too.
- Perhaps most important: the fact that Azure treats its PaaS instances as distinct from its IaaS ones makes the architecture especially well suited to spinning up and tearing down machine instances. By having apps able to execute on what is essentially a generic VM, Microsoft assures that developers will build apps that deploy seamlessly and with minimal risk of dependency on environment or software version specifics. Arguably, that removes impediments to availability and smooth operational logistics.
In any case, competition in this sphere is very important and helpful. That's a bit of a platitude, but it's still true. AWS offers virtual servers as a commodity; Azure offers an application execution and data processing environment. Each offering has its place, and if Amazon and Microsoft both stretch from their home bases into the other's territory, that rounds out each offering and keeps both companies honest and innovative.
It should also make each product increasingly economical. That's good news for customers, and for their customers and investors.
Posted on 01/21/2011 at 1:15 PM1 comments
I’m just back from the 2011 Consumer Electronics Show (CES). Every year I go to CES to get a sense of what Microsoft is doing in the consumer space, and how people are reacting to it. When I first went to CES two years ago, Steve Ballmer announced the beta of Windows 7 at his keynote address, and the crowd went wild. When I went again last year, everyone was hoping for a Windows tablet announcement at the Ballmer keynote. Although they didn’t get one (unless you count the unreleased HP Slate running Windows 7), people continued to show anticipation around Project Natal (which became Xbox 360 Kinect) and around Windows Phone 7. On the show floor last year, there were machines everywhere running Windows 7, including lots of netbooks. Microsoft had a serious influence at the show both years.
But this year, one brand, one product, one operating system evidenced itself over and over again: Android. Whether in the multitude of tablet devices that were shown across the show, or the burgeoning number of smartphones shown (including all four forthcoming 4G-LTE handsets at Verizon Wireless’ booth) or the Google TV set top box from Logitech and the embedded implementation in new Sony TV models, Android was there.
There was excitement in the ubiquity of Android 2.2 (Froyo) and the emergence of Android 2.3 (Gingerbread). There was anticipation around the tablet-optimized Android 3.0 (Honeycomb). There were highly customized skins. There was even an official CES Android app for navigating the exhibit halls and planning events. Android was so ubiquitous, in fact, that it became surprising to find a device that was running anything else. It was as if Android had become the de facto Original Equipment Manufacturing (OEM) operating system.
Motorola’s booth was nothing less than an Android showcase. And it was large, and it was packed. Clearly Moto’s fortunes have improved dramatically in the last year and change. The fact that the company morphed from being a core Windows Mobile OEM to an Android poster child seems non-coincidental to their improved fortunes. Even erstwhile WinMo OEMs who now produce Windows Phone 7 devices were not pushing them. Perhaps I missed them, but I couldn’t find WP7 handsets at Samsung’s booth, nor at LG’s. And since the only carrier exhibiting at the show was Verizon Wireless, which doesn’t yet have WP7 devices, this left Microsoft’s booth as the only place to see the phones.
Why is Android so popular with consumer electronics manufacturers in Japan, South Korea, China and Taiwan? Yes, it’s free, but there’s more to it than that. Android seems to have succeeded as an OEM OS because it’s directed at OEMs who are permitted to personalize it and extend it, and it provides enough base usability and touch-friendliness that OEMs want it. In the process, it has become a de facto standard (which makes OEMs want it even more), and has done so in a remarkably short time: the OS was launched on a single phone in the US just two and a quarter years ago.
Despite its success and popularity, Apple’s iOS would never be used by OEMs, because it’s not meant to be embedded and customized, but rather to provide a fully finished experience. Ironically, Windows Phone 7 is likewise disqualified from such embedded use. Windows Mobile (6.x and earlier) may have been a candidate had it not atrophied so much in its final five years of life.
What can Microsoft do? It could start by developing a true touch-centric OS for tablets, whether that be within Windows 8 or derived from Windows Phone 7. It would then need to deconstruct that finished product into components, via a new or altered version of Windows Embedded or Windows Embedded Compact. And if Microsoft went that far, it would only make sense to work with its OEMs and mobile carriers to make certain they showcase their products using the OS at CES, and other consumer electronics venues, prominently.
Mostly though, Microsoft would need to decide if it were really committed to putting sustained time, effort and money into a commodity product, especially given the far greater financial return that it now derives from its core Windows and Office franchises. Microsoft would need to see an OEM OS for what it is: a loss leader that helps build brand and platform momentum for up-level products. Is that enough to make the investment worthwhile? One thing is certain: if that question is not acknowledged and answered honestly, then any investment will be squandered.
Posted on 01/10/2011 at 1:15 PM8 comments
I have been spending the last 2 weeks immersing myself in a number of Windows Azure and SQL Azure technologies. And in setting up a new business (I'll speak more about that in the future), I have also become a customer of Microsoft's BPOS (Business Productivity Online Services). In short, it has been a fortnight of Microsoft cloud computing.
On the Azure side, I've looked, of course, at Web Roles and Worker Roles. But I've also looked at Azure Storage's REST API (including coding to it directly), I've looked at Azure Drive and the new VM Role; I've looked quite a bit at SQL Azure (including the project "Houston" Silverlight UI) and I've looked at SQL Azure labs' OData service too. I've also looked at DataMarket and its integration with both PowerPivot and native Excel. Then there's AppFabric Caching, SQL Azure Reporting (what I could learn of it) and the Visual Studio tooling for Azure, including the storage of certificate-based credentials. And to round it out with some user stuff, on the BPOS side, I've been working with Exchange Online, SharePoint Online and LiveMeeting.
I have to say I like a lot of what I've been seeing. Azure's not perfect, and BPOS certainly isn't either. But there's good stuff in all these products, and there's a lot of value.
Azure Goes Deep
Most people know that Web and Worker roles put the platform in charge of spinning virtual machines up and down, and keeping them up to date. But you can go way beyond that now. The still-in-beta VM Role gives you the power to craft the machine (much as does Amazon's EC2), though it takes away the platform's self-managing attributes. It still spins instances up and down, making drive storage non-durable, but Azure Drive gives you the ability to store VHD files as blobs and mount them as virtual hard drives that are readable and writeable.
Whether with Azure Storage or SQL Azure, Azure does data. And OData is everywhere. Azure Table Storage supports an OData Interface. So does SQL Azure and so does DataMarket (the former project "Dallas"). That means that Azure data repositories aren't just straightforward to provision and configure... they're also easy to program against, from just about any programming environment, in a RESTful manner. And for more .NET-centric implementations, Azure AppFabric caching takes the technology formerly known as "Velocity" and throws it up into the cloud, speeding data access even more.
Snapping in Place
Once you get the hang of it, this stuff just starts to work in a way that becomes natural to understand. I wasn't expecting that, and I was really happy to discover it. In retrospect, I am not surprised, because I think the various Azure teams are the center of gravity for Redmond's innovation right now. The products belie this and so do my observations of the product teams' motivation and high morale. It is really good to see this; Microsoft needs to lead somewhere, and they need to be seen as the underdog while doing so. With Azure, both requirements are in place.
BPOS: Bad Acronym, Easy Setup
BPOS is about products you already know; Exchange, SharePoint, Live Meeting and Office Communications Server. As such, it's hard not to be underwhelmed by BPOS. Until you realize how easy it makes it to get all that stuff set up. I would say that from sign-up to productive use took me about 45 minutes... and that included the time necessary to wrestle with my DNS provider, set up Outlook and my SmartPhone up to talk to the Exchange account, create my SharePoint site collection, and configure the Outlook Conferencing add-in to talk to the provisioned Live Meeting account.
Never before did I think setting up my own Exchange mail could come anywhere close to the simplicity of setting up an SMTP/POP account, and yet BPOS actually made it faster.
What I want from my Azure Christmas Next Year
Not everything about Microsoft's cloud is good. I close this post with a list of things I'd like to see addressed:
- BPOS offerings are still based on the 2007 Wave of Microsoft server technologies. We need to get to 2010, and fast. Arguably, the 2010 products should have been released to the off-premises channel before the on-premises one. Office 365 can't come fast enough.
- Azure's Internet tooling and domain naming is scattered and confusing. Deployed ASP.NET applications go to cloudapp.net; SQL Azure and Azure storage work off windows.net. The Azure portal and Project Houston are at azure.com. Then there's appfabriclabs.com and sqlazurelabs.com. There is a new Silverlight portal that replaces most, but not all of the HTML ones. And Project Houston is Silvelright-based too, though separate from the Silverlight portal tooling.
- Microsoft is the king of tooling. They should not make me keep an entire OneNote notebook full of portal links, account names, access keys, assemblies and namespaces and do so much CTRL-C/CTRL-V work. I'd like to see more project templates, have them automatically reference the appropriate assemblies, generate the right using/Imports statements and prime my config files with the right markup. Then I want a UI that lets me log in with my Live ID and pick the appropriate project, database, namespace and key string to get set up fast.
- Beta programs, if they're open, should onboard me quickly. I know the process is difficult and everyone's going as fast as they can. But I don't know why it's so difficult or why it takes so long. Getting developers up to speed on new features quickly helps popularize the platform. Make this a priority.
- Make Azure accessible from the simplicity platforms, i.e. ASP.NET Web Pages (Razor) and LightSwitch. Support .NET 4 now. Make WebMatrix, IIS Express and SQL Compact work with the Azure development fabric. Have HTML helpers make Azure programming easier. Have LightSwitch work with SQL Azure and not require SQL Express. LightSwitch has some promising Azure integration now. But we need more. WebMatrix has none and that's just silly, now that the Extra Small Instance is being introduced.
- The Windows Azure Platform Training Kit is great. But I want Microsoft to make it even better and I want them to evangelize it much more aggressively. There's a lot of good material on Azure development out there, but it's scattered in the same way that the platform is. The Training Kit ties a lot of disparate stuff together nicely. Make it known.
Should Old Acquaintance Be Forgot
All in all, diving deep into Azure was a good way to end the year. Diving deeper into Azure should be a great way to spend next year, not just for me, but for Microsoft too.
Posted on 01/03/2011 at 1:15 PM1 comments
The news hit Monday morning that Google has decided to delay the release of its Google TV platform
, and has asked its OEMs to delay any products that embed the software. Coming just about two weeks prior to the 2011 Consumer Electronics Show (CES), Google's timing is about the worst imaginable. CES is where the platform should have had its coming out party, especially given all the anticipation that has built up since the initial announcement
came seven months ago.
At last year's CES, it seemed every consumer electronics company had fashioned its own software stack for Internet-based video programming and applications/widgets on its TVs, optical disc players and set top boxes. In one case, I even saw two platforms on a single TV set (one provided by Yahoo! and the other one native to the TV set).
The whole point of Google TV was to solve this problem and offer a standard, embeddable platform. But that won't be happening, at least not for a while. Google seems unable to get it together, and more proprietary approaches, like Apple TV, don't seem to be setting the world of TV-Internet convergence on fire, either.
It seems to me, that when it comes to building a "TV operating system," Windows Media Center is still the best of a bad bunch. But it won't stay so for much longer without some changes. Will Redmond pick up the ball that Google has fumbled? I'm skeptical, but hopeful. Regardless, here are some steps that could help Microsoft make the most of Google's faux pas:
- Introduce a new Media Center version that uses XBox 360, rather than Windows 7 (or 8), as the platform. TV platforms should be appliance-like, not PC-like. Combine that notion with the runaway sales numbers for Xbox 360 Kinect, and the mass appeal it has delivered for Xbox, and the switch form Windows makes even more sense. As I have pointed out before, Microsoft's Xbox implementation of its Mediaroom platform (announced and demoed at last year's CES) gets Redmond 80% of the way toward this goal. Nothing stops Microsoft from going the other 20%, other than its own apathy, which I hope has dissipated.
- Reverse the decision to remove Drive Extender technology from Windows Home Server (WHS), and create deep integration between WHS and Media Center. I have suggested this previously as well, but the recent announcement that Drive Extender would be dropped from WHS 2.0 creates the need for me to a) join the chorus of people urging Microsoft to reconsider and b) reiterate the importance of Media Center-WHS integration in the context of a Google compete scenario.
- Enable Windows Phone 7 (WP7) as a Media Center client. This would tighten the integration loop already established between WP7, Xbox and Zune. But it would also counter Echostar/DISH Network/Sling Media, strike a blow against Google/Android (and even Apple/iOS) and could be the final strike against TiVO.
- Bring the WP7 user interface to Media Center and Kinect-enable it. This would further the integration discussed above and would be appropriate recognition of WP7's Metro UI having been built on the heritage of the original Media Center itself. And being able to run your DVR even if you can't find the remote (or can't see its buttons in the dark) could be a nifty gimmick.
Microsoft can do this but its consumer-oriented organization -- responsible for Xbox, Zune and WP7 -- has to take the reins here, or none of this will likely work. There's a significant chance that won't happen, but I won't let that stop me from hoping that it does and insisting that it must. Honestly, this fight is Microsoft's to lose.
Posted on 12/21/2010 at 1:15 PM1 comments
Today, at the Silverlight Firestarter event on the Microsoft Campus in Redmond, Scott Guthrie (Microsoft's Corporate Vice President, .NET Developer Platform) announced Silverlight 5 and introduced us to its features. Among them:
- 1080p video, GPU-acceleration, "trickplay" (variable speed playback with pitch correction)
- WS-Trust support, low-latency networking for real time apps
- Ability to set breakpoints on data binding expressions and debug them
- Merging into Silverlight of WPF data binding features
- <LoadTransition> tag in XAML, letting you do fly-ins without storyboards
- Style setting "binding"
- Multi-column text flow, font tracking/leading, vector-based PostScript printing API
- Pivot Control, now be built into the Silverlight SDK
- Immediate Mode Graphics API w/ GPU-accelerated 3D support
- P/Invoke for low-level API programming
- 64-bit version of SL Runtime
For more details,read this post on Silverlight team member Tim Heuer's blog.
It's a lot of stuff. And as to what this all means to the prognosis for Silverlight's continued existence among the living, I would say the technology will be with us for a while. Of course, part of the reason for that longevity is that the Beta for Silverlight 5 won't make an appearance until the Spring, and RTM/RTW won't be until later next year. But that's fine.
If Silverlight seems off its heretofore aggressive release "cadence," at least part of the reason is that the version for Windows Phone 7 (a version that a lot of people do not count) shipped only recently. And if the wait seems long, realize that normally the Silverlight team wouldn't talk to us until they were almost ready with a CTP. But given the PDC brouhaha, the team knew they had to brief the developer community. I wouldn't wish it any other way.
But let's put aside the SL5 vs. HTML5 debate for a moment; the battle of the 5s seems less urgent now. Let's instead consider what the SL4 demos that were shown, and the SL5 features that were announced, really mean for the product.
Here's the message I got today: Silverlight is a technology for building serious business applications, and building them under the highly productive tooling regime of Visual Studio. Yes, Silverlight apps can look subtle, futuristic and kiosk-like. Yes, Silverlight is a venerable media platform. But also, and perhaps above all, Silverlight is Microsoft's rich client platform. It lets you build line-of-business applications. They can include powerful data visualization capabilities. They can be highly data-connected. They can be tested and debugged efficiently. They can be re-skinned programmatically.
Whether you can or cannot create similar applications in HTML5, given enough time, is beside the point. Because in business you're probably not "given enough time." You have an application to write; it's got to be written quickly. It's got to run mostly bug-free, and the few bugs that get out have to be easily diagnosed and corrected. And while these apps do not have to look like consumer-oriented kiosk installations, they should still have some punch: nice transitions, good use of color and graphics, and anything else that will make them fun enough and engaging enough to get users to adopt them and enjoy using them. And this has to be do-able under time and budget constraints that are tough and getting tougher.
That's what Silverlight's for. One day, you'll be able to do that with HTML. That day's not here. Not yet.
Posted on 12/02/2010 at 1:15 PM1 comments
With all the noise (not to mention the funk) around HTML5 and Silverlight at PDC 2010, we could be forgiven for missing the numerous keynote announcements about Windows Azure and SQL Azure. And with all those
announcements, we could be forgiven for missing the ostensibly more arcane stuff in the breakout sessions. But one of those sessions covered material that was very important.
That session was Lev Novik's "Building Scale-out Database Applications with SQL Azure," and given the generic title, we could be further forgiven for not knowing how important the session was. But it was important. Because it covered the new Federation (aka "sharding") features coming to SQL Azure in 2011. Oh, and by the way, you'd also be forgiven for not knowing what sharding was. Perhaps a little recent history would help.
When SQL Azure was first announced, its databases were limited in size to 10GB, in the pro version of the service. That's big enough for lots of smaller Web apps, but not for bigger ones, and definitely not for Data Warehouses. Microsoft's answer at the time to criticism of this limitation was that developers were free to "shard" their databases. Translation: you could create a bunch of Azure databases, treat each one as a partition of sorts, and take it upon yourself to take the query you needed to do, and divide it up into several sub queries -- each one executing against the correct "shard" -- and then merge all the results back again.
To be honest, while that solution would work and has architectural merit, telling developers to build all that plumbing themselves was pretty glib, and not terribly actionable. Later, Microsoft upped the database size limitation to 50GB, which mitigated the criticism, but it didn't really fix the problem, so we've been in a bit of a holding pattern. Sharding is sensible, and even attractive. But the notion that a developer would have to build out all the sharding logic herself, in any environment that claims to be anything "as a service," was far-fetched at best.
That's why Novik's session was so important. In it, he outlined the explicit sharding support, to be known as Federation, coming in 2011 in SQL Azure, complete with new T-SQL keywords and commands like CREATE/USE/ALTER FEDERATION and CREATE TABLE...FEDERATE ON. Watch the session and you'll see how elegant and simple this is. Effectively, everything orbits around a Federation key, which then corresponds to a specific key in each table, and in turn determines what part of those tables' data goes in which Federation member ("shard"). Once that's set up, queries and other workloads are automatically divided and routed for you and the results are returned to you as a single rowset. That's how it should have been along. Never mind that now.
Sharding does more than make the 50GB physical limit become a mere detail, and the logical size limit effectively go away. It also makes Azure databases more elastic, since shards can become more or less granular as query activity demands. If a shard is getting too big or queried too frequently, it can be split into two new shards. Likewise, previously separate shards can be consolidated, if the demand on them decreases. With SQL Azure Federation, we really will have Database as a Service... instead of Database as a Self-Service.
But it goes beyond that. Because with Federation, SQL Azure gains a highly popular feature of so-called document-oriented NoSQL databases like MongoDB, which features sharding as a foundational feature. Plus, SQL Azure's soon-to-come support for decomposing a database-wide query into a series of federation member-specific ones, and then merging the results back together, starts to look a bit like the MapReduce processing in various NoSQL products and Google Hadoop. When you add to the mix SQL Azure's OData features and its support for a robust RESTful interface for database query and manipulation, suddenly staid old relational SQL Azure is offering many of the features people know and love about their NoSQL data stores.
While NoSQL databases proclaim their accommodation of "Internet scale," SQL Azure seems to be covering that as well, while not forgetting the importance of enterprise scale, and enterprise compatibility. Federating (ahem!) the two notions of scale in one product seems representative of the Azure approach overall: cloud computing, with enterprise values.
Posted on 11/23/2010 at 1:15 PM1 comments
Readers of my blog
know that I am a big fan of PowerPivot. And because of that, I was excited to see, both at June's Microsoft BI Conference (part of Tech Ed) and this week's PASS Summit
, that the PowerPivot technology will become more entrenched. What Microsoft showed in June was that the full-fledged SQL Server Analysis Services (SSAS) product would support the in-memory BI (IMBI) engine, known within Microsoft as VertiPaq.
This cool, columnar, in-memory technology will not be limited to personal and departmental BI solutions that live in Excel and SharePoint; instead, it will be available for Enterprise BI implementations too. As it should be.
But it gets cooler. What Amir Netz showed at PASS this week is that we will be able to open up an Excel workbook (that has a PowerPivot model in it) in the next version of the BI tools in Visual Studio, and have it reverse engineer the model, and then allow us to deploy the model to Analysis Services. Now we really have a workflow going, wherein "user-generated" BI solutions can be upsized to Enterprise ones, and in a way where the design can be refactored and optimized, rather than simply moved wholesale. It also means that even Enterprise BI pros can use PowerPivot as a tool for prototyping.
We're still not done. Not by a long shot. Because Netz, the father of Analysis Services, and his team have done something truly groundbreaking. They've voluntarily, indeed enthusiastically, made it possible for relational databases to take advantage of some of this cool IMBI technology. The next version of SQL Server, currently code-named "Denali" and available as a CTP1 release as of this week, will allow for the creation of so-called columnar indexes over relational databases.
In other words, the SQL and BI teams have worked together to implement VertiPaq's technology within the core SQL Server relational engine. This means the in-memory speed and high rates of compression currently implemented by PowerPivot, will also be available in relational databases. And the speed gains, while fastest in PowerPivot and the Denali version of SSAS, queried with MDX and DAX, will now benefit relational databases that are queried with SQL.
So now SQL Server will become a columnar database, but not in a way that makes it a niche product. Instead, it will continue to be the powerhouse it has grown to be, with mainstream appeal, and it will bring columnar technology to potentially all of its users. And while analytical work will still best be done with Analysis Services, the ability to perform aggregation queries directly against a data warehouse, with impressively fast response times, will be in mainstream users' hands too. I never thought BI for the masses would manifest itself this way. But the more I think about it, the more profoundly sensible the tactic seems.
I thought relational technology was completely mature and that all the action was on the BI side. The SQL Server team is showing us that such a dichotomy may be all wrong. The action is on the BI side and that, in turn, means relational database innovation continues.
Posted on 11/12/2010 at 1:15 PM0 comments
I didn’t go to the Microsoft Professional Developers Conference (PDC) this year because it was, as far as I could tell, a made-for-streaming video event. As such, I watched the keynote about 24 hours after it took place and used my Media Center PC to watch it on my plasma television. And I have to say, the keynote was worthy of the medium. Not only did the Silverlight Smooth Streaming technology deliver a fine HD image, but the content of the keynote itself, merited a big screen, and necessitated the ability to pause, rewind and play back.
Sure, the first part of the keynote, focusing on Internet Explorer 9 and HTML5, then Windows Phone 7, was useful and important. And the juxtaposition of HTML5 in the browser with Silverlight on Windows Phone was striking, but the real showstopper (the good kind) of the keynote was Bob Muglia’s presentation of all the new services and features being added to Windows Azure, SQL Azure and Windows Azure App Fabric. As a character in David Lynch’s "Twin Peaks" might say: "Wow, Bob, Wow."
There was already a lot of speculation that Microsoft would be adding Infrastructure as a Service (IaaS) features to Azure, both for a better competitive story against Amazon Web Services, and also to treat Microsoft developers like grownups, letting them configure their cloud environments with the same degree of control they have in their on-premises environments. The speculation was correct, but as it turns out, it barely scratched the surface of what we got.
The Azure cloud will soon offer the following dizzying array of capabilities:
- Remote Desktop (RDP) into Azure role instances
- Access to the full suite of Internet Information Services (IIS) capabilities (including multiple sites per Web role)
- Admin rights on Azure role instances for one or more users
- The ability to deploy and spin up numerous instances of Hyper V virtual machine images (which can be composed of base and difference disk VHD volumes) that you build yourself, and RDP connectivity into those instances as well.
- VPN connectivity between Azure role instances and your on premises infrastructure (the former project "Sydney")
- Windows Azure AppFabric Access Control services providing authentication via not only Active Directory Federation Services, but also Windows Live, Yahoo, Facebook and…it appeared…Google.
- Brief mention of integration between the Azure AppFabric Service Bus and BizTalk Server (I would love to know more about this)
- Windows AppFabric caching (formerly known as "Velocity") capabilities added to Windows Azure AppFabric
- SQL Server Reporting Services added to SQL Azure
- An online store, called Windows Azure Marketplace, and within it, the "DataMarket" which is the official brand for what had been called project "Dallas."
- The Windows Azure AppFabric Composition Model, which lets you combine Windows Azure roles, SQL Azure databases, hosted Windows Workflows, Azure AppFabric Access Control services and more into a single unit of deployment and configuration. The composition and configuration is done through a visual designer.
- A new low-priced "Extra Small Instance" offering that will allow developers and hobbyists to work with the real Azure environment for a price of $.05/compute hour (which by my calculations comes out to $37.20 for a 31-day compute month).
As if the above list were not enough, Muglia and his team (including Mark Russinovich and Don Box) showed highly relevant, end to end demos of how all these services and features integrate with each other and into Visual Studio. Brian Harry even showed a sneak preview of an Azure-based implementation of Team Foundation Server and connected to it from Visual Studio as well.
My column on the soundness of Microsoft's cloud strategy came out on October 1, and in it I made clear that I was impressed. Not quite a month later, I continue to be impressed, to say the least. When all of these features become generally available, Azure will offer a cloud continuum, allowing the most granular, hands-on experience for developers and IT pros who need and want it, to nicely abstracted Platform as a Service (PaaS) instances which technologists can treat as a black box deployment target. Scaling out involves a few clicks in a (much improved) Web console; connecting the cloud assets to your on-premises databases and file shares is done in the same console and a single button click lets you RDP into just about anything.
As the title of this post alludes to, this is the kind of maturity in a product that Microsoft generally does not add until a 3rd version of a product. But unless the demos were completely cooked, it sure looks the Azure teams aimed that high, and got there, for v2. Good for them; it was time to discard the 3.0 precedent.
What I found most striking of all, though, was the way this next generation of Azure brings together so many parts of the Microsoft stack and repurposes them for significant added value. Hyper-V, RDP, Windows Server, ASP.NET, LINQ, OData, SQL Server (including Reporting Services and Data Tier Applications), AppFabric caching and Visual Studio. They all move fluidly from on-prem to cloud.
If all this info makes your head spin, you might want to read through the Azure roadmap.
After my head stopped spinning, I realized that all this new stuff was actually rooted in something a bit older, and that once again, Microsoft's cloud story is faithful to the company's core strengths, products and strategies. The Windows Azure AppFabric Composition Model made me think back to the vision in Microsoft’s Dynamic Systems Initiative (DSI), introduced almost exactly seven years ago (and articulated to the press by Bob Muglia, no less). The architecture diagrams (originally project "Whitehorse") which were introduced two years later as part of the Architect SKU of the original Visual Studio 2005 Team System product, were an early manifestation of DSI, but they weren’t very successful. And yet the Composition Model designer that was shown at the PDC keynote bore a strong resemblance to the old diagram designer, and may well enjoy the success that the Whitehorse team originally hoped for. Mr. Muglia must have taken some pride in that, and so should the whole company. Wow, Bob, Wow.
Posted on 10/29/2010 at 1:15 PM1 comments