News

Microsoft Demos 'Kilimanjaro'

"Kilimanjaro" and "Madison" will enable centralized application- and systems-management and massive parallel processing in SQL Server.

Microsoft showcased new functionality for SQL Server that will let developers and DBAs centrally manage applications and resources. The new pooling feature is planned for a business-intelligence focused release of SQL Server, code-named "Kilimanjaro."

Microsoft introduced the centralized app and system-management capability last month at the annual Professional Association for SQL Server (PASS) Conference, held in Seattle. In the keynote address, Ted Kummert, corporate vice president of Microsoft's Data and Storage Platform Division, highlighted the new centralized app- and systems-management capability as the latest component in the company's effort to further bolster SQL Server as a platform for enterprise-scale application development.

The pooling feature will be a key requirement as organizations seek to run database-management systems in virtualized environments, says Forrester Research Inc. analyst Noel Yuhanna. "The new, centralized application-management capability will make DBAs more productive and improve the overall efficiency of SQL Server deployments," he explains.

The new functionality fits with the company's strategy to enable massive parallel processing in SQL Server with a technology under development, code-named "Madison." Slated for release during the Kilimanjaro time frame, Madison is based on technology from Microsoft's acquisition of data warehouse appliance vendor DATAllegro. It's designed to enable organizations to scale data warehouses to hundreds of terabytes. A preview of Madison is expected within the next 12 months.

"Having built a number of parallel database systems in the past, I think we can offer something when it comes to optimization of queries that will allow us to scale even higher," says David DeWitt, a Microsoft technical fellow.

Interim Release
With Kilimanjaro-which is planned for release in the first half of 2010-Microsoft says customers will be able to consolidate data sources and increase the amount of information in the repository without degrading performance.

Microsoft revealed its SQL Server roadmap in October at its Business Intelligence Conference-also held in Seattle-describing Kilimanjaro as the next key update to SQL Server 2008 (see "Microsoft Outlines Next-Gen Databases," Nov. 1, 2008).

Kilimanjaro is not the next major release of SQL Server, however. According to Quentin Clark, general manager of Microsoft's SQL Server Group, the next major version will come within 36 months of the release of SQL Server 2008, which shipped this past summer. "Kilimanjaro is really an add-on kind of release," Clark says. "We'll do bug fixes and other minor stuff, but it's not a major release of the database engine."

That said, Clark acknowledges that the engine in SQL Server will be improved with Kilimanjaro. At the Windows Engineering Hardware Conference (WinHEC) in Los Angeles earlier this month, Clark demonstrated support for systems with more than 64 cores.

Microsoft says it's working with Hewlett-Packard Co., IBM Corp. and Intel Corp. for Kilimanjaro to support up to 256 logical processors running on the next release of Windows Server, called Windows Server 2008 R2.

Still Early for SQL Services
Also at PASS, Microsoft officials talked up SQL Data Services (SDS)-now part of the company's Azure cloud services portfolio-which debuted at the company's Professional Developers Conference in late October.

Developers shouldn't expect a replica of SQL Server, cautioned Telerik Corp. Chief Strategy Officer and Microsoft Regional Director Stephen Forte, speaking at a NYC .NET User Group Meeting in New York last month. "SQL Data Services is a little disappointing," Forte told the group, saying that adding tables, rows, columns and joins is complicated with the current SDS test build. "I have to do all these extra hoops; I might as well put SQL Server up in the cloud somewhere, open port 1433 and talk to it."

The problem is that SDS uses generic object attributes that are held in "one big massive table that holds everything," Forte said. "It needs to be more like SQL Server, not this big entity thing in the cloud. It's probably not what they're going to go to market with -- we'll see."

About the Author

Jeffrey Schwartz is editor of Redmond magazine and also covers cloud computing for Virtualization Review's Cloud Report. In addition, he writes the Channeling the Cloud column for Redmond Channel Partner. Follow him on Twitter @JeffreySchwartz.

comments powered by Disqus

Featured

Subscribe on YouTube