News

New SQL, Hadoop Management Products Unveiled

Microsoft CEO Satya Nadella introduces Azure Intelligent Systems Service and Analytics Platform System.

Data continues to get bigger and more pervasive, and shows no signs of slowing down. That makes it more critical than ever to find new ways to corrall and control it, and Microsoft just announced several new offerings toward that end.

In addition to the recently released SQL Server 2014, the company previewed two new data platform products in San Francisco yesterday: the Microsoft Azure Intelligent Systems Service and the Analytics Platform System.

The Intelligent Systems Service (ISS) embraces the Internet of Things (IoT) for data analytics, while the Analytics Platform System (APS) allows queries across tradtional SQL data warehouses and Hadoop stores.

CEO Satya Nadella introduced the products while emphasizing the importance of data in today's world of ubiquitous computing and ambient intelligence. Microsoft is evolving to bridge a technology gap between huge amounts of data -- the "data exhaust" generated from ubiquitous computing -- and the ambient intelligence derived from that data that can improve business, he said. Microsoft wanted to communicate how it provides the "magic" to bridge that gap.

Satya Nadella at today's event
Microsoft CEO Satya Nadella at the event

To that end, the ISS helps users embrace the IoT by securely connecting to machine-generated data from sources such as devices and sensors, capturing that data and managing it to derive business value, regardless of the OS being used. Microsoft said the service lets users analyze the data with familiar tools such as HDInsight and Power BI for Office 365. The company said the service is available out of the box to take advantage of the IoT and immediately generate business value by exposing actionable insights from the data collected by any devices in an organization.

"This makes it possible for you now to be able to collect data from all the sensors and servers, bring that into cloud, so you can take advantage of the rest of the data platform to be able to do the analysis, the machine learning," Nadella said in his address. "And so that is a service that I think will take out all the friction that exists in being able to connect the cloud to the Internet of Things trend that is only going to increase."

Users can request to take part in the limited preview of the service.

The other new product, the APS, is now generally available. Microsoft described the product as an "evolution of the Parallel Data Warehouse" with the new capability to query across data stored in a traditional relational data warehouse and also data stored in a Hadoop region, either in an on-premises turnkey appliance or in a separate Hadoop cluster.

The new Analytics Platform System
[Click on image for larger view.] The new Analytics Platform System

Nadella noted that central to this new "Big Data in a box" technology is the PolyBase project developed from research done by David DeWitt and his team.

APS, Nadella said, "for the very first time brings together the Parallel Data Warehouse of SQL with a Hadoop region. So that means now you have the ability in one affordable appliance to do queries across both of these. You can imagine the scenarios. You can imagine having data from your transaction systems, your log information from your servers and Web sites, as well as social streams and the ability now to be able to query across all of this."

Quentin Clark, corporate vice president of the Data Platform Group, also presented at the event and wrote an accompanying blog post about APS, in which he stated, "SQL Server has seamless integration with VMs in Azure to provide secondaries for high availability and disaster recovery. The data people access in the business intelligence experience comes through Excel from their own data and partner data –- and Power BI provides accessibility to wherever the data resides."

The Microsoft team also expounded on the new capabilities of SQL Server 2014, recently made generally available. The primary new capability is in-memory technology, which can provide enormous and instantaneous performance benefits.

Microsoft's Julie Strauss joined Clark to walk through a scenario involving a fictional e-commerce company called Fabrikam. The regular sales Web site suffered some performance problems, taking about seven seconds to load a user's recommendations as derived from behavior. With just a few keystrokes, Strauss applied in-memory optimization to the underlying SQL Server 2014 table and reported a 7x increase in performance, bringing the recommendations up in about 1 second. She also memory-optimized tables to gain a 23x performance boost in concurrent purchases and a 71x gain the time needed to update the best-seller list.

In-memory e-commerce Web site performance optimization
[Click on image for larger view.] In-memory e-commerce Web site performance optimization

Clark said in-memory transaction processing optimization resulting in general 30x performance gains in throughput and latency is now available for every workload running in SQL Server 2014, the first RDBMS product to have that capability.

"In-memory technology has been allowing users to manipulate millions of records at the speed of thought, and scaling analytics solutions to billions of records in SQL Server Analysis Services," Clark wrote in his accompanying blog post.

Throughout his address, Nadella emphasized a "data culture" at Microsoft, which allows insights to come from anyone, anywhere, at any time, as long as they have the right tools to work with. Fostering that data culture in other organizations lets employees do great things, he said, and embodying this data culture is of extreme importance at the company, even going so far as to say it's "the most paramount thing inside of Microsoft." This data culture, in which every engineer is leveraging usage data to test and improve products, is "the lifeblood of Microsoft," Nadella said. "In every aspect, you have to build deeply into the fabric of the company a culture that thrives on data," he said.

Nadella also wrote a blog post about his address, in which he concluded, "We are all experiencing the explosion of data driven by ubiquitous computing. We all crave easier and faster ways to turn that data into fuel for insight, and to realize the potential of ambient intelligence for every individual and every organization. Today marks a big step toward, and we're going to keep moving quickly."

About the Author

David Ramel is an editor and writer at Converge 360.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube