Data Driver

Blog archive

More Details on SQL Server 2014 In-Memory Capabilities

More details are emerging about in-memory capabilities in the new SQL Server 2014, announced at the recent TechEd 2013 conference.

The first Community Technology Preview is expected to be released soon, possibly this month, and you can register with Microsoft to be notified of its availability.

Highlights of the new release are data warehousing and business intelligence (BI) enhancements made possible through new in-memory capabilities built in to the core Relational Database Management System (RDBMS). As memory prices have fallen dramatically, 64-bit architectures have become more common and usage of multicore servers has increased, Microsoft has sought to tailor SQL Server to take advantage of these trends.

The in-memory Online Transaction Processing (OLTP) capability--formerly known by the codename Hekaton--lets developers boost performance and reduce processing time by declaring tables as "memory optimized," according to a whitepaper (PDF download) titled "SQL Server In-Memory OLTP Internals Overview for CTP1."

"Memory-optimized tables are stored completely differently than disk-based tables and these new data structures allow the data to be accessed and processed much more efficiently," Kalen Delaney wrote in the whitepaper. "It is not unreasonable to think that most, if not all, OLTP databases or the entire performance-sensitive working dataset could reside entirely in memory," she said. "Many of the largest financial, online retail and airline reservation systems fall between 500GB to 5TB with working sets that are significantly smaller."

"It’s entirely possible that within a few years you’ll be able to build distributed DRAM-based systems with capacities of 1-10 Petabytes at a cost less than $5/GB," Delaney continued. "It is also only a question of time before non-volatile RAM becomes viable."

Another new in-memory benefit is "new buffer pool extension support to non-volatile memory such as solid state drives (SSDs)," according to a SQL Server Blog post. This will "increase performance by extending SQL Server in-memory buffer pool to SSDs for faster paging."

Independent database expert Brent Ozar expounded on this subject, writing "SQL Server 2014 will automatically cache data [on SSDs] with zero risk of data loss."

"The best use case is for read-heavy OLTP workloads," Ozar continued. "This works with local SSDs in clusters, too--each node can have its own local SSDs (just like you would with TempDB) and preserve the SAN throughput for the data and log files. SSDs are cheap, and they’re only getting cheaper and faster."

Other in-memory features mentioned by the Microsoft SQL Server team include "enhanced in-memory ColumnStore for data warehousing," which supports real-time analytics and "new enhanced query processing" that speeds up database queries "regardless of workload."

Some readers expressed enthusiasm for the new features, but, of course, wanted more. "Ok the in-memory stuff (specifically OLTP and SSD support) is valuable but the rest is so so," read one comment from a reader named John on the Microsoft blog post. "Really I wish that we would see continued improvements in reporting and analysis services and in general less dependence on SharePoint which is a painful platform to manage. QlikView and Tableau are a real threat here."

Besides the in-memory capabilities, Microsoft also emphasized increased support for hybrid solutions where, for example, a company might have part of its system on-premises because of complex hardware configurations that don't lend themselves to hosting in the cloud. These companies can then use the cloud--Windows Azure--for backup, disaster recovery and many more applications. You can read more about that in this whitepaper (also a PDF download).

What do you think of the new in-memory capabilities of SQL Server 2014? Comment here or drop me a line.

Posted by David Ramel on 06/13/2013


comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube