It looks like there will be another test release of Microsoft's in-memory data caching software, code-named Project Velocity. Microsoft today said it will release a fourth Community Technology Preview (CTP) in mid-September, leading to the conclusion that the company will miss its goal of a summer release to manufacturing.
Announced a year ago, Project Velocity is designed to provide scalable performance of data-driven applications by reducing the number of calls the app has to make to the data source. According to Microsoft, it offers high-speed access to data developed in .NET via partitioned, replicated or local caches. It does so by fusing memory across multiple servers to provide a single, unified cache view to apps.
Microsoft has said it will offer Project Velocity free of charge. It was expected to RTM this summer but in a blog posting by the Project Velocity team, Microsoft revealed plans for CTP 4. The new CTP will offer improved stability and security.
CTP 4 will also include at least two new features: performance monitor counters and support for setup and configuration changes. The performance counters will be available for both the host and the cache. Microsoft's Sharique Muhammed provided details in a separate posting today.
Meanwhile, CTP 3 has been in the hands of testers for over the past two months, and Microsoft today posted code samples on its MSDN site today.
What's your take on Project Velocity? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 06/10/2009 at 1:15 PM
With Visual Studio v17.6 becoming generally available recently, Microsoft provided a peek at what's coming up in the next iteration, VS 2022 v17.7
Dev team shows how Welcome revamp was first presented and then how it was shaped by community feedback.
Microsoft's regular monthly update to Java on Visual Studio Code (May 2023) brings new features around performance improvement, the user experience and Spring Boot integration, among many others.
Dr. James McCaffrey of Microsoft Research says the main advantage of using Gaussian naive Bayes classification compared to other techniques like decision trees or neural networks is that you don't have to fine tune model parameters.
> More Webcasts