News

New Prefetching Scheme Saves Computing Time, Energy

Most desktop computers today, when not in use, power down their hard drives to save energy. While this may be very green, it also incurs minor, though sometimes frustrating, delays. Waiting up to eight seconds for a hard drive to power back up to fetch some document is not the best use of any office worker's time. So two University of Arizona researchers have sussed out a way to save energy and keep data coming quickly, through a concept they call context-aware prefetching.

"The idea is to reduce or eliminate delays," said Igor Crk, who along with Chris Gniady wrote up their findings in the paper "Context-Aware Mechanisms for Reducing Interactive Delays of Energy Management in Disks" (available here in PDF format). Crk presented the team's work at the USENIX conference, held last month in Boston.

The idea is to run a small program that logs user actions and notes which chains of actions typically lead to interactions with the hard disk. For instance, if a user opens a word processing program and then executes a series of keystrokes to open a program, that whole series of actions is a good predictor that the hard drive will need to deliver a file within a few seconds.

"Monitoring user interactions with applications provides an opportunity for predicting upcoming power mode transitions and, as a result, eliminating the delays associated with these transitions," the paper states.

The team wrote a program that, when sensing a sequence of actions that typically leads to hard-drive use, will issue a command to the hard drive to spin-up. So, by the time the user actually requests the file, the hard drive, already up and running, can serve it up immediately.

Prefetching is already quite widely used in many systems, but in most of these approaches, the hard drive is activated whenever a user does anything at all on a computer. Only when the user does not interact with the computer at all for a set period of time does the disk power down. Crk explained that this approach is not as efficient insofar as the hard drive stays powered on for long periods of time while nothing is actually written or read from the disk.

The team's approach accurately predicted hard-drive usage 79 percent of the time, and incorrectly predicting usage only 2 percent of the time. As a result, this approach "is able to reduce spin-up delays on average by 35 percent (over 3 seconds), while maintaining low energy consumption," according to the paper. By contrast, traditional prefetching predicted hard drive use 81 percent of the time, though incorrectly predicted usage 52 percent of the time, making that approach much more energy-intensive.

Crk, speaking with this reporter after the presentation, noted that context-aware prefetching could be easily implemented in most operating systems. Most all commercial hard drives have commands that be be evoked to spin up disks. Crk explained that the program they created consumed very little in the way of CPU resources, so it would go unnoticed by users (it is available here for those willing to sign a non-disclosure agreement and agree to only use for the software for non-commercial purposes).

Crk also said that this technique could also be used with wireless cards as well, which could help save battery life on mobile and laptop computers. The idea would be the same, though the program would wake up the wireless communications card rather than the hard drive.

USENIX, the Advanced Computing Systems Association, is a non-profit association for technicians, scientists, systems administrators and engineers to share information on developments in the field of computer science.

About the Author

Joab Jackson is the chief technology editor of Government Computing News (GCN.com).

comments powered by Disqus

Featured

  • VS Code v1.99 Is All About Copilot Chat AI, Including Agent Mode

    Agent Mode provides an autonomous editing experience where Copilot plans and executes tasks to fulfill requests. It determines relevant files, applies code changes, suggests terminal commands, and iterates to resolve issues, all while keeping users in control to review and confirm actions.

  • Windows Community Toolkit v8.2 Adds Native AOT Support

    Microsoft shipped Windows Community Toolkit v8.2, an incremental update to the open-source collection of helper functions and other resources designed to simplify the development of Windows applications. The main new feature is support for native ahead-of-time (AOT) compilation.

  • New 'Visual Studio Hub' 1-Stop-Shop for GitHub Copilot Resources, More

    Unsurprisingly, GitHub Copilot resources are front-and-center in Microsoft's new Visual Studio Hub, a one-stop-shop for all things concerning your favorite IDE.

  • Mastering Blazor Authentication and Authorization

    At the Visual Studio Live! @ Microsoft HQ developer conference set for August, Rockford Lhotka will explain the ins and outs of authentication across Blazor Server, WebAssembly, and .NET MAUI Hybrid apps, and show how to use identity and claims to customize application behavior through fine-grained authorization.

  • Linear Support Vector Regression from Scratch Using C# with Evolutionary Training

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the linear support vector regression (linear SVR) technique, where the goal is to predict a single numeric value. A linear SVR model uses an unusual error/loss function and cannot be trained using standard simple techniques, and so evolutionary optimization training is used.

Subscribe on YouTube