In-Depth

Data Protection Made Easy

System Center Data Protection Manager 2006 (DPM) lets you go beyond the Volume Shadow Copy Service to the next level of backup capability. Learn how you can use DPM to better protect your data.

We all spend a lot of time working on our computer systems. After all, as users, we need to produce information and documentation on all the items we work on. Once again, as users, we don't worry much about how that data is protected, until we lose a piece of vital data. All of a sudden we're concerned about backups and hopefully, restores of our data. And that's the way it should be.

Microsoft has gone a long way toward making this a lot simpler for users by introducing the Volume Shadow Copy Service (VSS) with Windows Server 2003. VSS is a special feature that automatically takes a "snapshot" of the data stored in a shared folder on a regular schedule (see Resources). What's great about this feature is that with the installation of a small client on Windows XP machines—the Previous Versions client—users can recover their own files. And because shadow copies are taken two times a day by default and a server can store up to 64 copies, users have access to more than 30 days worth of data, directly from the server itself and in self-service format. No longer do they have to wait for administrators to load a tape onto the system, restore the file, and then notify them. Given that it takes about 10 minutes to set this up (see Resources), it's surprising that most organizations don't have this solution in place yet.

VSS is not bad—in fact, it's a lot better than users have ever had with Windows—but you can now go one better. With the introduction of System Center Data Protection Manager 2006 (DPM for short), Microsoft has taken the Volume Shadow Copy Service to the next level. DPM is a server that automates the backup capabilities of remote offices. It is designed to automatically pull changes from any shared folder in any location in your network and store them centrally. Changes are captured as they occur and include only the deltas or only the modified bits for each file. All backups are stored to disk and are readily available for restores.

In fact, DPM also captures the Volume Shadow Copy images from each remote or local shared folder and stores them centrally. When users need to restore a file, they simply right-click on the lost file or on the folder that stored it, go to Properties, and click on the Previous Versions tab (see Figure 1). From here they can view, copy, or restore the files they need. The restore process is completely transparent to them and they don't even realize that they are not restoring from the local server, but from the central DPM server. One nice feature of DPM is that users can even restore directly from within Office applications.

Microsoft designed DPM as a disk-based–only backup solution. DPM pulls all information and can store it on a local disk structure, to network-attached storage, or to a storage area network, whichever best suits the needs of your organization. From there, you need a third-party backup solution or even just NT Backup to copy the data from disk to tape. A simple DPM architecture has the server attached to a central storage device, pulling backups over either the wide area network (WAN) or the local area network (LAN) and storing them to disk. They are then copied from disk to tape with a third-party tool (see Figure 2).

The first implementation of DPM supports only file shares, but Microsoft is working on adding backup support for Exchange and SQL Servers to provide a more comprehensive backup solution. For now, this low-cost backup solution is ideal for organizations of all sizes because it solves some key pain points most organizations face today with traditional backup strategies.

The most common of these is that tape-based backups are not a good solution for remote locations. Someone has to be there to change the tape when it gets full and also when data must be recovered, making it impossible to automate. In addition, backup windows are shrinking. That's because the backup to tape can only go so fast and depending on the amount of data you need to preserve, you might not have enough time during the backup window to complete it.

With DPM, backups are constant and include only deltas at all times. Data transfers on the WAN or LAN are minimal, and restores are always available. In addition, the backup to tape can take as long as it needs to because no one is directly accessing the backup data during the tape copy. In addition, tape-based backups are expensive. Now you can remove any remote tape-based solution and replace it with DPM.

In large implementations, you can chain DPM servers to create backup hubs that can protect a limited number of sites, and then pass on the data to a more central repository. DPM uses Active Directory to discover any file servers. It makes sense because all file shares are published in the directory by default. Once the servers are discovered, the DPM server will install a special agent on each. The first time the agent is deployed, it will synchronize all data with the DPM server and from that point on, it will pass on all data deltas.

In addition, the DPM Previous Versions client supersedes the default version included in Windows Server 2003. You'll have to deploy this client to your workstations to make sure they have access to the centrally stored snapshots. This way, users will be able to restore their own files. Administrators, on the other hand, can use DPM to restore servers, volumes, and shares through the DPM interface.

There are limitations, though. This first version of DPM does not support server clusters. In addition, one set of DPM servers can support only a single domain in AD, so if your forest includes multiple production domains, you'll need multiple DPM deployments. On the other hand, several partners have integrated their third-party backup tools with DPM to provide a complete backup experience. In fact, partners such as Yosemite and Veritas have solutions that are fully integrated with DPM, letting DPM manage the remote backups and then taking over to transfer data to tape. This greatly reduces costs and provides a clean and simple enterprise-level backup solution for organizations of any size.

About the Author

Danielle Ruest and Nelson Ruest, both Microsoft MVPs, are IT professionals focused on technologies futures. They are authors of multiple books, including "Microsoft Windows Server 2008: The Complete Reference" (McGraw-Hill Osborne Media, 2008), which focuses on building virtual workloads with Microsoft's new OS.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube