Skip to main content

enabling deduplication on unnamed volumes (and other stuff)

it dawned on me the other day that while i had enabled deduplication on my office computers, i never did enable it at home. back when ssd was very expensive, i had managed to get a very small drive (64gb.) well, it proved to be too small to be useful.

i ended up replacing the optical drive with a secondary hdd. it runs out of the optical chassis so it spins slower. it did it’s job though – which was to provide more space for not often accessed things. cool. i ran into a couple of things while toying around.

in case you didn’t know you could, windows 8.1 will support deduplication. you just have to get the binaries on to the os. once you install it and enable the features, you need to get into powershell to turn stuff on.

so, here’s a primer on getting all the deduplication commands:

gcm *dedup* | gcm –module deduplication (both work)

CommandType     Name                            ModuleName  
-----------     ----                            ----------  
Function        Disable-DedupVolume             Deduplication
Function        Enable-DedupVolume              Deduplication
Function        Expand-DedupFile                Deduplication
Function        Get-DedupJob                    Deduplication
Function        Get-DedupMetadata               Deduplication
Function        Get-DedupSchedule               Deduplication
Function        Get-DedupStatus                 Deduplication
Function        Get-DedupVolume                 Deduplication
Function        Measure-DedupFileMetadata       Deduplication
Function        New-DedupSchedule               Deduplication
Function        Remove-DedupSchedule            Deduplication
Function        Set-DedupSchedule               Deduplication
Function        Set-DedupVolume                 Deduplication
Function        Start-DedupJob                  Deduplication
Function        Stop-DedupJob                   Deduplication
Function        Update-DedupStatus              Deduplication

 

first problem i ran into happened when i went to enable the c: drive and received the following error:

enable-dedupvolume -Volume c:
enable-dedupvolume : MSFT_DedupVolume.Volume='c:' - HRESULT 0x8056530b, The specified volume type is not supported. Deduplication is supported on fixed, write-enabled NTFS data volumes and CSV backed by NTFS data volumes.

unfortunately searching for the error code did not yield any results. however, if we look at the error message, it speaks about the volume type. according to technet, this is what is supported:

  • Must not be a system or boot volume. Deduplication is not supported on operating system volumes.
  • Can be partitioned as a master boot record (MBR) or a GUID Partition Table (GPT), and must be formatted using the NTFS file system.
  • Can reside on shared storage, such as storage that uses a Fibre Channel or an SAS array, or when an iSCSI SAN and Windows Failover Clustering is fully supported.
  • Do not rely on Cluster Shared Volumes (CSVs). You can access data if a deduplication-enabled volume is converted to a CSV, but you cannot continue to process files for deduplication.
  • Do not rely on the Microsoft Resilient File System (ReFS).
  • Can’t be larger than 64 TB in size.
  • Must be exposed to the operating system as non-removable drives. Remotely-mapped drives are not supported.

the requirements fell apart on the first bullet for me. oh well, i still have the secondary hdd i can optimize. ran into a small snag, realizing that i had created a mount point so the secondary hdd isn’t an actual volume i can specify by drive letter.

not too big of a deal as long as i know the path where it’s mounted such as:

enable-dedupvolume –Volume c:\data

 

if the directory is unknown, you could also use the objectid, which you can get from get-volume. the following command would attempt to enable deduplication on all available volumes. obviously, this is not something you want to try on your desktop:

get-volume | % { enable-dedupvolume -volume $_.ObjectId }

Comments

Popular posts from this blog

using preloadpkgonsite.exe to stage compressed copies to child site distribution points

UPDATE: john marcum sent me a kind email to let me know about a problem he ran into with preloadpkgonsite.exe in the new SCCM Toolkit V2 where under certain conditions, packages will not uncompress.  if you are using the v2 toolkit, PLEASE read this blog post before proceeding.   here’s a scenario that came up on the mssms@lists.myitforum.com mailing list. when confronted with a situation of large packages and wan links, it’s generally best to get the data to the other location without going over the wire. in this case, 75gb. :/ the “how” you get the files there is really not the most important thing to worry about. once they’re there and moved to the appropriate location, preloadpkgonsite.exe is required to install the compressed source files. once done, a status message goes back to the parent server which should stop the upstream server from copying the package source files over the wan to the child site. anyway, if it’s a relatively small amount of packages, you can

How to Identify Applications Using Your Domain Controller

Problem Everyone has been through it. We've all had to retire or replace a domain controller at some point in our checkered collective experiences. While AD provides very intelligent high availability, some applications are just plain dumb. They do not observe site awareness or participate in locating a domain controller. All they want is the name or IP of one domain controller which gets hardcoded in a configuration file somewhere, deeply embedded in some file folder or setting that you are never going to find. How do you look at a DC and decide which applications might be doing it? Packet trace? Logs? Shut it down and wait for screaming? It seems very tedious and nearly impossible. Potential Solution Obviously I wouldn't even bother posting this if I hadn't run across something interesting. :) I ran across something in draftcalled Domain Controller Isolation. Since it's in draft, I don't know that it's published yet. HOWEVER, the concept is based off

sccm: content hash fails to match

back in 2008, I wrote up a little thing about how distribution manager fails to send a package to a distribution point . even though a lot of what I wrote that for was the failure of packages to get delivered to child sites, the result was pretty much the same. when the client tries to run the advertisement with an old package, the result was a failure because of content mismatch. I went through an ordeal recently capturing these exact kinds of failures and corrected quite a number of problems with these packages. the resulting blog post is my effort to capture how these problems were resolved. if nothing else, it's a basic checklist of things you can use.   DETECTION status messages take a look at your status messages. this has to be the easiest way to determine where these problems exist. unfortunately, it requires that a client is already experiencing problems. there are client logs you can examine as well such as cas, but I wasn't even sure I was going to have enough m