It seems to be some kind of universal rule that the moment I describe a post here as “Part One” intending it to start a series, something happens to drag my attention away or make the issue irrelevant *sigh*. However, I will make sure to follow-up with a part two. We’ll start part 1 with my comments and thoughts about disk fragmentation and the defraggers that just love to tame it.
In part two, I will talk about how you can’t measure how well defraggers work against one another very easily. Claus Valca wrote a nice article about defraggers over on Grand Stream Dreams where he talks about ways the process can still be valuable and asks about ways of evaluating the different products. I can think of one way of comparing different products, but while it’s honest and fair in terms of what it tests, it probably doesn’t answer all the questions you should be asking on the subject. Still it’s a start right?
Just to make things clear, I’m using the definitions of File System Fragmentation and Defragmentation found on Wikipedia. I’m not claiming those definitions are perfect but they’re at least as good as any I could write myself and they already exist.
Pick a ‘neutral’ program to measure the level of fragmentation. For the purposes of this test, the results of analysing the disk with this program will be the only results that are used to compare one product to another.
Decide if you want to do a file-system ‘performance’ test. If you do, I suggest copying lots of files of varying sizes onto the disk or partition you are using for your tests for this part of the test regime. Try to make sure that the source medium that you grab the files from is fast (because you don’t want it to be the slowest point of the test, because you’re not testing the speed of the source) and performs consistently (because you want a fair test, right?). Yes, this will be quite difficult to achieve.
Some drive imaging programs allow you to take a “raw” backup image of a disk partition or even of an entire hard disk. I’m linking to Acronis True Image’s FAQ here as an example of a product that I’m comfortable with and which offers the facility, though there are other choices out there.
Using such an imaging system, you can create a backup of a test computer’s hard disk after you’ve set it up in such a way as to leave it fragmented enough for your tests.
If you’re performing this test on Windows Vista, disable the built-in defragmentation routines NOW, before you do anything else.
For such a test I’d suggest using a relatively small hard disk and creating fragmentation by installing lots of programs and copying large files (got a couple of Linux ISOs doing nothing in your archives?) around the disk, then un-installing some of the programs and deleting some of the images. Considering that NTFS does try to avoid disk fragmentation in the first place you may actually find this bit to be the most difficult part.