Sunday, September 24, 2006

SuperDuper, the Intel DiskWarrior?

When I first switched to an Intel-based iMac, I put my home directory on an external drive formatted as APM, since DiskWarrior can’t repair GPT drives (even if you boot from a PowerPC-based Mac). After a while, I started having problems with that drive, and its fan was bothering me anyway, so I moved the home directory to the internal boot drive. This was more convenient in a number of ways, especially when syncing with my PowerBook where the home directory is also on the internal drive. I made regular backups and hoped that I wouldn’t need to use DiskWarrior or that Alsoft would finish the Intel version soon.

The iMac gradually got slower. Since 10.4 or so, my Mac has always seems to get slower if I leave it running for more than a day without a reboot, but now even rebooting didn’t help much. At one point it was taking more than five seconds to download a single message in Mailsmith or to switch between news items in NetNewsWire. Saving a small OmniOutliner Pro document took a few seconds. Compiling is always too slow, but it was even slower than normal. Raw processing was still fast, but everything to do with the disk was slow. It didn’t dawn on me how much slower everything had gotten until I spent a day using the PowerBook as my main machine—it shouldn’t seem faster than the Core Duo, but it did.

My Mac creates and deletes a lot of files. I used to run DiskWarrior every few weeks to optimize the directory. Since moving my home directory to the GPT drive a few months ago, I hadn’t been able to run it once. I suspected that a slow directory was the problem (since the OS is now supposed to manage file fragmentation automatically), and the only solution I could think of was to rebuild it the old-fashioned way. I created two clones using SuperDuper, then erased the internal drive and copied everything back to it. Many hours of copying later, the iMac is now fast again.

3 Comments RSS · Twitter

I've used hfsdebug for an online version of this - HFS+'s on-demand defragmentation has two key limitations: it doesn't help at all if your disk is almost full (I'm using over 95%) and it doesn't work with files larger than 20MB. Unfortunately I've not only had data files over that size limit (video editing is wretched) but also index/sqlite files used by things like Spotlight or Aperture which have a grossly disproportionate impact on system performance; in such cases I've used a simple cp/rm/mv script to defragment individual files, which isn't as effective as getting the entire system but much faster and doesn't require a reboot.

iDefrag has the ability to optimize directories on conjunction with it's defrag/optimize function, if I remember right.

Well, I had to do this again on Sunday. I still think the problem is the catalog, not (mainly) fragmentation of individual files, because even saving new, small files was very slow.

Leave a Comment