Maintenance of backup image drives

Discussion in 'backup, imaging & disk mgmt' started by axial, Apr 14, 2011.

Thread Status:
Not open for further replies.
  1. axial

    axial Registered Member

    Joined:
    Jun 27, 2007
    Posts:
    479
    Any comments on best practices for maintaining external drives used to hold backup images?

    For example, current backups are on a 1TB Samsung Spinpoint eSata. Images range from 20-40 GB (full system drive backups). Older images have been deleted, when space gets tight. Never defragged or any other disk maintenance. Occasionally boot images and never any problem, so it's not that the drive is misbehaving, this is just pro-active discussion.

    Note: don't want to get into discussion of backup theory, procedures, off-site storage, or backup apps, just wanted to focus on general care and feeding of drives containing images.

    With larger drives available now, any drive maintenance will just get more time-consuming, so it's time to re-think.

    Assumptions:
    • defragging could damage images (greater risk with such large files? defrag time factor would be significant, and my gut says risk is high.
    • drives used for storing images are always freshly formatted and partitioned

    Questions:
    • what risks are there to moving images to another drive, or changing directory structure on the drive, or ... whatever.
    • should backup drives with full images be considered "write once", i.e. no image deletions, no moving, no image manipulation (e.g. mount image and delete files), just fill 'er up and start with a new drive
    • what about backup images stored on a NAS, any other issues?

    What maintenance do you do with your backup drives?
     
  2. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,618
    Location:
    Milan and Seoul
    Hi there,

    you are asking interesting questions, hopefully some knowledgeable member will chime in. I have in the past defragged 2 USB harddrives (90 GB and 120 GB) with only images on them. Your assumption was correct it was time consuming but it did not affect the integrity of the image. I've also transferred several images without any problems. Ever since I bought a large 1 TB HD I don't bother to defrag anymore.

    I would like to add to your questions whether an image that has been defragged would restore faster than one heavily fragmented...
     
  3. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    1) Very little risk. As long as your imaging program detects the image regardless of its location, you'll be fine.

    2) No, they should not. You would be limiting yourself for no good reason, and the hard drive will become full without deletion.

    3) Not sure, haven't tried that myself.
     
  4. axial

    axial Registered Member

    Joined:
    Jun 27, 2007
    Posts:
    479
    Another variation on the above questions would be for folks who do incremental backups, which would, I think, create more need for defragging because of the smaller, more frequent files written. Do you do any periodic defrag or drive maintenance?

    For those who defrag, or move images from one drive to another, do you always use your imaging app to verify the image afterward?
     
  5. napoleon1815

    napoleon1815 Registered Member

    Joined:
    Sep 9, 2010
    Posts:
    734
    Just wanted to add my two cents...

    I have two external hard drives I alternate between my desktop and laptop for images (so each one has images of both systems for redundancy). I basically only hold a few of each before space becomes an issue so I am always manually deleting older images and creating new ones with no issues whatsoever. Once every 6 months or so I will take one of them and reformat it.

    As for defragmenting, I have never defragged the external hard drives at all. I don't see the need, and it's painfully slow and puts a lot of wear on the devices. I was under the assumption that you shouldn't at one point, but I don't believe there is any proof or validity to that.

    I have also moved (or copied) existing images to other devices fine. I usually do an integrity check on the new copy. Never a problem.

    I have done all of this using IFL, IFW, Drive Image, and Drive Snapshot. Thanks.
     
  6. layman

    layman Registered Member

    Joined:
    May 20, 2006
    Posts:
    293
    I have for many years had a backup approach that utilizes dedicated, redundant disks to store image files. Unfortunately, the disk allocation scheme of most file systems is not well suited to disks populated mostly with very large files. In fact, the usual approach to allocation tends to promote fragmentation in this situation. Also, the file system's own structural files wind up scattered here, there, and everywhere, further promoting fragmentation. But there is really little harm in allowing these huge and seldom-accessed files to be fragmented. The worst thing is that the fragmentation tends to worsen over time.

    If you are worried about a loss of integrity while copying large files, try using a program like FastCopy, TeraCopy or CopyLargeFiles that can verify the copy. Unfortunately, of these three, only FastCopy produces a tight copy. The other two produce highly fragmented targets.

    Some years ago, I began having trouble with corrupt copies, especially of larger files. Disk images or zip archives would suffer undetected damage during the course of copying. Eventually, I was able to determine that this corruption was always in the form of a single dropped bit, and by monitoring closely I was able to narrow the field of suspects down to a custom-built Velocity Micro machine. I could never pinpoint the problem any closer than that, however. I ran memory and disk diagnostics for days on end without a single problem ever occurring, but turn around and copy a large file on that machine, and the target would be corrupt. That computer was beautiful in how meticulously it had been assembled, but I finally wound up throwing it on the trash heap. Dunno if the problem might have been the motherboard, or possibly a memory problem that went undetected by the diagnostics because of buffering. Just dunno. Couldn't afford to waste any more time on the problem, and I haven't had a bit of trouble since I tossed the machine out.

    It would be nice if file systems offered some options in the allocation scheme to be employed on a disk partition. The usual allocation approach favors fragmenting a file over fragmenting open space. For disks that store mostly large files, however, and where older files are deleted due to age, the reverse approach (favoring fragmentation of open space over fragmenting a file) would be far better.
     
  7. wat0114

    wat0114 Guest

    That's my assumption too, so I never defrag them

    Good question. In my experience I've had on two separate occasions an image corrupt on me when copying over to another external drive (only corrupted on the target drive), so I no longer do that. Now when I image my drive, I do so twice to separate drives.

    None, really. I just keep them turned off when not in use.
     
  8. Sully

    Sully Registered Member

    Joined:
    Dec 23, 2005
    Posts:
    3,719
    Remember that an image is a contiguous file. A program like contig/wincontig works great on these files, as you can run a batch script that does one, then the next, then the next. Fragmentation occurs when you do incrementals, and obviously if you create a new one.

    My exprience has been, depending on if you believe your defragger or what "they" say, is that if I have a few static images that will never be removed, I delete everything else on the drive/partition, then I defrag them. New images could then be defragged after they are made, but only that new image. This "should" ensure each large file is contiguous and as I understand it, if the defragger does it correctly should start at the last cylinder with unused data. Meaning, if you don't mess with these files, they won't be fragmented and the next file you create will always be at the beginning of free space, and a defrag of that one file puts it in order. I don't know if this is 100% true or not, but it is my take on it.

    I have a small partition with my baseline images and some images for PE and stuff like that. I have to make sure on win7 that those are contiguous, else they won't work correctly when I load the PE. I know it works, I just don't know how long it would stay that way if you used the partition for other purposes, as mine is never touched unless I create a new baseline image, and then I use wcontig on each file.

    There used to be programs that said they could put certain files on certain parts of the disc, and in theory once they are put there, they don't move if you don't mess with them, but I forgot most of that stuff now. I use wincontig and that is about it anymore.

    Sul.
     
  9. pirej

    pirej Registered Member

    Joined:
    Sep 30, 2010
    Posts:
    64
    I do backups on external hdd occasionally, and i never defragmented it yet.
    But i do system partition backup's on 1 of my PC partitions that i use for other data storage too, and i have defragmented this partition from time to time, and when i wanted to restore using 1 of the "defragmented" images.. nothing bad happened, it was successful.
    I never thought that backup images should be treated different than any other data file on the hdd... i was wrong?
     
  10. MerleOne

    MerleOne Registered Member

    Joined:
    Mar 6, 2006
    Posts:
    1,336
    Location:
    France
    I sometimes partially defrag my image files, with the only tool I know that is capable of partial defrag : Auslogics disk defrag ( version < 3.2.0.0). I set the segment size to 100MB, so files having segments below that limit will see their framentation greatly reduced without taking too much time.
     
  11. Keatah

    Keatah Registered Member

    Joined:
    Jan 13, 2011
    Posts:
    1,029

    That would make perfect sense! I've always pictured a Velocity Micro to be one of those m0ddErZ BoIze systems. Built to exacting physical standards and usually having some sort of plexiglass window and perhaps even funny-looking lights; and classed as a high performance gaming computer. GaMeRz always think that because their computer can crunch the intense graphics it can handle anything else. HAHAHAHAAH!!! I got no0z for you buddy. That ain't the case (no pun intended).

    These machines are not built to any "exacting" electrical specification whatsoever. These are media machines, and a dropped bit here and there isn't going to make a difference when you're playing the latest fanboi FPS or jamming tunes.

    I had reliability issues with systems that take the "latest and greatest" hardware and run it up to its limit, or even overclocking it. These computers, while they were nice and pretty, couldn't handle a basic backup or file transfer without some sort of glitch somewhere.

    I prefer to always work with "last-year's technology", or if we're looking at a cpu or ram, and it's a new model, always take the mid-grade speed level. There are always bugs to be worked out, and gunz-A-bLaZIN performance with a touch of instability isn't what you need when executing backups.

    As a side note, I've noticed in general that the less number of DIMMS and SIMMS you got, the more reliable a system is. And buffering, for all that we think it is, just adds more "stuff" between the memory element and processor.

    The *TWO* main culprits (by far!) of failed backups are:

    #1 An unstable cpu due to overclocking or a noisy power-supply.
    #2 Unstable ram due to the above reasons AND one more choice, a motherboard with excessively long circuit pathways between the individual bit-cells in the memory chip and the memory controller in the cpu. Those need to be tight and clean.

    You can certainly ooddle and oogle over the specs of a supersonic-electronic buffered memory subsystem in a high-performance gaming machine and CONVINCE yourself you got a solid system, but it ain't so buddy!

    Anybody (including those boutique companies and my sister) can build a rig. But to make it stable?? Well, that's a whole different ballgame brother!

    I prefer store-bought pedestrian-performing systems as workhorses. Everything is so conservative in those designs.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.