Defragging an External Disk

Discussion in 'backup, imaging & disk mgmt' started by twl845, Jun 7, 2012.

Thread Status:
Not open for further replies.
  1. twl845

    twl845 Registered Member

    Joined:
    Apr 12, 2005
    Posts:
    4,186
    Location:
    USA
    Hi all, I have a 2TB Seagate external disk that I bought a year ago and use for my images. Is it advisable to defrag the external drive periodically like you would your C drive? I checked its status and it's 41% fragmented. I started to do a defrag, and after 2 hours it defragged one image so far. There's only 3 images and some files on the whole disk so it's practically empty. Should it take that long to defrag being it's 41% fragmented and is it necessary? Thanks for any advice. :)
     
  2. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    you should not defrag backup images anyway, but it will take a while to defrag a very large file and it depends on your usb port and the external drive speed etc..
     
  3. twl845

    twl845 Registered Member

    Joined:
    Apr 12, 2005
    Posts:
    4,186
    Location:
    USA
    Thanks for the info. Thinking about it, I understand why you say not to defrag an image. :)
     
  4. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    I don't see any problems with defragging image files - other than the fact they tend to be huge. There is no evidence defragging image files results in anything bad.

    That said, it is always best clean out the clutter and defrag before running a backup or image file.

    But the key thing here is, you said of the drive, "it's practically empty". If there is lots of free disk space on a drive, fragmentation is not really a problem. It becomes a problem when a drive is crowded, then fragmentation can affect the page file - if it is on that disk (not likely with an external drive here) and computer performance.

    Since image and backup files are NOT system files used during normal operation, they will not affect performance, even if highly fragmented.

    Do you need all 3 images? If not, just keep the latest.
     
  5. twl845

    twl845 Registered Member

    Joined:
    Apr 12, 2005
    Posts:
    4,186
    Location:
    USA
    Thanks for clearing that up for me. I make an image with Active Disk Image, and another one with Drive Cloner RX for insurance. I have a Windows Back up from December that I guess I could eliminate.
     
  6. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    most companies will tell you if you ask not to defrag the image files.. "Do not defrag System Image. This may rearrange files and will compress the files. Not a good thing when you go to use the images. It may have trouble sorting and uncompressing the files on restore" is one response from a company
     
  7. twl845

    twl845 Registered Member

    Joined:
    Apr 12, 2005
    Posts:
    4,186
    Location:
    USA
    Hi Bill _Right, I think Zfactor recommended deleting the old images before defragging, and then making new image. If nothing else the defrag will be alot quicker.
     
  8. TheRollbackFrog

    TheRollbackFrog Imaging Specialist

    Joined:
    Mar 1, 2011
    Posts:
    4,954
    Location:
    The Pond - USA
    TWL, I know not about Drive Cloner and ADI, but many imaging programs offer you the option of saving the image in "pieces" (my primary imager, IFW, does). If this option is available, you should use it to create smaller pieces in the backup image, which will allow for efficient defragging should you choose to do so. My pieces are 100mB each...
     
  9. twl845

    twl845 Registered Member

    Joined:
    Apr 12, 2005
    Posts:
    4,186
    Location:
    USA
    Thanks, I'll check it out. :)
     
  10. aladdin

    aladdin Registered Member

    Joined:
    Jan 9, 2006
    Posts:
    2,986
    Location:
    Oman
    You really don't need to defrag. But if you want to then;

    Destroy the images by formatting the drive. Then create new large images, by two of your imagining programs. It is bunch and bunch of small files that get fragmented and not very few large files.

    Old images are useless, anyhow.

    This way time will be well spent and you will have new images.
     
  11. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    agreed delete the images defrag the drive then create new ones. i personally have seen where a defragged image was unable to be restored...
     
  12. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    I defrag my external drives every now and then . . . But i don't have any images. :D
     
  13. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Oh? Where? Got a link? I can find NOTHING from any defrag application maker that says do not defrag image files.

    I note Acronis True Image documentation as seen here does not mention it. In fact, it has a defragment program built in.

    Norton Ghost in fact has a feature (Compact) to defrag Ghost image files.

    The fact is, the reason you don't normally defrag image files is they don't need to be, not because you may corrupt them.

    No! That is incorrect. Defragging does NOT compress files. Also, note the image file is just that, "a" as in 1 file. Therefore, there is NO rearranging of the files within the image. This is the same with .zip and .rar files too. The files within the .zip file are NOT rearranged or compressed.

    Understand there is ALWAYS a risk of file/disk corruption whenever you defrag the drive. It is not a zero-risk process, regardless the file types on the drive.

    Therefore, there is no added risk when defragging an image file over another type file. The risk of corrupt is the same - and that is is minimal.
     
  14. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    i think what they meant is it may somehow possibly change the file. the response i posted was quoted from easeus. and most of the responses i have seen were on forums from the companies when asked..either way i dont touch my images. i want them 100% when restoring them. and yes i have personally had times where a image could not be restored after it was defragged due to a damaged file. and no it was not the drive the drive was perfectly fine as well as the images were verified then done. and that was being defragged with perfectdisk. nothing else was done the image was made and a couple days later the external drive had not been touched at all i checked the drive for fragments with perfectdisk and defragged it and then could not restore the images on it so maybe my experience with perfectdisk has something to do with it not sure but i for one will never do that again....
     
    Last edited: Jun 9, 2012
  15. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    And what I am saying is, that is the same risk as there is to any file that is large enough to be fragmented, whenever the fragments are moved about.

    As for possibly changing the file, sure it will change it - at the minimum, the file location will change. But that does not mean it will corrupt the file. And more importantly, that change will likely be seen as a modification causing the backup/imaging program to see this as a modified file and, depending on settings, create more space hogging incrementals, thus cluttering up the drive even more.

    I am not suggesting everyone go out and defrag their image files. I am just saying the reasons given not to are not valid - and still not justified. I am saying, however, that defragging image files is not necessary.

    FTR over the years I have had countless backups fail for various reasons. Image files are NOT foolproof. You have not shown it was the defragging that corrupted your image restore.
    Where? I'm a "show-me" type of guy. Here is the Easeus Backup FAQ and you can download the docs from here. I just went through them and nothing says, do not defrag image files. So I ask again, please show us where the image program developers say, "do not defrag the image files".

    As for PerfectDisk, note the PD 12.5 manual on page 230 talks about using PD's defragger to defrag highly fragmented image and backup files!!!!

    Personally, IMO, image files don't belong on your working drives anyway. They should be on a backup drive, and secured, preferably off-site. Regardless, you should defrag your drive BEFORE you create an image, not after. And you should always clean the clutter (rid the system of potentially 1000s of tiny temporary Internet files and cookies) BEFORE defragging too.

    And while on the subject, I am totally against any defragging program that runs real-time in the background, or defrags automatically on a schedule. This includes Windows 7's own (otherwise excellent) defragger. Why? Because as I noted above, it is counterproductive to defrag with 1000s of tiny temporary files on your system. And real-time or scheduled defragging programs do NOT first run Windows Disk Cleanup or CCleaner first.

    Also, you should ALWAYS defrag in "Safe Mode", not "Normal" Windows boot mode. There are all kinds of critical system, and program start-up files and drivers that are opened during normal boots and cannot be moved. Therefore, you will NEVER - as in NEVER EVER get an efficient defragging running your defrag program from a Normal boot.

    As to programs like PerfectDisk and other "alternative" 3rd party defragging programs that promise to provide the most efficient defragging - they are a waste of money! Don't waste your money or time on them - they are rip offs! Why? Because the second you start to use your computer again, fragmentation starts all over again too. So even though it is true, these programs do defrag more efficiently, so what? The playing field is leveled within minutes anyway. So the "basic", no-frills defragger built into Windows is just fine (as long as scheduled is disabled - Note it is enabled by default in W7 :().
     
  16. zfactor

    zfactor Registered Member

    Joined:
    Mar 10, 2005
    Posts:
    6,102
    Location:
    on my zx10-r
    well we agree to disagree then.. the response was in their forums a while back.. i still had it in a email because i got a email for the thread update. it doesnt show the whole thread though. i also dont agree that all defrag programs are a waste. i personally dont like the built in one. i like having the placement options of perfectdisk as well as being able to see what files are located where. and my win7 defrag is disabled by me so it is not running.
     
  17. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Well, certainly if you don't like the "basic" defragger built into Windows, then by all means, use another, if your drives need defragging. I am not saying the alternatives are bad. Windows native defragger it is not meant to be anything more than a basic defragger. I am just saying, if your drives need defragging, Windows basic defragger (all versions of Windows) is all you need.

    I should also note that if your drives regularly need defragging, that typically indicates you need more disk space.
     
    Last edited: Jun 10, 2012
  18. Brian K

    Brian K Imaging Specialist

    Joined:
    Jan 28, 2005
    Posts:
    12,146
    Location:
    NSW, Australia
    I don't defrag images. I see no point. It is very time consuming and the images wouldn't work any better. I've restored over 10 thousand images using various apps an I've never experienced a failed restore.
     
  19. Seer

    Seer Registered Member

    Joined:
    Feb 12, 2007
    Posts:
    2,068
    Location:
    Serbia
    I always split images to at least 700MB (even 100MB) parts just beacause it's easier to "fit" many small files on a drive than a single large one. This way the fragmentation of the image parts and defragmentation times are reduced.
    I generally tend to avoid working with large files if at all possible.
     
  20. MerleOne

    MerleOne Registered Member

    Joined:
    Mar 6, 2006
    Posts:
    1,336
    Location:
    France
    Hi,
    If you use auslogics disk defrag and specify "don't defrag segments over 100 MB" you will partially defrag big files, in a reasonable amount of time.
     
  21. Keatah

    Keatah Registered Member

    Joined:
    Jan 13, 2011
    Posts:
    1,029
    My recommendation for defragging a drive hosting your backup files is not to. Let me explain it several ways. And at the same time present a solution to those that absolutely must have a defragged image. Perhaps someone will learn something along the way.


    Begin TLDNR version:
    The purpose of defragging Spinners (SSD's need not apply) is to reduce the amount of head movements required to read a file. This is good for tweaking up a system for optimum performance on your main boot drive. This is good for repetitive file access over a small period of time. The role of a backup drive is different. It is here to save your ahss from shelling out thousands of dollars to a data recovery corporation when your system gives up the ghost. We are asking for reliability and not speed.

    To the point: There isn't a need to defrag a drive holding your backup image files. You can, if it makes you feel all warm and fuzzy inside. But its a waste of time and adds a little risk to the integrity of the image being defragged. Understand and know that a drive hosting your backup images is not utilized for performance. It is utilized for safety and security. That is the purpose of it. So let us approach the question from that perspective. Let us operate in a fashion that will support the drive in its intended role of being a safe haven in times of disaster.

    When crunch time comes and you're hanging on the edge of your seat, all you want to do is get the system operational and be assured your precious data is safe. Especially that big pr0n collection, eh? The 20 seconds you could possibly save (by spending hours defragging backup images) isn't gonna mean Jack during a restore.

    Furthermore. When reading and writing and re-arranging data on drive during a defrag operation there is a small risk that something could go wrong. A bit might flip here and there either from a blip in the ECC onboard the drive, or in the operating system read/write routines itself. A random flip of memory from an intermittent memory cell that fails under certain circumstances can do your whole image in. And unlike .BMP graphics files or MS Word files (to pull some examples out of my ahss) which are tolerant to some degree of errors. You won't notice a mis-shaded pixel, spell check will catch mis-spelled words, and so on and so forth. However, backup images are not tolerant of any mistakes. Every bit needs to be present, and every bit needs to pass through some thorough error checking.

    Backup imaging programs are notorious for failing if one (just ONE) of those bits are out of place. And rightfully so! If you have problems with restoring images or verifying them, I'd suspect a hardware issue straight away.

    Reading, writing, moving, copying and organizing these image files simply exposes them to more chance of error-inducing conditions that's all. Not to mention the wear and tear of a 10 hour marathon defrag session if it's over a slow USB connection. During all this, your system must operate error free! 100%.

    If you get anal about it and have to do "something" - a good thing to do is delete some old images, while keeping one around for safety, and just rewrite new ones. Have the imaging software break them down into smaller chunks of a few hundred MB or so as it creates them. This is a good thing and in no way compromises the safety of the image.

    Furthermore, a huge multi-GB sized image file is more difficult to defrag, some defraggers choke on them from the get go.

    One other thing you can do, if you like, if it makes you happy, is to partition the big 2TB backup-hosting drive into some smaller ones. Know the history and expected size of your images and proceed accordingly. This way you can erase old images and the imaging program will see a blank recipient drive and produce images with only a couple of fragmented files, if any at all! Especially if you split the images into, let's say, 256MB chunks.

    What I like to do that has a real benefit in both speed and performance is to run a cleaner program that has been customized to my system. CCleaner for example. That is my main "optimizer" program. I do a lot of work that produces tons of temp files, so after a few months, CCleaner can remove gigs upon gigs of unneeded files. This results in measurably smaller backups which do indeed go faster. That's all I recommend to run, aside from defragging your Mechanicals.

    I believe (but have to test) that CCleaner will allow an SSD to better prepare itself for setting up contiguous free space. Just simply through deleting unneeded ~ Snipped as per TOS ~ files. SSD's like that. Any storage device likes that. TRIM and the SSD's garbage collection routines like that. And wear leveling likes that because there are more empty spots to work with.

    I like Ultimate Defrag for the file placement and zoning options. I like putting all the O/S and application files into one small area; thus reducing head movements a good deal. Once in a while I give that a whirl too. But never on an SSD!


    Continuing on with a story:
    To anyone that has experienced a failed backup image, the first thing to check is your system's memory, run memtest86 or something like it. I one time had a system that would periodically corrupt a .zip file during a copy operation, but yet defragging the same file and sending it over the internet was not an issue. Just copy operations would *sometimes* **** it up. I got frustrated with this seemingly innocuous random fail and turned to hardware issues. The drive checked fine. The fans were blowing nicely, the cpu had good heatsink paste, the connections to everything internal and external seemed good. And Memtest86 found a failed bit in a module. 1 frakking bit!

    Here is a story if you wish to continue reading. If not, then just consider that if you have bad images that won't restore you may have a ram problem or something pissing off the ram like a noisy capacitors or power supplies.

    Story begins:
    A user working through the wintry evening on mathematical equations describing the path a spacecraft would take as it cruised the solar system was up-in-arms. After days of trying to determine why the simulation she was running would sometimes allow the craft to land on target and other times miss by hundreds of thousands of miles she gave up.

    Dejected and demoralized she passed the code off to a band of colleagues that had little difficulty in figuring out the problem. The next day they showed her the simulation and the craft was landing on target run after run, +/- 100 meters.

    And more perplexing, they wondered, why she couldn't find the cause. The error was glaringly obvious! So plain was the "mistake" that they thought it was a mis-timed April fools joke. Why in god's universe would her program be firing a mid-course correction long enough for complete propellant depletion, and at an angle such as to effect a plane-change instead of a retrograde delta-V of, say, perhaps 2m/s..? Why? Because the ram in which the simulation was running had 1 single bit that was intermittently bad. In this simulation the bad bit was in the routine that determined whether there would be a mid-course correction. And if so how long. The failed bit would **sometimes** make a sign error and invert the time RCS was to fire.. A few seconds or a few billion seconds.

    ---------------------------------

    Keep in mind, unlike a simulated spacecraft living in a desktop system, a real one would have several processors running checks against each other before doing maneuvers. So a failed calculation would be caught. This is how it works in real life.

    That's-a right! A single bit error, one single bit, mind you, really sucks. It's like the hardware version of flipper. Flipper is a virus I wrote that just kinda hangs out and looks like part of an application but it randomly flips a bit here and there, in a random file of its choice - from time to time. But that's besides the point. Really insidious, sometimes the problem can show up, sometimes it can be hidden for months.

    Anyways. I had the fun task of tending to a sick machine that seemed to be, mostly, on a mission to corrupt files, sometimes, and of a certain size (especially larger datasets) by flipping one bit in them. The failed bit in module #2 was discovered with Memtest86. Not saying it was simple. We had to cycle the machine for a few days to uncover the issue. And it was annoying because the problem would mostly show up at the end of the test.

    Imagine a tic-tac-toe board. Let us equate it to a 9-bit memory chip. All X's, good. All O's, good.

    Now, make it all X's except for the center square #5. Make that an O.
    After a few minutes the O might fade and become an X, sometimes.
    The complement is also true.

    Now, make it a random distribution of X's and O's. As long as there was a mixture of both X's and O's surrounding the center #5 element, the chip would be stable. Provided the temperature and local background radiation levels are right. It can get worse, the center #5 cell can intermittently couple itself to any surrounding element, randomly. It doesn't have to be all at once either.

    In the big picture, this system would (when not running the simulation, just a Windows desktop) corrupt some larger files when doing file transfers, or doing disk operations. And since the bit seemed to be near the edge of (and between) the o/s code that controls the reading and writing and storage-in-memory (of said file), depending on what Windows was doing at the moment, a file could pass through unscathed sometimes. Since this was a Windows XP system, the problem remained relatively contained to file corruption, no other symptoms showed up, like sound or graphics issues.

    If this was in a Windows 7 system, there could problems all over the place, because, win7 loads itself into memory differently every time. In an attempt to avoid some low-level malware hacks, win7 will sometimes load graphics stuff here, sound stuff there, and filesystem routines in yet a different spot, different each and every time you start your computer. So this failing ram bit might sometimes handle code that does graphics, or sound, or mouse, or printing operations. You'll just never ever know!

    Back to XP, in contrast, XP tends to keep a memory map that's not as dynamic. On this system, code that handled disk ops and scratchpad memory (for file transfer), happened to be making use of the failed bit. So the scope of errors was limited to file corruption when copying or creating files.

    On Linux, this could have been a non-issue (almost), because, assuming testing was accurate, the bad locations could have been mapped out and simply not used in the map.

    This is something that a lot of failsafe & mission critical computers do, they will test their own memory and automatically map the failed address off-limits. The systems onboard a spacecraft will do this in conjunction with mission control.

    SSD's, like standard hard disks, have been doing it for years too. All we have left is to do this for ram. Yehp! Your $4,000 gaming rig, can fail due to one single marginal bit. And that failure can be intermittent.

    Single bit errors can..
    Make you swap graphics card drivers 10 times.
    Make you reload your recalcitrant applications till the cows come home.
    Make you swap a motherboard, power supply, and make you remove all your add-in cards.
    You'll be checking heatsink paste, spending hours dicking with bios options.
    You will also read up on and replace bad capacitors.
    We're not done yet.. You'll methodically patch and update your o/s and apps.
    You'll post to message boards and try ridiculously non-applicable suggestions.
    Make you check cables and clean contacts.
    Make you think you've tried everything and reload the o/s from scratch.
    Keep you up all night long.
    Make you spend hours swapping and exchanging all sorts of hardware.
    Makes you start thinking about driving across town to get that tertiary offsite backup you just updated last week, because your local backups from yesterday are bad and will not restore!
    Make you call tech support and bitch how bad their service & support is, especially on that $5,000 gaming PC you ordered with dad's money.

    Single-bit errors can be the most insidious forms of failure, especially when they show up in a certain temperature range only, and then become intermittent. If it's seemingly too hot or too cold, the memory works fine. But in a certain temperature range, the metal gate in a microchip can become leaky and interact with the surrounding cells and cause a bit to flip. Read the technical section of memtest to see how subtle some of these failures can be.

    And backup/imager programs rely on this memory to be 100% functional every moment.

    http://www.memtest86.com/
    http://fgouget.free.fr/misc/badram.shtml
    http://rick.vanrein.org/linux/badram/
    http://en.wikipedia.org/wiki/Dynamic_random-access_memory
    Error detection and correction - Wikipedia, the free encyclopedia
    http://it.slashdot.org/story/10/06/24/2210214/tracking-down-a-single-bit-ram-error
     
    Last edited by a moderator: Jun 20, 2012
  22. twl845

    twl845 Registered Member

    Joined:
    Apr 12, 2005
    Posts:
    4,186
    Location:
    USA
    Keatah, Thank you for your excellent lesson. I can now be confident in my imaging on my external drive. :)
     
  23. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Ummm, no. Not true. SSDs don't care about contiguous free space. This is exactly why defragging a SSD is not necessary. They don't have motors or RW heads that need to jump around. It takes no more time, or effort to retrieve the adjacent chunk of data than it does to retrieve data from the far side of the SSD.

    Ummm, not agreeing with this either. Got a link to substantiate that?

    If Windows 7 changes how it loads every time, it is because it is tweaking itself, via prefetch and superfetch, based on how you, the user, use your computer so that Windows 7 can speed up getting ready for your next session. It is not for security reasons that it changes.
     
    Last edited by a moderator: Jun 20, 2012
  24. Keatah

    Keatah Registered Member

    Joined:
    Jan 13, 2011
    Posts:
    1,029
    SSD's care big-time about contiguous free space within a block of data. This block of data may be 512KB more or less. When an SSD is nearly full and you are working it hard with a lot of updates it is important to have complete free blocks available. You don't want to have the drive be erasing prepping blocks and doing garbage collection ops at the same time you're writing files.

    Where these blocks are located is irrelevant. The important thing is that there are entire blocks free. This is the contigous space I'm referring to. Over time as SSD's come out of the beta-test phase and morph into a real consumer product this will become less and less a problem. This explains what's going on. The third article is a good read.
    https://wiki.archlinux.org/index.php/Solid_State_Drives
    https://wiki.archlinux.org/index.php/SSD_Memory_Cell_Clearing
    http://thessdreview.com/ssd-guides/optimization-guides/ssd-performance-loss-and-its-solution/

    Clearing up free space will make the internal garbage collection process run smoother. No "defrag" utility can effect how the blocks are managed internally. The best that can be hoped for is for you to free up space and thus "encourage" the drive to do it's own optimization by itself.

    As a side note: MFG's won't tell you this, but it is recommended to only fill an SSD to perhaps 75% capacity.


    Regarding how Windows 7 loads itself - please do some research:
    http://blogs.technet.com/b/askperf/...der-and-address-space-load-randomization.aspx
    http://blogs.msdn.com/b/michael_howard/archive/2006/05/26/608315.aspx
    http://en.wikipedia.org/wiki/Address_space_layout_randomization
    http://social.technet.microsoft.com...ut randomization&x=0&y=0&refinement=1002&ac=8
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.