Incremental not incremental

Discussion in 'Acronis True Image Product Line' started by misterz, Dec 31, 2006.

Thread Status:
Not open for further replies.
  1. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11
    I just completed an incremental backup. It was brutally long. About 18 hrs.:mad: I realize from reading other threads that the fact that I did a disk defrag with System Mechanic might account for an incremental file that is 10 x the size of the most recent. BUT, that doesn't explain why, right after I reboot the computer from the completion of THAT backup, and it's now time for the next scheduled incremental backup, that the program estimates the next incremental backup will take 8 hours o_O . For the record, I'm using XP Pro, 2.6 Ghz, backing up to a Buffalo Link Station via an Ethernet cord.

    Help, appreciated :thumb:
     
  2. Highplain

    Highplain Registered Member

    Joined:
    Dec 31, 2006
    Posts:
    4
    Is this a new installation of TI? What version?
    In any case, there is no way it should take that much time, defrag or not.
    Have you booted from the TI rescue disk to see if that external drive is visible? How long does the backup take using the bootdisk enviroment, if you can see the drive? You might also consider adding another internal drive, if possible. HDD space is pretty cheap these days.
     
  3. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11
    TI v. 8. I've not tried the bootdisk approach. I don't want to use another internal drive. I use the Buffalo to back up my whole network. I'm thinking of starting another full backup again with incremental to follow. I'm not having this trouble with the other computers on the network. For some reason, TI is not registering the files backed up so the next incremental realizes it.
     
  4. Brian K

    Brian K Imaging Specialist

    Joined:
    Jan 28, 2005
    Posts:
    8,634
    Location:
    NSW, Australia
    misterz,

    I'd be interested in hearing times from people who use incremental images with TI. I don't use incrementals on my computer with TI installed. I do use daily incrementals on another computer (using different imaging software) and the incremental images take 25 seconds to write and verify.
     
  5. Colvin

    Colvin Registered Member

    Joined:
    Aug 13, 2004
    Posts:
    75
    Location:
    USA
    I don't know what it is you are wanting to hear from those of us who do use Incremental backups? What I can tell you is that my routine is to create a Full backup and then 3 weekly Incrementals. The time needed is c. 1 1/2 - 2 minutes to create an Incremental backup of the drive with 19 gigs of data to an external eSATA HDD.
     
  6. Brian K

    Brian K Imaging Specialist

    Joined:
    Jan 28, 2005
    Posts:
    8,634
    Location:
    NSW, Australia
    Colvin, that's exactly what I wanted to hear. Your times are comparable to mine. And yet we see reports of hours to do an incremental backup. Strange.
     
  7. shieber

    shieber Registered Member

    Joined:
    Oct 27, 2004
    Posts:
    3,710
    18 hr.s? You were, then trying to backup about a terabyte? You oughto to get roughly a bout 1GB prer minute when backing up, unless you're doing it to optical media or oaver usb 1.x.
     
  8. layman

    layman Registered Member

    Joined:
    May 20, 2006
    Posts:
    217
    You're not stopping to think through how to go about using this tool. As you've already suspected, it makes no sense to make a delta (either incremental or differential) of a disk image once the disk has been defragged. A delta records all the changes in the disk's real estate, so if there is massive change (as there would be following a defrag) the delta will be massive. In the case of incremental imaging, the tool must recreate the state of the disk at each increment by applying the chain of incrementals to the original image. In the case of your humongous incremental, that will take a very long time, which is why you got the long estimate. Note also that a long chain of incrementals would also be time-consuming. Unless a disk is almost static, you can't make very many incrementals until the process will become counter-productive. I've found about 3 or 4 incrementals to be the practical max for a normally volatile disk.
     
  9. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11
    Afraid I'm not following. You seem to be describing the scenario of a differential versus an incremental. That being, where the backup incorporates all previous changes. I know you know that you are talking about an incremental but your description sounds decidedly differential. Also, 3 or 4 incrementals as a practical max is simply not so. My routine is to do an incremental everyday. I've a number of computers on my network. Most of the collection of incremental backup files number in the hundreds. They work just fine on restores.

    Just to be clearer: I've been trying to do incremental backups of an 80 GB drive w/. approx 20 Gb of data. Following a defrag the time for the incremental was about 18 hrs. It was understandable. But, IMMEDIATELY following that backup the estimated time was 8 hrs which doesn't sound at all logical because there were virtually NO file changes after the last incremental backup.

    At this point I've decided to do another full backup (20 gb data/80 gb hd) and then see what the estimates will be for subsequent incrementals. The estimate for this full back up (BTW) is about 1.5 days. Certainly, not even close to what's being reported above. I don't know if this is accounted for by the fact that I'm backing up to a Buffalo Link Station versus another HD attached to the motherboard.
     
  10. Brian K

    Brian K Imaging Specialist

    Joined:
    Jan 28, 2005
    Posts:
    8,634
    Location:
    NSW, Australia
    Sounds like Differential to me too.
     
  11. Brian K

    Brian K Imaging Specialist

    Joined:
    Jan 28, 2005
    Posts:
    8,634
    Location:
    NSW, Australia
    misterz,

    Do you have other partitions on your HD to send the image? If not, as a test could you write the image to your C: drive. Yes, it will work. This should take about 30 minutes and will tell you if the Buffalo Link Station is a bottleneck. Delete the image when it's complete or move it to your Buffalo Link Station.
     
  12. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11
    Very good idea.
     
  13. layman

    layman Registered Member

    Joined:
    May 20, 2006
    Posts:
    217
    Nope. The point you seem to be missing is that there is processing overhead inherent in incremental imaging. With differentials, each differential captures the delta from the original full image. Incrementals, however, capture the delta from an immediate predecessor. So, in order to do incremental imaging, the software must work forward from the full image, recreating each successive state. The longer the train of incrementals, the greater this overhead. Of course, the significance of the overhead will depend on how volatile the contents of the disk are. If the disk changes little, it will be practical to chain many incrementals to a full image, but not if the disk changes significantly. The advantage of incremental backups, both in terms of disk space and processing overhead, diminishes. Indeed, the "advantage" goes negative at some point. You have to recognize that there's a point where it's "cheaper" to simply do a full image again.
     
  14. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11
    Indeed your answer would account for the very large expenditure of time and file space of the incremental immediately subsequent to a defrag. But, not for successive incrementals. Moreover, my own anecdotal personal experience with collections numbering in the many hundreds of incrementals is that the size and duration of the incrementals is fluctuating upwards and downwards constantly. This, is consistent with the definition of incrementals that their size and length of time to produce is relative to the amount of change (delta) that occured from the last incremental. Were your theory operative the size and duration would remain permanently enlarged from then on. This has not been my experience.

    For your benefit, I've attached an image of a part of a directory of 36 successive incremental backups. During this period a defrag was conducted. As you can see there is considerable variability in the sizes of the incremental files. This, of course is as it is supposed to be. It is proportionate to the changes (delta) since the preceding incremental. It has not been impacted by the defrag (as I might currently be). Moreover, if as you say "So, in order to do incremental imaging, the software must work forward from the full image, recreating each successive state. " changes would be constantly occurring to the entire collection of previously created incrementals. That also, is not my experience.

    Thank you for your help, though.
     

    Attached Files:

  15. Doug_B

    Doug_B Registered Member

    Joined:
    Nov 10, 2005
    Posts:
    120
    Location:
    Central New Jersey
    I think what layman is getting at, in general, is that when backup software (ATI or other) is making an incremental and needs to determine whether or not a media unit (sector or whatever) has changed since the last incremental, it needs a baseline version of each media unit (or equivalent info; see end of post for more on this) for comparison purposes. Thus, it needs to keep going back in the list of incrementals already made until it finds the most recent version of that media unit, which it then compares to the media unit in the current source for changes.

    For example, if one full backup and one subsequent incremental have already been made, and this one incremental has few changes relative to the full, then the backup software must use the full backup as the baseline for comparison for most of the media units. If that incremental has many changes since the full backup, then the software can use the incremental as the baseline for comparison for more of the media units.

    For an incremental made after a defrag, the incremental will generally be large, especially if it's been a while since the previous defrag and the media had a lot of activity in between defrags. So one would expect the process of making yet another incremental, shortly after the post-defrag incremental, to actually take less processing time. However, I would think that the processing time would not be very different, since the software must find a baseline for every media unit anyway. I would think the number of incrementals that the backup software must look through will have a larger impact on the processing time.

    For a differential, the full backup is used as the baseline for comparison for all of the media units.

    Note that for both differential and incremental, the writing process of the next incremental/differential is the same, i.e., media units that are changed are written.

    Of course, each software app may have different mechanisms of doing this and define different "media units" (and possibly multiple media units, e.g., sectors and files) to track. Also, an application could include "shorthand" information in an incremental on the contents of media units that don't change (I'm thinking checksums and such), but I actually have no clue if this is or can be done (and if such mechanisms incur some probability of error).

    So am I on or off base here?

    Doug
     
  16. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11
    Doug,
    Appreciate your response. Still not following, though. An incremental backs up what's changed since the previous incremental backup. Seems it should need to go to my HD and check to see if the unit's been backed up. If not then back up. If so, move on. You seem to say that the program is searching through previous backups (not the actual data/units on the drive). You also seem to infer that regardless of data/unit quantity needing backing up, it is the previous incremental backups that are determinative to the length and duration of succeeding incrementals. That, quite definitely, is not my experience.
    I do understand that a defrag (in the case of imaging, not conventional backup software) would cause an incremental to see a unit as needing backing up (in spite of the fact that data content has not changed, ie., a Word document has not been accessed or changed but still will be selected to backup). But, once that unit's been backed up and the unit on both the drive and the incremental register that (thus, a new baseline) it should not be necessary for the program to go far to verify it's lack of need for further backing up. The matter should be settled right then and there in the most recent incremental. Which leads us back to the question as to why, after an incremental backup that was done following a defrag, should an incremental with virtually no changes from the preceeding incremental (I just rebooted) be vastly longer and more time consuming than incrementals prior to the defrag? Frankly, I think the program failed to mark units/data as backed up after the incremental following the defrag and went to back it up again. Why that's so is the question.
     
  17. Doug_B

    Doug_B Registered Member

    Joined:
    Nov 10, 2005
    Posts:
    120
    Location:
    Central New Jersey
    misterz,

    As to the problem itself, I don't think the "normal" behavior of the program, whatever that is, will help to figure out the behavior you're observing, as the backup times you report seem to be way too long altogether. My lengthy discourse is more relevant to normal behavior.

    The backup software needs to determine what has changed if performing a differential or incremental backup. How it does that can differ based on what the OS, technology, method of backup, etc., offer as methods. For file-based backups, we have archive bits, dates, and other file attributes that could be used. A byte-by-byte comparison can also be used, as could a method that streamlines such a comparison, such as doing a computation on the bytes and comparing to a previously computed value that is stored. I'm not sure what avenues are available for an image-based backup other than byte-by-byte comparison, but whatever is done, you need a representation of the prior state stored somewhere as a basis of doing the comparison.

    I am inferring from the beginning of your last post (i.e., "...Seems it should need to go to my HD and check to see if the unit's been backed up") that you believe that for an image-based backup, there is status data stored as part of the media unit / sector or somewhere else on the disk that tells the program whether each media unit was backed up, and if any changes occur to that media unit since, that status is cleared. Sounds a lot like an Archive bit, but at a media unit level instead of a file level. I don't know enough to say if this is true or not. If this is in fact true (which would surprise me), then software can make use of this to streamline image-based differential / incremental backups. Without this, though, software would have to do a byte-by-byte comparison or some comparison of computations (current versus prior stored).

    This example hopefully makes the image backup situation clear:

    I have 20 units of data on my disc. I back all of them up. Two days later, I make an incremental backup, which results in units 1-4 being backed up, as these units were changed. The backup software needed some mechanism to determine that units 1-4 had changed - status data, complete comparison of current vs backup, or some condensed representation of such. Another 2 days pass and it's time for another incremental. If the software is doing byte-by-byte comparison, it need only look to the previous incremental backup file for comparing units 1-4, but it would need to look at the full backup made 4 days ago to determine if units 5-20 changed, since units 5-20 are not represented in the previous incremental. More previous incremental backups in the chain mean more hopping back in the backup chain to find the most recent version of each unit from which to compare to the current state - unless there is a different scheme employed that keeps track of changes at the media unit level, as mentioned above.

    Doug
     
  18. layman

    layman Registered Member

    Joined:
    May 20, 2006
    Posts:
    217
    Doug B gets the general point. The software can't just go back to the HD to determine if a unit has been backed up. Unlike with some kinds of disk operations, there's no "dirty bit" here to tell the software what has or hasn't changed since the the previous increment. The only way it can determine what has changed is to (in effect) recreate the previous increment (or, more correctly, the state of the disk at the time of the previous increment.) There are various algorithms that could be used to do this, some more efficient, some less. But all entail an overhead that grows as the number of increments stacks up. Your puzzlement over the varying sizes of the increments shows that you really don't yet understand the mechanics of incremental imaging.

    As for poor performance times, I've seen huge differences in the time required to image the same amount of data. Aside from differences in hardware performance, not all drivers are created equal.
     
  19. Brian K

    Brian K Imaging Specialist

    Joined:
    Jan 28, 2005
    Posts:
    8,634
    Location:
    NSW, Australia
    And different applications do it differently. With Ghost 9 the incremental sizes really do reflect recent changes. For example, if you create an incremental image about 1 minute after a previous incremental then the second incremental is around 500 KB in size. If you don't use the computer then the daily incrementals are around 10 to 20 MB.

    Running Defragmention apps markedly changes incremental size with all imaging apps. Windows can move data around even without running a defrag and this can make for large incrementals.
     
  20. misterz

    misterz Registered Member

    Joined:
    May 12, 2004
    Posts:
    11

    I don't suppose you'll be able to offer solutions to the problem presented, will you? Working in the field, I presume :D

    Anyway, I've reached out to Acronis tech support who've offered the following steps as part of the initial assessment phase. There's a lesson there I think, don't you? First assessment. Then diagnosis. :eek:

    When I get the chance to run through this, I'll get back to all of you with the outcome.

    Hello Mark,

    Thank you for your interest in Acronis True Image http://www.acronis.com/homecomputing/products/trueimage/

    Please accept our apologies for the delay with the response.

    1. Could you please specify exact title of Acronis True Image you have used to perform the backup tasks?
    2. Specify the storage device (manufacturer, interface etc)?
    3. Please send us the following diagnostic information:
    3.1. plug-in the storage HDD, download Acronis Report utility available at http://download.acronis.com/support/AcronisReport.exe and run it, create a report and send it to us? Please compress the Acronis Report output file into an archive (e.g. with WinZip or WinRAR) and attach to your message by browsing for the archive.
    Point us source HDD and HDD where you have kept the backup archives.
    3.2. Download the http://download.acronis.com/support/schedmgr.exe file;
    - Start the Command Prompt from Start -> Programs -> Accessories menu;
    - Run the following command in the folder you saved the file to:

    schedmgr get report > schedreport.txt

    - Send us the created schedreport.txt file.
    3.4. Please run the schedmgr.exe program

    - Enable logging by using the following command:

    set logflags support

    - Please reproduce the problem and send us the schedul2.log file which is placed to the same folder as the service file (program files\common files\acronis\schedule2)

    It is recommended to turn off logging after troubleshooting by using the following command (from the schedmgr.exe command prompt):

    set logflags 0
    set lf_registry on

    3.5 Could you please send us the log file of Acronis True Image.
     
Thread Status:
Not open for further replies.