CrashPlan deep maintenance of local archives

Discussion in 'backup, imaging & disk mgmt' started by peterk62, Sep 15, 2014.

  1. peterk62

    peterk62 Registered Member

    Joined:
    Feb 10, 2009
    Posts:
    51
    I have been trying the CrashPlan backup client for backups to a NAS and USB drive. Overall I was happy with the way the software works until I triggered a "deep maintenance" cycle. The problem is that deep maintenance, specifically the "deep pruning" phase takes forever even though CPU and network usage seems to be very low while it is running. For my testing I was using a mapped network drive connected to a 2.66 GHz Core2Quad. Actual backup speed was very good, it is the maintenance performance which I find unacceptable.

    As a test, I created a fresh backup archive that was about 20 GB in total size. Immediately after creating this archive, I used the "Compact" option in the GUI to trigger deep maintenance. This took about an hour even though this archive should have had no previous versions which needed pruning. There were long periods of time when the progress counter would increment 0.1%, and during this time both the CPU and network usage were close to 0. When I increased the backup archive size to 200+ GB, the deep maintenance took several hours, also with long periods when the process didn't actually seem to be doing anything.

    According to the documentation (http://support.code42.com/Administrator/3.6_And_4.0/Monitoring_And_Managing/Archive_Maintenance), the difference between "deep" and "shallow" maintenance is that deep maintenance verifies block checksums and compacts archives, whereas "shallow" maintenance only checks for file corruption. By default, shallow maintenance runs every 7 days and deep maintenance runs every 28 days, though these intervals can be increased.

    My question is, if I am not backing up to the cloud, what is the risk to my backups if the deep maintenance interval is increased to the point that it is effectively disabled? Would shallow maintenance be enough to catch potential file corruption?
     
  2. peterk62

    peterk62 Registered Member

    Joined:
    Feb 10, 2009
    Posts:
    51
    Some interesting results from some more testing:
    • I reinstalled the 32-bit client and the latest 32-bit Java 8 JRE. The 32-bit CrashPlan service uses a lot less memory than the 64-bit service. It looks like the 32-bit client is a better option as long as you don't need more than 2 GB of Java heap.
    • performance on a fresh test backup set of ~28000 files with a total size of 15.5 GB were better than my last test - now it took about 30 minutes to run a deep maintenance cycle on the archive.
    • there are still long periods when the CrashPlan service is using > 1% CPU during deep pruning.
    Supposedly the version 4.0 client will be a native application instead of Java, but supposedly this has been coming "real soon now" for a long time...

    I'm still not sure if I want to keep using CrashPlan in free mode (daily automated backup is fine for me as long as I can trigger a manual backup on demand if required) or if I should just bite the bullet and get Genie Timeline Pro (which I have installed for a 30-day trial on another machine). I'm actually surprised that there are no other "Apple Time Machine" alternatives/clones for Windows.
     
  3. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
  4. peterk62

    peterk62 Registered Member

    Joined:
    Feb 10, 2009
    Posts:
    51
    I have looked at Areca (and a lot of other backup tools), but I think I prefer software which performs sync with versioning rather than full with incremental or differential backup. From what I have read on the website, Areca is in the second category, and although it implements archive merge functionality to keep the number of archives in check, that functionality appears to be quite resource intensive particularly when the archives are stored on a NAS.

    What I have tried:
    • AutoVer: folder filters didn't work for me, issues with versioning Thunderbird mailbox files, single developer no longer has time to work on the software so it is in limbo perhaps waiting to be open-sourced.
    • EVACopy: slow backup, restore tool builds a flat list of all files in the backup which is extremely inefficient with thousands of files (dev admits it is probably not suited for this task), cannot copy open files.
    • PersonalBackup: backup over network seems slow but this isn't my main issue; cannot copy open files, GUI is a bit too complex (exposes too many options) but at least it has a restore function.
    • SyncFolders: using this now but lacks restore function. Although it can keep old versions of changed/deleted files for a specified period of time, there is no easy way to find out how many old versions of a specific file are in the archive.
    • SaftBackup: looks good on paper but constantly crashes when I try to run it from a standard user account.
    • Oops!Backup: looks good but appears to have been abandoned by the developers as they work on a different product.
    • FreeFileSync, SyncToy, SyncBack etc: same general issue as SyncFolders and that is that although they can save multiple versions AFAIK none of them have an actual restore function which will help find a specific version of a specific file.
    • Cobian Backup: looked at this a long time ago; it seems to get a lot of praise on-line but when I looked at it there didn't seem to be any restore function, so again it seems to me that you are forced to manually hunt through your archives if you actually need to restore a file
    • CrashPlan (free mode, no online backup): continuing evaluation. Looks good except for incredibly slow deep maintenance function; not sure about proprietary backup format but at least they get income from their online service.
     
  5. taotoo

    taotoo Registered Member

    Joined:
    Mar 13, 2013
    Posts:
    459
    I'm not sure what your workflow is exactly, but you could take a look at AJC Active Backup if you haven't already. Only mentioning it because versioning seems to be a common theme.
     
  6. peterk62

    peterk62 Registered Member

    Joined:
    Feb 10, 2009
    Posts:
    51
    I have several Windows PCs which I want to back up to a NAS - user files and digital photos/videos. Due to the amount of data, I don't want to do a full backup every month or whatever - I would rather do one big sync to start and then have the software update only new/changed files, with old/deleted copies being saved for some period of time before being purged.

    I want versioning so that if somebody accidentally deletes some files or messes up a school project they can go back to an earlier version if required. This has happened, and this is when I found out that fishing through the SyncFolders archive is a pain.

    With that in mind, I'm not looking at a source code repository or enterprise-class backup system but it has to be relatively easy to use and essentially set-and-forget. It has to be usable by non-technical users. That's why Genie Timeline still looks best overall, but CrashPlan is also OK and I also like the fact that CrashPlan runs as a service so I can set it up once on each PC and it will automatically back up everybody's files.

    I have been looking mostly at free options for now but don't mind the Genie Timeline Pro 3-back for $90 cost, but if the per-PC cost goes much above $30-$40 I'm probably not interested.
     
  7. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    One isn't required to perform Areca merges. The merge functionality exists to reduce the aggregate storage used.
     
  8. peterk62

    peterk62 Registered Member

    Joined:
    Feb 10, 2009
    Posts:
    51
    Agreed, but the way I see it, with a backup setup which uses full + incremental/differential archives, you either have to merge those incremental/differential archives or else you have to regularly do full backups. What I don't like about that kind of setup is that the full backup is always the oldest data. If you want to recover a newer version of a file you have to start from the last full backup and work your way to the version you want.

    With a mirror setup like Genie or SyncFolders, the latest version is always the "full" mirror and older versions get pushed down. Granted that setup means you have to periodically prune the old versions to keep the archive from growing out of control but hopefully that would be less intensive than copying everything over again.

    That's just my opinion, of course, and there's plenty of software available for both backup strategies. Kind of reminds me of Gulliver's big-endians and little-endians...
     
  9. peterk62

    peterk62 Registered Member

    Joined:
    Feb 10, 2009
    Posts:
    51
    Well, I have signed up for an account on the CrashPlan user forums; unfortunately it seems that slow pruning is a common complaint...
     
  10. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    Fortunately that's not how Areca Backup works. A full restore of an Areca incremental archive doesn't just restore what's in the incremental archive, but instead automatically uses any previous archive necessary to reconstruct the restored files.
     
  11. MrBrian

    MrBrian Registered Member

    Joined:
    Feb 24, 2008
    Posts:
    6,032
    Location:
    USA
    Areca Backup also can list all backed up versions of a given file, and the user can recover a specific version.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.