Verified archive fails

Discussion in 'Acronis True Image Product Line' started by srdiamond, Jul 12, 2006.

Thread Status:
Not open for further replies.
  1. aoz

    aoz Registered Member

    Joined:
    Jun 8, 2005
    Posts:
    223
    to grayhair, and ohers

    re: Bit to Bit

    I myself do occasionally do binary (bit-to-bit) compares of a backup.

    I use a program, Super Flexible File Synchronizer. (others exist also). This is a synchronizer program that also allows binary comparisons of each file.

    I do this for testing purposes; you can do a backup of TI, then mount the image, and do a binary comparison against the source.

    A PROBLEM with this is that if you do a comparison of your c: drive, you'll have a lot of not-compared-correctly files, due to new temporary internet files, changes to system files,e tc. BUT, you can at least see if the majority of your files, on the c: drive, compared.

    On a d: or e: (separate data) drives,the comparison should be exact.

    I personaly would like to see bit-by-bit comparison as an option for verification. If I'm doing archiving, I do NOT want to rely just on checksums; if I do a binary comparison, on file-per-file basis, I KNOW that the actual files were re-read, etc.

    An example of a program that does this with DVD building is Nero. It has option to VERIFY after BACKUP. PER their tech support, it actually does reading of data to compare to the source. BUT, in other tech forums, I've seen that some techie's question whether their specific method is totally accurate. I can't vouch for either school of thought

    Nick
     
  2. Howard Kaikow

    Howard Kaikow Registered Member

    Joined:
    Apr 10, 2005
    Posts:
    2,802
    Not only on the drive on which your OS lives.
    The Content program at http://www.standards.com/inde.html?CompreDrives
    reports the reason why a file could not be compared.

    Not necessarily. Depends on how the system is set up and what apps are running.

    The most accurate comparison requires running the backup immediately after a reboot, and ten the comparison program immediately after that.

    Note that comparing the content of files is a very long process. You are reading each file, that has the same length, twice. For this reason, the Content program allows you to choose a larger buffer to speed things up.

    On slower systems, or with drives that have lots of files, it can take over an hour to process the drive. and this is using the fastest possible APIs.

     
    Last edited by a moderator: Jul 14, 2006
  3. Xpilot

    Xpilot Registered Member

    Joined:
    May 14, 2005
    Posts:
    2,318
    Well gentlemen, I find this discussion interesting but as far as I and many other users are concerned where is it leading to?
    The Acronis validation is the result of a checksum comparison which gives a pass or fail result to an archive. So how does that really help ? To my mind,even if it passes, there is no guarantee that the archive can actually be restored in the hour of need. The prime objective of an archive is to enable a failed or corrupted disk to be restored. I gave up running validations long ago as The Acronis program demonstrated after hundreds of backups and restores that it is quite capable of producing valid images every time. Several "what if" scenarios spring to mind of errors that could creep in from a variety of causes before actually being committed to taking the restore option. I have however managed to sweep all these doubts aside by restoring each backup as soon as it is made. So if there was a problem this would be seen and corrections made whilst the system as such was still up and running.
    As I am extremely averse to risk taking I have always restored to a spare hard drive thus leaving my working drive intact. I have recently installed an exchangable hard drive system so I can follow this safety principle easily for every backup.
    So I regard validations as a waste of time as I am covered in another way and in any event a failed validation is just that, it corrects nothing by itself. It does not ensure a subsequent sucessful restoration even it it passes the first test.
    To this day it still gives me a frisson of delight each time I boot from the the freshly updated swapped main drive and computing life carries on as if nothing has changed. Of course in terms of systems and data nothing has changed but I am still impressed by the technology behind it all.

    Xpilot
     
  4. Tabvla

    Tabvla Registered Member

    Joined:
    Apr 21, 2006
    Posts:
    649
    Location:
    London, England
    Agreed.

    Like the saying goes..... disks are as cheap as chips these days. This is exactly the kind of advice that I am giving my customers. Forget about the verification. Simply do an actual restore to a spare disk and check that it works. It takes about the same amount of time and the benefit is that you KNOW with 100% certainty that the archive will restore.

    Data can simply be restored to a spare partition set aside for the purpose and immediately checked. The system partition or disk can be restored to a spare disk and again checked immediately.

    The other item mentioned by Xpilot is the removeable drive. The price of removeable racks has dropped quickly and are now competitively prices in relation to USB enclosures. One of the big advances in recent times is that SATA is now being increasingly supported in these devices. Up until quite recently only IDE technology was supported at a consumer price level. In addition sequential read speeds are virtually the same as for a disk connected directly to the MOBO.

    The big advantage of the removeable hard drive rack, as opposed to an external USB enclosure is that it is a very simple task to restore a system disk image to a removeable disk and then test it. It is possible to boot from a USB device but this requires the user to be very techy-savvy, something which my customers are not.

    The Vantec range of removeable racks are well priced with few cons. If you are interested you can read a review here:-

    http://www.pcstats.com/articleview.cfm?articleid=1883&page=3
     
  5. furballi

    furballi Registered Member

    Joined:
    Jun 17, 2006
    Posts:
    263
    The IDIOT-PROOF method of testing an image file is to restore that image file and reboot. Do this about 50x with your favorite imaging software under various conditions (CD/DVDs, USB drive, etc). If okay, then the imaging software is compatible with your PC. There is probably no need to verify future image files.

    If you update the imaging software, then you should retest for stability.
     
  6. seekforever

    seekforever Registered Member

    Joined:
    Oct 31, 2005
    Posts:
    4,751
    I tend to agree with your statement and I also have said if TI is working on your system it will very likely continue to work well unless a HW problem develops. That of course is the catch if it is a marginal rather than catastrophic failure. I certainly wouldn't be making long series of backups without at least a verify or a restore at a suitable interval.

    I find disturbing the implication in this thread that the validate process is useless. It may not be ideal but it is far from useless. To really understand it requires more information than it is a "checksum calculation".
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.