Discussion in 'FirstDefense-ISR Forum' started by MerleOne, May 2, 2009.
Is there any way to check an archive integrity without restoring it to a snapshot ? Thanks.
I'm also interested to know, good question MerleOne.
Indeed there is. Tools>options>tasks and tick verify archive.
Honestly, in all the time, I've been using FDISR, I've never once used it, and I've never once had an issue, and I restore FDISR archives, all the time.
Thanks, but that's not exactly what I was looking for. I'd like to check days after it was created that an archive is OK. I defragmented some of them and would like to be sure they still are OK. The option you mention is for all tasks, not only archives, and check is done just after copy/update.
I never bother defragging the archives, as I've been told they will always defragment, as the are compressed files.
Since about every 60 days I rebuild them anyway, if you wanted to defrag that drive, I'd just delete them, and then rebuild them. If you have multiple version archives, then I just wouldn't bother.
I have restored one of the defragmented arx and it worked fine.
When you mention that you rebuild your archives, could you detail how ? I mean if you create a new archive from existing snapshot, you will add to newly created archives the latest modifications you made on the snapshots, just by having booted them for instance. So you may lose what was OK in those archives.
Or you proceed differently ?
Since I just really use one archive as a rollback type of thing, I just periodically delete it, and recreate it by making a new one.
If you update an archive frequently which I do, then you notice that once the archive is finished copying, it does a "finalizing" function, which takes more and more time. When it gets to about 2 minutes is when I do the rebuild.
I do the same thing that Peter does: when I find the finalizing process taking too long, I simply delete the archive and start over again with a new one. A new "fresh" archive finalizes almost instantly.
Sorry to insist, but while doing so, you don't really rebuild your archive, you recreate a new one which contains a different config. Just wondering : did you try defragging an archive that takes a lot of time to finalize ? It would be interesting to know if it makes a difference. I recommend defraggler from Piriform to defrag just one file.
Total waste of time. What causes the archive to take longer is not the file being defragged, but the internal structure of the database. True of all database files, when data is added and then data is deleted.
Your first statement doesn't make sense to me, as of course the config is different. But since I generally update the archive once a day, it's config is constantly changing.
What you may not have grasped is I am using the archive in place of a snapshot.
Have you tried it ? I wouldn't dismiss it before doing so, one never knows...
Regarding the way you use archive, I see what you mean, I sometimes do the same thing too. Even if for me, the purpose of an archive is to keep an interesting config for a long time. But doing as you do, I lose this config and sometimes it's impossible to recover it. May be using "export snapshot" should be a better way to do real archives - I think exported snapshots cannot be updated.
Honestly no, as they are about 15gb files. Also I discussed this issue a long time ago with a Raxco support tech who was pretty savy and he said it was a waste of time as the NTFS system deliberately leaves space among the file, as it is a compressed file, and that is done for the uncompression so defragging wouldn't accomplish anything.
I understand your desire to keep certain configurations. I just do it with images.
I tend to agree, 15GB is a lot to defrag. Regarding the compression, all I can observe is that the file itself is not NTFS-compressed, I mean at filesystem level. Maybe it's its internal structure.
Separate names with a comma.