On how many computers is it allowed to use Bvckup2? I mean, can you use it on two or three? Or do you need some "family-pack"?
I wonder why Bvckup 2 re-scans the source prior copying any minuscule change on it. It happens on Windows 7 x86 not on Windows 10 x64.
A SOURCE re-scan is necessary to determine any changes that have been made. You may not see it happen if all the structures that need to be scanned are in the System file cache at the time (RAM) of the scan... way too fast.
Renamed a folder from KIDDO\UNADM with hundreds of sub-folders and files by just adding a underscore character to KIDDO\_UNADM Ran the profile to update destination accordingly but instead of just renaming the folder, Bvckup 2 created a new _UNADM folder then started to copy and moving those hundreds of folders and files to such folder then deleting the original UNADM folder afterwards. Unbelievable! Why not just rename the folder in the destination drive! Utterly stupid what the program did to say the least.
Take a look at the following thread in the Bvckup 2 Forum...it should enlighten you with regard to what you are calling 'unbelievable'.
Apologies...somehow the link got lost...most probably due to ongoing senility. This is where to look: Bvckup 2 | Forum | How to Avoid Re-copyng Files Because Filename's Changes
Me too! Thing is one gets annoyed at times at the worst times LOL It took 45 minutes to complete the "renaming" on a slow USB stick. Apologies, didn't want to offend Alex, peace. I've always had both settings as user Doequer comments there: Use destination snapshot Use delta copying I don't what happened but I recreated same scenario like last night, renaming back and fort UNADM to _UNADM four times and Bvckup 2 worked as expected. So I don't know what happened last night.
Well, that is software for you...could be something environmental & transitory...has happened to me a number of times with different apps. . If it happens again then I would log a ticket.
Pre-backup safeguard checks https://bvckup2.com/support/forum/topic/1530 Trying to implement this features: What do you recommend? Canary file or amount/percentage of modified files.
It depends on what you're trying to prevent. If you were interested in stopping a backup prior to a recent massive ransomware attack, a great "canary" file (jpg, docx, exe, whatever), that would be surely affected, is one option. If you're trying to stop a massive change in your backups (due to wrong source reference, etc.), a % or file # would be a better approach. This could happen if you're using drive letter reference rather than the driv's GUID in the backup specs. This would also help if your backup target is a NAS, and the basic NAS doesn't support the usual file triplet (access time/mod time/creation time)... even though Bvckup does offer different ways to deal with this. You need to think about what you are trying to prevent and why... what kind of situations are you trying to avoid?
This @TheRollbackFrog , thank you. Still, I need more details on how to implement this. What if a ransomware encrypts lots of files and Bvckup starts backing up (realtime) and the canary file is still untouched by the ransomware?
This option is executed when Bvckup runs its PLAN stage (develops a plan for the backup after it does all its scanning), before it does any file operations. At that point in time, if hasn't done anything. If you're doing this operation in "real time," I would change it to a scheduled operation... trickle changes would always get through.
Yes... think about it. If ransomware is actively whacking your System, its only going to be able to encode a few files at a time in a few seconds. If you're backing up every 10-sec, what kind of metric would you use to detect it? If you're backing up every 5-min, and don't expect a lot of changes in your normal operation period, then a larger metric of changes (say 20, or 30) would easily catch a malware event.
Since you have no idea when that canary will be attacked during a ransomware attack, but do have an idea how many files you may change normally during that 5-mi interval... I'd try a FileCount instead. You can fine tune the number along the way when you see how many changes you really produce in that 5-min period. Eventually you'll get a number that you never practically exceed, but ransomware will pummel when it's really active. You may be good at producing files, but not as good as ransomware is at changing them A canary file should work really well if your scheduling interval is long... that would give ransomware a good period of time to do its dirty work.
Remember, it's not the total # of files being monitored, it's the expected normal # of changes in that 5-min period... tweak it 'til it meets your expectations. Using both the canary file AND a file # is a good idea, 20-files seems fine.
Thanks a lot for your kind help @TheRollbackFrog My new config's working fine but I had this "issue" today when trying to delete more than 20 files at once: It's working fine and expected but in case I need to make changes over the count of 20 files, I think there should be a right click option to run the backup operation manually without safeguards measures. Or am I missing something?