Discussion in 'backup, imaging & disk mgmt' started by Coolio10, Sep 20, 2008.
Well, time to mention Vopt...The fastest I've used so far (after 1st run).
How about file placement - see post #75 ? The files in green just never get used once the OS has been installed. Reducing the number of files in play and locating them near to each other sounds like a good idea to me. As to the fragmentation aspects of different programs I see little practical difference.
As it never takes me more than a few seconds to run UD for system or data drives I can see no reason not to - but the main benefit comes fromfile placement in my view.
Well, is everyone done arguing about defragmenting? Will you anwer my question instead .
Okay on two points. First I'd agree with Longboard your disks are pretty full and that makes it tougher on any of them.
I've been a PD user for a long time, but frankly the new version I just found a waste. I've switched to Ultimate Defrag, and like it. What I do like is the ability to specify where files go.
Is it worth it. I found with one specific application, a game, MS Train Simulator, it made a big difference when I kept all the files on the same area of the disk. Much smoother.
One caution with any of them that use the recently accessed method, which I do like. Guess what an AV scan does to that. Poof.
In all likelihood your greatest benefit would come at the expense of a fresh partitioning outlook.
That was a great idea about running Puran before UD on a large badly fragmented drive, mine is 750gb. It took about 1.5 hours to defragment only with Puran, leaving gaps. Now, when running Folder/Name method on UD, with placement of FD-ISR archives and StorageProtect backup files at the beginning of the disk, it uses much less of the CPU, keeping the system cooler and optimizes the file placement with no gaps about 30 times faster than without the pre Puran step.
Before using the pre Puran step, UD was very slow and using 100% of the CPU.
Thanks for the good tip,
I wonder if those who have found UD slow have been using it long enough ?
I set mine using "past days data was used". with the OS set at 30 days it is at least 30 weeks from a clean install before things settle down. Then if data is messed with when using a backup program the dates can be reset so the clock starts again. Then there is the question of optimal partition size with smaller sometimes being better in my experience. If there is a weakness with UD it is that it does take time to understand and any one time only analysis will miss the possibilities. Just defragged C: in 34 seconds which is about as long as it takes following a days work.
I noted that perfectdisk seems to take about 3 times as long as WDD/diskeeper 10 - I think this is simply because more files are shuffles into as perfect a position as possible.
Flip side is rudeced refragmentation rates.
I think the GUI was originally designed for 1024x786 and actually would not shrink below this. One of the updates fixes this.
Does not bother me personally as I look at defrag GUI about once a month.
Reason why PD and DK use their own services is not just for the time scheduling but also the monitoring of disk activity counters, so that the defragging is not obtrusive.
Memory use, I don't think its particularly bad.
You have missed the point of smartplacement.
Its purpose is to position files to reduce refragmentation rather than performance.
Not only does it help to reduce refragmentation, but should in theory reduce time of future defragmentation runs (relative to no specific placement using the same defrag tool).
My tests show that tools that put effort into placement (eg PerfectDisk and JKdefrag) reduce refragmentation rates by approx half, over those that do not (Windows defrag, Diskeeper or perfectdisk on non-smartplacement mode).
Flip side is the defragmentation takes far longer with specific placement.
My tests confirm there is a LOT of hype over performance losses from fragmentation on windows/NTFS, for the majority.
Question to ask is if your drive is only running at half speed due to fragmentation, will it half your daily productivity, if you look at your normal usage pattern - average desktop user, performance loss will be in the realms of error, only when you look at people doing things like video editing will productivity loss be significant enough to warrant defragmenting.
From what I understand, Diskeeper were heavily involved in the writing of the first defrag API on the first version of Windows NT, so I suspect some kind of reciprocal agreement was made, at the time MS had no expertise in defragmentation tools, so probably more economic to use Diskeepers defragmentation tool than develop their own.
Its not just the file system that is the problem, its the file allocation strategy used by the NTFS drivers, but I think NTFS is burdened by legacy support.
NTFS was a good file system when it first came out, one of the first mainstream filesystems with journalling (and therefore good reliability).
I Just think that its been neglected as newer, more advanced file systems have come along (XFS, ZFS) and others have evolved to be better file systems (ext2 to ext3)).
When NTFS initially came out and I was using it with NT, MS's claim was that defrag was never needed. The fact that starting from 2K defragger started being incorporated default with the OS meant MS finally acknowledged the fact that NTFS files fragment.
MS did give a home made defragger with Win 98 but it was quite basic.
that is true.
surely that could be solved by microsoft creating a better file system?
linux doesnt have that problem.
Do you or does anyone else know of any AV scanning programs that will not change the access date?
Is there a switch in Windows XP to turn that function on and off? If so, maybe a batch file before and after doing a malware scan might be available.
I don't Silver. Assuming I keep the AV I am playing with on the system, I just will leave it real time and not bother scanning.
That is sort of the way I do it, except for realtime protection I use DefenseWall to protect all of my programs that connect to the Internet and Threatfire with Outbound protection.
Any files that I download are scanned from the explorer right click menu with Avira, MBAM and SAS before they are executed. These programs are not real time but on demand so they never scan.
Is there a reason why perfectdisk can take almost 5 minutes to analyze?
When all the other can do it under 1 minute?
Puran can probably do a whole defrag in that time.
I just did an analyze on my other computer after ultimatedefrag finished and puran did make a small difference.
Mostly to do with free space. The free space fragmentation was higher and the largest free space chunk was also higher.
atm using Puran Defrag, really love it, the "Fill Gaps" in the Automatic Boot Time Defragmentation" can take awhile sometimes, although one can easely press ESC to skip if your in a hurry
Does anyone know when the ultimatedefrag boottime defrag is suppose to be released. The last announced release date was mid-late august and its already late september......Hopefully ultimatedefrag development hasn't frozen.
I'll just use purans instead for now.
Something's just not right there. Either your disk is too full (as a few other members here alluded to already) or it's some kind of configuration issue/conflict.
My most recent complete defrag (PD w/SMARTPlacement) took less than 8 minutes; it took well under a minute to analyze. This was a 640GB HDD with over 90% free space that hadn't been defragged in over a week.
The other day, on a 160GB HDD with ~85% free space, the entire defrag with PD only took a couple of minutes. I did have StealthPatrol enabled, however.
Silver, you can trun on/off the last access with the command prompt or through the registry.
for turning off the last access modification run
FSUTIL behavior set disablelastaccess 1
for truning it on again
FSUTIL behavior set disablelastaccess 0
for turning off the last access modification
for turning it on again
ps. pay attention that the modification takes effect after you reboot your system.
I'm missing the point here. I use UD2008 with "past days data was used" turned on and have last access turned on. Running an on demand av or as causes no problem. Some backup programs do cause a problem because they emply incremental methods. The solution here is to image rather than use traditional backup programs.
Point is this. I use same options, so I want those files not accessed in 30 days treated as archive. But running a scan on the means all those files were again accessed so they no longer fit the haven't been accessed in 30 days model.
Question. Say last access is on and I have a file whose last access date is Jan 1 2008. Obviously if I scan it today it gets todays date.
So I turn it off and reboot. Then scanning wouldn't change the date. But what date is there when I turn it back on and reboot.
Separate names with a comma.