Defragging and Defraggler...

Discussion in 'other software & services' started by Osaban, Mar 19, 2019.

  1. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    I’ve used Vista, Win 8-8.1, and now Windows 10. With Vista, I stopped using third party defraggers and relied on Windows own defragmenter ever since, as Vista wouldn’t improve its system speed after defragmentation.

    Recently I’ve decided to download ‘Defraggler’ from Piriform and run it on my Windows 10 after having used Win 10’s own defragger…

    I was surprised at the results: Defraggler took two hours to defrag what Windows claimed to be a system with 0% fragmentation, and my computer is now visibly faster and snappier. I thought some people might like to know about this…
     
  2. majoMo

    majoMo Registered Member

    Joined:
    Aug 31, 2007
    Posts:
    994
    Good info for who think that to defrag is an useless maintenance action...
     
  3. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,623
    Location:
    USA
    I also found that from Vista forward defragging had little to no effect on performance. As I am running SSDs on my PCs I find zero use for a defragger now. I am also unlikely to run anything from Piriform after the CCleaner incident, but that is a personal choice you have to make. In any case the time you took to post your findings is appreciated and hopefully they will be of benefit to someone that isn't using an SSD.
     
  4. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Not sure who that might be. It is, and always will be important - at least as long as hard drives are around.

    I do not agree however, that Defraggler is that much better. Yes, it works. And it is good. But there are a couple things to remember before assuming a sample-size-of-one experience renders moot the whole point. Anecdotal evidence is not evidence.

    Fragmentation tends to affect performance most as free disk space becomes scarce. This is because there are fewer large chunks of free space to put entire files in. So it has to be broken up (fragmented) and saved in a bunch of smaller free spaces. So downloading and installing yet another program like Defraggler just takes up more precious space.

    Most computers these days have monster disks. And if you run out of room, buying more is very affordable.

    In the old days, disks were small and expensive. And seek times were much slower. So defrag programs jammed the files at the beginning of the disk to save time - as well as consolidate free space. That is not necessary today - in fact, that quickly degrades performance as files are updated. Remember, it is only seeking the first file segment that matters - as long as the rest of the file is in the adjacent storage location.

    After you defrag, the second (and I mean that almost literally) you start to use the computer again, fragmentation start over again as Windows opens and modifies files and they have to be saved in new locations - leaving holes where they were previously saved. This means any advantage to using Defraggler (or any other 3rd party defragger) is quickly negated.

    It is surely possible you saw some improvement right after running defraggler. But it is more likely it was mostly in your head - that's human nature, not a criticism.

    I also have no doubts Defraggler reported fragmentation when Windows said 0%. This is because each program uses their own algorithms to calculate fragmentation and how to defrag. If you were to run Auslogics Disk Defrag after running Derfraggler, no doubt it would show the disk is fragmented too. Then it would beat your hard drive to death, just like Defraggler did, to defrag the drive its own way.

    It is just best to use Windows own and keep it at the default settings - which defrags your HDs weekly, if needed. Contrary to what many want to believe, Microsoft really does know how to keep Windows running quick and smoothly.

    As for SSDs, because of how they store data, fragmentation is not an issue and therefore defragging is not needed. And Windows will not defrag a SSD anyway. Note this is why the Windows Defrag program has been renamed to "Optimize Disk" in newer versions of Windows. Windows will properly decide what maintenance to perform on the disk, depending on the type of disk found.

    And it won't on any OS where there is plenty of free disk space.
    I respect your decision but frankly, it is not logical. If you dismissed every company because it made a mistake, you would be sitting on a dirt floor twiddling your thumbs because you could trust no company. And BTW, the CCleaner incident was caused by AVAST, their parent company, and not by the Piriform developers. I'm just saying, holding a grudge isn't worth it.
     
  5. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,623
    Location:
    USA
    I work for a software company and don't feel the story they gave the general public is even plausible.
     
  6. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,614
    Location:
    Milan and Seoul
    I actually did not want to promote Defraggler (my post is unintentionally misleading), it was the program that first came to my mind when I wanted to see whether there was any difference in performance compared to Windows own utility. With Vista I had a paid program which did not affect its performance after defragmenting. I decided then and there that using Windows own was enough for all intent and purposes.

    Fast forward to Windows 10, and out of curiosity I tested just any one free program to see if there was any substantial difference. Now you might be right Bill, my conclusion is based on subjective visual perception which can hardly account for evidence, but here at Wilders we take seriously anecdotal impressions from members, with a grain of salt of course, and I can assure you that my computer is slightly faster, boot and and shut down as well. To declare that MS can outperform any third party competition is not always true, case in point is the ongoing debate of Windows Defender being heavier than other AV programs. Avira on my computer is definitely faster than Windows Defender.

    Isn't this the very reason we defragment the drive on a regular basis?
    I would like to point out that my post was not meant to criticize MS in any possible way, as a matter of fact I'm a strong supporter of Windows 10, and my harddrive is almost empty, I use 55 GB out 1 Terabyte.
     
  7. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    I don't mean to be crass, but so? I've been working IT support for nearly 50 years, including 10 years in a major software company. Working for a software company does not qualify any person in those matters. And don't forget, no company in any industry is going to put out a mea culpa that doesn't have a marketing or PR spin on it.
    I am sure it was - right after it was defragged. That is typical, regardless the defragging program used. But that does not change the fact that immediately there after, as fragmentation starts all over again, any performance gains level out. This is especially true today as almost every program regularly goes through updates that involves replacing files. The new file never nicely fits in the exact same spot on the drive. And every time Windows or any other program opens a file, that file is locked. And if that file is modified in any way then saved, it is saved in a new location, then verified the save is not corrupt. Then the space the old file in the old location is marked as free space - where likely a fragment of another file will be saved.

    I am just saying that any performance gains 3rd party defraggers provide are quickly negated and it soon becomes a level playing field anyway - but without taking up extra precious disk space by the 3rd party defrag utility.
    Who said that? Not me! The fact is, most of the native programs in Windows are pretty basic with no frills. But that does not mean they are not totally adequate for the job. If you want a fancier or greater feature calculator, install one. But does that suggest the native one cannot accurately multiply 2 x 2? The Snipping Tool is great and more than adequate for most. But if you need more features, they are out there.

    Just because there are more efficient defragging tools out there, that does not mean we need them. If users would just leave Windows defaults alone, and let Windows own Optimize Disk program do its thing, fragmentation will never become so bad that a 3rd party app is needed!

    Yes! Which is why Windows own Optimize Disk program (the current name for the defrag feature) does defrag on a regular bases by default. And I should point out because it is done regularly, fragmentation does not have a chance to get so bad as to affect performance. Which again implies the need for a more efficient 3rd party defragger is, once again, negated.

    Lets not forget what Windows uses its fetch features, in conjunction with its Optimize Disk feature to load programs we use most faster. Other defrag programs don't.

    Now if you are saving, deleting, and modifying and saving and deleting and modifying 10s of 100s of big files every week, and you have limited free disk space, then you might very well need a more efficient 3rd party defrag program. But is that normal usage? Nope!

    FTR - I didn't take it as a MS criticism.
     
  8. RangerDanger

    RangerDanger Registered Member

    Joined:
    Apr 30, 2018
    Posts:
    120
    Location:
    Boston
    When I installed 1809 I first used Windows defrag,then used Glary,now Auslogics.I notice no difference.What I notice is Windows Defender slowing things down compared to a lighter AV.
     
  9. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    I use Windows Defender on all my systems. Never seen it drag anything down. But I know some have complained it does. That said, I have heard the same complaint from every security program ever made! Since every Windows computer becomes unique after the first few minutes of first boot, I don't expect every program will behave consistently on every computer.

    Personally, I like, use and recommend WD. But frankly, I don't care which anti-malware solution people use (except for Kaspersky but that's for a different discussion). I just want users to keep Windows updated, use a decent anti-malware solution they are comfortable with and keep it updated, and don't be "click-happy" on unsolicited downloads, links, attachments, and popups.
     
  10. RangerDanger

    RangerDanger Registered Member

    Joined:
    Apr 30, 2018
    Posts:
    120
    Location:
    Boston
    I will use Defender because when something does not work its one less thing to worry about.In the real world(not You Tube) it works just as well as any other AV.
     
  11. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    I think this is what many just don't seem to understand.

    Microsoft does not code WD to score well in artificial, simulated lab tests. They don't sell Windows Defender (or a pro version for home consumers) so they don't need high laboratory test scores for marketing fodder. So they don't code for those scores, they code to protect consumers in real world scenarios.

    I keep reminding folks that of all the anti-malware product developers out there, only Microsoft does NOT have a financial incentive for the bad guys to do well!

    Think about it. What will happen to AVAST, BitDefender, Norton, McAfee, and all the other commercial players if they actually defeated malware and malware went away? They all would go out of business. So they need malware to thrive so they can too.

    Microsoft, on the other hand, always gets blamed for the security mess we are in even though it is the bad guys who put us here, and Norton, McAfee and others who failed to stop the bad guys.

    The ISPs and big telecommunications carriers are not helping either. The fact is, blocking malware is not, and never will defeat malware. The only way that will happen is to block malware and the bad guys at the source. But our ISPs and the big backbone carriers would much rather sell us more bandwidth than to block spam (major carriers of malware) and other malicious code BEFORE it gets distributed globally. :(

    It does not help that the Law Enforcement world wide do not have the resources to enforce even the current laws!

    But none of this has anything to do with the thread topic.
     
  12. ronjor

    ronjor Global Moderator

    Joined:
    Jul 21, 2003
    Posts:
    163,035
    Location:
    Texas
    Let's not go through this again.
     
  13. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    I agree. There's nothing left of that horse.
     
  14. EASTER

    EASTER Registered Member

    Joined:
    Jul 28, 2007
    Posts:
    11,126
    Location:
    U.S.A. (South)
    Since Windows 98 on this end a routine of trying to scrap or scramble (commonly known as cleaning files) first than Defragging is been a necessity for platter disk drives that always improved performance-still does on 8.1 but as so very well mentioned, the platter over a short period afterwards stacks right back up (the comment term "negated") due to that very thing, new files-new locations of clusters etc. Effectively also gumming up the MFT pointers as well etc. It's a never ending task. Of course SSD's were in part, in addition to performance, designed for user's to avoid cleaning up & chasing after fragmented files and all that to a point. I dunno exactly-this end still favors and uses mechanicals. Old school but still useful IMO.

    Appreciate the mention to those facts because quite frankly I personally don't think Microsoft ever really made much progress on that front of locking files clusters because of the geometry and circuitry patterns involved which are too numerous to invest any more money into improving as opposed to say security in Windows Defender arena.

    We use UD5 on this end which with it's graphical metaphor and some advancement in accurately moving groups of new/old files around (described as file placement), makes it (albeit as mentioned-Temporary) much more effective in at least gauging or measuring how long you need take before another new Defrag is required to reset the whole metal disk's population again. :)
     
  15. plat

    plat Registered Member

    Joined:
    Dec 19, 2018
    Posts:
    2,233
    Location:
    Brooklyn, NY
    Do you think HDDs should be phased out sometime soon? I only use mine for very occasional storage externally. They were cast-offs from old prebuilt computers.
     
  16. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Not sure much progress is needed. It works the way it does to ensure no file is lost or corrupt until the system verifies the new version has been saved properly. And the file system has no control over the size of the new version either.

    As far as the file tables, they are pretty robust.

    Not necessarily designed for that. It is just the design meant defragging was not necessary. Files are fragmented all the time on SSDs. However, because no mechanical R/W head has to go back and forth looking for the next cluster, no time is lost if the fragments are not next to each other.

    I like to look at it like a file cabinet where all the pages of a report are scattered throughout vs a mail sorting box. In a file cabinet, you must rifle through front to back and back to front to find each piece of paper and assemble them in the correct order. That takes a lot more time - not even counting it is a mechanical operation.

    In the mail sorting box, you just reach into the hole and grab. Then into the hole where the next fragment is. It takes the same amount of time to reach into each hole, regardless where it is.

    "Should" be phase out? I would like to see them go away, but I don't see it happening until the price per GB of SSDs at least equals that of HDs. And while the prices for SSDs have dropped dramatically, I think it will be awhile before they are as economical as HDs.
     
  17. TairikuOkami

    TairikuOkami Registered Member

    Joined:
    Oct 10, 2005
    Posts:
    3,418
    Location:
    Slovakia
    You can always defrag outside of Windows, not only it is way faster, there are no locked files or running processes, so you can defragment and consolidate everything.
    Code:
    Boot Windows USB - Repair - Troubleshoot - CMD - type/enter
    c:
    cd windows
    cd system32
    defrag c: /u
    I have short-stroked my HDDs, so the speed has increased by 30%, though loosing some space in the process, which I do not really need.
     
  18. EASTER

    EASTER Registered Member

    Joined:
    Jul 28, 2007
    Posts:
    11,126
    Location:
    U.S.A. (South)
    The remaining comments conveniently cut out from the quote
    can/should also be considered. You as well as anyone else who is been in IT for any number of years is surely aware there are almost always segments within any O/S that become obvious to the professional as well as user/customer which could be improved on. Priorities are aligned with company policy which area receives the greater attention and finance.

    In this discussion, without pressing deeper into company-vendor technical detail where neither of us are as expert or at liberty to their charts as the vendor, a user raised a statement of positive results BY DIRECT COMPARISON (choice for Defrag purposes), for a matter that is of importance to the quality-adequacy of the computer system to perform at it's optimum expected level. One that a customer is best comfortable with without lag or degraded performance.

    In my conclusion-progress is always necessary for the furthering development of the product marketed and it's potential for higher returns from that effort-investment.

    In short- Windows Defender (with Win 10 anyway) quickly became high priority to better account for resources and development to further that chief goal in light of the shortcomings within previous Windows versions in order to (in plain English) to prevent another crush from targeted malware attacks. It's failure to meet security expectations (yet again) embedded as a whole complete safe O/S would inevitably quash not only the marketing investment but openly exhibit a trend of continued waste of resources should that aspect of it prove vain, which it hasn't.

    In ending-this is a reach for a peon like myself but times being what they are with technological expertise continuing to climb to new heights, maybe one day we will discover no need for the age old Windows practice of reboot every time a new program or update rolls out. A pipedream for most and dismissed by many, except myself. That clearly depends on time and the policy priority of an O/S maker such as Windows (or another) flush with enough resources and determination to make further PROGRESS.
     
  19. pandlouk

    pandlouk Registered Member

    Joined:
    Jul 15, 2007
    Posts:
    2,976
    @Osaban Defragging is absolutly useless for speeding up your system. If you want to have an always snappy hdd system you only need to partition it (= create or shrink the system partition at the beggining of the disk and allocate between 1/10th to 1/5th of the hdd space). The windows automatic defragging will take care of the rest.

    Panagiotis
     
  20. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,623
    Location:
    USA
    RAID 0 and/or SSDs are even better if you want faster. I am doing both. No need to worry about defrag.
     
  21. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Even doing so in Safe Mode results in a more efficient defrag.
    I am not sure I follow your post, but I will, like you cut out what I want to respond to.

    I don't know who's security expectations you are referring too. If you mean the IT press/media (wannabe journalists and many bloggers and their lemming followers and parrots), then I agree - they will never ever be happy with Microsoft, regardless how well Microsoft does - which is tragic because it means (yet again) the public is getting biased opinions (and often pure falsehoods) instead of these outlets doing their jobs; that is, to inform the public of the facts. And that is that WD is more than adequate for the normal user. Sadly, many have the faulty "expectation" that everyone must have the absolute best security possible to be safe. That is just not true. Nobody needs to drive around in an Abrams Tank to be safe. They need a fairly modern car that is properly maintained and they need to drive defensively.
    Well, of course this is inaccurate (and biased?). Every time? Not hardly.

    But more importantly that is an unreasonable expectation! Programs (and not just Windows) need to "open" files (many critical that are needed just for the program to run) and stuff those files into RAM. Many of those just cannot be replaced while they are are currently in use. It would be like upgrading your bicycle tire while you are riding your bicycle. Can't be done. This is exactly why Windows schedules such reboots in the middle of the night. And IMO, that is a good compromise.

    Would you rather go back to the old way and the only way to upgrade your programs and Windows was to send off for new floppy disks? Not me. To me, it is a big advantage being "connected" so Windows and my installed programs can be updated weekly (or even daily, if needed).

    Which leads us to pandlouk's comment below, and how that relates to files changing and free disk space.
    I don't agree with that - not today any way. You are referring to "short stroking" and years ago when drives were much slower and file were updated much less often, that may have been good. But with today's fast and monster drives with larger buffers, the advantage is negligible.

    Remember, finding the first file fragment is what takes the most time. If each successive file segment is in the adjacent storage location, it is going to load very quickly. So if you keep the drive in one partition, you have a greater chance of maintaining the greatest amount of free disk space which offers the best opportunity to keep all the file segments together (not fragmented).

    If you create a small partition for your OS, that means you risk running out of free disk space in that partition - especially today with frequent updates changing files sizes and how that affects free disk space and may affect fragmentation issues. I note running out of free disk space for the OS to operate freely in is a leading cause of computer sluggishness.
     
  22. pandlouk

    pandlouk Registered Member

    Joined:
    Jul 15, 2007
    Posts:
    2,976
    Yes, I was referring to "short strocking", but not for the compact/minimum search time for the various files or segments, but simply because the outer cylinders of the disks are much faster than the inner cylinders. and the new monster drives still behave the same; the closer you get to the center of the disk the slower is the spinning speed of the cylinder (= slower reading/writing times of each sector).
    On a 1tb disk allocating 200gb partition for the Os and the daily used programs is not enough? o_O

    Panagiotis
     
  23. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,623
    Location:
    USA
    I agree, I would expect short stroking to be at least as effective as ever. Spindle speed han't changed much in the last 15 years. Drives are faster and larger capacity because of higher platter density which I would expect would create a bigger difference between the inner and outer tracks if it did anything at all. I remember when the hot setup was to create a 3 disk RAID 0 array and short stroke that. That said, Amazon had a 480 GB SSD for just over $58 yesterday so I will likely never buy another platter drive again.
     
  24. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Ummmm, kinda, sorta, but not really how it works.

    Your basic claim is right and in theory you are right - but your reasoning is way way off and in practice in the real world, the advantages don't pay off.

    First, writes don't count. Yes, they are part of a drive's performance specification, but the OS can quickly hand off any write task to various caches and buffers, then be freed to move on to other tasks. Point being, the user typically does not have to wait for writes to complete.

    So it is all about read times.

    The platters (the actual disks inside the disk drive) are solid pieces of metal. If a disk is spinning at 7200RPM, the entire disk is spinning at 7200RPM. That is, the "spinning speed" is the same in the outer edges as it is in the inner locations. That is, it takes the exact same amount of time for a point (read: 1 "bit" of data) in the outer ring to travel one revolution as it does for a point in the innermost ring.

    But, because the outer ring has a greater circumference, there are more sectors (storage locations) in the outer rings. This means, in theory, the stepper motor does not have spend time moving from ring to ring as much as it might with a file on the inner rings.

    So the bottom line is this. The R/W head can read in more data in 1 revolution from the outer rings than it can in 1 revolution from the inner rings. And this is the point that totally agrees with your comment.

    Years and years ago, with older, slower, and most importantly, smaller drives, this did matter. But today, any savings in time is totally insignificant. Why? For many reasons.

    First, today's drives in general are just faster. The average "seek time" for today's drives is <10ms. The speed of "human perception" for a young adult in his or her prime is at best ~50ms. That's 50 thousandths of 1 second or 5/100 of a second. And drives are much faster than that.

    Second, drives have much higher capacities today. That means the data is stored in much greater densities (much more data is jammed into the same amount of space). This means even in the smaller inner rings, a tremendous amount of data (a much greater portion of the file - if not the entire file) can be read in. That means even with less data in one inner ring, there is less need for the R/W to move to another ring.

    Third, as long as the user didn't dink with the defaults (and hopefully they didn't) Windows knows how to use the Page File on the disk much more efficiently. And while the PF may be located somewhere other than the outer rings, it is still 1 contiguous chunk of space (for less R/W ring jumping) - assuming the user did not allow free disk space to decrease to critical levels.

    Fourth, virtually all motherboards and disk drives today support the much faster SATA III (6Gbps) interfaces.

    Fifth, years ago, it was not uncommon for drives to have just 2MB of slow (compared to today) buffer space integrated with the drive's controller card. 8MB was considered a lot. Today, Newegg's least expensive new drive has a whopping and much faster 64MB buffer.

    The above is just about the drives. Other factors factor in, including the amount of system RAM, processor performance, the OS and more.

    But also, as noted above, because virtually all computers today have internet access, confining the OS to a tiny partition WILL results in greater fragmentation of critical Windows system files - and that will slow file access times down more than where the files are located on the disk.

    So yes, you are right and faster file load times can be achieved if the file is stored in the outer rings vs the inner rings. But, with today's big and fast drives, if the user performs his user responsibilities and allows Windows to regularly defrag the disk (or even regularly defrags with a 3rd party defragger) and if the user maintains an ample amount of free disk space (I recommend 30Gb), then at best, and only under ideal conditions, you are talking about a few scant milliseconds. Not even a full second. In other works, nothing perceivable except, maybe, in benchmark programs.

    If the desire is to have the OS on its own, the much better solution would be to have the OS on a separate drive, not a separate partition. Operating systems are smart enough and know how to efficiently access two drives at the same time. So, for example, if you have Windows on Drive 0 and the C drive, and have MS Office on Drive 1 the D drive, and you start Word, the OS can access critical OS files on C and load Word from D simultaneously. Two R/W heads loading data at the same time.

    When Office and the OS are on the same drive (regardless if in different partitions) all files must be loaded sequentially. Just one R/W jumping all over the place. And that would be noticeable.
     
  25. Brummelchen

    Brummelchen Registered Member

    Joined:
    Jan 3, 2009
    Posts:
    5,868
    for defraggler and similar "small" defraggers you need to know, that all those uses windows api to move files. no driver. so it must be slow and result poor. stick with the windows defragger service...
    running (portable) here, maybe twice a year. i dont care because i use fast drives.

    at least you need x per cent free space to consolidate bigger files, ISO, RAW images, video, similar.

    CAV - constant angular velocity - is out of time, modern hdd run complete dynamically.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.