Can Vista run on this hardware?

Discussion in 'other software & services' started by noone_particular, May 26, 2015.

  1. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    Linux turned out to be a no-go. She couldn't handle the changes. Did the factory reset. Updating this thing took forever. Between the OS, updates, an AV, and minimal software, this pig is 40GB after removing the bloated apps like Norton and Adobe. Disabled all the eye candy. Went back to a classic interface. Disabled superfetch. Gave her SeaMonkey and PaleMoon. Installed Avast, which seems reasonably light, all considered. The performance is tolerable but much better than it was. Thanks all for the input.

    This was the first Vista unit I've serviced. I hope it's the last. That is the most annoying, bloated, sluggish, counter-intuitive mess I've ever used. I never imagined that it could take minutes (as opposed to seconds) to delete a file or a shortcut, or that an OS could prompt you that many times for the same thing. The hardware specs of that Vista unit are fairly comparable to the unit my 98 system runs on, except for the processor. The 98 unit has a faster processor, 2.4GHZ Pentium vs 1.8GHZ Celeron. What a difference! I can't make mine run that slow, no matter what I'm doing. I can definitely see why so many hated it. Never imagined that an OS could need so much RAM and disk space just to run at all, or use so much processor time when you're not doing anything. If doing less with more is what they call progress, I'm glad that I'm behind the times.
     
  2. Kerodo

    Kerodo Registered Member

    Joined:
    Oct 5, 2004
    Posts:
    8,013
    Vista was really terrible... I bought a new desktop that came with it back in 2007 and I never really realized how bad Vista was till I installed Win 7 the first day it was released. What a difference! Everything was 2 or 3 times faster and slicker, although there still was quite a lot of bloat compared to XP or Win2k. Anyway, yeah, Vista stinks....
     
  3. Osaban

    Osaban Registered Member

    Joined:
    Apr 11, 2005
    Posts:
    5,618
    Location:
    Milan and Seoul
    Not my experience, I'm still using it on my fairly old notebook (Core 2 CPU T7400 @2.16 GHz 2.16 GHz with 2GB of RAM) and it's almost as fast as my Core i7 machine with Win 8. The only problem was that Vista was launched when computers were all too low on RAM. After testing Win 7 on the same machine it was slightly faster, but not enough to upgrade IMO.

    It is already a miracle that 1.8 GHz Celeron with 1 GB RAM can run Vista, as soon as the owner streams a video on facebook or anywhere else the machine will probably freeze. My machine normally with Chrome and 2-3 tabs active runs on 1.10/1-12 GB...
     
  4. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    IMO, it's insane that an OS should need a GB of RAM just to run itself. The mentality behind the design of operating systems and software in general is IMO disgusting. Just because RAM and disk space are almost unlimited, both seem to try to see how much they can consume of both. In the end, the user ends up right back where they started. What good is 5 times the RAM if the system needs that much more to do the same things it used to do on less? If operating systems and software were still coded with efficiency in mind, like they were years ago, modern hardware could run several dozen virtual systems at once and not even come close to using up the RAM.
     
  5. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,267
    Location:
    Southern Rocky Mountains USA
    Vista on a low spec machine is really bad. The main thing I like about it is the full on classic interface with classic start menu. I get good performance on it with a Core Duo or Core 2 Duo system with 2-3gb of ram. Windows 7 is extremely close to Vista but optimized for performance. Microsoft had years of complaints about the performance issues of Vista so they called it Windows 7 instead of Service pack 3 for Vista. Almost every time I've tried a Vista driver with Windows 7, it's worked.
     
  6. Q Section

    Q Section Registered Member

    Joined:
    Feb 5, 2003
    Posts:
    778
    Location:
    Headquarters - London & Field Offices -Worldwide
    Sorry if this was covered since the whole thread was not read but - Older machines used Hard drives that spun slower (for example at 4200 RPM sometimes) so if you were to try a 7200 RPM Hard Drive it could be a step up in performance which should make a goodly performance difference.

    Also after you either upgrade to a faster Hard drive or if you do not then make sure you defrag the HDD using a smart defrag tool like Smart Defrag. It places frequently used files near the outer edge which spins faster and seldomly used files in the inner portion that is more suited for archives.

    If you can get a legitimate copy of Windows 7 and ascertain drivers are available for the hardware then this will get you some additional performance as well.

    Best wishes
     
  7. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,626
    Despite what many people say, in my opinion Vista is an excellent operating system, and a huge improvement over XP. But, with only 1GB RAM the computer is going to run slowly. If it was at all possible to upgrade to 3GB of RAM (it will cost very little), and you make sure both Vista service packs are installed, then computer will probably run just fine unless it's got a really slow CPU. My experience has been that Vista with SP1 and SP2 installed runs nearly as well as Windows 7 (but can be be slow and problamatic if the service packs are not installed).
     
  8. Daveski17

    Daveski17 Registered Member

    Joined:
    Nov 11, 2008
    Posts:
    10,239
    Location:
    Lloegyr
    Vista looked beautiful and was a good idea, apart from that it was total ********possibly offensive word removed. It had more bugs than a rain forest and was painfully slow at everything. :mad:
     
    Last edited by a moderator: Jun 8, 2015
  9. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    Just FYI, a Pentium D is not an updated Pentium 4. It's basically the same beast as a Core 2 Duo, i.e. a much more recent processor. Don't mind the clockspeed; a 1.6 GHz Pentium D will blow away a 2.8 GHz Pentium 4.

    @noone_particular

    Some of the higher RAM consumption is useful features, or even vital ones, e.g. internationalization.

    That said, this kind of creeping software bloat bothers me too; to be blunt, it fuels an economy of wasteful consumption, and makes computing power a social privilege (and thus further restricts access to information, public services, etc. etc.)

    I consider it more a problem of mass culture than of coding practice, though... for whatever that's worth.

    Anyway sorry I couldn't be of more help. I've been trying for some time to figure out the causes of Windows Rot, but thus far I've got nothing at all. (Other than ever-increasing anecdotal evidence that yes, Windows does slow down over time for no good reason.)
     
  10. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    Correct me if I'm wrong. In addition to existing as files, isn't the registry for Vista and newer OS also completely loaded into memory? Since that registry also stores a mountain of usage records, loading it into memory would be a constantly increasing load over time. That would make less available for everything else plus it would take more CPU time to process it. It would take some time but there would be a way to test this. I assume that there's an equivalent to ERUNT for Vista, 7, 8. Use a tool like ERUNT to back up a fairly clean registry. Time the systems performance. Then work the unit to create a lot of usage tracks and activity records until the rot starts showing. Then switch back to the clean registry and compare.
     
  11. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    @noone_particular

    Nope, the registry is loaded into memory on an as-needed basis. (It's been that way since Windows 2000, IIRC.) Also IIRC, it should not exceed 120 MB or so in total size.

    I'm guessing that the performance loss might have something to do with NTFS, and the general workings thereof. Not necessarily fragmentation; maybe something like, say, the hashing algorithms used to locate data.

    (In that case, the problem would be unsolvable for users - the fix would require design changes in the next version of NTFS.)

    Edit: I would hope it's something more obvious and more fixable than that. But there are a lot of places stuff can go wrong, in relation to how the OS interacts with the filesystem and HDD.
     
  12. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    If that were XP, that idea could be tested by building 2 identical systems, one NTFS, one FAT32. Somewhere I read that Vista could be made to work on FAT32. No idea if that would apply to 7. I've never liked NTFS for many reasons. Except for one virtual test system, an XP/98 dual boot, I've stayed with FAT32. I've long suspected NTFS of being a major factor in hard drive wear. That aside, I have no use for a file system that can conceal executables in ADS and allow them to function. ADS might be another possibility to check into. Could rot be related to the number of ADS or the amount of data that they contain? You mentioned looking into this rot. Can I assume that you've ruled out the amount of temp files in their combined locations as a source of the problem? I suspect that Windows rot won't have a single cause. I'd bet on multiple causes, each of which is insignificant on its own. Combined they badly slow a system.
     
  13. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    Might work. I don't know how FAT32 performance in various areas compares to NTFS though.

    I wouldn't even bother trying that, Vista and later are heavily dependent on NTFS access control lists. I suspect such a system would not be stable, day to day.

    No idea there; I don't have enough cumulative Windows experience to rate its impact on HDDs. From what I've seen BTW, hard disks usually either
    a) last year after year without any signs of failure,
    b) show dubious and unreliable behavior from day one, or
    c) are dead on arrival.

    I have yet to see a laptop or desktop hard disk fail during the course of normal use.

    (But then, I've been using Linux almost exclusively for 10+ years.)

    No idea, though I don't think that's the issue now (see below).

    Yeah, that's ruled out. The laptop I was experimenting on was kept quite clean of temporary files. In any case I don't think that should matter, unless the files were numerous enough to either
    a) eat up a lot of disk space, or
    b) strain the limits of the filesystem's search algorithms

    Possibly. However, I think it may actually be something simple...

    See, the laptop I was experimenting with (and on which I'm now typing) was booting in about 1 minute and 15 seconds after a fairly fresh install, and programs were pretty unresponsive. Memory usage was high, disk I/O was very frequent, etc. etc.

    At one point I took a look at the filesystem, and noticed that files were basically concentrated in two "stripes" - one in the middle of the C: partition, and one at the beginning.

    Remembering how spinning HDDs work, the middle zone should have lower throughput. Also there's going to be significant seek delay, moving the read/write head from beginning to middle and back. A lot of system files were located in the middle "stripe," while others were at the beginning - so there were seek delays all over the place.

    Windows' built in defragger is not very aggressive. It will not consider such a layout "fragmented." Instead I opted for a third party tool (MyDefrag), and ran the intensive "monthly" routine on the C: drive. That took about an hour.

    On rebooting, boot time was 40 seconds. Subsequently, memory usage was down by about half(!), and disk I/O only occurred when starting programs or saving files.

    So, I'm going to call this mostly solved. My conclusions are:

    1) There's something wrong with how Windows places data on NTFS filesystems. I can kind of understand the "two stripes" thing, since it's a balance of wanting sequential I/O vs. wanting to limit file fragmentation; but even so, I should not be seeing such weird distribution of files after a fresh install!

    2. Windows has an additional disadvantage, especially in new versions, because it's huge. You can't just put C: on its own little 20 GB partition (which is SOP for Linux root filesystems - OpenSUSE even does that by default).

    3. The Windows built-in defragger is not nearly aggressive enough for dealing with NTFS. It rates filesystems as not needing defrag that perform very poorly, specifically due to the layout of data on the filesystem...

    But there's also a missing piece somewhere. On Linux, you can partition your drive in a totally whacky fashion - home partition at the beginning, and root waaaaay over on the end of the drive, for instance - and you won't get 10% of the performance loss I was seeing on Windows. So I'm thinking there's something majorly off about Windows 7's I/O scheduling, or NTFS, or something.

    Anyway, the practical upshot seems to be
    1. Use a honking big hard disk
    2. Make your C: partition big to avoid fragmentation
    3. Run an intensive defrag, using a third-party defrag tool, every once in a while

    ...

    Anyway, yeah, it's kind of cool to see Windows finally performing on par with Linux. But I would absolutely hate to be dealing with this OS in a server environment, on top of large RAID arrays or whatever. I mean... darn. o_O

    Edit: mind, the above is from experiments on one old Win7 laptop, so the usual grains of salt etc. apply.
     
  14. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,267
    Location:
    Southern Rocky Mountains USA
    There are a lot of variables in the Windows equation. What works on one machine doesn't work on another similar one. Windows rot can be from many causes. I have one computer that I'm using Vista on regularly. The main issues it has is a slow explorer that freezes and crashes if over burdened with thumbnails and periodic page file memory burps that slow the whole system down for several minutes. Windows 7 improved but didn't really fix the explorer issues. Mostly it is faster but that depends. On my Vista laptop, I tried Windows 7 but it had worse page file memory burps than Vista and it had one on every boot up that lasted about 5 minutes. I think it was due to a low end gpu because on a similar laptop with a good ATI gpu, Windows 7 barely burps at all.

    I found a very simple solution to Windows rot years ago, imaging. I just keep the system imaged and if any rot starts to happen, I restore the last rot free image. Then I think about what changes I made to the system and what software I installed and usually I figure out what caused it and don't make the same mistake again.

    There is no way I would use fat 32 as a system partition. No file permissions and security. That is the most basic and reliable way to control executables I've found although NTFS is overly complex in that regard. That being said, after playing with it enough, I've found an approach that works and it is mostly based on eliminating unnecessary users and groups and simplifying the default permissions.
     
  15. rodocop

    rodocop Registered Member

    Joined:
    May 1, 2010
    Posts:
    74
    Well, sure Vista CAN DO IT. But you need to help the PC:

    1. Switch off all the unneeded stuff like services - there are numerous guides to how to do this, but what I recommend to check: Windows Defender service is OFF. It's does almost nothing on Vista to defend, but is really hardcore resource hog. Many other services could be set to OFF or Manual start.

    2. Uninstall any IS or complex antivirus suite you use and use the 'Safe Admin' config by our comrade. (If you cannot live without AV-suite, try lightweight Bitdefender Free or Forticlient or Avetix)

    3. Uninstall unused software (especially vendor preinstalled bloatware)

    4. Switch to one of lightweight browsers like K-Meleon (lightest and better option) or TheWorld6 (Chrome without extensions but with adblock). Chrome isn't really compatible with Vista on 1Gb RAM ;-)

    5. replace other 'heavy' apps with alternatives

    6. delete all the possible stuff from startup: look not only apps but also tasks and unused system components if any (AnVir Task manager will help you to manage advanced startup - also it can guard startup for you)

    7. 'optional but useful': uninstall all the .NET completely and reinstall final packages of .NET 3.5 SP1 and .NET 4.6 from Microsoft

    8. Clean WinSxS folder like described here

    9. Check your PC for PUPs with latest AdwCleaner and Ultra Adware Killer

    10. Clean all the TEMP folders.

    I recommend SlimCleaner or Toolwiz Care for complex maintenance of your PC. If you think that to clean registry is a good idea - I recommend Argente Registry Cleaner Portable (use backup option while cleaning!) as it does most deep and accurate cleaning I know.

    I think this guide will help you.
     
  16. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    I do like the file permissions of NTFS. That's the only reason I even have it on a test unit. On that test unit, the 98 system has Paragon NTFS for 98 installed, which gives me a very interesting view of the XP system.
    [​IMG]
    When my physical unit was dual boot, 98 and XP, I used FAT32 on both for compatibility.

    I question just how effective file permissions would be against malware (or an attacker) that has low level disk access. If I recall right, these permissions are set by specific bits, either in the files themselves or in the MFT. With low level access, an adversary can change those permissions. IMO, NTFS is a security tradeoff. If ADS could be eliminated from its design, I'd consider using it. As it is now, I regard the liability of ADS as greater than the asset of file permissions.
    I haven't had a rot issue in a long time either. For all purposes, my system partition doesn't change at all. I moved the temp folders, browser cache, etc to a RamDrive. The desktop is on a data partition. Virtual systems have their own partition. The registry is restored to a clean and optimized state on every reboot. My system performs the same as it did when I completed it a few years ago.
     
  17. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,267
    Location:
    Southern Rocky Mountains USA
    The low level disk access usually requires physical access to the machine. I can bypass just about anything if I can boot the machine from an optical disk or usb stick. The only defense against this is encryption. File permissions work when the system is live. They are pretty effective against web based threats like exploits and drive by installations. I find them to be more effective than srp but usually use both in conjunction.

    One thing I've learned with NTFS is to always test the permissions to see if they really work as intended.

    Windows group policy does allow you to disable execution from optical and removable storage so fat 32 external disks can be used without allowing unknown software to be able to execute on them. There is just no way to do it if it is an internal disk and it can't be done at the folder level like file permissions. Once again, that only applies to a booted Windows system. If the disk is bootable, it can run all kinds of malware if it is allowed to boot.
     
  18. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    If someone has low-level disk access, the game's up anyway. That means all your security layers have been bypassed, and the OS (and everything on it) must be considered untrustworthy.

    Note that this does not necessarily require physical access to the computer BTW. For instance, it's exactly how MBR rootkits work. (They directly overwrite some of the data at the start of the disk.)
     
  19. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    I believe that some of Windows own components have that access. Disk management would almost have to in order to function.
     
  20. Gullible Jones

    Gullible Jones Registered Member

    Joined:
    May 16, 2013
    Posts:
    1,466
    Yes - and those components operate at a higher privilege level, which other software can be blocked from accessing. (By mandatory access control on Vista and later, or by third-party stuff or limited accounts on XP.) There are different divisions of responsibility within the OS.

    (A lot of divisions, in fact. Windows has had API functionality for various kinds of mandatory access control since at least Windows 2000, possibly since NT 4; but nobody made much use of it until Google Chrome appeared on the scene.)

    Point is, parts of the OS always operate with SYSTEM privileges, but an attacker (human, malware, whatever) doesn't necessarily have control of them. Part of the role of security software is to reduce the likelihood of someone or something getting there. But once they're actually there, they can theoretically control everything including the security software. The computer can therefore no longer be trusted.

    ...

    More in depth explanation, for those willing to listen to my 'splaining:

    Stuff done at a SYSTEM privilege level gets permissions to do whatever via the kernel. The kernel code resides in CPU ring 0, i.e. it's limited to a protected state of the hardware. So there's no tampering with that, barring
    a) A flaw in the design of the hardware. (Not common, fortunately.)
    b) A bug in the computer's firmware. (Probably more common than I'd like to think about.)
    b) A bug in the kernel. (Quite common.)
    c) A bug in a program that uses SYSTEM privileges, with the kernel's permission. (Also quite common.)

    And the levels of compromise would be roughly like this:
    Stuck in sandbox -> access to sandboxed program's data and such
    Limited account -> access to your personal data, but difficult to hide from a skilled user
    Admin account -> access to more data, higher possibility of jumping to SYSTEM (though HIPS may block that)
    SYSTEM access -> access to all data and functionality, plus the ability to hide any tracks on the local system

    If an attacker has SYSTEM access on one machine, IMO the best bet for noticing is probably an external firewall, preferably with a proxy. If you notice traffic to a C&C server... well, there's your compromise.

    ...

    Note also that on Windows 9x and ME, none of this applies. Those versions of Windows do not make use of x86 memory protection - any program can write to any other program's address space. If you're lucky this will "only" result in a massive system crash.

    (Also note, however, that I am not a network security consultant or suchlike. Some of the above might need correction...)
     
  21. safeguy

    safeguy Registered Member

    Joined:
    Jun 14, 2010
    Posts:
    1,795
    First and foremost, you would want to begin with Vista SP2 (at least SP1) because the updates resolve many issues that have been reported.

    While you can theoretically run Vista with 1GB of RAM, it would be a poor experience.

    Vista was designed to maximize the use of hardware resources that are available. There are 2 issues here.

    1. Maximizing use of hardware resources available makes sense on systems with decent or powerful specs but can be taxing those systems that are running on less.
    2. It does not help that Microsoft understated the system requirements needed.

    Superfecth & Readyboost:
    http://blogs.technet.com/b/askperf/archive/2007/03/29/windows-vista-superfetch-readyboost.aspx
    https://technet.microsoft.com/en-us/magazine/2007.03.vistakernel.aspx

    If you find constant hard drive activity on a unit with Vista installed, it is mostly due to Superfetch & Search Indexer. You might want to tweak it as such:
    http://www.tweakguides.com/VA_5.html

    Instead of disabling Superfetch, you can try setting it to cache boot files only:
    http://www.howtogeek.com/howto/wind...tch-to-only-cache-system-boot-files-in-vista/

    As for fragmentation, you might want to look at these:
    https://technet.microsoft.com/en-us/magazine/2007.11.desktopfiles.aspx
    http://blogs.technet.com/b/filecab/...s-cover-why-windows-vista-defrag-is-cool.aspx
    http://blogs.technet.com/b/askperf/archive/2008/03/14/disk-fragmentation-and-system-performance.aspx
     
  22. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,267
    Location:
    Southern Rocky Mountains USA
    Access control came with Windows NT 3.1. Microsoft wanted to enter the market that UNIX dominated at that time. It needed an OS that had multi user security features. Maybe not mandatory but available when necessary.

    NT4 was the first Windows OS I set up with a LUA and access controls. I still have fond memories of it because it was not only more secure than the consumer Windows 95/98/ME versions based on MSDOS, it performed extremely well compared to them as well. It was faster and didn't suffer from crashes. It was possible to work for hours, days without suffering a crash and reboot.

    System is the holy grail of privilege escalation but administrator is more than adequate for malware to do lots of damage. And once there, system is only a reboot away. Once you have a LUA set up, you want to stay at that privilege level unless you need to do something that requires administrator privileges.

    The Mark Rossnovich article is the better read. He always understands how the system is really working. The first is just a glowing description of superfetch that doesn't bring up real world performance issues due to constant disk access. Superfetch makes presumptions of computer usage that don't always correspond to real world use. If you start an application that isn't prefetched launch time is slower, not faster. In any case, it is in no way necessary for the OS to function and can be safely disabled. Tweaking it to cache boot files is interesting but I don't see much advantage over simply disabling it, especially considering the system Noone is dealing with.
     
  23. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    In many ways, Microsoft has got it completely backwards. For all practical purposes, the 9X versions were single user systems. There was no real separation between user accounts, or profiles as they called them. Strangely enough, on a 98 unit, each user profile could specify their own screen resolution. On win 2000, you needed 3rd party software for that. For home users, the 9X systems were family units, multiple users on what was designed as a single user OS. Contrast that with today. The NT systems are multi-user by design but are running on personal devices. Now it's devices per user instead of users per device, with both having operating systems designed for the opposite scenario.

    Excluding added bloatware and infections, the only time I've found NT systems to be faster was when they were running on superior hardware. As for the stability issues, 98 was pretty bad out of the box. The number one cause of that instability was Internet Explorer. IE6 was a huge drain on memory and resources on 9X systems, resources you couldn't get back without a reboot. On my first 98 unit, I could get several hours of use with IE before the system became unstable. When I switched to the Mozilla Suite, the usable time went from several hours to several days. After I used IEradicator to completely remove IE, the usable time was over a week. With the Revolutions Pack upgrade from MSFN, the resource management issues were largely solved. Microsoft could have fixed those issues but chose not to so they could market the NT systems as more stable and reliable, a pattern they've repeated with every OS since.
     
  24. MisterB

    MisterB Registered Member

    Joined:
    May 31, 2013
    Posts:
    1,267
    Location:
    Southern Rocky Mountains USA
    You can use an NT based system as a single user computer with no problem and most users do in full administrator accounts which is one of the main reasons Windows security is such a big mess. I generally have just an administrator account and a limited user account that I work in almost all of the time. Part of keeping things secure is keeping them simple enough that you can keep track of them.

    The MSDOS based versions of Windows were definitely single user systems. NT 4 was so fast and secure because it didn't have plug and play which came in with Windows 2000 and introduced a whole gamut of stability and security issues into NT based systems. In NT4, you had to have supported hardware and installing drivers was a pain but once you had it set up, it performed. It did great on 64mb ram, was stable and fully multitasking. I left it around 2003 due to increasing hardware incompatibility. It didn't support USB among other things. I find the same thing happening with Xp. I'm slowly moving to multi core 64 bit systems and I like having 4gb or more of ram and backup drives bigger than 2tb. The best Windows option for newer hardware at present is Windows 7 64 bit. Ubuntu looks like a good alternative. I haven't tested it extensively yet but, so far, I like it.
     
  25. noone_particular

    noone_particular Registered Member

    Joined:
    Aug 8, 2008
    Posts:
    3,798
    Universal Plug and Play was actually available for Win 98. The user could install it if they needed it. IMO, UPnP is a disaster waiting to happen. I expect to see it exploited again in the near future with disastrous results.

    Not having USB could be a major inconvenience. For 98, there's an unofficial upgrade that gives it excellent USB 2.0 compatibility. Given some of the recent revelations regarding malicious USB and their ability to infect machines at a device level, I'm beginning to regard it as much as a liability as an asset.

    When I look at the new hardware, I can't help but wonder what it would be capable of if the operating systems weren't consuming a large percentage of its resources just to run. As the hardware became more powerful, the operating systems became more demanding on them with Vista being an extreme example. When compared to a system that ran on 64MB of RAM, what does it give the user to justify that consumption? It's like building a 500HP car, then adding an extra 4000 pounds of weight to it.

    Like NT4, 98 was made for hardware with 64MB of RAM and processors of 500MHZ or so. I've had the pleasure of running it on XP hardware, a 2.4GHZ Pentium with 1GB of RAM, far more than 98 itself will ever require, leaving it all available to the users applications. The resulting speed has completely spoiled me. While it's probably biased my judgment of that Vista unit, it also shows what could have been and still could be if the modern hardware was running fast, lightweight systems like those we used to run. Microsoft has never developed an OS to its full potential. When an OS gets mature and stable, they replace it with something more bloated and start the cycle all over again. For myself, there's no incentive to modernize (I can't call it upgrading). I don't use it for work or business. It's strictly for personal use and a hobby when I have the time. As long as it serves my needs, I see no reason to change.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.