Core i5 and Core i7: there is difference in a single pc for an home use ?

Discussion in 'hardware' started by blacknight, Sep 16, 2015.

Thread Status:
Not open for further replies.
  1. Joxx

    Joxx Registered Member

    Joined:
    Sep 5, 2012
    Posts:
    1,718
    Did you bother to read before posting?
    The OP was clear enough...
    ... an AMD FX4300 would serve him very well for less than €100, but you're promoting chips in excess of €300 (or €400 in the case of the i7) for what...? ~ Removed Off Topic Remarks ~
     
    Last edited by a moderator: Sep 17, 2015
  2. CrusherW9

    CrusherW9 Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    517
    Location:
    United States
    While I agree that he should wait for new Skylake sku's to come out, the same i3/i5 recommendation applies here. For SSD's, it's far more important to just have an SSD than to have a fast (assuming we're only talking sata) SSD. No need for the Pro unless you have huge workloads or need the sequential bandwidth (which the average consumer doesn't). Similarly, for your mass storage suggestion, while the Blacks are fast, you could easily get by with a Green or Blue for significantly cheaper. And lastly, graphics. The iGPU of modern Intel CPU's is pretty ok for general use. I think OP should probably just see how that works for him and then if he runs into problems (which I'd be surprised if he did), he could go for like a GT720 or something similar. GTX970 for just Netflix is vastly overkill.
     
  3. roger_m

    roger_m Registered Member

    Joined:
    Jan 25, 2009
    Posts:
    8,627
    @ComputerSaysNo Why install so much RAM. I've got 4GB of RAM on my laptop, and sometimes it isn't enough because at times I have 40+ tabs open in a Chrome based browser, leading to excessive RAM usage. However, if it was not for excessive amount of tabs open, I would never need more than 4GB.
     
  4. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    Indeed I only use 16 GB because half of it is permanently allocated as a RAM disk, and another quarter may be used running VM's.
     
  5. Rolo42

    Rolo42 Registered Member

    Joined:
    Jan 22, 2012
    Posts:
    571
    Location:
    USA
    • For the use you described, the i7 will give you nothing over an i5
    • A discrete graphics card would be a complete waste as you don't do anything that would actually use it and intel's on-board GPU is faster at transcoding/video streaming and with less power consumption/heat dissipation
    • SSDs aren't any less reliable than HDDs--especially given the [lower] quality of HDDs these days. Both fail. This is what automatic backups are for
      • SSDs are still readable after they wear out (required to meet industry standards), platter-drives are not (expensive data recovery services notwithstanding)
      • I've had the same SSD for 5 years. Wife has also. We find using an HDD-based PC insufferable--you might as well buy an 80486 CPU because that's what an HDD will make an i7 (or even multiple Xeons) feel like
      • If you're still paranoid about storage failure (again, backups--handy for file system failures too), then mirror two SSDs (RAID1)
      • You don't have a choice since you're wondering if an i7 will give you a perceptible performance gain over an i5 and you threw out the word "budget" and SSDs don't give a perceptible performance difference, SSDs make it a completely different computer performance-wise
    • Don't skimp on the power supply; a PC PSU is not just a power supply and they are definitely not all made the same. Get a quality, reputable PSU at the proper wattage (450W-550W in your case)
    • Check your memory usage now, especially if you have multiple images and/or browser tabs open simultaneously, to see if 16GB would be advantageous over 8GB (8GB is baseline for a few years now)
      • I don't like stock RAM speeds when I can get faster specs for about the same price (paid $93 for Mushkin 8GB DDR3-2133 4 years ago, same as quality DDR3-1333 at the time)
     
  6. blacknight

    blacknight Registered Member

    Joined:
    Sep 25, 2007
    Posts:
    3,344
    Location:
    Europe, UE citizen
      Very synthesizing, thank you.
     
  7. Mrkvonic

    Mrkvonic Linux Systems Expert

    Joined:
    May 9, 2005
    Posts:
    10,213
    There is no concrete black & white answer. But the simple one is, no, there's no difference for most people.
    Mrk
     
  8. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    I think 16Gb of RAM is overkill and 8Gb is the sweetspot. That said, I have 16Gb on this system. Why? Because the budget allowed and most importantly, because I wanted it, so there!!! ;)

    But still, I doubt I would notice the difference except in artificial scenarios (benchmark programs) if I suddenly went back to 8Gb - in part because I use SSDs. So any performance hit I might get with "only" 8Gb will be negligible because my Page File is on my SSD. And SSDs are ideally suited to support Page Files.

    I also agree with Crusher when he said "it's far more important to just have an SSD than to have a fast (assuming we're only talking sata) SSD". That is, even a slow SSD will offer MUCH GREATER performance over the fastest hard drives (even hybrid drives).
     
  9. Rolo42

    Rolo42 Registered Member

    Joined:
    Jan 22, 2012
    Posts:
    571
    Location:
    USA
    16GB RAM isn't overkill if you're using a pagefile.
     
  10. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,546
    Location:
    The Netherlands
    I have used both and to be honest I don't see any difference between the two in my daily usage, not even with some older video games. I came to the conclusion that for true performance gain, the GPU and SSD are more important.
     
  11. Rolo42

    Rolo42 Registered Member

    Joined:
    Jan 22, 2012
    Posts:
    571
    Location:
    USA
    Older games would only use one or two cores anyway and wouldn't be optimised for hyperthreading.

    Even with contemporary games that do use multiple cores, my i5-2500k CPU is never the bottleneck, even when the GPU isn't the bottleneck. I have seen lots of assumptions* about games being CPU-bound when they aren't--it's just the way the games are coded, especially networked games. Early release dates trump optimization and even quality these days.

    *Assumptions because those declaratives are never accompanied with hardware performance statistics (CPU, GPU, RAM, VRAM, PCIe bus, GPU/VRAM bus, etc.)
     
  12. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Sorry, but that makes no sense - even if you meant "NOT" using a PF. The fact of the matter is, you should always use a PF regardless how much RAM you have though certainly if you have a small amount of RAM, a PF is more beneficial.

    As I said I put 16Gb in this system, but the fact of the matter is, 8Gb would have been plenty. Very few people need more and those folks tend to be professionals doing serious CAD/CAE, CGI and editing.

    I generally agree with this though there are still many games that are CPU intensive too. But that does not mean you need a top-of-line CPU to get good game play. I think it is important to note that game developers know very well that most gamers don't have monster budgets for monster gaming rigs. So they develop their games to provide excellent "game play" on lessor systems.

    I think it all depends on what your goals are. Do you want bragging rights with the most FPS with FurtureMark benchmarking programs or do you just want a good gaming rig (but not a Xbox or PS)?
     
  13. J_L

    J_L Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    8,738
    I think he forgot the word "actively", meaning that you need the pagefile for extra "memory" more occasionally than not.
     
  14. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Maybe but that does not really make sense either. The smaller your total RAM is, the more "active" your PF is likely to be. But even if you have 32Gb or even 64Gb, Windows likes to have and will use the PF for lower priority data. And there is no harm in that.
     
  15. CrusherW9

    CrusherW9 Registered Member

    Joined:
    Dec 27, 2012
    Posts:
    517
    Location:
    United States
    These days, most new games totally abuse even very high end systems on max settings. The games are made incredibly intensive with the idea that they will age well.
     
  16. RJK3

    RJK3 Registered Member

    Joined:
    Apr 4, 2011
    Posts:
    862
    I believe Rolo's post makes sense in context. You suggested you'd need a pagefile at 8gb, therefore Rolo suggested that 16gb of RAM isn't overkill in your case.

    Personally I don't like wasting too much space for a page file. Since I'm known to use older programs and games which might be hardcoded to use the pagefile, I leave a small pagefile but with room to grow.
     
  17. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    "On max settings", yes. My point was they are still coded so they can be toggled back in FPS and resolutions, background detail, etc., so they still play on lessor systems.

    That said, not sure I still agree with your comment because sadly, most games today originate on game consoles and then are ported to the PC. And game consoles come with just 8Gb and often integrated graphics - meaning "most" are designed to run with less than 8Gb of RAM even on max setting.
    Read what he said. He said, "16GB RAM isn't overkill if you're using a pagefile." That implies 16Gb IS overkill if NOT using a PF. And that makes no sense. And FTR, I said you need a PF regardless the amount of RAM you have.
    Wasting space? Not hardly. The fact of the matter is, unless you are Mark Russinovich and you constantly monitor your virtual memory requirements it is highly unlikely you are smarter than Windows at determining the best setting for a fixed PF. Therefore, at least with the modern versions of Windows (7/8/10) it is best to just let Windows manage your Page File. If you need that disk space because you are running that low on free disk space, then you need to uninstall unneeded programs, move your user files to a different drive, clean out the clutter, or buy more disk space - or all of the above.
     
  18. Rolo42

    Rolo42 Registered Member

    Joined:
    Jan 22, 2012
    Posts:
    571
    Location:
    USA
    This is precisely what I meant.

    I don't recommend this on a platter drive: it will be fragmented all over the disk and be a performance-killer. Make a static pagefile 1.5x the amount of installed RAM (which is what Windows will recommend to you on the pagefile properties page). Even better: place that pagefile near the beginning of the disk (takes some coaxing; I use MyDefrag for that) or at least in/near your used space.

    If you have multiple platter drives, put it on a non-OS disk. (Edit for clarity: by "multiple platter drives" I mean multiple HDDs/multiple hard drives, not referencing the platters contained therein)

    A small (really, insufficient) amount of RAM, a pagefile is necessary (not "beneficial") to not run out of memory and cause software to halt--this is why we have pagefiles. Even if you have sufficient RAM, the OS will page dormant content to be responsive to new loads; if you never use more than your installed RAM, this paging is unnecessary and makes your system less responsive with no benefit whatsoever.
    Note that I am not referring to or excluding the minimum page file size necessary for Windows to perform minidumps (it will advise you of this when you try to turn off paging or reduce its size too much); I am referring to actually utilising the pagefile to exceed chip RAM limits.

    I disagree for a few reasons:
    1. This is a specific user asking for specifics for his specific solution, so general statements--although true--may not apply
    2. We don't know how many photos are open at once and we don't know how many browser tabs are open at once; my wife can burn through 8GB over morning coffee doing only these
    3. We don't have to guess (or we could reduce the pagefile to 512MB and see what happens if we wanted to guess) but we have resource monitor to tell us precisely what is going on. Real-time and logged performance statistics are how I know with certainty that a game's performance isn't hindered by hardware (or if it is, which hardware and adjust accordingly)

    It depends. A poor design example would be CitiesXL/XXL: it's LUA interpreter only uses one core and that causes severe hitching. A faster CPU (in raw clock speed and in pipelining) would mitigate this but wouldn't eliminate it. A faster-clocked i3 would outperform a slower-clocked i7 in this particular case (and in any single/dual-threaded application)

    There are minimum requirements and there are recommended requirements. You can run Windows on 2GB RAM but it won't be "excellent".

    "Good" is a relative, ambiguous term and completely unusable in a technical discussion. My requirements (specific, actionable): Stable 60FPS, zero hitching, minimal aliasing:
    • 24 FPS works for movies, not games
    • Hitching breaks immersion and can get you killed, especially in PvP
    • Aliasing ("jaggies") also breaks immersion and completely distracts me

    Less of an issue these days but software design may cause hardware to directly influence gameplay. Quake is a good example. Higher FPS = higher jumps. This is why I had Vsynch OFF to get the absolute maximum possible 266 FPS (which saturated the AGP bus).

    So, "good" is a little in the eye of the beholder.

    After my requirements are taken care of, any leftover performance is relegated to visual quality. It's nice to play Tropico at max settings, 8xAA. It's really nice to play it at 16QxAA with 8x transparency AA and no negative LOD bias. This is one reason why defining one's budget is important (in addition to being fiscally responsible): with visual quality you get what you pay for but with plateaus in the efficient frontier (certain price-points are a better value than others).

    Synthetic benchmarks diverge too much from actual FPS that I don't use them at all. Why use theoretical when we can measure the actual target?

    Whether it's one user's PC or an enterprise network, the approach is the same: clearly define requirements and budget first, then design a solution around those.
     
    Last edited: Sep 25, 2015
  19. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    Where's the evidence of this? Show us a link to a current study on today's modern operating systems to back up this claim. Show us a study that show where Microsoft has it all wrong with W7/8/10.

    W7, W8, and W10 are not XP. Microsoft has not be sitting on their thumbs for the last 15 years since XP came out. Windows today is much smarter than XP. Also, Windows defragger works automatically in the background too, in part just to address this concern. So your fragmentation comment is just obsolete, outdated, and untrue - unless you disabled the Windows defragger which is unnecessary and a mistake for most users too.

    15 years ago, I would have agreed with you but not today. What we did to improve performance with XP is just not needed, and potentially detrimental to performance with W7/8/10.

    It is time to stop treating today's modern operating systems like XP. I say again, unless you are a bonafide expert like Mark Russinovich, AND you keep regular tabs of your virtual memory use, there is no way you know what is better for your system than Windows does. For sure, the normal user doesn't and IMO, it is bad advice to tell users to set a fixed PF simply because they are not virtual memory experts.

    The problem with setting a fixed size PF is just that, it is fixed. But users are constantly changing what they do on their computers. A Windows managed PF will dynamically change as the user changes - and can do this several times a day, if necessary. Is the normal user willing to keep track of the virtual memory like that? No! And why should they when Windows is already very adept at doing it for us!
    Huh? That makes no sense either! The vast majority of hard drives today have more than one platter. The vast majority of hard drives have at least two platters, many have three or more. Note this 2014 WD review that says this drive is "the first drive on the market to breach the 1TB per platter" for that technology. How are 2Tb drives available then without more than one platter? This drive needs 7 platters using state of the art densities!

    Also, more and more systems are using SSDs, at least for the boot drive and SSDs are ideally suited for page files. And fragmentation is not an issue for page files either.

    If you are worried about running out of free disk space, then you have failed to provide a sufficient amount of disk space!

    Totally false! Again, where is the evidence? Show us the study on modern operating systems that shows paging in this circumstance (or any circumstance) makes a system less responsive. Please! Show us!

    If you really are smarter than all the PhDs, computer scientists, master programmers at Microsoft with their 20+ years of empirical data and super computers to analyze it, then by all means, set your fixed page files. But if you are not a real computer expert, leave it alone!

    Comparing running Windows to the "game play" of a game running on a lessor system is comparing apples to oranges. Game maker code games to scale back features, frame rates, detail to maintain the same "game play" on lessor systems. That is not done with operating systems.

    I fail to see your point here. How is that different from me saying, "it depends on the what your goals are"? We are saying the same thing.

    And I agree with you about benchmarking programs. I don't use them either because they do not represent real-world scenarios. Bragging rights with a benchmarking program gets you what? Nothing but bragging rights.
     
  20. Rolo42

    Rolo42 Registered Member

    Joined:
    Jan 22, 2012
    Posts:
    571
    Location:
    USA
    Please don't add to what I say; I never said "Microsoft has it all wrong". Win7/8/10 are fundamentally the same with paging as NT/xp/2000; there's been no modern revolution here. If there has, enlighten us by holding yourself to the same standard you are demanding of me and post a study or whitepaper.
    I aim to please:
    Windows Server Performance Team explaing what I've stated, pretty much verbatim:
    http://blogs.technet.com/b/askperf/archive/2008/03/14/disk-fragmentation-and-system-performance.aspx
    http://blogs.technet.com/b/askperf/archive/2007/12/14/what-is-the-page-file-for-anyway.aspx

    It works in a limited sense to address heavy fragmentation in a broad audience. Performance tuning is nowhere near this broad audience. "Why is my [hi-res texture] game lagging [hitching]? [because your 14GB asset file is fragmented and the mechanical limitations of platter drives can't get textures to your discrete graphics memory fast enough]" is not part of this broad audience. Buying an eMachine at BestBuy "to surf the web" is.

    How do you know I'm not? Irrelevant since I don't need to be an MD to know if I have a headache and what to do about it in almost all cases. I also don't have to be a Pharmacist to take a buffered analgesic. Besides, the CTO of cloud storage may not be the best subject-matter expert on getting a performance workstation maximally tuned. A hardcore gamer, however, knows how to extract every electron of performance out of a box. Quantitative over qualitative.

    Anyone who wants to know the fundamentals of storage and memory and how to best configure them for their own application can do so--especially in this here information age with forums like these and millions of experts willing to share. I would say stop treating basic IT skills as limited only the elite. We have the 1337 now. Also see my link above for the Microsoft experts who said the same thing I have.

    The wise seek counsel from experts, not defer all decisions to them.

    To clarify: multiple HDDs (platter vs. SSD). If you have two hard drives and one has Windows on it and the other has whatever else on it, put the page file on the second disk, preferably at the very beginning. Mechanical limitations of HDDs have and will always be a performance-killer.
     
  21. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    :( Those are from 2007 and 2008! I asked for current information on current operating systems. And it is wrong to suggest there have been no changes in the last 15 years. Start with the fact XP did not dynamically manage PF sizes. It set a fixed size at program installation and left it. And those articles are written by the "Windows Server" performance team for "Windows Server" administrators. It is not written for "normal" Windows users.

    I know that the vast majority of users are not! So I say again, unless you (that is, anyone reading) are a bonafide Windows and virtual memory expert, leave it alone! Let Windows manage your PF. It knows how to do it very well.

    Thanks for that clarification and with that, I agree to move the main PF to the secondary drive. But I still contend if you are not an expert AND you don't keep near constant monitoring of your virtual memory, let Windows manage it.

    If you, Rolo, are a true expert in Windows memory management AND you regularly monitor your computers and how they utilize virtual memory, then fine. Set a fixed PF. I have no problems with that at all. But I do have a problem suggesting to others with less or unknown skillsets and commitment they use a fixed PF. That is not giving sound advice.
     
  22. Mrkvonic

    Mrkvonic Linux Systems Expert

    Joined:
    May 9, 2005
    Posts:
    10,213
    The fixed PF is more about fragmentation and predictable allocation of space on the disk.
    100% agree that VM management should be left to the kernel.
    Mrk
     
  23. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,041
    Location:
    Nebraska, USA
    I agree 100%. And this was certainly important 15 years ago when 10Gb was considered a monster drive and very expensive. With today's hard drives costing $30 for 1000 Gb of disk space you really have to be on an extremely tight budget and hard times to be so low on disk space that fragmentation and PF size matters.
     
  24. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,624
    Location:
    USA
  25. wshrugged

    wshrugged Registered Member

    Joined:
    Jun 12, 2009
    Posts:
    266
    [I'm enjoying the discussion, thank you.] To no one in particular -- just a point of information to add regarding what MSFT has done, post XP, with Window's Defrag. The MSDN blog's content was produced before W7 was released but it's still relevant.
    http://blogs.msdn.com/b/e7/archive/...d-engineering-the-windows-7-improvements.aspx
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.