Graphic card recommendation to go with Ivy Bridge 3570K!

Discussion in 'hardware' started by ratchet, Mar 22, 2013.

Thread Status:
Not open for further replies.
  1. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    I do need to qualify this question as it is in the vein of, "If I were to purchase one." I'm perfectly satisfied with the performance of Intel's Integrated 4000 graphics. Indeed, the reason I selected the 3570K CPU was because of the reviews of the 4000 graphics. I also never game, so I'm just curious what the least expensive card you'd recommend to take advantage of LucidLogix? Thank you!
     
  2. nosirrah

    nosirrah Malware Fighter

    Joined:
    Aug 25, 2006
    Posts:
    560
    Location:
    Cummington MA USA
    You buy a gfx card to replace on board/CPU graphics when you are doing something that is lacking due to low GPU performance. If nothing you are doing is suffering due to a GPU bottleneck I don't think adding a dedicated card is worth it at all. You do though have a K CPU and can if you want play with the maximum turbo multiplier. The stock turbo is 3.8 ghz but these CPUs usually can handle 4.2 to 4.4 ghz (higher with solid liquid cooling) without issue and that would get you a little more performance for free.

    After overclocking I would think about getting a small SATAIII SSD and enabling SRT. This lets you avoid reinstalling your OS while still getting a decent speed bump by using a SSD as HDD cache. SV300S37A/60G <- this is a great SSD for SRT.

    As for your question, it seems that any DX11 card should do the think and those go all the way down to about 30$. I just don't see any reason in your case to do this as the integrated GHX in your CPU is great for every day tasks and the gap between it and ultra low end dedicated gfx cards is just not all that much. You may get a small bump in performance by gaining some ram back (your current setup uses system ram and ghx ram) but that is only if you have less than an optimal amount of ram installed and that is easier to fix with simply more ram anyway.

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814102994

    This is as low as I would go for a dedicated gfx card and factoring in the sale price it is also one hell of a deal.
     
  3. mechBgon

    mechBgon Registered Member

    Joined:
    Mar 2, 2013
    Posts:
    68
    Location:
    USA
    You mentioned LucidLogix, as in VIRTU. Its function is the opposite of what you're asking. VIRTU lets your system have a decent gaming card installed, and still access the onboard video's capabilities, most notably Intel Quick Sync.

    Quick Sync is a hardware-accelerated video transcoding capability in the Sandy Bridge and Ivy Bridge CPUs (those that have integrated GPUs, that is). If you have a big video-editing project and your video-editing software supports Quick Sync, then in *some* scenarios you'd reap much faster video encoding using Quick Sync than in pure software mode.

    I happen to have some QS-aware software and it is also able to use hardware acceleration on my NVidia video cards as an alternative. Whether I can use QS on a given project depends on what encoding scheme I'm using and stuff, so it's not exactly a home run every single time.

    So as a non-gamer who's getting good results with the integrated Intel GPU, you're in a good position even without an add-in card. Now, if you want to set up for three or more monitors, you have a case for add-in cards.
     
  4. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Theres no need for a GPU if you dont do any GPU intensive tasks. :D
     
  5. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    I agree. Because more and more systems are being integrated into home theater setups, and because more and more notebooks and all-in-one computers are being "marketed" as desktop replacements and even gaming machines, integrated graphics, when paired with a decent CPU and lots of system RAM, today's integrated graphics solutions have become very capable.

    I would ask, what are you trying to do that your current graphics solution does not support?
     
  6. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    8,251
    Location:
    The land of no identity :D
    I wouldn't mind AMD's integrated graphics, but I do think twice still before thinking about Intel's graphics solutions. Bad habit gained from my days of working with OpenGL models, I guess. But then, the Intel 810-915 series wasn't exactly very good to begin with :D

    To answer your question, any cheap DX11 video card will do the trick. If you want something with a little more punch, go for something mid-range like the x5xx series (AMD) or the x40/x50 series (NVIDIA), where x is a family number, higher the better in general.
     
  7. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Yeah and the HD4000 is quite capable for simple tasks such was watching videos, web browsing etc.
     
  8. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Yeah, I think you need to let go of that habit. Intel actually led the way to advanced integrated graphics solutions. It is not fair to dismiss an entire brand based on past experience with legacy products. Times change to fast to hold grudges.

    Virtually any integrated graphics solution today will easily support simple tasks as watching videos DVDs, and BluRays, Word/Excel/PowerPoint documents, and web browsing.
     
  9. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    Thank you all for the interesting responses! I was mostly just curious and although it killed the cat, satisfaction brought it back. Probably won't bother with one, but then again, one never knows.
     
  10. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    I don't game but I do require, demand, cannot live without two large monitors connected to my computer - both via a digital connection. So I opted for a decent card with lots of on-board RAM, even though my Gigabyte board came with, and my i7 supports decent integrated graphics. I have no regrets.

    However, remember that the GPU, and consequently, a graphics card is often the most power hungry device in our computers - considerably more than many power hungry CPUs. So before you consider adding any graphics card, you MUST ensure your current power supply is capable of supporting the added demands - not just in power, but necessary connections too. It is common to need a more capable PSU when adding a graphics card to a system previously running with integrated graphics - especially with factory assembled computers where expansion capabilities are typically limited to reduce costs and/or increase/preserve profits.
     
  11. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    I did think of something else prior to your post. Since I don't use two monitors, could/would the setup (and present monitor) still be able to access the card just via the PCI port?
    Also, I do think my specs can support a card:
    Asus P8Z77-M PRO PSU SEASONIC X650 GOLD SS-650KM MEM 4Gx2|MUSHKIN 996988
    Kingston HyperX SH100S3/120G 2.5" 120GB SATA III
    One interesting thing is the SSD had the blue stripes on it and retailed for more than the regular Hyper Xs. There was a $50 rebate though. Anyway, I paraphrase: The blue striped ones supposedly were better for the long term and for a while now, they apparently don't even market those models anymore. So I bought (literally) into the hype (no pun intended) for a discontinued item!
     
  12. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    PCI? Did you mean PCI?

    Understand that PCI is NOT the same thing as PCIe (PCI Express). Despite their similar names, they are totally different and totally incompatible slots.

    There are PCI graphics cards, but by far, most cards use the PCIe interface.
     
  13. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    Did mean PCIe!
     
  14. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Typically, when you insert a graphics card into a motherboard PCIe slot that has integrated graphics, the integrated card is disabled, any shared memory is released, and the BIOS knows to use the card from then on - unless you manually enable the integrated again.

    And to ensure we are on the same page, it is not PCI "port" (or PCIe port). They use slots in the motherboard and appear through slots in the back of the case. And then cards inserted in those slots will provide VGA, DVI, and/or HMDI "ports" (or connectors) at the rear of the computer to connect your monitor (or monitors).
     
  15. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    Yes slot! In this situation though (Again, I will probably never do it but enjoy learning at my ripe old age), if I were to install LucidLogix, it wouldn't disable integrated, correct? Would I still be able to leave the monitor connected to the HDMI port on the motherboard or would one now have to connect to the card?
     
  16. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    If you install a card, it is typically best to use just the card. And when you install a graphics card and Windows drivers, Windows will use that card from then on. Installed programs will no know the integrated existed.
     
  17. Noob

    Noob Registered Member

    Joined:
    Nov 6, 2009
    Posts:
    6,491
    Why run something slower when you have a faster one installed? :D
     
  18. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    You need to read my original post again. Had no intentions of doing such a thing unless Card Model xxxx for $50 and LucidLogix provided some great boost, which I now fully realize isn't going to happen. Never did think it could/would for a modest investment but I didn't know for sure, so I asked the question!
     
  19. Boyfriend

    Boyfriend Registered Member

    Joined:
    Jun 7, 2010
    Posts:
    1,070
    Location:
    Pakistan
    Add graphic card only if your routine software(s) are capable of GP-GPU like CUDA. For simple computer use, Integrated 4000 is more than sufficient. Want to play games in HD resolution on 4K display, then get decent graphic card(s).
     
  20. mechBgon

    mechBgon Registered Member

    Joined:
    Mar 2, 2013
    Posts:
    68
    Location:
    USA
    There are exceptions, though. One case is when using Lucid Virtu, which will let the OS selectively use the best of both (for example, using onboard Quick Sync to accelerate video-encoding projects). Other cases include AMD's recent chipsets, which can team with an add-in AMD card for higher performance than either can achieve separately, or the onboard video can be used alongside an add-in card to support more monitors. I have a couple 785G-based systems at work that could probably drive five monitors using the three onboard outputs in conjunction with a dual-head add-in card.
     
  21. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Exceptions don't make the rule, however and that is why I said "typically", instead of "always".

    As systems running with more than two monitors, I would still use cards instead of integrated to ensure all system RAM is dedicated to the CPU and OS. Even with two monitors, I would (and do) use a card with lots of RAM - unless I was displaying static information.
     
  22. hawki

    hawki Registered Member

    Joined:
    Dec 17, 2008
    Posts:
    6,078
    Location:
    DC Metro Area
    New graphics drivers for Ivy Bridge will add Ultra HD support.

    If you're using the integrated graphics embedded into your 3rd generation Core processor (Ivy Bridge) from Intel, get ready for a driver update that's supposed to improve power and performance, as well as add new features. Intel HD Graphics driver 15.31 will be the seventh update since the introduction of 2nd generation Core processors (Sandy Bridge), and though it was built for 4th gen chips (Haswell), it will work on Ivy Bridge too.

    http://www.maximumpc.com/article/news/intel_readies_hd_graphics_driver_update_ivy_bridge2013
     
  23. mack_guy911

    mack_guy911 Registered Member

    Joined:
    Mar 21, 2007
    Posts:
    2,677
    cheap card will provide you graphics less than 4000 series so also bad on hardware since your hardware if expensive

    people will recommend you to stick you system and save money then buy some good card or a middle range card a few month or year later if you still feel need of graphic card :)
     
  24. ratchet

    ratchet Registered Member

    Joined:
    Feb 20, 2006
    Posts:
    1,988
    hawki, thanks for the headsup, installed!
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.