Doom 3 - GeForce FX - Discrepancies? Can PS 2.0a increase speed in games?

Discussion in 'other software & services' started by Firecat, Feb 3, 2005.

Thread Status:
Not open for further replies.
  1. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Hello everyone,

    I have a PC of the following specs:

    AMD Athlon64 3000+ Socket 754
    Asus K8V-X - VIA K8T800
    1GB DDR400 RAM (Kingmax 'Hard-core' DDR500 actually, but mobo is low on overclock features)
    Asus V9950 GeForce FX 5900 128MB
    Creative Sound Blaster Live! 24-bit

    On this configuration, when I play Doom 3, if I go to a wall of something and observe the textures carefully, i can see slight visual banding. This is especially apparent in the very beginning of the game where you first gain control over your player. Just go to the door near where the security guy is standing and observe the textures on the door (go very very close) very very carefully.

    My feeling on this is that Doom3's engine uses OpenGL FP16 64-bit colour formats on GeForce FX graphics cards, thus resulting in the banding. This was noted on HardOCP too they called it a simple texture bug. As this does not happen on GeForce 6 series or ATI Radeon 9xxx/X-series it means that those cards are probably running full precision DX9.

    So, I could say John Carmack moved the half-precision FP16 part into ARB2 by coding all shaders with the _pp hints.

    Could anyone with a GeForce FX card verify this? (preferably FX 5600 or better, but 5200 will do too)

    Also, I wanted to know if using Pixel Shader 2.0a under Dx9 through Microsoft HLSL or Cg would improve performance in the following cases:

    1)Full precision (FP32) DX9 performance
    2)Partial precision (FP16) DX9 performance

    Also, do you have figures of how big the performance increase is and of Vertex Shader 2.0A too?

    Awaiting a response, will be sitting for two hours now....

    Best Regards,
    Firecat
     
  2. waters

    waters Registered Member

    Joined:
    Nov 8, 2004
    Posts:
    934
    HI
    i upgraded graphics card just to play doom 3.
    There are comparitive tests on the web on graphic card running doom 3.
    One of the top 2 was nvida geforce fx 5900xt
    I got that one and very pleased i did.
    Detail and speed are perfect
     
  3. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Dont worry waters, you will not spot anything while playing the game. Only if you stare directly at a texture and see it very very carefully.... it is not at all visible most of the time....

    Looks perfect for me too. I run in high detail at 1024x768 resolution with 2xAA. Its just that I want to figure out why the small banding occurs. HardOCP saw it too. I feel its the FP16 codepath..If you buy a GeForce 6800 then u will see it loooks SLIGHTLY better (nothing to worry about, its the higher amount of colors). You completed the game? The graphics are excellent, never quite seen anything like it. The other card in the comparative was the GeForce FX 5800 Ultra which is one hell of a card. I am please with my 5900 (not XT, not Ultra, simply 5900) too.

    If you have not installed the patch 1.1, then use this tweak to slightly improve performance:-

    press CTRL+ALT+~ to open console
    type r_useDepthBoundsTest 1 (This enables UltraShadow on GeForce FX 5700 and 5900 series)
    do this everytime you play - patch 1.1 automatically enables it
    or else go to doom 3 folder, goto 'base' folder. With notepad type r_useDepthBoundstest 1 and save the file as autoexec.cfg, this way you won't need to type it everytime you play...

    Yes, the GeForce FX 5900XT is a damn fine card, but my 5900 is better (not by too much though) and it came at only slightly higher price than the Asus V9950SE (5900XT). Keep gaming!! You tried Half-life 2? It runs very well for me.

    The hell levels are very good indeed in Doom 3. I completed it....

    Best Regards,
    Firecat
     
  4. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Is no one here interested in what I specialise in? [​IMG]

    Might as well give a 'jargon buster':

    OpenGL - An open source graphics API (Application Programming Interface) originally developed by SGI Corporation. Now graphics card vendors give their own extensions to OpenGL for proper exploitation of their hardware.

    Of course, SGI still makes standard extensions for OpenGL (which should work well for all graphics cards); through the new ARB (Architecture Review Board) standard and new versions of OpenGL.

    DirectX - This the graphics API which is proprietary of Microsoft and a part of Windows. This API rather sticks to standards set by Microsoft instead of using the inputs of an Architecture Review Board.

    FP16 - This relates to a feature called Pixel Shaders 2.0 on DirectX 9.0(a,b or c). Shaders are programs written into DirectX or OpenGL which allows per-pixel control of 3D objects on a screen. Using Pixel Shaders, developers can create effects such as an irregularity on a surface (say melting candle) or The strange effect of light on leather, or lighting systems with excellent accuracy and detail.

    In DirectX 9.0, Pixel Shader 2.0 defined that to create realistic graphics, more color was needed. Taking inputs from John Carmack (Technical Director of ID software, makers of Doom 3) - Though they won't admit it - and others in the field.

    As a result, DirectX 9 demanded that graphics cards have at least FP16 color range (which is 16-bits for the red, green, blue and alpha components for every pixel), resulting in 64-bits colour per pixel. this is also known as a hal-precision format in DX9.

    This format was also applied in OpenGL by the ARB2 standard in a more graphics-card friendly way.

    In simple words, FP16 is the colour format used by animation studios such as Pixar, now brought to the PC.

    FP32 - Microsoft with DirectX 9 recommended FP16 as a minimum and FP24 (24-bits per component=96-bit per pixel) as recommended. However, NVIDIA went a step further with FP32 = 128-bit colour (because they didn't want what the competitors had took FP24, so they didn't implement it)

    Unfortunately, this wasn't a good move for NVIDIA, as the first line of graphics cards, GeForce FX, was not powerful enough to run the FP32 mode without a serious loss in performance. This was due to higher transistor counts, improper architecture and un-optimal bandwidth.

    This made things difficult as NVIDIA complicated the system. Developers had to keep every DirectX 9 compliant card in mind and code his game such that it would run well on all graphics cards. And he had FP16, FP24 and FP32 which was very confusing.

    As a result, developers had to use FP16 most of the time or not use DirectX 9 at all on GeForce FX graphics cards. FP16 caused a noticeable quality loss much of the time.

    NVIDIA fixed themselves with the GeForce 6 though, which runs FP32 just fine.

    FP32 is also implemented in OpenGL 1.5 onwards by the ARB2 standard.

    Pixel/Vertex Shader 2.0a - NVIDIA, to remedy the GeForce FX problem, provided a new optimized code structuring system in the form of Pixel/Vertex Shader 2.0a which noticeably improved performance in many cases.

    Hope that was simple.

    Regards,
    Firecat
     
  5. Infinity

    Infinity Registered Member

    Joined:
    May 31, 2004
    Posts:
    2,651
    I got myself few weeks ago a pci e 6600gt from pixelview.

    I love it.

    heat is 42C with a zalman cooler the graphics are stunned.

    the previous was a fx5200 :D a little different...

    I hope you fix it, the last drivers did it with me...
     
  6. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    Infinity,

    let me tell you that the banding is not really seen at all, just that 'its there'. Doom 3 runs ultra good on my PC, no probs at all.
     
  7. PTGamer

    PTGamer Guest

    I have the GeForce FX 5200 vid card, and boy what a piece of junk that card is, I wouldn't recommend it for anyone. I would love to have the FX 5900XT, someday maybe if I catch it on sale.
     
  8. Firecat

    Firecat Registered Member

    Joined:
    Jan 2, 2005
    Posts:
    7,927
    Location:
    The land of no identity :D
    PTGamer,

    whats your budget? Maybe I can help you.
     
Loading...
Thread Status:
Not open for further replies.