KVM switches with EDID support?

Discussion in 'hardware' started by Gullible Jones, Aug 4, 2012.

Thread Status:
Not open for further replies.
  1. I just discovered that most of the video woes I've been getting on my workstation are due to my budget KVM switch. The switch blocks the monitor's EDID information, which leads to low resolution, poor refresh rate, and display flickering (because nouveau's output detection gets thrown off). So I'm looking for a better KVM switch.

    I'll admit I know next to nothing about these devices... What I need is one that
    - Does not block EDID
    - Supports USB input/output
    - Can connect to three or four PCs

    Supporting my monitor's full 1280x1024 resolution would be nice too, but not strictly necessary.

    What's a good brand/model for these features? Inexpensive would be nice, but if the "get what you pay for" rule applies here I'd be willing to pay more for better quality equipment.
     
  2. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Yeah it does. There is so much that goes into KVM devices. They are not very high-tech, but poor quality will surely impact performance.

    It may not be the KVM but the cheap cables that came with, or were bought with it.

    Sadly, you told us nothing of your computers or monitor. Do they support HDMI, DVI or D-Sub only?

    I recommend you use Bing Google to search on [URL="https://www.google.com/#hl=en&sclient=psy-ab&q=HD+KVM+switch+reviews&oq=HD+KVM+switch+reviews&gs_l=serp.3..33i29l2.6629.7522.0.7861.3.3.0.0.0.0.419.614.1j1j4-1.3.0.eese..0.0...1.eYuROJQXJuA&pbx=1&bav=on.2,or.r_gc.r_pw.r_cp.r_qf.&fp=bdf2989869f9786&biw=1182&bih=810]HD KVM switch[/URL]

    Note too that distance/cable length can play a major role in display quality.
     
  3. Thanks... The monitor is a Benq LCD model, VGA only, up to 1280x1024. I expect to be mostly attaching computers with VGA outputs to it, but my workstation has a GeForce FX video card with only DVI out - I have to use an adapter for that.
     
  4. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Well, VGA via D-Sub is analog only and as such, is often limited by cable distance even more than digital. If you can get by with 6ft, or better yet, 3ft cables, I would.

    It is my experience, however, that rebooting the machine selected by the switch usually gets the monitor and graphics solution in sync - though that gets annoying real quick. That said, if all these computers are networked, you can use KVM software and get rid of the switch and all those cables. Check out Input Director in particular.
     
  5. I was using 3 foot cables already. :(

    Reboots did not help. The issue was graphics cards messing up, or getting low resolution, due to lack of EDID info from the monitor.

    Interesting. Does that work kind of like VNC? I assume it has to be installed on all KVM'd computers?
     
  6. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Why are you using nouveau open source drivers instead of NVIDIA drivers?
    How do you know that?

    The EDID information (in a PC to monitor setup), unless I don't understand it, simply tells your card (and Windows) the resolutions the monitor does NOT support. But your card's resolution should already be set in Windows. So once the driver is loaded, the card will try to communicate with the monitor at that resolution. Note when the EDID is not read, Windows still knows how to communicate with the monitor because it doesn't - it communicates with the card and all monitors (for the last 15 years or so) are PnP - or plug and play.

    Note that monitor drivers are NOT drivers in the real sense. They just provide the brand and model number, and then they gray out the resolutions NOT supported. It is up to the graphics card to then sync up.

    Ideally, you need all the monitors to use the same native resolution and the card should be set to that resolution.

    I found this about EDID Emulators - but that seems too costly a solution to me.

    Yes and yes - but after that, only a network cable needs to be connected, though you may have to change a couple settings in the BIOS Setup Menu to prevent the system from halting during boot due to a keyboard [missing] error.
     
  7. treehouse786

    treehouse786 Registered Member

    Joined:
    Jun 6, 2010
    Posts:
    1,411
    Location:
    Lancashire
    i had always suspected this, thanks for confirming
     
  8. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    When you think about it, it makes sense. The OS communicates with the graphics solution (card or integrated), not the monitor. All the OS really cares about is how many columns and rows, what color for each pixel, and for how long to fire that pixel. It is the graphics solution that has the responsibility to communicate that information to the monitor.

    It should also be noted that all PC graphics solutions, all PC operating systems, all PC motherboard chipsets (including the BIOS), and all PC monitors know how to communicate at basic VGA resolutions right out of the box. This is to ensure we (consumers) can see something besides a blank screen when first connect and power on our systems. This is also how graphics in Safe Mode or when booting into the BIOS Setup Menu works.
     
  9. axial

    axial Registered Member

    Joined:
    Jun 27, 2007
    Posts:
    479
    I've used a variety of KVM's over the years, most with various "performance issues" when dealing with high-res monitors (dual Samsung 244T at 1920x1600).

    The very best I have found are Geffen brand. Definitely on the pricey side but they are top quality products.
     
  10. Okay, I have found a solution that doesn't make my eyes bleed (and doesn't require a new KVM switch). Looks like this:

    1. Run gtf to get a modeline. In this case I did 'gtf 1280 1024 75" to get a modeline for 1280x1024 pixels at 75 Hz.

    2. Stick it in the "Monitor" section of xorg.conf like so:

    Code:
    Section "Monitor"
            Identifier "Configured Monitor"
            Option "DPMS"
    ...
            # 1280x1024 @ 75.00 Hz (GTF) hsync: 80.17 kHz; pclk: 138.54 MHz
            Modeline "1280x1024_75.00"  138.54  1280 1368 1504 1728  1024 1025 1028 1069  -HSync +Vsync
    ...
    EndSection
    3. Create an entry for it in the "Screen" section, like so:

    Code:
    Section "Screen"
            Identifier "Configured Screen"
            Monitor "Configured Monitor"
            Device "Configured Device"
            DefaultDepth 24
            Subsection "Display"
                    Depth 24
                    Modes "1280x1024_75.00" ...
            EndSubSection
    EndSection
    The monitor will now use your modeline after X restarts... Assuming it's possible anyway. If it isn't possible, I would surmise the monitor gets toasted, so be careful!
     
  11. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Well, that's all over my head. I am glad it works for you and hopefully it will for others.
     
  12. jwcca

    jwcca Registered Member

    Joined:
    Dec 6, 2003
    Posts:
    772
    Location:
    Toronto
    Me too.

    It might be a Linux thing.;)
     
  13. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    I have just made it a supreme goal over the last few decades to avoid anything that looks like programing. That's why I'm a hardware guy.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.