Subscriber Discussion

Are There Any Advantages To Using DVI-I Over VGA Or DVI-D Over HDMI?

Avatar
Carl Lindgren
Mar 14, 2013

DVI appears to be disappearing as a video transport technology. It is getting harder and harder to find computers and video cards and monitors with DVI capability. In addition my tests have shown DVI-I shows no advantage in picture quality over VGA and DVI-D shows no advantage in picture quality over HDMI.

The proof of this could be how cheap basic DVI-I/VGA adapters are (as low as $4) and the same can be said for DVI-D/HDMI adapters.

MI
Matt Ion
Mar 15, 2013

Carl, first off, I think when you refer to DVI-I, you're thinking more of DVI-A, which is the spec for analog signals in a DVI connector; DVI-I specifies both analog and digital over one line. Most computers with DVI-I outputs support both signal types; the card detects what type of monitor is attached and enables the appropriate signal.

Now that said... you SHOULDN'T see a difference between DVI-A and VGA, because DVI-A carries the same analog signal as VGA. That's why adapters are cheap - they're just pin-to-pin adapters.

Similarly, you shouldn't see a difference between DVI-D and HDMI, since HDMI's video signal is the basically the same as DVI's; HDMI merely adds digital audio, extra video modes, control signals, and other data (latest spec even rolls in TCP/IP). And as such, DVI-D/HDMI adapters are pretty much pin-to-pin as well.

If you want to see the difference, put an actual DVI-D or HDMI monitor beside a VGA monitor, running off the same machine. Even with a good quality cable and set at the proper native resolution, the VGA will often show a "softness" to the picture. It's often not even noticeable until you have the two side-by-side.

All that aside, if DVI is "dying off" it's probably mostly because HDMI offers the same signal and compatibility, but in a more consumer-friendly form. The additional data HDMI carries won't be applicable in a lot of computer applications, but are in multimedia, home theater, etc., so it makes more sense to build displays, receivers, and other non-computer gear to accept HDMI, since they can still accept the DVI image. HDMI has the added "benefit" of a smaller connector, which allows for much higher density - a four-head HDMI card can fit in a PC backplane in the same space as only two DVI connectors. In short, DVI is simply being obsoleted by HDMI.

As for VGA... I certainly hope it's not going away anytime soon. It's far more flexible and versatile for computer/surveillance purposes. VGA is easy to extend several hundred feet over (relatively) low-cost baluns and KVM extenders; DVI and HDMI typically require expensive converters to do the same thing. If you need to output a VMS to multiple remote monitors, VGA can be substantially less expensive.

Avatar
Ethan Ace
Mar 15, 2013

Going back to the days when I actually did A/V, HDMI presented a problem other formats did not: HDCP (Hi-Bandwidth Digital Content Protection). It caused so many problems in commercial settings, PCs included, because either source or display could require HDCP to be present, and often did, and if it wasn't there, it would simply not display. This meant that using HDMI-to-DVI adapters often resulted in one component or another freaking out because HDCP wasn't there. It also meant that trying to use distribution amps to feed multiple displays didn't always work, because HDCP was licensed per connection, and some sources could only feed one destination. All this so that people "couldn't" pirate the source material. That obviously worked...

Anyway, the whole thing just left a bad taste in a lot of folks' mouths, which is how DisplayPort ended up growing bigger, because it could pass HDCP, but didn't require it, so the commercial A/V world was drawn to it. In my experience, most new equipment and multi-head video cards I've used have been DisplayPort instead of HDMI because of this (the Matrox M-series cards are a prime example), and one simply gets the proper cable to adapt it to the monitor. I think HDMI is more common on consumer cards, though, so people don't have to deal with adapters.

As far as one looking better than another, there shouldn't be a difference between DVI-A and VGA or DVI-D and HDMI at identical resolutions. Digital signals, whether DVI-D, HDMI, or DisplayPort, sure better look better than VGA, though. I've never seen an instance where they didn't (though I'm sure someone will give me an example).

New discussion

Ask questions and get answers to your physical security questions from IPVM team members and fellow subscribers.

Newest discussions