DVI-D vs HDMI

OldGunney

Registered User
Clan
Nov 12, 2018
1,860
1,285
113
75
Kentucky
Which is better? DVI-D or HDMI? Had to run this new cable in HDMI since my DVI-I cable would not fit the DVI-D port.

Just got a video card from a friend suppose to be alot faster than what I had. New: ASUS Strix 1070 and Old: GTX950.

Seems to run just a tad faster in Medium settings while in BF4 Goland. I usually keep the FPS down to 69 since my monitor is only a 60 hz monitor. When I installed the card the Nvideo monitor program would no long show up on my G15 keyboard. While playing game I got FPS in the 90's.

Now the slower FPS may be due to my ASUS motherboard not able to handle the PCI 3.0 interface. Mine is set for PCI 2.0. An older mobo and the video card is backward capable, but slower.
 
DVI is in the process of being phased out. It's an old standard and has limited bandwidth available for video output (vs DisplayPort or HDMI). However, from what it sounds there could be a lot of different bottlenecks. The CPU is the most likely culprit if you still have PCIe 2.0.
 
That would involve buying another MoBo just for a litlle more speed. Would like to bump up to a faster monitor that handles faster frame rates.

On another question, what about the Logitech G15 keyboard, isn't there a plug in for the LCD display so I can see the temp and memory usage like before? The Nivideo plugin was nice, but this is an ASUS video card.
 
On another question, what about the Logitech G15 keyboard, isn't there a plug in for the LCD display so I can see the temp and memory usage like before? The Nivideo plugin was nice, but this is an ASUS video card.

ASUS GeForce products are still Nvidia Cards. Might need to update whatever software was showing the temp data, so it can read the new hardware (assuming it can be updated, and wasn't abandoned).
 
My ASUS 1070 video card does have two HDMI ports. IS that what you are calling 'Displayport'?

Nope they are different.

HDMICABLES.jpg


Left is HDMI and Right is Displayport
 
My card has two of the HDMI ports and two of the Displayports. Doubt my old ACER monitor can use the displayport. Is the displayport better than they HDMI ports?
 
My card has two of the HDMI ports and two of the Displayports. Doubt my old ACER monitor can use the displayport. Is the displayport better than they HDMI ports?

Display port (DP) is only needed if your monitor or tv that you want to game on is 4K (3840x2160) or higher and the game you're playing supports it. I think BF4 resolutions only reaches 1920x1080p. (I might be wrong)

At 4K resolution HDMI will only muster 30Hz despite it having more bandwidth than DP.

In short, if your VC and monitor supports DP then use it. One other thing, some older DP didn't carry audio but I believe most now do.
 
That would involve buying another MoBo just for a litlle more speed. Would like to bump up to a faster monitor that handles faster frame rates.

On another question, what about the Logitech G15 keyboard, isn't there a plug in for the LCD display so I can see the temp and memory usage like before? The Nivideo plugin was nice, but this is an ASUS video card.

The G15 gets it's info from the OS. Just DL the software from Logitech and set it up.
 
Displayport is the way to go. My 2 year old ASUS monitor has it as well as my EVGA GTX1060 video card.

^ This

I've run Displayport on my last 2 GySinc monitors (Acer 27" and now Alienware 36") and last 3 Video cards (980, 1070, and now 1080Ti) and it rocks performance wise.

That said, when it comes to refresh rates:
DVI Supports 144hz
HDMI supports 60hz
Displayport supports 144hx +

So while DVI is older and soon to be legacy hardware, it will handle a higher refresh rate with a monitor that supports it.

Note: That 1070 should ahve a displayport output
 
It depends on the GPU/Display. HDMI and DisplayPort are generally neck-and-neck with matching and beating each other, but you probably won't see the latest editions on hardware until next-gen. Much like DDR5 (which is a thing on the new GPUs, but not available for system memory).

 
As someone that has enjoyed Gsync technology at 144hz for several years now, there is no going back to a non - Gsync monitor with a lower 60hz refresh rate.

Im an Nvidia gamer, always have been, but that said, if I had a Radeon I would only have a Freesync monitor. I've gotten spoiled on fast refresh, no ghosting, no stuttering, and no screen tearing.

From Toms Hardware
HDMI Vs. DisplayPort: The Bottom Line for Gamers

So, which of the two makes the most sense for PC gaming? Well, it depends on what you already own, and what your intent is.
In some cases, your choice is pre-ordained. If you pick up a GeForce graphics card and a G-Sync monitor, you might notice that you don’t have much (or any) choice in your display technology. The only connector that currently works with G-Sync is DisplayPort. (Newer G-Sync-capable monitors also have HDMI ports, but those ports won't support the G-Sync feature.) So, if you do have a G-Sync display, you'll want to stick to DisplayPort--at least for gaming purposes.

If you do have a decision to make--that is, if both your graphics card and your PC display have HDMI and DisplayPort on board and available--what’s a gamer to do?

Well, the current state of HDMI supports higher theoretical maximum resolutions. But you’d need a monster of a system--that is, one that probably doesn’t exist yet (at least in any reasonably attainable price range)--to play games at anywhere near the top bandwidth and resolutions that HDMI supports. And games you play will have to expressly support those extreme frame rates, as well. So a higher theoretical resolution ceiling doesn’t make HDMI the inherent right choice for PC gaming, for the vast majority of folks.

DisplayPort, meanwhile, makes sense if you want to game on multiple monitors but have just one DisplayPort connection available (say, if you're using a gaming laptop with just one DisplayPort out). The port is "splittable" via DisplayPort hubs, or displays can be daisy-chained. Note, though, that this works only if the monitor has DisplayPort out and supports a feature called Multi-Stream Transport (MST). Most monitors have the latter, but very few have the former. It's more typical to run multiple cables from a single video card rather than daisy-chaining.

All else being equal, though, for gaming on a single display at workaday resolutions and refresh rates, you'll get roughly the same results from either interface, as long as you're not running up against the limitations of G-Sync/FreeSync support. For what it's worth, the Video Electronics Standards Association (VESA)--the governing body that fashioned DisplayPort as a replacement for DVI and VGA--intended it for PC-centric uses, whereas HDMI was conceived by a group of consumer-electronics companies with TV implementations in mind.

Because of its TV-based aims, one of the primary initial features/goals of HDMI was content protection. This arrived in the form of High-bandwidth Digital Content Protection (HDCP), which was developed by Intel to prevent copying of digital video and audio. You'll need HDCP support to use most major steaming services from your PC, as well to watch DVD and Blu-ray discs. But fear not, as HDCP support is baked in to both DisplayPort and HDMI. As long as your graphics card--or the integrated graphics inside your CPU--were made in the last several years, you should be able to watch HDCP content over either HDMI or DisplayPort. The two connectors (and the tech inside them) were meant to be complementary, not competing, technologies.

Don’t despair if your laptop or desktop has only an HDMI port. As long as your monitor isn't G-Sync or FreeSync enabled, the truth of the matter is that you won’t notice a glaring difference in anything that matters (frame rate, refresh rate, latency, or anything that gamers love to brag or argue about) either way.
 
I second BikerDog..G-Sync was a life changing tech for me, I would take a lesser video card with G-Sync, the only way I can describe is "fluid motion" it didn't give me a FPS increase in BF4 from my Nvidia 1080 but it did make everything fluid and when trying to hit moving targets or dodge terrain (heli or jet ).

I had to learn way more than I wanted to about DVI and Displayport and HDMI early this year when I decided to buy a dock for my old 2013 Macbook Pro of which can only output HDMI but I quickly learned that to get 3k-4K I then learned that not all HDMI is the same either... I ended up having to buy an HDMI adaptor to get my Macbook to output the resolution I wanted you would think hey just plug in the cable but nope... I also game on classic consoles so I have messed with all sorts of adaptors to get those systems on newer TV's, everything turns into a project ... Remember the good old days when you just put the cart in the atari and turned it on ? lol...
 
^ This

I've run Displayport on my last 2 GySinc monitors (Acer 27" and now Alienware 36") and last 3 Video cards (980, 1070, and now 1080Ti) and it rocks performance wise.

That said, when it comes to refresh rates:
DVI Supports 144hz
HDMI supports 60hz
Displayport supports 144hx +

So while DVI is older and soon to be legacy hardware, it will handle a higher refresh rate with a monitor that supports it.

Note: That 1070 should ahve a displayport output

Dog, that some good dupe! Yes, the 1070 card does have a displayport.

My monitor was being fed display through it's own VGA port. I assume it was at the 60hz rate. It is a ACER H236HL model and the manual states it only supports 60hz at 1920 x 1080 resolution. While I'm playing if I turn the screen is not smoothly turning and makes my eyes twitch, but I get used to it.

Since the setup I have now only gives me frame rates between 89 and 115 would I have to select a monitor that runs at those frame rates or would 144hz be OK?
 
That Acer model is running 60hz screen, and while your 1070 can push 89-115 FPS you may get screen tearing as the 1070 is pushing frames faster than the monitor can refresh effectively.

But what do I know?
 

Who has viewed this thread (Total: 1) View details

Who has watched this thread (Total: 3) View details