• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

Another unaccredited wall of text with no reference.

It seems astroturf pays big time. Why "trolling" has a negative connotation is beyond me.
Screenshot_20200714-204114_Samsung Internet.jpg
 
How about some real video upgrades & not just more & faster memory, as well as bandwidth?

We've had HDMI 2.0(b) & Displayport 1.4, as well as HDR for capable hardware since at least the GTX 1000 series or late 2015/early 2016 (even the lowly 2GB MSI GT 1030 GDDR5 can do this as a x4 card) & some of the 900 series can deliver the same with a Displayport firmware update released 2-3 years ago. Since then, 4K TV's have came a long way, there's several models now boasting HDMI 2.1 running 4K at 120 Hz. These cards cannot keep up with HDMI 2.1 speed, unless future spec sheets shows otherwise.

There's many of us who aren't hardcore gamers (if at all), rather video enthusiasts & in this respect, the RTX 3000 series (as well as 2000 series) doesn't deliver. Cannot run 4K at 120/144 Hz! There'll be more & more of these 4K TV's with HDMI 2.1, will trickle down to value models, such as VIZIO. 4K at 120 Hz is the official spec of HDMI 2.1, as well as 8K at 60 Hz (there's quite a few of these on the market as well). Try gaming on these & will be disappointed when one cannot keep up with native refresh rate of the TV. Pricing is getting competitive, soon we'll be seeing these for less than the cost of a mid range (RTX 3070) GPU. Furthermore, HDMI 2.1 will make 2K gaming obsolete, these old monitors are pricey & few TV's shipped with the standard (the only one I've seen was at Walmart & didn't look close). 4K is dropping in price, as well as a few 8K models. BTW, eARC is also a spec of HDMI 2.1 & that alone is a big deal! HDMI 2.0(b) ARC runs at only 1 Mbps, eARC doubles this by 48 times! More than that vs connecting by optical cable.


Every LG 4K TV released this year has HDMI 2.1 & all of it's official specs in glory. :clap:


Therefore, let's slow down with simply beefing up the memory, bandwidth, bus & get in tune with 2020 HDMI 2.1/DP 2.0 video standards. After all, these are video cards (not 100% gaming), why are we 5 years w/out an HDMI/DP upgrade? We must demand more from not only NVIDIA, also AMD in this regard. While I've been a fan of EVGA for 7-8 years, this could change if another brand could provide a GPU in tune with the times. Am not a fanboy of any brand, other than the one which meets my need (why I've not upgraded my Z97 build with i7-4790K). I have both Intel & AMD computers (most self-built), any of my best ones could go another 5 years with an up to date GPU.:D

Cat
 
How about some real video upgrades & not just more & faster memory, as well as bandwidth?

We've had HDMI 2.0(b) & Displayport 1.4, as well as HDR for capable hardware since at least the GTX 1000 series or late 2015/early 2016 (even the lowly 2GB MSI GT 1030 GDDR5 can do this as a x4 card) & some of the 900 series can deliver the same with a Displayport firmware update released 2-3 years ago. Since then, 4K TV's have came a long way, there's several models now boasting HDMI 2.1 running 4K at 120 Hz. These cards cannot keep up with HDMI 2.1 speed, unless future spec sheets shows otherwise.

There's many of us who aren't hardcore gamers (if at all), rather video enthusiasts & in this respect, the RTX 3000 series (as well as 2000 series) doesn't deliver. Cannot run 4K at 120/144 Hz! There'll be more & more of these 4K TV's with HDMI 2.1, will trickle down to value models, such as VIZIO. 4K at 120 Hz is the official spec of HDMI 2.1, as well as 8K at 60 Hz (there's quite a few of these on the market as well). Try gaming on these & will be disappointed when one cannot keep up with native refresh rate of the TV. Pricing is getting competitive, soon we'll be seeing these for less than the cost of a mid range (RTX 3070) GPU. Furthermore, HDMI 2.1 will make 2K gaming obsolete, these old monitors are pricey & few TV's shipped with the standard (the only one I've seen was at Walmart & didn't look close). 4K is dropping in price, as well as a few 8K models. BTW, eARC is also a spec of HDMI 2.1 & that alone is a big deal! HDMI 2.0(b) ARC runs at only 1 Mbps, eARC doubles this by 48 times! More than that vs connecting by optical cable.


Every LG 4K TV released this year has HDMI 2.1 & all of it's official specs in glory. :clap:


Therefore, let's slow down with simply beefing up the memory, bandwidth, bus & get in tune with 2020 HDMI 2.1/DP 2.0 video standards. After all, these are video cards (not 100% gaming), why are we 5 years w/out an HDMI/DP upgrade? We must demand more from not only NVIDIA, also AMD in this regard. While I've been a fan of EVGA for 7-8 years, this could change if another brand could provide a GPU in tune with the times. Am not a fanboy of any brand, other than the one which meets my need (why I've not upgraded my Z97 build with i7-4790K). I have both Intel & AMD computers (most self-built), any of my best ones could go another 5 years with an up to date GPU.:D

Cat
A) it is entirely expected that upcoming GPUs feature HDMI 2.1.
B) You're misrepresenting things by saying we've gone five years without an I/O upgrade - current gen cards mostly launched 1-2 years ago.
C) AFAIK there still isn't a certification process for HDMI 2.1 source devices, just for signal sinks (TVs etc.). How are they then to make something compliant? They certainly couldn't when they last launched a series of GPUs.
D) eARC isn't relevant for GPUs (unless you for some reason want your TV/monitor to send its audio back to your PC?), only for TVs/monitors and receivers/amplifiers/soundbars. As long as your TV and amp support eARC, it doesn't matter if your PC doesn't.
E) You're saying GPU makers should take it easy with performance increases (which are always wanted and "necessary" inasmuch as anything with a GPU can be said to be) to prioritize ... a trivial upgrade that likely represents <1% of the effort of said performance increases. I don't know about you, but I fully expect them to be able to deliver both.
 
People should realize DVI is royalty free, while HDMI is not, on top of being crippled by "OMG people might see it for free" useless (hacked wide open as usual) encryption that complicates it unnecessarily.
 
People should realize DVI is royalty free, while HDMI is not, on top of being crippled by "OMG people might see it for free" useless (hacked wide open as usual) encryption that complicates it unnecessarily.
Did you mean DP? DVI tops out at 2560x1600@60Hz for dual-link cables/connectors. Hardly a modern interface.
 
Back
Top