Monday, August 20th 2018

NVIDIA GeForce RTX 2000 Series Specifications Pieced Together

Later today (20th August), NVIDIA will formally unveil its GeForce RTX 2000 series consumer graphics cards. This marks a major change in the brand name, triggered with the introduction of the new RT Cores, specialized components that accelerate real-time ray-tracing, a task too taxing on conventional CUDA cores. Ray-tracing and DNN acceleration requires SIMD components to crunch 4x4x4 matrix multiplication, which is what RT cores (and tensor cores) specialize at. The chips still have CUDA cores for everything else. This generation also debuts the new GDDR6 memory standard, although unlike GeForce "Pascal," the new GeForce "Turing" won't see a doubling in memory sizes.

NVIDIA is expected to debut the generation with the new GeForce RTX 2080 later today, with market availability by end of Month. Going by older rumors, the company could launch the lower RTX 2070 and higher RTX 2080+ by late-September, and the mid-range RTX 2060 series in October. Apparently the high-end RTX 2080 Ti could come out sooner than expected, given that VideoCardz already has some of its specifications in hand. Not a lot is known about how "Turing" compares with "Volta" in performance, but given that the TITAN V comes with tensor cores that can [in theory] be re-purposed as RT cores; it could continue on as NVIDIA's halo SKU for the client-segment.
The RTX 2080 and RTX 2070 series will be based on the new GT104 "Turing" silicon, which physically has 3,072 CUDA cores, and a 256-bit wide GDDR6-capable memory interface. The RTX 2080 Ti is based on the larger GT102 chip. Although the maximum number of CUDA cores on this chip is unknown the RTX 2080 Ti is reportedly endowed with 2,944 of them, and has a slightly narrower 352-bit memory interface, than what the chip is capable of (384-bit). As we mentioned earlier, NVIDIA doesn't seem to be doubling memory amounts, and so we could expect 8 GB for the RTX 2070/2080 series, and 11 GB for the RTX 2080 Ti.
Source: VideoCardz
Add your own comment

25 Comments on NVIDIA GeForce RTX 2000 Series Specifications Pieced Together

#1
cyneater
Wow same amount of GRam....

So is it another same old same old...
Posted on Reply
#2
fore1gn
cyneaterWow same amount of GRam....

So is it another same old same old...
Do you really need 16 GB of RAM on your graphics card? We can't even max out the 8 GB the 1080 has.
Posted on Reply
#3
Gungar
fore1gnDo you really need 16 GB of RAM on your graphics card? We can't even max out the 8 GB the 1080 has.
Ofc we can max them out xD
Posted on Reply
#4
Bjorn_Of_Iceland
fore1gnDo you really need 16 GB of RAM on your graphics card? We can't even max out the 8 GB the 1080 has.
GungarOfc we can max them out xD
I can confirm that I can max out my 1080ti in FF15 XD.
Posted on Reply
#5
swirl09
Theres only a handful of games that I would see 9GB used (at 4K+). Most stay under 6GB.

Not fused about more VRAM just yet, the increase in bandwidth is welcome and I am sure I'll be wanting more the next round, but for now that 11GB will do for 4K and I would think 8GB for lower wont be a problem for another gen or 2.
Posted on Reply
#6
SetsunaFZero
RT-Core and Tenso-Core count are missing in this table
Posted on Reply
#7
StrayKAT
So the real time raytracing feature is for consumers..? How is it going to work for games? Is it going to require specific Nvidia features to be coded into them (like Hairworks, etc)?
Posted on Reply
#8
ShurikN
One of those two extremely similar 2080s seem pointless.
Posted on Reply
#9
bpgt64
Really would have liked a model with 22Gbs of ram, or even 16GB would have been fine.
Posted on Reply
#10
Xzibit
StrayKATSo the real time raytracing feature is for consumers..? How is it going to work for games? Is it going to require specific Nvidia features to be coded into them (like Hairworks, etc)?
Yes

DX12 + DXR + GameWorks / RTX

Like Physics acceleration a la PhysX
NvidiaThe upcoming GameWorks SDK — which will support Volta and future generation GPU architectures — enable ray-traced area shadows, ray-traced glossy reflections and ray-traced ambient occlusion.
Posted on Reply
#11
JalleR
Maybe the Cards will get the same "able to share texture Cache" as the Quadros, then you just buy a card more and a NVlink bridge and BOOOM you have 22GB....

I maxed out my 8GB 1080 when playing RawData Early dev with Tweaked rendering distance, but haven’t seen that with my 1080TI
Posted on Reply
#12
DeathtoGnomes
why would they only use 12gb, for the Titan V, intentional hampering maybe?
Posted on Reply
#13
Liviu Cojocaru
5 hours to go and we will be blown away by the new GPU's from nVidia...or maybe not? Not much left to go
Posted on Reply
#14
ppn
1 channel is broken. Raytracing use less memory as oposed to regular textures that can get overbloated.

RTX 4000 series next year 2048/4096/6144 core or stays the same 1536/3072/4608, just shrinked to 1/2 the size.
Posted on Reply
#15
olymind1
Based on those specs, i don't understand why they call that TI card a 2080, that beast of a card is more like a 2090...
Posted on Reply
#16
rtwjunkie
PC Gaming Enthusiast
GungarOfc we can max them out xD
Bjorn_Of_IcelandI can confirm that I can max out my 1080ti in FF15 XD.
Cards with 11 GB of VRAM are not being maxed out, and there is no reason to have more for the foreseeable future.

You see “maxed out” only because the GPU is storing graphics there because it can, because it is there, NOT because it needs to.
Posted on Reply
#17
kings
My bet is RTX 2080 on the level of GTX 1080Ti performance and RTX 2080Ti some 30%~35% ahead.
Posted on Reply
#18
Capitan Harlock
I wanna see how is going to perform the 2080 vs the 2080+ with more cuda.
Price is important too. If is right i will think about it .
Posted on Reply
#19
Prima.Vera
Liviu Cojocaru5 hours to go and we will be blown away by the new GPU's from nVidia...or maybe not? Not much left to go
Why? We already know the performance numbers for the Titan V. Which are not impressive in games at all tbh. Do you think that the 2080 Ti will be faster or, worst, the 2080 vanilla??
I don't think so. So overall, I think this will turn out the be one of the most boring generations of cards in recent years...
Posted on Reply
#20
Liviu Cojocaru
Prima.VeraWhy? We already know the performance numbers for the Titan V. Which are not impressive in games at all tbh. Do you think that the 2080 Ti will be faster or, worst, the 2080 vanilla??
I don't think so. So overall, I think this will turn out the be one of the most boring generations of cards in recent years...
Titan V is not really a gaming card and Volta is a bit different than Turin so it remains to be seen
Posted on Reply
#21
jsjones008
cyneaterWow same amount of GRam....

So is it another same old same old...
Except it's GDDR6, which can almost double the bandwidth of GDDR5X. So, yeah, same amount of memory (which, as has been stated, can't even be maxed out in most gaming scenarios), but much faster at what it does.
Posted on Reply
#22
xorbe
DeathtoGnomeswhy would they only use 12gb, for the Titan V, intentional hampering maybe?
It's HBM2.
Posted on Reply
#23
peche
Thermaltake fanboy
rip "GTX"
Posted on Reply
#24
windwhirl
olymind1Based on those specs, i don't understand why they call that TI card a 2080, that beast of a card is more like a 2090...
This is just my guess, but the 9 was probably used to indicate two GPUs on the same card, like the GTX 295, the GTX 590 or the GTX 690. Since the 700 series, that spot was taken once by the Titan Z and after that no other dual GPU card was ever released for consumers, so Nvidia never used the 9 again to indicate model number within series.

I bet Nvidia would only consider that if AMD suddenly launched some overwhelmingly powerful and efficient GPU... which will probably be not the case for a long time.

EDIT: Honestly, Nvidia should just make the 2080+ card the "default" 2080, they are so similar I don't see the point of launching both of them as different products...
Posted on Reply
#25
Bjorn_Of_Iceland
rtwjunkieCards with 11 GB of VRAM are not being maxed out, and there is no reason to have more for the foreseeable future.

You see “maxed out” only because the GPU is storing graphics there because it can, because it is there, NOT because it needs to.
I see it the other way around. Not because it needs to, yes, but because it can. And by doing so, we are treated with consistent high res textures and less jitter from texture streaming.
Posted on Reply
Add your own comment
Jun 4th, 2024 14:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts