Thursday, August 23rd 2018

NVIDIA "TU102" RT Core and Tensor Core Counts Revealed

The GeForce RTX 2080 Ti is indeed based on an ASIC codenamed "TU102." NVIDIA was referring to this 775 mm² chip when talking about the 18.5 billion-transistor count in its keynote. The company also provided a breakdown of its various "cores," and a block-diagram. The GPU is still laid out like its predecessors, but each of the 72 streaming multiprocessors (SMs) packs RT cores and Tensor cores in addition to CUDA cores.

The TU102 features six GPCs (graphics processing clusters), which each pack 12 SMs. Each SM packs 64 CUDA cores, 8 Tensor cores, and 1 RT core. Each GPC packs six geometry units. The GPU also packs 288 TMUs and 96 ROPs. The TU102 supports a 384-bit wide GDDR6 memory bus, supporting 14 Gbps memory. There are also two NVLink channels, which NVIDIA plans to later launch as its next-generation multi-GPU technology.
Source: VideoCardz
Add your own comment

65 Comments on NVIDIA "TU102" RT Core and Tensor Core Counts Revealed

#51
Aquinus
Resident Wat-man
gamermannvidia this release is brand new turing and has as always again brand new gpu with new graphics option and and new connectors and it is 40-50% faster than old pascal.
First of all, we don't really know how much faster these GPUs are because no reviews have been released yet other than nVidia's own slides which undoubtedly have bias which is a thing.
gamermannvidia is factory and its build great gpus abut not doing it fun,its want property lke all business.
nvidia moust common csh coming indestry,more than 3/4 coming there so respect.
You know that nVidia uses the same fab as AMD, right? You know, that whole thing with TSMC. The only reason nVidia has been moving forward like this is because they focus on what gamers want and had a different lineup for the professional market (FP64 performance anyone). I suspect nVidia is changing that given the amount of compute they put in the RTX 2xxx chips. That however does not necessarily translate into a benefit for the consumer. It's also similar to what AMD does to be honest except AMD takes a more simple (and arguably a more scalable,) approach. Take the GTX 1080 for example, it's faster than the 390 when it comes to games but, when you talk about raw compute and texturing power, it's actually not.
gamermanso new gpus cost little more than amds junk,but when u buy nvidia gpu,you get high qualitymfast,excellent efficiency latest tech gpu.
AMD's GPUs aren't junk. They're not the fastest but, they work and they get the job done. They don't have anything to compete with nVidia's high end cards but, that doesn't mean that they're junk. The 580 really isn't that bad of a GPU, neither are the Vega series cards but, people tend to go cross-eye retarded when they obsess about performance and don't stop for a minute to think about how they accomplished it.
gamermanp.s. im not nvidia worker or so, i just see things what they are, i use bfore amd product cpu and gpu..not evermore again. i think amd cheat clear customer... i remembe again that terrible hypeting and commercial from vega gpu...
You don't need to be a nVidia employee for your views to be skewed and your rhetoric kind of proves that.
gamermanand what you get, old lausy tech slow junk,overpriced gpu with 500W power eat. phyi!
Wow, really? Even my 390 doesn't pull that kind of power and it's one of the most power consuming GPUs they've released.

With that said, I have a few suggestions:
  1. Tone down the rhetoric and choose your words with more precision.
  2. When making claims like the 40-50% faster bit and the 500w thing, I expect sources to back up such seemingly erroneous claims.
  3. Reread what you're posting before posting it. The number of spelling and grammatical errors is simply astonishing and that doesn't help your argument. It also doesn't help when we try to read it and understand what you're trying to say. For example, I've re-read my post at least 2 or 3 times before posting it.
Posted on Reply
#52
efikkan
AquinusAMD's GPUs aren't junk. They're not the fastest but, they work and they get the job done. They don't have anything to compete with nVidia's high end cards but, that doesn't mean that they're junk. The 580 really isn't that bad of a GPU, neither are the Vega series cards but, people tend to go cross-eye retarded when they obsess about performance and don't stop for a minute to think about how they accomplished it.
The only segment where AMD currently competes is the low-end. People are not giving AMD a hard time for failing to produce a high-end contender, but because their mid-range cards are vastly inferior. RX 580, RX Vega 56 and RX Vega 64 are all inferior to their counterparts from Nvidia, and it's not like they offer a major advantage despite the large disadvantages, no Nvidia's offerings are superior unless you have an extremely special use case for the card. At the moment Pascal offers 80-90% more performance per watt than Polaris and Vega, and Turing will push that further.

This didn't use to be the case. Back in the 2000s, ATI was competitive most of the time. They didn't always reach up to the high-end, but they was nearly always competitive (and beating Nvidia) throughout the mid-range. And even when their counterpart was slightly better, they was usually close enough that we were talking single digit differences, or other benefits compensating for any deficiency. ATI used to be innovative and kept pushing new technology. After the acquisition by AMD on the other hand, development was eventually outsourced (to China?) and development budgets cut down to almost nothing. Their current strategy is to tweak the same thing for >5 years and make money by offering customized versions of it. As they fall further and further behind, catching up will only get harder.
Posted on Reply
#53
londiste
efikkanThe only segment where AMD currently competes is the low-end. People are not giving AMD a hard time for failing to produce a high-end contender, but because their mid-range cards are vastly inferior. RX 580, RX Vega 56 and RX Vega 64 are all inferior to their counterparts from Nvidia, and it's not like they offer a major advantage despite the large disadvantages, no Nvidia's offerings are superior unless you have an extremely special use case for the card. At the moment Pascal offers 80-90% more performance per watt than Polaris and Vega, and Turing will push that further.
RX580 competes well, slightly better perf for slightly less money for worse power consumption. RX570 is competitive as well but overall more evenly matched by GTX 1060 3gb. But these two are pretty much it for now. Nvidia has lowend covered by 1050/1050Ti.
Posted on Reply
#54
medi01
londisteRX580 competes well, slightly better perf for slightly less money for worse power consumption. RX570 is competitive as well but overall more evenly matched by GTX 1060 3gb. But these two are pretty much it for now.
Vega 56 at MSRP is quite good a card against 1070/ti.
efikkanTheir current strategy is to tweak the same thing for >5 years
Posting this crap on this site, freaking seriously?
Posted on Reply
#55
londiste
medi01Vega 56 at MSRP is quite good a card against 1070/ti.
At least in Europe, Vega 56 tends to be about 50€ more expensive than GTX1070Ti (and about the same as GTX1080) while performing largely the same if not worse. Couple that with much lower power consumption (and better/quieter cooling as a result) of GTX cards and Vega is not that enticing of a proposal.
Posted on Reply
#56
StrayKAT
efikkanNvidia's offerings are superior unless you have an extremely special use case for the card.
I guess that's the category I'm in. I probably wouldn't recommend one either unless you had Freesync. It was definitely a no-brainer for me, since both my monitor and my TV use it. edit: In effect, this is the trend for consoles atm.. I'm just following that trend and getting the benefit of a PC to boot.

edit: Nvidia does have some "BFGD" displays in the works (apparently due after summer), but I bet the prices will be ridiculous. This is crazy high end.. and once again, AMD is not getting beat in the low-mid range market when you start counting display/gpu combos.
Posted on Reply
#57
hat
Enthusiast
StrayKATedit: Nvidia does have some "BFGD" displays in the works (apparently due after summer), but I bet the prices will be ridiculous.
No bet about it. You can probably buy your computer twice and still have money left over, compared to the price of one of those.
Posted on Reply
#58
StrayKAT
hatNo bet about it. You can probably buy your computer twice and still have money left over, compared to the price of one of those.
I look forward to Tom's Reviews' recommendation!
Posted on Reply
#59
hat
Enthusiast
StrayKATI look forward to Tom's Reviews' recommendation!
JUST BUY IT
Posted on Reply
#60
sepheronx
hatJUST BUY IT
I read that Turing article. I was quite surprised that something like that would even be published at all. No real hard data or anything. It was just a long page of someone dancing around a function that wasn't properly tested by anyone and just showcased by Nvidia and saying "just buy it because life is short".

I can write way better than that. How do I get a job writing such articles and get paid for it?
Posted on Reply
#61
hat
Enthusiast
I guess you'd have to become a hardware site/reviewer for a while, build a good reputation, then accept a big check from nVidia...
Posted on Reply
#62
Aquinus
Resident Wat-man
efikkanThe only segment where AMD currently competes is the low-end. People are not giving AMD a hard time for failing to produce a high-end contender, but because their mid-range cards are vastly inferior. RX 580, RX Vega 56 and RX Vega 64 are all inferior to their counterparts from Nvidia, and it's not like they offer a major advantage despite the large disadvantages, no Nvidia's offerings are superior unless you have an extremely special use case for the card. At the moment Pascal offers 80-90% more performance per watt than Polaris and Vega, and Turing will push that further.
First off, it's more like 50% versus Vega and 80-90% for Polaris. You can look at TPU's own reviews to confirm that one... and it's not like you're putting this thing in a mobile device, it's a dedicated GPU for a tower. I'm not going to lose sleep over the equivalent of a penny out of my paycheck for the cost of electricity. The reality is that Vega doesn't perform that bad when you consider performance in games but, it's a very different story when you talk about compute and hey, look at that, nVidia is beefing up compute. Simply put, the hardware works and it's not that bad. I'm sorry it hasn't reached a status where it's able to kiss your boot but, not being the best doesn't equate to being junk.
Posted on Reply
#63
efikkan
AquinusFirst off, it's more like 50% versus Vega and 80-90% for Polaris. You can look at TPU's own reviews to confirm that one... and it's not like you're putting this thing in a mobile device, it's a dedicated GPU for a tower. I'm not going to lose sleep over the equivalent of a penny out of my paycheck for the cost of electricity. The reality is that Vega doesn't perform that bad when you consider performance in games but, it's a very different story when you talk about compute and hey, look at that, nVidia is beefing up compute. Simply put, the hardware works and it's not that bad. I'm sorry it hasn't reached a status where it's able to kiss your boot but, not being the best doesn't equate to being junk.
No, it's pretty much in the 80-90% range, unless you underclock the card.

It's strange how efficiency only matters when it favors someones side…
Posted on Reply
#64
Aquinus
Resident Wat-man
efikkanNo, it's pretty much in the 80-90% range, unless you underclock the card.

It's strange how efficiency only matters when it favors someones side…
I wouldn't by a Vega 64 to run at 1080p any more than I would buy a 1080 to run at 1080p.
Posted on Reply
#65
Vayra86
AquinusI wouldn't by a Vega 64 to run at 1080p any more than I would buy a 1080 to run at 1080p.
Using a 1080 at 1080p I can tell you right away its not even close to overkill. Im using the full 100% over here more often than not. GR: Wildlands can dive under 60 fps with this card at max IQ, and many other games too. Let alone high refresh
Posted on Reply
Add your own comment
Sep 17th, 2024 10:24 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts