• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

CHOO CHOOOOO!!!!1! Navi Hype Train be rollin'

Well we have been hearing about Navi for a while, just not from AMD. Huge difference. They mention it here and there as an afterthought. Everything mostly came from rumors and "leaks".
And another thing, WCCF is as reliable as a 1980 Zastava Yugo. Don't give them too much credit.
I never give much credit to anybody. I'm the dude that believes when sees it. I know it's still assumptions but their premise is intriguing and they must have something to support this. I want to buy RVII but maybe a good thing would be waiting a bit longer. What's the release date for NAVI now? I think AMD will have a shot with NAVI. Probably I will be pounded for the GCN old crap here but everything can be improved. AMD must buckle up and give the performance finally to stand up to NV. Now's a good time for this. Hopefully they can pull it off.
 
I never give much credit to anybody. I'm the dude that believes when sees it. I know it's still assumptions but their premise is intriguing and they must have something to support this. I want to buy RVII but maybe a good thing would be waiting a bit longer. What's the release date for NAVI now? I think AMD will have a shot with NAVI. Probably I will be pounded for the GCN old crap here but everything can be improved. AMD must buckle up and give the performance finally to stand up to NV. Now's a good time for this. Hopefully they can pull it off.
Apparently Q3 this year. That's Navi 10, intended to replace Polaris and Vega.
 

First you said it yourself: They are committed to enterprise. And enterprise only. Second, I would say look at recent Intel history and make an educated guess. Enthusiasts and 'gamers' are basically an after thought to Intel. Actually, a piggy bank for when times get tough so they can poop out some half assed product (more recently) and add $150 - $500 to it. I can't say I blame them when it is that easy.

Add that to the fact that all the people they hired have been churning out gpus that aren't really great at anything and Voila! You have exactly nothing to be excited about. My educated guess says we don't have much to look forward to from them. Maybe a better Instinct but what does that get us?
 
9aj6wtl1b0x11.png


Take with 2 teaspoon full of rock salt & distilled water, for (less) aftertaste :p
 
So, you didn't know that Navi 10 was the Polaris successor coming in 2019 and the Vega successor is the Navi 20 that would launch in 2020? Those rumors are over a year old to be confused with the Radeon 7 launch that was a product to buy AmD time until Vega 20 is ready.
I don't pay attention to rumors so it doesn't matter. What matters is they're taking a long time and market prices are inflated due to lack of competition. Sooner the better.
 
The picture mentions DLSS by name (end of the first bullet).

I can't see Navi having tensor cores at all so the chances of deep learning anything are none. Arcturus might have tensor cores, not Navi.
 
The picture mentions DLSS by name (end of the first bullet).

I can't see Navi having tensor cores at all so the chances of deep learning anything are none. Arcturus might have tensor cores, not Navi.

Tensor flow support with GCN on Linux is quite poor. Experienced it first hand when I still had the FuryX. Very few developer actually spend time to develop for GCN based cards. So yeah I agree, without dedicated Tensor flow ASIC and good software support, DLSS level de-nosing would be very hard on AMD GPU.

Also R0h1t, DLSS has nothing to do with Ray Tracing. Not talking about Ray Tracing here at all.
 
It's from 2018, no idea whether fake or not but it is possible given RTX cards were only released less than a year back. Also we don't know for sure what DLSS stands for in that slide.
 
...Volta has tensor cores and it debuted in 2017. It took almost a year for that to evolve into Turing and DLSS to be created that uses it. AMD is quite far behind in this area. And why would Sony want anything to do with it anyway? DLSS only reason to exist is to cover up the fact they're running it at low resolutions to hide the raytracing performance drop.

Mentioning DLSS (an NVIDIA tech) in relation to AMD shows the author of said picture doesn't have a basic understanding of what it is which calls in to question the accuracy of all of it. Here's more glaring examples:
1) 6c/12t when chiplets are 8c/16t. The leak that suggests it is 8c/8t makes a lot of sense because of backward compatibility reasons with the PS4 and PS4 Pro. 6c is going to create threading issues because two cores are going to have higher load than the rest (and it could choke).
2) Sony doesn't use anything off the shelf so why would they use Radeon Rays off the shelf? They likely have a custom raytracing implementation.
3) "post-processing of the buffers" let me put my pear "wut" expression on
4) "8K?" :roll:
5) these TFLOP/shader counts look like they're copied from Vega.
6) "14 GB available to developers" which is 2 GB when PS4 kept 3 GB to itself. They'll likely expand it, not contract it.
7) 2 TB, 2.5" HDDs retail for $85. I'm thinking either 3.5" HDD or, more likely, an SSD of unknown capacity. Probably use older, slower chips and buy them in bulk.
8) 802.11ax :laugh: ac at best
9) that last bullet is 100% BS.
 
Last edited:
It's from 2018, no idea whether fake or not but it is possible given RTX cards were only released less than a year back. Also we don't know for sure what DLSS stands for in that slide.


D~L~S~S

Deep Learning Super Sampling


https://www.nvidia.com/en-us/geforc...-new-technologies-in-rtx-graphics-cards/#dlss

Whoever made that pic is doing his/her/its best to fake it without even paying attention to all the acronyms.



Well unless Sony actually is using “DISCRETE LOGIC SOLVING SYSTEM” to control nuclear power plant with the help of lockheed-martin. TBH with those BS spec running off an GCN, it actually might need a nuclear power plant to power it lol.
https://trademarks.justia.com/870/97/dlss-87097534.html
 
So yeah I agree, without dedicated Tensor flow ASIC and good software support, DLSS level de-nosing would be very hard on AMD GPU.

Just out of curiosity, what does this mean? Are you equating a pre-computed edge reconstruction filter applied after scanout to denoising of low pass RTRT?

Also R0h1t, DLSS has nothing to do with Ray Tracing. Not talking about Ray Tracing here at all.

Hence my confusion. What do you think DLSS is?
 
D~L~S~S

Deep Learning Super Sampling


https://www.nvidia.com/en-us/geforc...-new-technologies-in-rtx-graphics-cards/#dlss

Whoever made that pic is doing his/her/its best to fake it without even paying attention to all the acronyms.



Well unless Sony actually is using “DISCRETE LOGIC SOLVING SYSTEM” to control nuclear power plant with the help of lockheed-martin. TBH with those BS spec running off an GCN, it actually might need a nuclear power plant to power it lol.
https://trademarks.justia.com/870/97/dlss-87097534.html
Yup,this is ~300w of gpu in a console
 
D~L~S~S

Deep Learning Super Sampling


https://www.nvidia.com/en-us/geforc...-new-technologies-in-rtx-graphics-cards/#dlss

Whoever made that pic is doing his/her/its best to fake it without even paying attention to all the acronyms.



Well unless Sony actually is using “DISCRETE LOGIC SOLVING SYSTEM” to control nuclear power plant with the help of lockheed-martin. TBH with those BS spec running off an GCN, it actually might need a nuclear power plant to power it lol.
https://trademarks.justia.com/870/97/dlss-87097534.html
You never know with AMD, however if there is no such thing as DLSS' equivalent here one would hope the leakers could avoid such an obvious mistake. Who knows, frankly I'm just shoveling coal here.
Next stop ~ E3 :rockout:
 
Just out of curiosity, what does this mean? Are you equating a pre-computed edge reconstruction filter applied after scanout to denoising of low pass RTRT?

Hence my confusion. What do you think DLSS is?
He is right it has nothing to do with Ray Tracing. It comes alongside RT to make more FPS by reducing resolution or image quality.

I can't see Navi having tensor cores at all so the chances of deep learning anything are none. Arcturus might have tensor cores, not Navi.
Why would Navi have tensor cores? It's not NV product. There are other ways of supporting ray tracing. It doesn't need to have tensor cores which is Nvidia specific.

Yup,this is ~300w of gpu in a console
I missed something here. Where does it say 300W??
 
Yup,this is ~300w of gpu in a console
Like Xbox One X: they take a big chip and run it at low clocks which translates to low wattage. The whole system will likely use less than 200w so about in line with Xbox One X.

Why would Navi have tensor cores? It's not NV product. There are other ways of supporting ray tracing. It doesn't need to have tensor cores which is Nvidia specific.
Tensors aren't for raytracing, they're for AI which AMD is way behind in. Arcturus is presumably the next architecture focused on Radeon Instinct like Vega was.
 
9aj6wtl1b0x11.png


Take with 2 teaspoon full of rock salt & distilled water, for (less) aftertaste :p

Source? This to me looks fake as hell and it certainly isn't © Sony

Nah, this reads like some raging fan's wet dream, not reality.
 
Like Xbox One X: they take a big chip and run it at low clocks which translates to low wattage. The whole system will likely use less than 200w so about in line with Xbox One X.


Tensors aren't for raytracing, they're for AI which AMD is way behind in. Arcturus is presumably the next architecture focused on Radeon Instinct like Vega was.
If you want to be more precise then tensor cores are not exactly AI but to enable it by mixing precisions to finish work faster sacrificing accuracy and it comes along with RT to fill other blanks RT can't complete.
Besides I still think tensor cores are NV specific. As far as I remember, AMD is going to have something similar with Navi. They talked about mixing precisions and they will not call it tensor for sure.
 
Last edited:
Yup,this is ~300w of gpu in a console
I missed something here. Where does it say 300W??
The rumored specs are effectively 7nm Vega 56. 11TFLOPS puts the clock at around 1500MHz. It will not be 300W but from the looks of it should stay at around 180-200W. Performance level of such GPU would be equal to Vega64 at about 30% lower power consumption.
 
If you want to be more precise then tensor cores are not exactly AI but to enable it by mixing precisions to finish work faster sacrificing accuracy and it comes along with RT to fill other blanks RT can't complete.
Besides I still think tensor cores are NV specific. As far as I remember, AMD is going to have something similar with Navi. They talked about mixing precisions and they will not call it tensor for sure.
Tensor cores are FP16*FP16+(FP16|FP32) matrix solvers. Deep Learning for dummies.
 
Last edited:
Rapid Packed Math is really simple: the FP32 FPUs can alternatively handle 2xFP16 in the same space/cycle.
 
Back
Top