• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Fast-tracks 7nm "Navi" GPU to Late-2018 Alongside "Zen 2" CPU

Aren't we mixing 2 things here? Or maybe 3?

1) Ray tracing (RT) is a basic rendering technique. It's been around for decades and is fundamental - not useless or unusable as one of AMD fanboys claims. :)
2) Real-time ray tracing (RTRT) is... RT in real time ;-), i.e. fast enough (whatever that means).
It's been around for a while, but used for previews - not final renders. Previews are greatly simplified - they ignore some materials and some effects. Also the resulting live render is usually low-res and under 30fps.
3) RTRT in games means it has to be efficient enough for processing all effects, at high-resolution (1080p+) and high frequency ( has to be acceptable for gaming, i.e at maybe 1080@60fps, maybe 4K@30fps...

You're talking about general processing implementation, i.e. what standard GPU cores do.
Nvidia used an ASIC and it's just way faster - just like tensor cores are way faster for neural networks.

Everything else you've said is more or less correct.
If one wants to combine RTRT with 4K@60fps, then doing that on GPGPU is 10 years away from now. But on ASIC it should be possible withing 1-2 generations, i.e. 4 years tops.
But thanks to RTX cards, you don't have to wait 10 years. For mere $1200 :) you can already make your games look as if it's 2028 (just at 1440p tops).
And when you buy your next RTX card in 2021 for another $1200, it should be OK for 4K@60fps. :)

There's just no way around it. AMD will have to respond with a similar tech, ignore RTRT ("Who needs realism? We're so romantic!") or magically make Navi 4x faster than Vega. :)

No need to argue semantics with me. You know exactly what I'm getting at ;) All RT that is not done on the GPU in real time is not the ray tracing we're talking about when it comes to RTX / DXR, we already have pre-cooked lighting and that is what any kind of non-realtime RT boils down to - its the same as saying 'AI' when in fact its nothing more than lots of lines of code and data to cover every possiblity.
 
No need to argue semantics with me. You know exactly what I'm getting at ;) All RT that is not done on the GPU in real time is not the ray tracing we're talking about when it comes to RTX / DXR, we already have pre-cooked lighting and that is what any kind of non-realtime RT boils down to
Well... I'm very fond of strict definitions. :-)
Just trying to point out that RTRT can be used for both professional 3D work and gaming.
For professional stuff it's already going on and RTX will just speed things up.
For gaming it wasn't really possible until now. And won't be possible for a while without ASIC.

For me RTX is important. You see... I'm getting old and I already struggle to find time for casual gaming few times a week. So the idea that I could get RTRT in games 2-3 years from now instead of 10 is big news.
And, as you can see, I'm a proud owner of a 1050. I paid $150 last summer and I just couldn't find a reason to buy anything more expensive. I'm fine with 1080p and this cheap GPU covers all games I wanted to play. Witcher 3 is the most demanding and I can easily run it at 40-50 fps with decent image quality.

But for me RTRT changes everything. Would I pay $1000 for these new RTX cards? No fu... way.
But would I pay $500 for a card that does RTRT and VR in... let's say... 2022? You bet I would. And now it suddenly became possible. :-)

Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.
saying 'AI' when in fact its nothing more than lots of lines of code and data to cover every possiblity.
Oh... this is not true and especially painful for me... but also way off topic. :-)
 
Well... I'm very fond of strict definitions. :)
Just trying to point out that RTRT can be used for both professional 3D work and gaming.
For professional stuff it's already going on and RTX will just speed things up.
For gaming it wasn't really possible until now. And won't be possible for a while without ASIC.

For me RTX is important. You see... I'm getting old and I already struggle to find time for casual gaming few times a week. So the idea that I could get RTRT in games 2-3 years from now instead of 10 is big news.
And, as you can see, I'm a proud owner of a 1050. I paid $150 last summer and I just couldn't find a reason to buy anything more expensive. I'm fine with 1080p and this cheap GPU covers all games I wanted to play. Witcher 3 is the most demanding and I can easily run it at 40-50 fps with decent image quality.

But for me RTRT changes everything. Would I pay $1000 for these new RTX cards? No fu... way.
But would I pay $500 for a card that does RTRT and VR in... let's say... 2022? You bet I would. And now it suddenly became possible. :)

Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.

Oh... this is not true and especially painful for me... but also way off topic. :)
Wow , a 1050 and your in every gpu thread salivating over Nvidia while knocking Amd.

I am not concerned with what you like though, or your perspective , anymore then you are other people's ,1080p died two years ago for me and no one's dragging me back their for one gfx feature , so you may be fine with Raytracing(ray based shadows and reflection not true Rt) at upto 1440p( really, like thats not the 2080ti and out of most people's reach anyway) but some are not ,to make me happy with rtx I'd need two or three 2080tis.

But im not expecting Rtx to work with sli either or its replacement so im out of Rtx for at least three years imho.

Even before reviews.

And as for game Ai i think he meant the current implementation of ai in game's, ie just big lists of if then's in effect not neural nets which clearly haven't graced a Aaa game yet.
 
Nice derailing of that thread lads :)

Imho, AMD might release the Vega 20 as an iteration for mixed use as did with Frontier Edition Vega 10. It will help them sell it at a premium to cover the added cost for the 32GB HMB2. My hope is that it manages to close the gap to the fastest nVidia by then at <10% for logical power consumption. It could easily be sold for $1000, especially since it would probably be a limited batch. Navi will come in 2019 for sure and first iteration will suceed the Polaris class GPUs.
 
Last edited:
Nice redailing that thread lads :)

Imho, AMD might release the Vega 20 as an iteration for mixed use as did with Frontier Edition Vega 10. It will help them sell it at a premium to cover the added cost for the 32GB HMB2. My hope is that it manages to close the gap to the fastest nVidia by then at <10% for logical power consumption. It could easily be sold for $1000, especially since it would probably be a limited batch. Navi will come in 2019 for sure and first iteration will suceed the Polaris class GPUs.
You sir share my opinion largely ,However Im expecting the geldings of 7nm vega to get mixed up with half working stacks of hbm , 3560 ish cores 16Gb Hbm, massive price too but as you say frontier version first in 2018 possibly but v late on.
 
Nowhere in the article does it say Navi is coming late 2018. As a matter of fact, Navi wasn't mentioned at all.
The gpu that's going to launch later this year is a 7nm Vega20 (or whatever it's called), which is not a consumer product.
Read the last paragraph...
 
What do you mean "to you" ? You see what you want to see rather than what it's written ? It is never once said that Navi is coming to mainstream consumers (or any consumers) late 2018 or anything of the sort.

Wow did read the entire paragraph? I was talking about the title of the article not the article that it was misleading in response to someone else’s comment. Right after that sentence it said we know it’s coming out in 2019! Read the whole thing next time lol. I think you just read one sentence and didn’t pay attention to my entire post lol.
 
That's just all in your head, not in anyone else's.
Maybe it's just in your head the, not so accurate idea, that you know what is in people's head.

Vega will do RT and potentially at good perf (relatively speaking to turding). I don't think future cards will have any issue.

As useless as RT already is, it'll be unusable on 2070, so it's not like anything midrange needs it.
If Vega can do RT, I bet we will see a GameWorks library that fixes that. But I don't believe that RT will be useless on 2070 cards. If you can choose various levels of RT quality, 2070 will be down to low or medium. Then we will get some more articles about how usefull G-Sync can be with 2070 when enabling RT for smooth graphics. Or, who knows, we could see 720p resolution coming back to life and start seeing articles about how much better visually 720p with RT is, compared to typical, non RT, 1080p.

I totally will if performance improve, not that they're bad but well...
I can understand that changing platforms is difficult when you are used to one.
 
I can understand that changing platforms is difficult when you are used to one.

Nah, it's not, not for me at least, i don't care which company, i just care about performance. If ryzen will poop on what's intel making, i'll gladly buy that, if they don't i'll buy intel, same goes for videocards. It's not like i'll buy anything to make some company a favor.
 
Wow , a 1050 and your in every gpu thread salivating over Nvidia while knocking Amd.
Why exactly is my GPU important?
ray based shadows and reflection not true Rt
LOL. What is "true RT"?
And as for game Ai i think he meant the current implementation of ai in game's
I don't know if he meant game AI at all. There is a world outside gaming. Google it! :-)

And man... I'm looking forward to your comments when AMD releases a GPU with similar RT solution. :-D
I'll be there to remind you all this rubbish and how 7nm Navi was just around the corner in late 2018.
 
Why exactly is my GPU important?

LOL. What is "true RT"?

I don't know if he meant game AI at all. There is a world outside gaming. Google it! :)

And man... I'm looking forward to your comments when AMD releases a GPU with similar RT solution. :-D
I'll be there to remind you all this rubbish and how 7nm Navi was just around the corner in late 2018.
Im not the one blowing smoke up a vendors and techs ass, i mearly stand by the, await reviews stance.

Your gpu doesn't directly matter but does make your stance on amd ,polaris and vega strange , since you don't buy into such but are very very vocal on something you !read! about?.

I know of the other ai ,i have Googles Ai in my hand right now obviously, and you know i and he, are right ,modern /present games have strictly list based Ai and not neural net based , were talking consumer gamer tech so who's bringing pro use case ai here , not me so I meant purely Game Ai.

Finally in this thread i mentioned nothing like navi in 2018 ,i said possibly a prosumer 7nm vega, im an optimist but i wouldn't bet on that either.
 
I bet Nvidia pays TPU to make these false claims so AMD would get more bad reputation in the end. Very bad. I consider using some other tech sites than this one.
 
Read the last paragraph...
Yeah and? Where does it mention that that particular GPU, releasing later this year, will be Navi?
 
I bet Nvidia pays TPU to make these false claims so AMD would get more bad reputation in the end. Very bad. I consider using some other tech sites than this one.

Another conspiracy nut AMD sycophant. Welcome! You'll fit right in.
 
Thats not how capitalism works. You have to create a product to fill a need, then consumers buy it.

If AMD doesnt make competitive GPUs, nobody will buy them, and they wont make money. At some point the division would get new leadership or bought by another company, and the products reinvigorated.

Vega 56/64 didnt sell due to supply shortages. If AMD had simply made a 3072/4096 core polaris card, and released them back in 2016, they would have sold well and made AMD money. AMD didnt get sales because they chose to go with cards that were difficult to make trying to pull a 3DFX.
 
Last edited:
This is what ATI did, a new process node with a new small chip and a refresh. They managed to match Nvidia performance with a much smaller chip, and beat them on dollar per performance.

I would rather have a cheap GPU that delivers 60FPS at 1080P with the eye candy maxed out, or 4K with enough eye candy to make it worthwhile.
 
Why exactly is my GPU important?

LOL. What is "true RT"?

I don't know if he meant game AI at all. There is a world outside gaming. Google it! :)

And man... I'm looking forward to your comments when AMD releases a GPU with similar RT solution. :-D
I'll be there to remind you all this rubbish and how 7nm Navi was just around the corner in late 2018.

They have had rt for awhile.
 
This is what ATI did, a new process node with a new small chip and a refresh. They managed to match Nvidia performance with a much smaller chip, and beat them on dollar per performance.
The famous 3870.
Apropos the smaller chip, Vega 20 is about 2x smaller than 2080Ti.
But I don't think the yields will be great, and that's probably why AMD decided against a gaming Vega 20. Then again, who knows... maybe they surprise us later in the year. Maybe they push Navi to early '19. Time will tell
 
Well... I'm very fond of strict definitions. :)
Just trying to point out that RTRT can be used for both professional 3D work and gaming.
For professional stuff it's already going on and RTX will just speed things up.
For gaming it wasn't really possible until now. And won't be possible for a while without ASIC.

For me RTX is important. You see... I'm getting old and I already struggle to find time for casual gaming few times a week. So the idea that I could get RTRT in games 2-3 years from now instead of 10 is big news.
And, as you can see, I'm a proud owner of a 1050. I paid $150 last summer and I just couldn't find a reason to buy anything more expensive. I'm fine with 1080p and this cheap GPU covers all games I wanted to play. Witcher 3 is the most demanding and I can easily run it at 40-50 fps with decent image quality.

But for me RTRT changes everything. Would I pay $1000 for these new RTX cards? No fu... way.
But would I pay $500 for a card that does RTRT and VR in... let's say... 2022? You bet I would. And now it suddenly became possible. :)

Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.

Oh... this is not true and especially painful for me... but also way off topic. :)
I swear everytime I come to read this thread it tells my you are paid either by intel or nvidia or both to be their personal fanboi. You arguments are flawed because of that.
 
They have had rt for awhile.
AMD supports RT via ProRender - I've mentioned it earlier. It's a nicely written platform, it works on everything (OpenCL-based) and it gives AMD's raw processing power a decent advantage (Vega matches 1080Ti - unlike in games).
But you just can't match ASIC in this regard. And if you're not very into 3D rendering, then just think about crypto mining. :-)

The famous 3870.
Apropos the smaller chip, Vega 20 is about 2x smaller than 2080Ti.
I can draw you even smaller theoretical chip if you want. :-)
2080Ti is here, it works and it's already traveling towards your favourite store.
My favourite store says they can deliver 22 MSI RTX 2080Ti Gaming by 03/10. It was "at least 30" last time I checked. So at least 8 people pre-ordered already (for 5800 PLN ~= 1570 USD).
By comparison, the same store sold 3 (three!!!) Asus Vega 64 Strix since launch.
But I don't think the yields will be great, and that's probably why AMD decided against a gaming Vega 20. Then again, who knows... maybe they surprise us later in the year. Maybe they push Navi to early '19. Time will tell
Navi could be an interesting architecture, but they've built the idea around 7nm. 7nm might just not happen fast enough. And the yields... oh my... the yields!
It's the HBM2 thing happening all over again. AMD wants to build an advantage using more recent, unproven and poorly available tech and they end up caught in all kinds of weird traps.

Now 7nm is the mystical saviour - praised by AMD believers in every comment (just like HBM2 a year ago). We'll see how it turns out this time. :-)
 
Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.
It does. For production stuff I would assume the API they use is Optix but point and underlying technology is the same.


If Vega can do RT, I bet we will see a GameWorks library that fixes that.
No we won't. AMD will have their own API that DXR will use and that'll work. When game devs have implemented straight-up Gameworks stuff, then all bets are off of course. But even then AMD will help them out in that regard :)
 
Last edited:
AMD supports RT via ProRender - I've mentioned it earlier. It's a nicely written platform, it works on everything (OpenCL-based) and it gives AMD's raw processing power a decent advantage (Vega matches 1080Ti - unlike in games).
But you just can't match ASIC in this regard. And if you're not very into 3D rendering, then just think about crypto mining. :)


I can draw you even smaller theoretical chip if you want. :)
2080Ti is here, it works and it's already traveling towards your favourite store.
My favourite store says they can deliver 22 MSI RTX 2080Ti Gaming by 03/10. It was "at least 30" last time I checked. So at least 8 people pre-ordered already (for 5800 PLN ~= 1570 USD).
By comparison, the same store sold 3 (three!!!) Asus Vega 64 Strix since launch.

Navi could be an interesting architecture, but they've built the idea around 7nm. 7nm might just not happen fast enough. And the yields... oh my... the yields!
It's the HBM2 thing happening all over again. AMD wants to build an advantage using more recent, unproven and poorly available tech and they end up caught in all kinds of weird traps.

Now 7nm is the mystical saviour - praised by AMD believers in every comment (just like HBM2 a year ago). We'll see how it turns out this time. :)
You really are a hype bandit, is your name susan because you seam to like Emmerdale proportions of dramma.

Mystical saviour ,tut. Ill be here to point out YOU alone said that.
 
It does. For production stuff I would assume the API they use is Optix but point and underlying technology is the same.


No we won't. AMD will have their own API that DXR will use and that'll work. When game devs have implemented straight-up Gameworks stuff, then all bets are off of course. But even then AMD will help them out in that regard :)

That's not my quote :P
 
Thats not how capitalism works. You have to create a product to fill a need, then consumers buy it.
Capitalism is creating a need for a product consumers don't need.
 
Mystical saviour ,tut. Ill be here to point out YOU alone said that.
No problem with that. I'm not changing my opinions. I've criticized Vega for HBM2 dependency and I feel comfortable criticizing this whole 7nm nonsense.
You know how this ends, right? Nvidia will give us consumer grade 7nm GPUs before AMD (I mean available products, not fairy tales).

Actually, there's still a decent probability that Intel delivers their 10nm whatever-lake lineup before 7nm Zen2 and that would be really funny. ;-)
 
Back
Top