• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Updates Roadmaps to Lock RDNA2 and Zen 3 onto 7nm+, with 2020 Launch Window

Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Actual expectations for Polars, Vega and Navi were all over the top.
Not for Navi. It beat expectations by quite a bit.

Of course your reasoning is that because AMD is in both the next XBox and PS5 they will dictate RT.
Of course.

Except that they were in there last round and it hasn't helped them at all.
Yeah, besides semi custom revenue growing to about half of all revenue and Sony & Microsoft spending tens of millions onto AMD's R&D that goes into non-console products as well.
Oh wait, you said it had NOT helped. Interesting.

Microsoft will dictate RT, and they already have with DirectX Ray Tracing (DXR).
Lol.

RTX is already compatible with it (and Vulkan's version as well) and what ever form of RT hardware AMD uses will have to be compatible with it as well. It will all come down to the design of the hardware and drivers, and at this point there is no reason to believe that AMD will have any advantage.
Oh, dear, oh dear. RTX being compatible with API that nV asked Microsoft to add, how surprising, lol.

AMD will largely influence what to put into consoles, which will or will not (depends largely on AMD) on whether it will spark RT development or not.

It will be AMD setting RT baseline and it will again be AMD influencing what performance, and what flavor of RT to push for. If they decide "nah, f*ck the Leather Man" so will it be. (see how it worked with FreeSync vs G-Sync)
 
Joined
Apr 8, 2019
Messages
121 (0.06/day)
Yeah, besides semi custom revenue growing to about half of all revenue and Sony & Microsoft spending tens of millions onto AMD's R&D that goes into non-console products as well.
Oh wait, you said it had NOT helped. Interesting.

That's great and all, but the point I was making was that being in the consoles didn't help AMD set the direction for graphic features or performance at all.

Lol.


Oh, dear, oh dear. RTX being compatible with API that nV asked Microsoft to add, how surprising, lol.

AMD will largely influence what to put into consoles, which will or will not (depends largely on AMD) on whether it will spark RT development or not.

It will be AMD setting RT baseline and it will again be AMD influencing what performance, and what flavor of RT to push for. If they decide "nah, f*ck the Leather Man" so will it be. (see how it worked with FreeSync vs G-Sync)

FreeSync's success had nothing to do with AMD being in the consoles. It was all about the fact that GSync modules added a couple of hundred dollars to monitors.

Do you really think that Microsoft is going to create a new set of Ray Tracing APIs for the consoles. No. Whatever solution AMD uses will have to be able to work with DXR.

Being in the consoles provided AMD with revenue, but it in no way has helped them gain a performance advantage over Nvidia. In fact one could argue that it has stunted there ability to develope high end GPUs, because they have had to spend so much of there engineering resources on the design and development of said console GPUs. We all have seen and heard the rumors that Navi was initially designed for the consoles first.

You are more than welcome to keep believing that AMD will set the direction of RT, but there is no historical evidence of that being the case.

† Remember that Mantel was supposed to take over the world and usher in a whole new era of performance. Not so much.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
† Remember that Mantel was supposed to take over the world and usher in a whole new era of performance. Not so much.
You're forgetting the part where Mantle essentially became Vulkan which has had adoption by several games and projects. Vulkan is why I can use a project like DXVK to play games like Diablo 3 in Linux without a terrible performance hit. Even Doom's performance improvement using a Vulkan renderer over OpenGL was pretty significant. Same deal with DOTA, so it's not like there is no gain to be had.
FreeSync's success had nothing to do with AMD being in the consoles.
FWIW, My Xbox One X plays very nicely with my 4k display and supports FreeSync. So, I wouldn't say it had nothing to do with it.
 
Joined
Oct 10, 2018
Messages
943 (0.42/day)
How did you lot turned this into Amd/Nvidia debate?!? I have not bought an AMD product for half a decade but they have finally gotten their act together.They are not ahead but finally caught up against their competition.Yes, in order to do so they spent their die shrink card whilst others still have it to use at will. But that is an issue for the future.look at now! Things are great we got competition.i5 9600k vs ryzen5 3600x. Now it's hard to choose.in the past it was no brainer.should you buy a 300 usd gtx 1660ti or for 40usd extra get rx 5700 which is almost 50% better performance for a little bit extra. These are good days, competition is great for us. Enjoy it.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,239 (4.67/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($179)
Cooling Frost Commander 140 ($42)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3100 core $(705)
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p ($399)
Case NZXT H710 (Red/Black) ($60)
You're forgetting the part where Mantle essentially became Vulkan which has had adoption by several games and projects. Vulkan is why I can use a project like DXVK to play games like Diablo 3 in Linux without a terrible performance hit. Even Doom's performance improvement using a Vulkan renderer over OpenGL was pretty significant. Same deal with DOTA, so it's not like there is no gain to be had.

FWIW, My Xbox One X plays very nicely with my 4k display and supports FreeSync. So, I wouldn't say it had nothing to do with it.

and yet Navi still has issues with freesync working above 75hz according to a few Navi owners here :/ freesync and g-sync are most important to me as it really enhances the gaming experience. so until Navi can do it 100% at high refresh I have to pass
 
Joined
Apr 8, 2019
Messages
121 (0.06/day)
You're forgetting the part where Mantle essentially became Vulkan which has had adoption by another games and projects. Vulkan is why I can use a project like DXVK to play games like Diablo 3 in Linux without a terrible performance hit. Even Doom's performance improvement using a Vulkan renderer over OpenGL was pretty significant. Same deal with DOTA, so it's not like there are no gain to be had.

FWIW, My Xbox One X plays very nicely with my 4k display and supports FreeSync. So, I wouldn't say it had nothing to do with it.

Yes, Mantle became Vulkan, but it didn't set the world on fire and it isn't the leading graphics API. My point is that having their GPUs in both consoles this past round hasn't allowed AMD to control the direction of graphics technology. There is no reason to think that because their GPUs are in the next round that they will be able to control the direction of Ray Tracing. We are already seeing major developers adopt DXR in their engines and their games. And it will be the API for the next XBox. AMD is not going to be able to force Nvidia to scrap all their work on RTX.

As far as Freesync and Xbox One X, yes it works great with a compatible monitor, but again it isn't the reason for there success. The majority of console gamers have their consoles hooked up to TVs. Which until very recently havent had the option for VRR of any kind. It was the monitor and PC market that set that direction.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Yes, Mantle became Vulkan, but it didn't set the world on fire and it isn't the leading graphics API.
There are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.
 
Joined
May 3, 2018
Messages
2,881 (1.20/day)
I haven't seen any solid information whether the next iteration will be a refined Navi or their next generation. But if it turns out to be Navi 2x, as you mentioned, the scaling from 5700 to 5700 XT shows how little room there is, so a ~50% larger chip would have to be clocked lower than 5700 to retain decent efficiency. Whatever AMD launches in the Navi family, it will retain the same scaling characteristics. So for this reason, it matters if AMD launches another iteration of Navi or something which (hopefully) is a more efficient architecture. If AMD's lineup for 2020 consists of refreshed and possibly larger Navis, then it's not going to be good enough, which is why I have several times before stated that a second Navi in 2020 is just too late to the party.

Lastly, just to put it out there, a second iteration of Navi may very well be some sort of professional version too, but we have no details on that.

The slides Lisa Su presented earlier in July, showed Navi+ launching in 2020 on 7nm+ process node. Navi+ is supposedly next gen architecture, not a refinement of RDNA which is refinement of CCN. Would be disappointing if it's not Navi+ but Navi on 7nm+ with a few tweaks and higher clocks. Navi+ hopefully gets ray tracing support.

I'm torn on updating to any AMD this year. The X570 MB are pretty crap and I guess we won't see a all new designs until Zen 3 anyway. Navi is nice enough but want something more at 2080 levels. I'm torn between updating my old Ivy Bridge 3570K and GTX 1070 system to R5 3600 and Navi 5700XT or waiting. R5 4600 + 6700 XT + B650 should be sweet (I'm just assuming names here).
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,239 (4.67/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($179)
Cooling Frost Commander 140 ($42)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3100 core $(705)
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p ($399)
Case NZXT H710 (Red/Black) ($60)
There are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.

Nvidia is forcing RT down our throats for forseeable future whether we like it or not, but I agree its a fad. I honestly think a lot of games that support it look better with it off. also i prefer high frames. this is another Physx scam
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Nvidia is forcing RT down our throats for forseeable future whether we like it or not
nVidia shoves nothing down my throat. I vote with my wallet. :laugh:

Honestly, having the best performance is secondary to upholding some of my values.
 
Joined
Dec 28, 2012
Messages
3,884 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
There are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.
It's still a drop in the roll-off of the gaming industry. Literally 99.99% of games on the market are DX, not vulkan.

I want AMD to succeed, but I'm not going to pretend they are gong to dominate a market thanks to consoles. Remember, many PC games favor Nvidia over AMD, despite being console ports, and Unreal Engine 4, one of the most widely used game engines on the market, favors Nvidia despite being optimized for consoles and used on a wide variety of AMD powered platforms.

Anyone who is writing off Nvidia's raytracing because AMD's raytracing will be in the next gen consoles is a fool. Nvidia still dominates the high end of PC gaming and has much closer ties to PC developers. Just as having 8 cores didnt magically make all games capable of taking advantage of said cores (as shown in the performance delta between the 6 and 8 core ryzen chips) AMD RT being in consoles doesnt mean it will dominate PC gaming. That will only happen if AMD can also provide the high end hardware and support needed to optimize game engines for that tech. AMD has always struggled with that last part.
 
Joined
Apr 8, 2019
Messages
121 (0.06/day)
There are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.

I'll make you a gentleman's bet. Ray tracing is only going to get more prevelant, especially as gpu's get more powerful.

What we have now isn't full on ray tracing. It is a hybrid approach combining rasterazation and a little bit of ray tracing. There is no denying that full on ray tracing looks a lot better and more realistic than the best rasterazation has to offer. If you need any proof of that just look at the movies. All computer effects these day are done using ray tracing. I know we aren't close to that point yet, but if offloading processing to the cloud (Stadia and whatever MS is cooking up) then it does become a possibility in the future.
 
Joined
Mar 23, 2016
Messages
4,841 (1.53/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB something
Memory Corsair DDR5-6000 small OC to 6200
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Phantek Eclipse P400S
Audio Device(s) EVGA NU Audio
Power Supply EVGA 850 BQ
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v23H2
Besides possibly Apple, AMD will have the most 7nm design experience as the prices matures. In many ways probably more than Apple as they use 7nm for both CPU & GPU, exciting times ahead!
Pretty sure Apple has plenty of experience with CPUs/GPUs on TSMC's 7nm process with the A12 SoC, and now the recently announced A13. Theres a high chance the A13 is fabbed on the 7nm+.

131793
 
Joined
Jul 26, 2019
Messages
418 (0.21/day)
Processor R5 5600X
Motherboard Asus TUF Gaming X570-Plus
Memory 32 GB 3600 MT/s CL16
Video Card(s) Sapphire Vega 64
Storage 2x 500 GB SSD, 2x 3 TB HDD
Case Phanteks P300A
Software Manjaro Linux, W10 if I have to
Wow, AMD only seems to be gaining momentum. I think I'll upgrade CPU & GPU next year, can't wait for reviews!
 
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
Where is your logical reasoning here? So just because Nvidia isn't 100% perfect, we can ignore all problems from AMD? This sort of argumentation is childish.
And if you actually read my whole post, you would also see; Even Nvidia's drivers is not perfect, so there is plenty of room for improvement.
Turing launched with at least 3-4 bugs which were quickly fixed, which is a lot from Nvidia, and is probably the "worst driver launch" they've done in many years, and I've criticized it plenty for that reason. But at least their drivers are stable once the initial bugs are ironed out.

Do I have to highlighted that again? :laugh:

Yes, AMD's driver quality is nowhere close to Nvidia's unfortunately.

A few cards being DoA, and a failure rate of 2-5% is completely normal, and Turing is below that. Most of the reported problems were later dismissed as driver bugs (which were also fixed), and only a few cards were actually defective.

Claiming that one particular driver is so bad while other company driver made your card dead :roll:

Luck of the draw. Bad cards do happen.
AMD/ATI cards I've had that died or quit working properly:
Radeon 9590 (dead)
Radeon X800 (dead)
Radeon X1900 XT (memory failed, visual corruption)
Radeon 5950 (UVD unit failed, system crashes)
Radeon Vega 64 (unstable, system crashes)
AMD/ATI cards I've had that work fine:
Radeon AIW 9700
Radeon 2900 Pro
Radeon 3870
Radeon 4850
Radeon 5950 (replacement)
Radeon 6970
Radeon 270
Radeon 390
Radeon VII
I run my cards at stock.

Read my previous post mate, someone made a bold claim bad driver causing card to be dud :)

from what i understand it was only 1-3% of all users who got bad cards, we don't really have any official data on how many RTX users got screwed over. also never had any experience with the RTX cards so again out of my data set... sorry you don't understand basic logic. :/

I have nothing to say, other member are already stating my point :)
If you claimed had yearly experience, why don't you use your knowledge to sort thing out?
 

Emu

Joined
Jan 5, 2018
Messages
28 (0.01/day)
my nvidia and intel system has given me 0 issues and I own over 1000 games... /shrug AMD seems to be crashing on lots of games according to many Navi owners and gamersnexus review.

Nvidia likes to break their drivers every so often. There has been quite a few times in the 12 months or so that I have owned my 2080 ti that I have had to skip driver version because they crash in certain games. Let's also not forget the drivers that actually broke Windows completely earlier this year either.
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
The slides Lisa Su presented earlier in July, showed Navi+ launching in 2020 on 7nm+ process node. Navi+ is supposedly next gen architecture, not a refinement of RDNA which is refinement of CCN. Would be disappointing if it's not Navi+ but Navi on 7nm+ with a few tweaks and higher clocks.
Anything Navi will not be a new architecture, if it's a new architecture it will get a different name.
I think Lisa Su as usual is intentionally vague.

Navi+ hopefully gets ray tracing support.
Navi was not designed for ray tracing. Remenber that Navi was supposed to launch early 2018, even before Turing. If a Navi family chip shows up with some kind of ray tracing support, it will be something primitive thrown in relatively late in the design process, and nothing integrated like Turing.

I'm torn on updating to any AMD this year. The X570 MB are pretty crap and I guess we won't see a all new designs until Zen 3 anyway. Navi is nice enough but want something more at 2080 levels. I'm torn between updating my old Ivy Bridge 3570K and GTX 1070 system to R5 3600 and Navi 5700XT or waiting. R5 4600 + 6700 XT + B650 should be sweet (I'm just assuming names here).
If your use case is primarily gaming, I suggest to wait one more cycle, and when you upgrade you also buy something one tier up so it lasts longer. I generally recommend upgrading when you "need" more performance, and there is something significantly better available.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Navi was not designed for ray tracing. Remenber that Navi was supposed to launch early 2018, even before Turing. If a Navi family chip shows up with some kind of ray tracing support, it will be something primitive thrown in relatively late in the design process, and nothing integrated like Turing.
I'm not convinced that dedicated circuitry for RT is necessary to do RT well, even if that's the route nVidia went. The big issue I see with dedicated circuitry is that with most loads, that hardware is going to be inactive which is a terrible waste of die space IMHO, even more when you consider how big of a die the 2080 Ti is as 754 mm² is a huge chip.
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I'm not convinced that dedicated circuitry for RT is necessary to do RT well, even if that's the route nVidia went. The big issue I see with dedicated circuitry is that with most loads, that hardware is going to be inactive which is a terrible waste of die space IMHO, even more when you consider how big of a die the 2080 Ti is as 754 mm² is a huge chip.
It depends on what degree of ray tracing you want to use. If you only want soft lighting and shadows (diffuse lighting), and can make do with few ray samples (like ~2 per 8x8 pixels or so), then yes, even current GPUs can do a decent job with OpenCL, CUDA etc. But if you want sharp reflections(specular lighting) to look good, then you need to spend 50x the performance to get a decent result, which is something even RTX can't truly do yet.

And BTW, unlike what a certain YouTuber claim, the rest of the GPU will not idle when tracing rays, it's not like one part is used for legacy games and one part for ray tracing. The RT cores will only accelerate the tracing of the rays, not give you a finished picture by themselves. The SMs will be active all the time, plus TMUs etc. too.
 

bug

Joined
May 22, 2015
Messages
13,773 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I'm not convinced that dedicated circuitry for RT is necessary to do RT well, even if that's the route nVidia went. The big issue I see with dedicated circuitry is that with most loads, that hardware is going to be inactive which is a terrible waste of die space IMHO, even more when you consider how big of a die the 2080 Ti is as 754 mm² is a huge chip.
You're probably not convinced we need a GPU for video either and shudder at the thought of the resources that go unused while browsing the Internet :D

Edit: Just to be clear, as years go by, Turing will surely go down in history as a crude RTRT implementation. That much is certain. But since no hardware we have now is particularly suited to what these cores do (BVH handling), it's is very possible their tasks cannot simply be folded onto traditional (or even enhanced) shaders, so the need for dedicated hardware will remain.
 
Last edited:
Joined
Aug 6, 2009
Messages
1,162 (0.21/day)
Location
Chicago, Illinois
How did you lot turned this into Amd/Nvidia debate?!? I have not bought an AMD product for half a decade but they have finally gotten their act together.They are not ahead but finally caught up against their competition.Yes, in order to do so they spent their die shrink card whilst others still have it to use at will. But that is an issue for the future.look at now! Things are great we got competition.i5 9600k vs ryzen5 3600x. Now it's hard to choose.in the past it was no brainer.should you buy a 300 usd gtx 1660ti or for 40usd extra get rx 5700 which is almost 50% better performance for a little bit extra. These are good days, competition is great for us. Enjoy it.

They have closed the gap but they definitely haven't caught up. Nvidia still has the fastest cards by a good clip.
 
Joined
Jun 2, 2017
Messages
9,157 (3.35/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
If everything goes as expected the next Navi (I am sure that no one thought NAVI would beat the Vega 64) will be faster than the current iteration and than can be nothing if not good. In terms of Zen3 those chips will probably have a 5GHZ boost and a base clock somewhere in the 4.4 to 4.5 GHZ base. This is all based on what we have already seen from AMD in terms of improvement from one step to the next.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
It's not as different as you think.
Back then AMD had Athlon first (that brought the IPC fight to Intel). Then they had AthlonXP (dreamy overclocking). Then Athlon64 came (64bit and integrated memory controller). And then we had Athlon64 X2 (real multicore).
So they did have an engineering roadmap back then, too. They just left marketing and sales to chance.
Funny you should mention the real mutlicore Athlon64 X2 because this time around AMD brought the fight to Intel with a multi-die cpu :D.
Intel have a tough times ahead of them, just being able to "print" 10nm/7nm CPUs it's half of the problem the other half is either getting them to clock 5GHz or introduce better much better architecture to compensate for lower clocks while the technology is up to speed.
IMO I Ryzen 2 is a letdown in my eyes clockwise I didn't believe the 5GHz rumors but I was expecting at least 4.6GHz.
But I guess if people are happy with what Ryzen 2 offers it leaves a lot of room for improvement for Ryzen 3.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
...being in the consoles didn't help AMD set the direction for graphic features or performance at all.
Consoles set baseline for game developers. I don't know if you are from the same planet as I am.

It was all about the fact that GSync modules added a couple of hundred dollars to monitors.
Sure, Joh, that chip price was so high because it was really really so expensive to manufacture and not because someone is really really greedy.

AMD got its own version of Variable Refresh Rate in DisplayPort and HDMI standards. End of story.
 
Top