• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 Founders Edition Potentially Pictured: 3-slot Behemoth!

Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
My pessimism has been on the right track more often than not though, when it comes to these predictions.

So far AMD has not shown us a major perf/w jump on anything GCN-based ever, but now they call it RDNA# and they suddenly can? Please. Tonga was a failure and that is all they wrote. Then came Polaris - more of the same. Now we have RDNA2 and already they've been clocking the 5700XT out of its comfort zone to get the needed performance. And to top it off they felt the need to release vague 14Gbps BIOS updates that nobody really understood, post/during launch. You don't do that if you've got a nicely rounded, future proof product here.

I'm not seeing the upside here, and I don't think we can credit AMD with trustworthy communication surrounding their GPU department. It is 90% left to the masses and the remaining 10% is utterly vague until it hits shelves. 'Up to 50%'... that sounds like Intel's 'Up to' Gigahurtz boost and to me it reads 'you're full of shit'.

Do you see Nvidia market 'up to'? Nope. Not a single time. They give you a base clock and say a boost is not guaranteed... and then we get a slew of GPUs every gen that ALL hit beyond their rated boost speeds. That instills faith. Its just that simple. So far, AMD has not released a single GPU that was free of trickery - either with timed scarcity (and shitty excuses to cover it up, I didn't forget their Vega marketing for a second, it was straight up dishonest in an attempt to feed hype), cherry picked benches (and a horde of fans echoing benchmarks for games nobody plays), supposed OC potential (Fury X) that never materialized, supposed huge benefits from HBM (Fury X again, it fell off faster than GDDR5 driven 980ti which is still relevant with 6GB), the list is virtually endless.

Even in the shitrange they managed to make an oopsie with the 560D. 'Oops'. Wasn't that their core target market? Way to treat your customer base. Of course we both know they don't care at all. Their revenue is in the consoles now. We get whatever falls off the dev train going on there.

Nah, sorry. AMD's GPU division has lost the last sliver of faith a few generations back, over here. I don't see how or why they would suddenly provide us with a paradigm shift. So far, they're still late with RDNA as they always have been - be it version 1, 2 or 3. They still haven't shown us a speck of RT capability, only tech slides. The GPUs they have out lack feature set beyond RT. Etc etc ad infinitum. They've relegated themselves to followers and not leaders. There is absolutely no reason to expect them to leap ahead. Even DX12 Ultimate apparently caught them by surprise... hello? Weren't you best friends with MS for doing their Xboxes? Dafuq happened?

On top of that, they still haven't managed to create a decent stock cooler to save their lives, and they still haven't got the AIBs in line like they should. What could possibly go wrong eh

//end of AMD roast ;) Sorry for the ninja edits.
I don't disagree with the majority of what you're saying here, though I think you're ignoring the changing realities behind the past situations you are describing vs. AMD in 2020. AMD PR has absolutely done a lot of shady stuff, have overpromised time and time again, and is generally not to be trusted until we have a significant body of proof to build trust on. Their product positioning and naming in China (like the 560D, various permutations of "580" and so on, etc.) is also deeply problematic. But so far I don't think I've seen Dr. Su overpromise or skew results in a significant way - but I might obviously have missed or forgotten something - and the "up to 50% improved perf/W" comes from her. (Sure, you could debate the value of Cinebench as a measure of overall performance - I think it shows AMD in too good a light if seen as broadly representative - but at least it's accurate and representative of some workloads.) And despite the fundamental shadyness of promising maximums ("up to") rather than averages or baselines, there's at least the security that it must in some sense be true for AMD to not be sued by their shareholders. And given how early that was said relative to the launch of the GPUs, I would say any baseline or average promise would be impossible to make.

Beyond that, most of what you describe is during the tenure of Koduri, and while it is obviously wrong to place the blame for this (solely) at his feet, he famously fought tooth and nail for near total autonomy for RTG, with him taking the helm for the products produced and deciding the direction taken with them. He obviously wasn't at fault for the 64 CU architectural limit of GCN, which crippled AMD's GPU progress from the Fury X and onwards, but he was responsible for how the products made both then and since were specified and marketed. And he's no longer around, after all. All signs point towards there having been some serious talking-tos handed out around AMD HQ in the past few years.

But beyond the near-constant game of musical chairs that is tech executive positions, the main change is AMD's fortunes. In 2015 they were near bankrupt, and definitely couldn't afford to splurge on GPU R&D. In 2020, they are riding higher than ever, with Ryzen carrying them to record revenues and profits. In the meantime they've shown with RDNA that even on a relatively slim budget (RDNA development must have started around 2016 or so, picking up speed around 2018 at the latest) they could improve things signifcantly, and now they're suddenly flush with cash, including infusions from both major high performance console manufacturers. The last time they had that last part was pre-2013, when they were already struggling financially, and both console makers went (very) conservative in cost and power draw for their designs. That is by no means the case this time around. They can suddenly afford to build as powerful a GPU as they want to within the constraints of their architecture, node and fab capacity.

And as I mentioned, RDNA has shown that AMD has found a way out of the GCN quagmire - while 7nm has obviously been enormously beneficial in allowing them to get close to (5700 XT), match (5700, 5500 XT) or even beat (5600 XT) Nvidia's perf/W, it is by no means the main reason for this, as is easily seen by comparing the efficiency of the Radeon VII vs. even the 5700 XT. And with RDNA being a new architecture with a lot of new designs, it stands to reason that there are more major improvements to be made to it in its second generation than there were to GCN 1.whatever.

As for the Fury X falling behind the 980 Ti: not by much. Sure, the 980 Ti is faster, and by a noticeable percentage (and a higher percentage than at launch), but they're still firmly in the same performance class. The 980 Ti has "aged" better, but by a few percent at best.

So while I'm all for advocating pessimism - you'll find plenty of my posts here doing that, including on this very subject - in this case I think you're being too pessimistic. I'm not saying I think AMD will beat or even necessarily match Ampere either in absolute performance or perf/W, but there are reasons to believe AMD has something good coming, just based on firm facts: We know the XSX can deliver a 52CU, 12TF GPU and an 8c16t 3.6GHz Zen2 CPU in a console package, and while we don't know its power draw, I'll be shocked if that SoC consumes more than 300W - console power delivery and cooling, even in the nifty form factor of the XSX, won't be up for that. We also know the XSX runs at a relatively low clock speed with its 52 CUs thanks to the PS5 demonstrating that RDNA 2 can sustain 2.1 GHz even in a more traditional (if enormous) console form factor. We also know that even RDNA 1 can clearly beat Nvidia in perf/W if clocked reasonably (hello, 5600 XT!). What can we ascertain from this? That RDNA 2 at ~1.8GHz is quite efficient; that RDNA 2 is capable of notably higher clock speeds than RDNA 1, and that AMD is entirely capable of building a wider die than the RX 5700 - even for a cost-sensitive console.
 
Joined
Mar 9, 2008
Messages
1,177 (0.19/day)
System Name KOV
Processor AMD 5900x
Motherboard Asus Crosshair Hero VIII
Cooling H100i Cappellix White
Memory Corsair 3600kHz 32GB
Video Card(s) RX 7900XT
Storage Samsung 970 evo 500gb, Corsair 500gb ssd, 500GB 840 pro & 1TB samsung 980pro. M2.SSD 960 evo 250GB
Display(s) ASUS TUF Gaming VG32VQ1B 165mhz, Dell S2721DGFA 27 Inch QHD 165mhz
Case Corsair 680x
Audio Device(s) ON Board and Hyper X cloud flight wireless headset and Bose speakers
Power Supply AX1200i Corsair
Mouse Logitech G502x plus lightspeed
Keyboard Logitech G915 TKL lightspeed and G13 gamepad
Software windows 11
Can't see me paying those prices. I update every year but now the prices are going to get stupid.
 
Joined
Mar 18, 2015
Messages
2,963 (0.83/day)
Location
Long Island
With ZEn 3 around the corner, idk why anyone would buy intel CPU for a new build.

Well 1, around the corner is not here ... and 2.... it's always best to choose your tools based upon what you need it to do and what jobs you do. Unfortunately the apps in which where AMD excels are not on a large % of peoples PCs. For gamers , the 10400 / 10400F is faster in gaming than anything in AMDs entire lineup ... hard to get your hands on one even with vendors selling well over MSRP. When I first looked at the $265 10600k versus the $250 3600X, they were in the same general prce tier. ... now th e 10600k is sometimes over $300, while the 3600X has dropped

While I would agree it's noyt wise to make a choice w/o all cards being on the table as yet, we can ony make decisions based upon what is real. The only applications that matter are the ones each user is running and how often. If you are rendering or going Gaming Develppment moist of your day ... most definitely you'd be better off with AMD. But for what people actually do on an every day basis .... most folks will do better with Intel.... even with many workstation type or advanced apps ... like Photo Editing, Video Editing and software development. And even tho many things favor Intel in testing (i.e office suites), I'm ignoring them as the test scripts ignore the real bottlenect, user input. Just gpoinmg to use gaming + video editing as if we want to include an area where more cores is supposed to help, the most common app of this type would be video editing.

$400 Price Point - 10700k ($406) vs 3900X ($429) ..... All data based upon TPUs testing

Gaming: 10700k = 100% / 3900x = 93% (OC'd the 10700K picks up 1.3% / OC'd the 3900x was slower) .... 10700k is 7.5% faster ignoring the OC advantage

Video Editing: 10700k = 236.5 / 3900x = 252.2 (OC'd the 10700K = 228.6 / OC'd the 3900x was slower) .... 10700k is 6.6% faster ignoring the OC advantage

Stock Temps @ Max Load: 10900k = 61C / 3900 XT = 79C

$300 Price Point - 10600K ($278) vs 3700X ($290) ..... All data based upon TPUs testing

Gaming: 10600k = 100% / 3700x = 93.6% (OC'd the 10600K picks up 1.9% / OC'd the 3700x was 0.1% faster) .... 10700k is 6.4 % faster ignoring the OC advantage

Video Editing: 10600k = 241.00 / 3900x = 253.40 (OC'd the 10700K = 228.6 / OC'd the 3700x was slower) .... 10700k is 5.1% faster ignoring the OC advantage

Stock Temps @ Max Load: 10600k = 56C / 3970 X =66C

$200 Price Point - 10600KF ($182) vs 3600 ($175) ..... All data based upon TPUs testing

Gaming: 10400KF = 100% / 3600 = 93.8% .... 10700k is 6.2 % faster

Video Editing: 10400KF = 267.40 / 3600 = 263.00 .... 3600 is 1.7% faster

We were asked to do a build like this 2 weeks ago. user didn't know which way he wanted to go so we aked what's the frequency of usage and irelative importance. ? It came down to, ***in his case***. What's more important 6.2% every night of the week versus rendering once or twice a week ?

So there are many reasons why someone would do an Intel build .... it will depend on what applications are being used. For a build dedicated to gaming (and other incidental stuff where speed is of no significance), the opposite is true ... it is real hard to make a case for an AMD build, as Intel's $180 CPU is faster in gaming than anything in AMDs entire lineup extending up to the $475 flagship. And that's "in general" ... if you exclusively into SIM games, you might go the other way.

No one buys a CPU cause it's faster in Powerpoint, and the fact that Intel is faster in Word and Excel again isn't going to sway any buying decisions. As an Engineer, my primary workday apps are AutoCAD and Spreadsheets, both of which favor Intel.

From AutoCAD Workstation Vendor Support
"In the case of AutoCAD, the majority of the software is only single threaded so it is only able to utilize a single core of the CPU. For this reason, our general recommendation when choosing a processor is to get the highest frequency. For current generation CPUs, that is Intel's Core i9 10900K or i7 10700K, both of which can boost to over 5.0GHz with a single core active. "

So the relevant questions becomes...

a) what application will your box have installed that benefit from one CPU versus another. Looking at TPU testing , I see about 7 or 8 that favor Intel ,,, and other that are too close to matter.
b) what application will your box have installed that benefit from one CPU versus another, that you use on a every day basis . Looking at TPU, I see about 4 or 5 that favor Intel
c) what application will your box have installed that benefit from one CPU versus another, that you use on a every day basis and in which the user is not the primary bottleneck. Looking at TPU, I see just gaming and that favors Intel.

I do have an engineering colleagues that wants me to build him a dual system box. His idea is to run a threadripper for doing rendering jobs overnight, while he plays games on the Intel side. But every time he's ready to pull a trigger , he hears about another "next big thing" and puts it off.

Saying there's no reason to buy [insert anything here] is doesn't fit everybody. The best answer would be a hammer if you are a roofer. ... but an electrician would likely say a screwdriver and an auro mechanic a wrench. Picking a tool is best done by picking the tool that best fits the application. ... and a CPU is just that. For gamers (outside of SIMs and a few others,) that's going to be Intel. For renderers, game developers, that's going to be AMD ... for CAD users, video editors, photo editors, CAD peeps, etc, that's going to be Intel.
 
Joined
Oct 10, 2018
Messages
943 (0.42/day)
I do not mind or care what I buy. But,when I buy it, I want it to work and have the peace of mind if there is a driver issue i know that it will get fixed immediately. This is where Nvidia excels and won me as a consumer.
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
this is sad but bad news is until AMD decide to stop the "anti-software" policy and stop looking for open source suckers to do them drivers for free this not going to change :x
yhe sure NV policies suckx but they deliver and consistently so, and in the end the user care about what is working not broken half made drivers, even if the hardware is bether :x
you can see similar example in linux vs windows: linux fail every time in games, first was WINE and now it's Proton and each time it the same story: nothing working only problems
and windows spilling millions in QA in the end worth more to user than principles of linux
 
Last edited:
Joined
Jan 27, 2015
Messages
454 (0.13/day)
System Name Marmo / Kanon
Processor Intel Core i7 9700K / AMD Ryzen 7 5800X
Motherboard Gigabyte Z390 Aorus Pro WiFi / X570S Aorus Pro AX
Cooling Noctua NH-U12S x 2
Memory Corsair Vengeance 32GB 2666-C16 / 32GB 3200-C16
Video Card(s) KFA2 RTX3070 Ti / Asus TUF RX 6800XT OC
Storage Samsung 970 EVO+ 1TB, 860 EVO 1TB / Samsung 970 Pro 1TB, 970 EVO+ 1TB
Display(s) Dell AW2521HFA / U2715H
Case Fractal Design Focus G / Pop Air RGB
Audio Device(s) Onboard / Creative SB ZxR
Power Supply SeaSonic Focus GX 650W / PX 750W
Mouse Logitech MX310 / G1
Keyboard Logitech G413 / G513
Software Win 11 Ent
Holy cr*p! The card is huge. In a small-ish case, this thing is going to split the entire internal space in half and completely block the airflow in between. What's the TDP?
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
this is sad but bad news is until AMD decide to stop the "anti-software" policy and stop looking for open source suckers to do them drivers for free this not going to change :x
yhe sure NV policies suckx but they deliver and consistently so, and in the end the user care about what is working not broken half made drivers, even if the hardware is bether :x
you can see similar example in linux vs windows: linux fail every time in games, first was WINE and now it's Proton and each time it the same story: nothing working only problems
and windows spilling millions in QA in the end worth more to user than principles of linux
And a comment a on topic? At best you got six words in.

People need to read the title more round here, that might calm some flame wars , wtaf.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
Preordered x2!

:rockout:
 
Top