• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,056 (2.48/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Faster than a 4080 for less, and the raytracing is plenty good enough in the games where it doesn't murder framerate. It's about where we expected it, given that AMD's reveal last month was always going to be cherry-picked games. Performance may increase as drivers get updated for the new compiler but I don't think that matters, because if the $999 price is real, the cheapest 4080 is almost $1400 and the XTX is already faster than it today.

For the games where the 7900XTX struggles with raytracing, so do the 4080 and 3090Ti. I'm with @Wizzard here, Raytracing is the future, but for now we're all stuck in the present.
I think 4080 suggested pricing is soft from Nvidia, as they wait to see where chips fall once AMD releases theirs. So, I am not putting a whole lot of weight in the idea that the 7900XTX is cheaper, maybe for its first week, assuming any of the rumors that Nvidia is planning price drops later this month is real.
 
Joined
Nov 16, 2022
Messages
9 (0.01/day)
System Name "Budget" cave machine
Processor Intel i5-13600kf
Motherboard ASUS Prime Z790-A WIFI
Cooling Arctic Liquid Freezer II 240 | 4x Noctua NF-P12 redux-1700 PWM
Memory Corsair Vengeance 2x16gb 5600MHz
Video Card(s) EVGA FTW3 ULTRA GAMING GeForce RTX 3070 Ti 8 GB
Storage WD_BLACK SN850 500GB (for boot/system drive)
Case Corsair Airflow 4000D
Power Supply Thermaltake ToughPower GF3 1000W
See ya in 5-7 years GPU companies. Maybe by then you'll come up with something that's actually worth getting for the mid/higher-mid tier users.

So much hype and shitting on NVIDIA and in the end there's no mic drop. Just a "standard" mid/higher-mid tier card with bloated prices/margins.
 
Joined
Dec 28, 2012
Messages
4,144 (0.94/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Nvidia just 2 years ago released the 1030 with DRR4 ram under the same name and everything as the 1030 GDDR5! They just did the same just like 3 weeks ago with the RTX 3060 8GB, its a cut down card from the actual 3060, but you don't have that info anywhere! They are literally selling a 3050 with the same name as a 3060!

Should I mention the dirty tactics they play on reviewers and how they try and bully them into positive reviews? Maybe we should talk about all of their naming schemes and even undercutting their AIB partners for more profits!

How about right now and their issues with the 16pin connector melting?

How about when their driver bricked people GPU's? Or the driver that had so many people unable to boot into windows?

Again ALL of the Nvidia crap and BS and incompetence or greed or scams and whatnot never gets brought up! Nvidia is literally 10x times worse than AMD in terms of bad drivers, bad products, misleading products, scams, false narratives, false advertisement, anti consumer fraud, monopolistic tendencies, etc...
And there it is, right on que! Pure whataboutism to justify AMD releasing a upper mid range part for halo tier pricing.

Pro tip: Copium does not a competitive GPU brand make, AMD needs to be competitive to make a profit, you cannot charge premium prices for second tier products.
And UR utter BS about $200 7900XTX is actually mentally insane, even if you measure JUST the production cost of the 7900xtx or RTX4090 just those are over $300, just the production cost! Do you forget about R&D, testing, marketing, transportation, drivers and features development, packaging, paying royalties to various groups and consortiums, profit margin leeway for AIB partners, etc...

You can NOT claim that AMD or Nvidia can sell their cards at $250, the price for 4nm and 5nm process node is extremely expensive, the nodes are new and therefore more error prone so that means less GPU's comes from a single wafer compared to a more mature process, the GPU's are generally bigger to make, etc...
Bruh, BRUH :wtf:



Did you just take that $250 joke seriously :roll: :roll: :roll: LMFAO
Literally everything you said in your post is complete and utter FUD!
Legend says if you say FUD five times the crypto bros will visit you in your sleep and give you a free JPEG.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,428 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
'Unacceptable RT performance' that's quite something. First of all, I guess you consider all 3000-series cards 'unacceptable' too in RT since they offer similar performance there.
Well that bar was set 2 years ago, we expected more 2 years later, strange.
Nvidia didn't improve on RT performance
Well when you dig a little deeper, it's plainly evident they did, like in @birdie 's subsequent post.
Wait, arent you the same "birdie" that is always trashing AMD and kissing nvidias behind at Phoronix?
Wait, aren't you the same "Neo Morpheus" that is always trashing Nvidia and kissing AMD's behind at Techspot?

Welcome to Techpowerup!
 
Joined
Jan 14, 2019
Messages
13,946 (6.31/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Well that bar was set 2 years ago, we expected more 2 years later, strange.

Well when you dig a little deeper, it's plainly evident they did, like in @birdie 's subsequent post.
If the previous gen was let's say, 40% slower with RT enabled vs disabled, and the current gen is also 40% slower, then there was no improvement, was there? From this point of view, I see no improvement on the RT front with either AMD or Nvidia. Increasing the number of RT cores together with the number of more traditional shader cores is not an improvement.
 
Joined
Jun 5, 2018
Messages
244 (0.10/day)
I wish AMD would put more effort into the RT and creator side of things. This is where Nvidia commands it's premium, and without competition there, prices for both companies will slot on top of each other rather than compete directly.

It would have been awesome if one of the chiplets was a dedicated silicon just for RT and multimedia, call it something cool like "the reality engine" and make it so good that it actually beat Nvidia's offering.

Right now the GPU industry feels just Meh..
- monster cards that won't fit you PC and will break your mobo
- no level playing field in RT and creator arena
- 600w power cables with incinerate function
- min $1k after tax for anything new
- focusing resources for upscaling and fake frames rather than RT and raw power

In the end, I feel like Nvidia and AMD are having a private party but I was not invited. That kinda summarizes my vibe with these launches.
 
Joined
Apr 30, 2020
Messages
1,048 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
If the previous gen was let's say, 40% slower with RT enabled vs disabled, and the current gen is also 40% slower, then there was no improvement, was there? From this point of view, I see no improvement on the RT front with either AMD or Nvidia. Increasing the number of RT cores together with the number of more traditional shader cores is not an improvement.
It varies on this thing on somes times on games it's 3-5% fasters on others it's 1-3% slower on games.

Doubling the instruction rate to two instructions pre-clock to, double what RNDA2 was, is pointless. If they doubled the shaders too. They left a single lone RT core with the need for a lot more instructions with the new BHV trasveral stuff. they just left it lone with barely any more instructions. The shaders sucked up all that extra instructions pre-clock, seems like wasted instructions.
 
Joined
Jan 14, 2019
Messages
13,946 (6.31/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I wish AMD would put more effort into the RT and creator side of things. This is where Nvidia commands it's premium, and without competition there, prices for both companies will slot on top of each other rather than compete directly.

It would have been awesome if one of the chiplets was a dedicated silicon just for RT and multimedia, call it something cool like "the reality engine" and make it so good that it actually beat Nvidia's offering.

Right now the GPU industry feels just Meh..
- monster cards that won't fit you PC and will break your mobo
- no level playing field in RT and creator arena
- 600w power cables with incinerate function
- min $1k after tax for anything new
- focusing resources for upscaling and fake frames rather than RT and raw power

In the end, I feel like Nvidia and AMD are having a private party but I was not invited. That kinda summarizes my vibe with these launches.
I completely agree. The only thing you left out is Nvidia's complete disregard of the entry level, and AMD's "okay, fine, here's something, just leave me alone" edition 6400.

It varies on this thing on somes times on games it's 3-5% fasters on others it's 1-3% slower on games.

Doubling the instruction rate to two instructions pre-clock to, double what RNDA2 was, is pointless. If they doubled the shaders too. They left a single lone RT core with the need for a lot more instructions with the new BHV trasveral stuff. they just left it lone with barely any more instructions. The shaders sucked up all that extra instructions pre-clock, seems like wasted instructions.
Exactly. Meanwhile, Nvidia has the same-looking shader cores with the same-looking RT cores as before. One step forward and one step back from AMD, no change from Nvidia.

It seems I made a good choice buying the 6750 XT a couple weeks ago. This generation is pretty lacklustre from both companies. Okay, we have a bit more performance, but so what?
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,428 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
If the previous gen was let's say, 40% slower with RT enabled vs disabled, and the current gen is also 40% slower, then there was no improvement, was there?
If that were what was happening, sure there would be no improvement. And in some games, that's what we see, but in others, especially newer ones and certainly Portal RTX, the difference in RT perf outstrips the increase in Raster Perf, Ada absolutely has stronger RT than Ampere when and if the extra capability can be/is leveraged. W1zards testing suite doesn't show much of this, but examples are out there.
 
Joined
Jan 14, 2019
Messages
13,946 (6.31/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
If that were what was happening, sure there would be no improvement. And in some games, that's what we see, but in others, especially newer ones and certainly Portal RTX, the difference in RT perf outstrips the increase in Raster Perf, Ada absolutely has stronger RT than Ampere when and if the extra capability can be/is leveraged. W1zards testing suite doesn't show much of this, but examples are out there.
Except that Portal RT isn't a "game" in the traditional sense. It's a tech demo made by Nvidia for Nvidia cards.

The other examples in the post are purely RT benchmarks measured against Ampere. That's not how you measure the improvement in RT.

1. You need a game.
2. You measure performance with RT on, then with RT off.
3. You take the difference in percentage.
4. You compare that percentage with the last generation.
 
Joined
Sep 17, 2019
Messages
506 (0.26/day)
@W1zzard. First off I want to say thank you so very much on revue on both video cards. It confirms that THIS is the web site for tech. Not Youtube and their questionable vbloggers (You all know where I stand on almost all of the so called techies so I won't go there right now). As stated before. Silicon Valley is in my back yard. I see these people who do and create things.

The information that I gathered before your review is close on what you have reported. You really work hard to give us reliable information. Heh. I should know. I was a technology reporter doing similar stuff 20+ years ago. So I kind of know how important and at times stressful to make sure your data is correct and helpful to us... The Masses.

In this day and age of "talking heads", questionable reporting on just about everything... Gimmicks to get people to buy garbage, it is so damned good that I can come to a site to get RELIABLE information on something.

I thank you for your hard work and I hope others take the time to say thanks as well.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,428 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Except that Portal RT isn't a "game" in the traditional sense. It's a tech demo made by Nvidia for Nvidia cards.

The other examples in the post are purely RT benchmarks measured against Ampere. That's not how you measure the improvement in RT.
From where I'm sitting, it's precisely a way to measure improvement to RT performance. And I disagree with dismissing those results which frankly are valid in context.

For arguments sake, lets try your way, there are games where it's a wash and broadly equal to Ampere, and there are ones that show a reasonable difference, even without alterations to game code to leverage Ada's improvements like Portal RTX does, at least they haven't said so iirc.


Marvels guardians of the galaxy
3090 has 48.4% of RT off perf with RT on
4090 has 52.4% of RT off perf with RT on ~8% better result than Ampere

Cyberpunk 2077
3090 has 45.5% of RT off perf with RT on
4090 has 54.2% of RT off perf with RT on ~19% better result than Ampere
 
Joined
Jun 14, 2017
Messages
21 (0.01/day)
I wish AMD would put more effort into the RT and creator side of things. This is where Nvidia commands it's premium, and without competition there, prices for both companies will slot on top of each other rather than compete directly.

It would have been awesome if one of the chiplets was a dedicated silicon just for RT and multimedia, call it something cool like "the reality engine" and make it so good that it actually beat Nvidia's offering.

Right now the GPU industry feels just Meh..
- monster cards that won't fit you PC and will break your mobo
- no level playing field in RT and creator arena
- 600w power cables with incinerate function
- min $1k after tax for anything new
- focusing resources for upscaling and fake frames rather than RT and raw power

In the end, I feel like Nvidia and AMD are having a private party but I was not invited. That kinda summarizes my vibe with these launches.
I think your reality has been distorted by the "news-hype" cycle.
It's true that the only high-end cards with reasonable power consumption and size are 3080, 7900xt, 7900xtx, and maybe the 6900xt.
But if you freeze frame rates to 100 fps then a 4080 only uses 170 watts of power, which is extremely good, and only 6800xt can compete on "Quiet".
It may be a gigantic oversized 3.5-slot lump in your computer, but joules per frame rendered on 4080 is second to NONE.
And RT really doesn't matter. Its just a FUD-hype technique by Nvidia.
Ever hear of ANYONE making a kill shot due to a puddle reflection? No? Of course not!
NVENC and OpenCL matter, and AMD is finally matching NVidia on OpenCL with the 7900 XTX.

I have bought 2 high-performance video cards this year (3080, 3070 TI), and both times it was hell and exasperation.
One was from the newegg shuffle during card rationing, and I won a much-hated 3070 TI for ONLY 50% over MSRP.
One was 30d before xmas when I won a NEW 3080 on amazon for ONLY 15% over MSRP.
These companies have to stop lying with lowball/fake MSRP prices, I think the FTC should step in.

Listen, both NVidia and AMD are now using the same semiconductor foundries. Since NVidia has the engineers from SGI and 3dfx that INVENTED the 3D gaming industry, did you expect AMD to beat them on price AND performance? Really? Now that EVERYONE is using the same chip factories, the designs are probably not that different any more! Going forward, you get what you pay for. A 30TFlops 3080 for $800 is a screaming deal compared to a 5TFlops rx580 costing $300 from just 3-4 years ago. I know, because I have bought both of them. The 30TFlops card is 6x faster, for 2.67x the $$$. It may not following moore's law but hopefully some competition from Intel will help (Although I feel like Raj Koduri is an idiot and AMD is better off without him, it was a brilliant move to dump him on Intel graphics division ...)
 
Last edited:
Joined
Jun 16, 2019
Messages
393 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Memory 128TB QRAM
Video Card(s) CDS Render Accelerator 4TB
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) LG C19 3D Environment Projection System
Power Supply Compact Fusion Cell
Software Skysoft Skynet
Nice to see improvements but they've been working on this for a few years and it comes out of the door with 100w idle on dual monitors... The hell? Not as big of an improvement on the 6900XT as I was hoping, and RT is still a fair bit behind the current NV, but at least they're closing the gap. At the moment though all prices are just daft. I thought my £660 3080 was expensive when I got it but compared to todays price for performance 2 years later, maybe it wasn't that bad. Not like I plan on upgrading any time soon anyway, and the way prices are going I'll probably still be using this 3080 in 5 years.
 
Joined
Jan 14, 2019
Messages
13,946 (6.31/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
From where I'm sitting, it's precisely a way to measure improvement to RT performance. And I disagree with dismissing those results which frankly are valid in context.
Those results measure performance in general, not the improvement in RT core performance specifically. Of course they're valid, just not in this context. You can't cram 60% more RT cores onto a GPU die and brag about your RT cores being improved, because they're not - there's just more of them.

For arguments sake, lets try your way, there are games where it's a wash and broadly equal to Ampere, and there are ones that show a reasonable difference, even without alterations to game code to leverage Ada's improvements like Portal RTX does, at least they haven't said so iirc.


Marvels guardians of the galaxy
3090 has 48.4% of RT off perf with RT on
4090 has 52.4% of RT off perf with RT on ~8% better result than Ampere

Cyberpunk 2077
3090 has 45.5% of RT off perf with RT on
4090 has 54.2% of RT off perf with RT on ~19% better result than Ampere
A good point. I guess there is some improvement in some cases. Just not enough (improvement and cases), imo.
 
Joined
Dec 22, 2011
Messages
3,890 (0.81/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,428 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Those results measure performance in general, not the improvement in RT core performance specifically. Of course they're valid, just not in this context.
I still disagree, it's measuring like for like products in like for like testing, when we know in rasterization the 4090 is on average about 60-70% faster than a 3090, but in RT testing it's ~2x ish faster. I'm happy to leave it there though, if you think those results lack merit in this context, that is your prerogative.
A good point. I guess there is some improvement in some cases. Just not enough (improvement and cases), imo.
I agree it's not as large as I would have hoped, but I get the impression lessening that hit is quite the uphill battle for both camps. Gotta inch forward every time.
 
Joined
Jul 13, 2016
Messages
3,450 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Looks like another generation I'll be keeping my 1080 Ti. Neither AMD nor Nvidia's cards are competing on price. At this point that's painfully clear. Instead AMD and Nvidia price around each other, ensuring that they both extract maximum profits while customers are left wondering how much they'll have to spend.


I didn't watch the video but that title is terrible clickbait.
 
Joined
Jan 11, 2013
Messages
1,237 (0.28/day)
Location
California, unfortunately.
System Name Sierra
Processor Core i5-11600K
Motherboard Asus Prime B560M-A AC
Cooling CM 212 Black RGB Edition
Memory 64GB (2x 32GB) DDR4-3600
Video Card(s) MSI GeForce RTX 3080 10GB
Storage 4TB Samsung 990 Pro with Heatsink NVMe SSD
Display(s) 2x Dell S2721QS 4K 60Hz
Case Asus Prime AP201
Power Supply Thermaltake GF1 850W
Software Windows 11 Pro
Alright, it's been decided. I put in an offer on a used Dell OEM 6900XT on eBay. I want that specific card because it's a true dual slot card so I can retain my PCI-E X1 front panel USB C adapter. If that offer is accepted by tomorrow morning when the 7900 series launches, great, I got a 6900XT. If not, I will retract my offer, and buy the 7900XTX, yay for credit cards. Too bad for my front panel USB C port at that point LOL. Like I commented on the XT review, I honestly really wanted a 7900XT but the price/performance is a joke compared to the XTX. If it was 799 or 849 I would have gone for it, but you lose over 15% performance for a 10% discount which is NOT cool with me.
 
Joined
Dec 30, 2021
Messages
400 (0.35/day)
There's something weird going on with the 4090 results in this review. It is getting lower frame rates at 4K in a lot of games than it got with a 5800X in the original 4090 review. What's the explanation for this?
 
Joined
Nov 25, 2011
Messages
175 (0.04/day)
Location
Australia
I'm going to wait for better drivers and 3dv cache version of this GPU hopefully out by NH summer 2023
 
Joined
Dec 28, 2012
Messages
4,144 (0.94/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
I completely agree. The only thing you left out is Nvidia's complete disregard of the entry level, and AMD's "okay, fine, here's something, just leave me alone" edition 6400.


Exactly. Meanwhile, Nvidia has the same-looking shader cores with the same-looking RT cores as before. One step forward and one step back from AMD, no change from Nvidia.

It seems I made a good choice buying the 6750 XT a couple weeks ago. This generation is pretty lacklustre from both companies. Okay, we have a bit more performance, but so what?
Oh God, don't get me started on the total lack of low end stuff. I went with a rx 560lp years ago instead of a 1050ti, and felt the 1650 wasn't enough of an upgrade.

What do I get? No cards from Nvidia that fit in the 75w tdp required for low profile cards, Intel's ARC a380 being a mess and requiring bios settings my old optiplex media machine doesn't have, and AMD throwing out the laughingstock rx 6400 when they have mobile gpus like the 6650m or 6700s that would have done the job FAR better as 75w desktop parts.

My only hope now is some Chinese brand takes aforementioned mobile GPUs and makes a low profile model out of them
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,428 (1.30/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte B650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
What do I get? No cards from Nvidia that fit in the 75w tdp required for low profile cards
Well, there was one option for those that must have LP, but you'd have needed relatively deep pockets for one, the RTX A2000. Absolutely though they could have made a 3050LE or something LP with a 'sane' price tag, they just chose not to.

Interestingly this generation I feel like Laptop GPU's are going to be really potent from both camps given the levels of efficiency on display, I reallllly hope that means at least one new LP card from both camps that raises the bar significantly over the GTX1650/RX6400
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,503 (4.66/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
I'm going to wait for better drivers and 3dv cache version of this GPU hopefully out by NH summer 2023

I never thought of that before, is that actually going to be a thing? 3d cache like on 5800x3d will be on gpu's?
 
Top