• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASRock Radeon RX 6900 XT OC Formula

Joined
Mar 29, 2014
Messages
486 (0.12/day)
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.
OC Formula. Says it right on the box. This is the card DESIGNED to remove as much power limits as possible. This is not what you want.
 
Joined
May 24, 2007
Messages
5,429 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
W1z - Is this being tested against 3090 reference card, or an OEM?
 
Joined
Apr 30, 2011
Messages
2,703 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Navi21 is a great chip. When tuned for efficiency it is the best, when tuned for performance it gets the best @4K, since for lower res the stock ref model already wins.
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
For $2000 no thanks. That RT performs is dog poo poo. If I wanna play pure rasterization game I would be fine with a lower level card. Flagship GPUs are supposed to be good in everything, including feature set and etc.

@W1zzard Would you be using the Metro Exodus Enhanced Edition going forward?

Navi21 is a great chip. When tuned for efficiency it is the best, when tuned for performance it gets the best @4K, since for lower res the stock ref model already wins.
when tuned for performance it gets the best @4K @ rasterization only

Again, not paying $2000 for a pure rasterization GPU at this age.
 
Joined
Apr 30, 2011
Messages
2,703 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
For $2000 no thanks. That RT performs is dog poo poo. If I wanna play pure rasterization game I would be fine with a lower level card. Flagship GPUs are supposed to be good in everything, including feature set and etc.

@W1zzard Would you be using the Metro Exodus Enhanced Edition going forward?


when tuned for performance it gets the best @4K @ rasterization only

Again, not paying $2000 for a pure rasterization GPU at this age.
I wouldn't pay over $500 for any GPU while you would pay 4 times more to have RTX on for the few games that support it though. OK.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,850 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
W1z - Is this being tested against 3090 reference card, or an OEM?
3090 FE, all my comparison cards are reference design

Would you be using the Metro Exodus Enhanced Edition going forward?
Undecided yet, mostly because there is no way to turn off RT. So I might stick with normal Metro, so I can report on the RT perf penalty, which is important

That RT performs is dog poo poo
Between RTX 3070 and RTX 3080 is so bad?
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
I wouldn't pay over $500 for any GPU while you would pay 4 times more to have RTX on for the few games that support it though. OK.

They priced it as a flagship without the quality of a premium flagship. Only die hard fans would get this over a 3090.

Not just me, the 6900XT have quite a lot of stock at my local Micro center. At over $2000 a piece, nobody in their sane mind would buy these. Hence why lots of available 6900XT stock because they are bad value.

Oh and also they are not good at mining, cannot do AI / ML, cannot be used scientific computing (ROCm does not support Navi2X).

So basically the world's fastest rasterization GPU and just that.



3090 FE, all my comparison cards are reference design


Undecided yet, mostly because there is no way to turn off RT. So I might stick with normal Metro, so I can report on the RT perf penalty, which is important


Between RTX 3070 and RTX 3080 is so bad?


For a $2000 GPU yup that is not good.

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.

Something like this

stacked-bar-example-1.png
 
Joined
Apr 30, 2011
Messages
2,703 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
They priced it as a flagship without the quality of a premium flagship. Only die hard fans would get this over a 3090.

Not just me, the 6900XT have quite a lot of stock at my local Micro center. At over $2000 a piece, nobody in their sane mind would buy these. Hence why lots of available 6900XT stock because they are bad value.

Oh and also they are not good at mining, cannot do AI / ML, cannot be used scientific computing (ROCm does not support Navi2X).

So basically the world's fastest rasterization GPU and just that.






For a $2000 GPU yup that is not good.

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.

Something like this

View attachment 199565
Nice! At least AMD GPUs with their lower mining and greater gaming potential will drop sooner and closer to their MSRPs.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,850 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
For a $2000 GPU yup that is not good.
Fair point

Also, PLEASE IMPROVE your RT graphs. Too busy to get any useful information out of it.

Something like this
My charting engine doesn't allow that. Not sure if useful with the way we have the text over the bar, and not outside of the bar
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Nice! At least AMD GPUs with their lower mining and greater gaming potential will drop sooner and closer to their MSRPs.

greater rasterization gaming

But sure. if it can go back to MSRP faster than some folks would be happy. Reality is AMD's board partners are happily charging up the price to insane level as well

Fair point


My charting engine doesn't allow that. Not sure if useful with the way we have the text over the bar, and not outside of the bar

I know it probably would never be fixed. But man that graph just so busy like impossible to understand. Well unless it is intended to confuse folks.

Can you do contrasting colors at least between RT and non RT? That way it would quickly pop out the results for people to grasp.

Or group the results by GPU instead of sort by FPS from small to big. Show it in groups would be more informative.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,850 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I know it probably would never be fixed
Never say never. I wrote every bit of that charting engine, so at least I don't have to bother with externals :)

Can you do contrasting colors at least between RT and non RT?

already have contrasting colors?
 
Joined
Mar 18, 2008
Messages
5,717 (0.94/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Never say never. I wrote every bit of that charting engine, so at least I don't have to bother with externals :)



already have contrasting colors?


No. Blue and Green are NOT contrasting colors. Contrasting colors are on the opposite end of color wheel. The opposite of blue should be orange. The opposite of green is red. You have blue and green which are not grouped contrasting colors.

Also you have grey in there as well, making it a 3 color instead of 2 contrasting color. The information you want to convey is RT on vs off, with your current color scheme it is not working.

Highly recommend you gave this a read. You have great RT data but the presentation and execution have a lot more room for improvement.

1603954546-image2-4.png
 
Joined
Apr 16, 2020
Messages
58 (0.03/day)
So basically the world's fastest rasterization GPU and just that.
consumerism is at its peak ~ selling one product for one top tier feature, supported by so called PCMR heads with non-ethical 'transaction' history aka boomers/flexers or even rich kids
But if the price it's stable at the top, the trend should be stabilizing hundred by hundred down to the cheapest (??? $) model aviable/in production line.

Bettin' next time GPUs go cheap, it wouldn't be the same as the last crypto bubble. They'll be stacking some more and more...
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,088 (3.00/day)
Location
UK\USA
That's a ridiculous level of performance. But it's not that great, given it's 50% more power hungry than a 2080ti. Progress is meant to bring efficiency, not brute force. I know lots of people dont give a crap about energy efficiency, but it is the future of technology. So, in a way, this is a step backwards, just like the 3090. I'm happy to miss this generation.

Why ?, nVidia clearly didn't. Although looks like AMD pushed it a little extra.

How ever i would of liked them not to even bother and see if it kept nVidia guessing.
 

phill

Moderator
Staff member
Joined
Jun 8, 2011
Messages
16,913 (3.43/day)
Location
Somerset, UK
System Name Not so complete or overkill - There are others!! Just no room to put! :D
Processor Ryzen Threadripper 3970X
Motherboard Asus Zenith 2 Extreme Alpha
Cooling Lots!! Dual GTX 560 rads with D5 pumps for each rad. One rad for each component
Memory Viper Steel 4 x 16GB DDR4 3600MHz not sure on the timings... Probably still at 2667!! :(
Video Card(s) Asus Strix 3090 with front and rear active full cover water blocks
Storage I'm bound to forget something here - 250GB OS, 2 x 1TB NVME, 2 x 1TB SSD, 4TB SSD, 2 x 8TB HD etc...
Display(s) 3 x Dell 27" S2721DGFA @ 7680 x 1440P @ 144Hz or 165Hz - working on it!!
Case The big Thermaltake that looks like a Case Mods
Audio Device(s) Onboard
Power Supply EVGA 1600W T2
Mouse Corsair thingy
Keyboard Razer something or other....
VR HMD No headset yet
Software Windows 11 OS... Not a fan!!
Benchmark Scores I've actually never benched it!! Too busy with WCG and FAH and not gaming! :( :( Not OC'd it!! :(
Brilliant review as always @W1zzard :)
 
Joined
May 24, 2007
Messages
5,429 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
3090 FE, all my comparison cards are reference design

Based on the review, for users who own a 3090 OEM, i.e., the one listed in my system specs, I don't see a significant advantage of owning this the 6900 XTXH over the 3090 OEM aside from slight FPS gains on the 6900 XT. Those gains could be diminished or increased as developers are using AMD technology across consoles and PC platforms. I still may consider giving overall advantage to 3090 OEM due to the 8gb of extra GDDR6X memory that could help with future proofing.

I am also a loyal to AMD graphics technology over the last decade... I wanted to purchase a 6900 XT, but could not find one in stock.

W1zz don't you think 16gb versus 3090 24gb should be a negative?
 
Last edited:
Joined
Jan 27, 2008
Messages
25 (0.00/day)
System Name Home
Processor Ryzen 5 5600X@4.8Ghz
Motherboard Gigabyte Aorus Pro AXi 550B
Cooling Custom water cooling CPU+GPU (dual 240mm radiators)
Memory 32GB DDR4 3600Mhz CL16
Video Card(s) Gigabyte RTX2080 Super Waterforce
Storage Adata M.2 SX8200Pro 1TB + 2TB Crucial MX500 2.5" SSD + 6TB WD hdd
Display(s) Acer Nitro XF252Q FullHD 240hz + 1440p 144hz AOC
Case CoolerMaster NR200 white
Audio Device(s) SteelSeries Arctis 9 Wireless headset
Power Supply Corsair SF600 Platinum
Mouse Logitech G Pro Wireless Superlight
Keyboard Logitech G915 TKL
Software Windows 10
That card is readily available at German Mindfactory.de. 5pcs left.
Price 2090 euros. Little cheaper than your average AIB RTX3090 right now. But RTX3090 has DLSS advantage. If upcoming FSR performance is any good then maybe soon the 2090€ price tag is kinda OK.
And I use this "OK" very very lightly, no gaming cards should cost over 1000€! But with 2090€ spent you get the fastest card on all resolutions basically.
Only 4K is hit and miss, but it pounds RTX3090 at 1080p/1440p high refresh rate gaming.

Who would have thought that AMD is capable beating Nvidia at highest level on summer 2020 after RTX3090 launched!?
 
Joined
Feb 20, 2019
Messages
8,287 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Only 4K is hit and miss
4K has been hit or miss since forever.
The GTX 1080 was billed as the first "4K gaming champion" and yet in games of its day, framerates were in the 30s and 40s.
4K30 is only suitable for certain types of games, and most people would choose 1080p60 over 4K30 because it feels better and makes the experience more enjoyable.
 
Joined
May 24, 2007
Messages
5,429 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
4K has been hit or miss since forever.
The GTX 1080 was billed as the first "4K gaming champion" and yet in games of its day, framerates were in the 30s and 40s.
4K30 is only suitable for certain types of games, and most people would choose 1080p60 over 4K30 because it feels better and makes the experience more enjoyable.

4K isn't hit or miss with either card, you are achieving 60 FPS majority minimum across every game tested with exception of 1 or 2.

The 6900 XTXH is faster than the 3090 FE in relative performance across 1080P, 2K, and 4K. The only downfall is that you are left with 8gb less GDDR with the 6900 XTXH versus the 3090. DLSS is great, but the developer must support it through training and the list of supporters isn't that large. Example, in Cyberpunk you have to enable DLSS to achieve 60 FPS with all settings MAX. AMD is releasing a more open competitor, possibly without training, next month possibly.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,287 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
4K isn't hit or miss with either card, you are achieving 60 FPS majority minimum across every game tested with exception of 1 or 2.

The 6900 XTXH is faster than the 3090 FE in relative performance across 1080P, 2K, and 4K. The only downfall is that you are left with 8gb less GDDR with the 6900 XTXH versus the 3090. DLSS is great, but the developer must support it through training and the list of supporters isn't that large. Example, in Cyberpunk you have to enable DLSS to achieve 60 FPS with all settings MAX. AMD is releasing a more open competitor, possibly without training, next month possibly.
I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's really 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a fantastic job on the latest and greatest games at the best graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a 2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's really 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a fantastic job on the latest and greatest games at the best graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a 2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!

I still have no idea why people like to quote the current price of 3090 when they were selling for MSRP +10% for months, most people who are willing to fork out 3000usd for 3090 are likely using them for mining, heck 3090 is making 30usd/day mining atm.

The only bad thing about owning a 3090 was that people wish they bought more of them when they were readily available for 1600-1800usd back in 2020 :roll:, I'm regretting that I only bought 1 myself.
 
Joined
Feb 20, 2019
Messages
8,287 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I still have no idea why people like to quote the current price of 3090
We're quoting the current 3090 price because the review card in question was only launched a few days ago.
When the 3090 was 'only' $1800, the XTXH chips didn't even exist.
 
Joined
May 24, 2007
Messages
5,429 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
I really hope AMD's DLSS alternative is something that can be enabled easily by developers through the driver rather than something that requires dedicated collaboration with Nvidia to get working on a per-title basis.

4K makes sense if you can leverage VRS and/or DLSS. Turning on raytracing and running at 4K60 isn't possible on AMD at the moment, and with Nvidia it's only possible to hit 4K60 in certain AAA titles with DLSS enabled, at which point you need to ask yourself if that's really 4K. Sure, most older/lighter games work at 4K60 just fine but you if you're spending $3000 on a GPU that will be superseeded in 12 months then it has to do a fantastic job on the latest and greatest games at the best graphical settings. If you're not setting that as your goal then you really aren't needing a $3000 GPU in the first place. Lets face it, a $650 (current ebay price) of a 2080 or 2070S will run medium/high-ish settings at 4K just fine. That extra $2400 for the 6900XT or 3090 needs to be justified somehow!

I spent $1700 on my 3090. I bought it at MSRP. They are still available at MSRP through the Newegg shuffle only, although I purchased mine prior to the shuffle on Newegg.

Again, I don't agree that these cards aren't capable of good 4K performance when not using DLSS or Fidelity. They are and I use it every day on my 4K PC. Further, the proof is in the review. Look at the AVERAGE FPS across the suite of tested 4K titles. The exception is 1 or 2 games. One of them is Cyberpunk. People are trying to act like Cyberpunk is the bar for AAA 4K. Cyberpunk is an un-optimized bug-ridden game that hit that market way too early. I have the game.

Anyhow, I decided to sell my PNY RTX 3090 for a Powercolor 6900 XTXH Ultimate. There currently aren't games on the market which take advantage of more than 10-12 gb of ram, and I don't see them doing so over the next 3 years either. The performance outside of future memory limitations is better on my Powercolor 6900 XTXH Ultimate versus my PNY RTX 3090. My 6900 XTXH is auto overclocked to 2650 MHz.


1620830764789.png
 
Joined
May 24, 2007
Messages
5,429 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
W1z - Could you provide your Wattman settings for the overclock section?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,850 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
W1z - Could you provide your Wattman settings for the overclock section?
Power at max, mem at whatever is stable (2160), gpu min at default, gpu max at highest stable (2850)
 
Top