• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Releases Even More RX 6900 XT and RX 6800 XT Benchmarks Tested on Ryzen 9 5900X

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,269 (4.67/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($179)
Cooling Frost Commander 140 ($42)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3100 core $(705)
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p ($399)
Case NZXT H710 (Red/Black) ($60)
What the hell are you on about ?

I don't think he understands its easy to get high refresh these days. i am playing dragons dogma maxed out at 144 fps 1440p with a gtx 1070... lol
 
Joined
Jan 8, 2017
Messages
9,438 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It means to get >60fps in single player game with your GPU, you are mostly playing with Low setting at 4K.
Or you actually never play any game and just decide that everyone need high FPS.

I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think everyone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I am going to have to gather a team of researches to try and figure out what it is that you are saying.

Why say that 60 FPS is enough for single player games, there is absolutely no logic to that. You think someone who has a higher refresh display will think to themselves that they need the highest possible frame rate in a multiplayer title but when switching to a singe player one suddenly they no longer need more than precisely 60 ? How the hell does that work, truly mind boggling.

60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?
 
Joined
Jan 8, 2017
Messages
9,438 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
no point in downgrading visual just to get >60FPS.

Still can't figure out that this is a purely subjective conclusion ?
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Still can't figure out that this is a purely subjective conclusion ?

Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?
Because I can give you many editorials who would target 60FPS gaming
 
Last edited:
Joined
Aug 25, 2015
Messages
192 (0.06/day)
Location
Denmark
System Name Red Bandit
Processor AMD Ryzen 7 9800X3D 5.425 1.31v
Motherboard ASUS PRIME X670E-PRO WIFI
Cooling Mo-Ra3 420 W/4x Noctua NF-A20S - 2xD5's/1xDDC 4.2
Memory G.SKILL Trident Z5 NEO EXPO 6000CL28/3000/2000
Video Card(s) Power Color RX7900XTX Liquid Devil
Storage Adata SX8200 PRO 2TB x 2
Display(s) Samsung Odyssey G7 32" 240HZ
Case Lian Li o11D Evo RGB
Audio Device(s) Apple AirPods Max
Power Supply Corsair RM1000i
Mouse Logitech G502X
Keyboard Asus Flachion Brown Wireless
Software W11 Pro
What is it with people and RayTracing suddenly ? Since it was announced and until before AMD showed the new cars , NOBODY was talking about RayTracing and now ?

Just be glad for the fricking competetion , its a win for all us customers. Nvidia and AMD really dont care about us , they just wanna make as much money as possible.
 

mystera

New Member
Joined
Aug 24, 2020
Messages
2 (0.00/day)
How about AMD release COD Modern Warfare, BF5, SOTR with DXR benchmark number then ? make it easier to gauge RX6000 RT capability.

That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
 
Joined
Jan 8, 2017
Messages
9,438 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?

I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?
 
Joined
Jun 25, 2013
Messages
38 (0.01/day)
System Name Rayzen
Processor 2700x
Motherboard asus prime x370-pro
Cooling NH-U12SSE-AM4
Memory G.SKILL TRIDENTZ F4-3200c14D-32G @3000
Video Card(s) RTX 2080 TI
Storage Force MP510
Display(s) SAMSUNG 40" TV
Case CORSAIR CARBIDE series 100R
Power Supply CORSAIR RM 650x
Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?

Tell me which editorial who would recommend downgrading visual to get >60fps ? You ?

I am honestly very confused as of what is it you want or trying to imply?


Different players have different preferences when gaming.
 
Joined
Sep 1, 2020
Messages
2,353 (1.52/day)
Location
Bulgaria
There are real RT(Ms DXR) and Nvidia RT. Real RT will be in 100% of games Nvidia RT in no more than 10% of this 100%. I think there will be not games exclusive with support Nvidia RT only.
 
Joined
Jun 13, 2012
Messages
1,390 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???

The RTX you refer to is nVidia's proprietary implementation of Microsoft's DirectX raytracing. How (and why) do you expect AMD hardware to perform better than nvidia hardware in ®RTX games? It makes no sense for AMD to even attempt that. Especially since, going forward, most games will implement AMD's (proprietary or open) version of raytracing, and only "some" of them will ship with ®RTX support along the console version of raytracing.

And based on how well AMD's raytracing looks and performs, nvidia's ®RTX may (in time) become a niche feature for select sponsored games... To keep ®RTX relevant, nvidia will have to invest more than it's worth on hardware and software (game dev partners) development, and given nvidia's new focus on enterprise, it may be a hard sell (to investors).
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.
 
Joined
Feb 11, 2009
Messages
5,556 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?

I dont agree, I think the reflections in Watch Dogs look pretty dang impressive, sad its all so heavy so a true ray traced future is still several gens out for sure, but look at Digital's Foundry's latest vid on it
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I really don't know in what world do you live but almost everyone recommends turning down visuals to hit your monitor's native refresh rate. Not that it matters because that's still a subjective recommendation.

However, buying a high refresh monitor and then trying to convince yourself or others that you should actually play at 60 because "there is no point" sounds like a really intelligent conclusion, I gotta say. Because that's why most people buy a high refresh monitor, to then play at 60hz, right ?

Sure just tell me which games you do you play exactly ? CSGO ? Youtube videos ?
Almost everyone recommends turning down visuals in AAA games to hit 144hz 1440p ? Yeah I really need some confirmation on that. No one would want to play AAA games with Low settings just to hit 144hz, that I'm sure of.

I didn't say anyone should play at 60FPS, if you have already max out all the graphical settings and can still getting >60FPS, then play at >60FPS, although capping the framerate really help with input latency with Nvidia Low Latency and AMD Anti Lag in certain games.
 
Last edited:
Joined
Feb 11, 2009
Messages
5,556 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
This sounds like typical AMD excuse for why their card sucks with what was/is a standard.

I think you do not know what "standard" means or how its applied.
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
dragons dogma
This doesn't explain it to you? The title? Monkeys with crayons can draw the scenes fast enough, lol.

From 2016: "Given its old-gen nature, Dragon’s Dogma: Dark Arisen is not really a demanding title."


Just saying. ;)


That's like asking: How do AMD cards perform in games that make heavy use of ®HairWorks???
Its nothing like it, really. AMD, like NV uses DXR. They're both using the same API for RT.
WCCFamdrdna2raytracing2-740x415.jpg


And based on how well AMD's raytracing looks and performs
Was anything official released on AMD RT performance?

RTX is hardware on the card. NV cards use DXR API for RT just as AMD will.
 
Last edited:
Joined
Aug 5, 2019
Messages
808 (0.42/day)
System Name Apex Raptor: Silverback
Processor Intel i9 13900KS Allcore @ 5.8
Motherboard z790 Apex
Cooling LT720 360mm + Phanteks T30
Memory 32GB @8000MT/s CL36
Video Card(s) RTX 4090
Storage 990 PRO 4TB
Display(s) Neo G8 / C1 65"
Case Antec Performance 1
Audio Device(s) DT 1990 Pro / Motu M2
Power Supply Prime Ultra Titanium 1000w
Mouse Scimitar Pro
Keyboard K95 Platinum
Good day everyone, I'm the person geniunely excited about raytracing. At least a third of the people I'm playing games with are excited too. Why? Because when implemented like in Control or Quake 2 it's legitimately the only eye candy tech that looks fresh to my eyes and makes me want to spend 700$ on a goddamn videocard. There are shite examples (Watch Dogs, WoW: Shadowlands, Dirt 5, BFV, etc.) where there are just some reflections or just some shadows and it doesn't make a difference but when it's all-out or close to it - man it looks amazing!
Hey, I'm also excited about 6800XT. I wanna know if it can do what I want it to do. Absolutely tired of people shouting that nobody cares. Where's your market research, mate?

I am similarly excited about RT.
It's just fanboys shouting at fanboys at this point. These same people would have been the ones mocking Geforce 256 back in the day about HWT&L, pay them no heed.

We're just in that awkward phase now where DXR is still an unknown for most people and we still dont know for sure if this years or maybe the next cycle is the one that will bring mainstream acceptance/performance to RT. I personally am not aware of any non DXR games tho i do believe those nvidia developed ones like Quake RTX are probably going to be nvidia HW only. I doubt games like Control aren't going to work on AMD, I suspect it's just AMD's software side of things being still not ready enough. I would expect a lot of growing pains for the first half of 2021 and AMD DXR. Hopefully I'm wrong, but they are going into this dealing with a 2 year handicap.

New games will have cross brand hw to work with soon, and as someone with a 2070super all i can say is the DXR game library is veeery small still, and its only going to really grow now with the new consoles since more or less all cross platform AAA titles will be coming with some form of RT once this first cross platform year of releases is over. (and already some of those cross plats are coming with RT anyway) So it bodes well overall for us mid to long term, regardless of hardware brand choices or.. god forbid loyalties.
 
Joined
Dec 31, 2009
Messages
19,371 (3.56/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You mean the DXR standard as opposed to Nvidia's proprietary RT
Who's proprietary RT? Nvidia uses DXR as well...

When DXR is enabled by a Game Ready Driver, targeted for April (2019), the supported GeForce GTX graphics cards will work without game updates because ray-traced games are built on DirectX 12’s DirectX Raytracing API, DXR. This industry standard API uses compute-like ray tracing workloads that are compatible with both dedicated hardware units, such as the RT Cores, and the GPU’s general purpose shader cores.
 
Last edited:
Joined
Apr 10, 2020
Messages
504 (0.30/day)
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
 
Joined
Jun 2, 2017
Messages
9,177 (3.35/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
60FPS is enough to enjoy single player game at the best visual fidelity you can get.
I bought the very first 144hz 1440p display, but I played The Witcher 3 at 60fps with the best visuals, no point in downgrading visual just to get >60FPS.
Same with Metro Exodus, Control, why would I sacrifice visual when I have ~60FPS already ?

Now switching to competitive games like PUBG, Modern Warfare, Overwatch and I lower every setting just to get the highest FPS, why wouldn't I ?

I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I can say with confidence that the Division 2 feels much smoother at 120+ FPS and looks beautiful. One of the differences between 60 FPS and 120+ FPS in that Game (and I suspect a few more) allows you to aim easier. To keep it simple in the Division 2 with a automatic rifle (900+ round) you can get headshot kills vs having the shots go all over the place. I love how from reading most of what you post that you are undeniably in favor of Nvidia GPUs. It is kind of foolish though that after you yourself bragged about the 3090 being unassailable for AMD. That you now make the ridiculous argument that 60 FPS is enough in Games period with the benchmarks that AMD has released? I suppose you will now remind me of the joy of DLSS (Which the 6000 series does not need for high......FPS) and RTX Ray tracing which may go the way of beta. You see beta was better than VHS but VHS is what the consumer market adopted. After about 5 years you could not find a beta version of popular culture anything. Which brings me to my last point....You don't have to downgrade anything to enjoy those same Games you mentioned at high FPS using 6000 series GPUS though. I am objective enough to say that Nvidia's 3000 series are nice cards but the way Nvidia is so relentless in trying to control mind share is desultory.

Do you play the single player or multi player version of the Division 2 ?
Like I said for competitive game, like the multiplayer version of Div2, then I would use Low Settings to get the highest FPS I can get.

Now tell me which do you prefer with your current GPU:
RDR2 High setting ~60fps or 144fps with low settings
AC O High Setting ~60 fps or 144fps with low settings
Horizon Zero Dawn High Setting ~60fps or 144fps with low settings

Well to be clear when I said 60FPS, it's for the minimum FPS.

Yeah sure if you count auto-overclocking and proprietary feature (SAM) that make 6900XT as being equal to 3090, see the hypocrisy there ? Also I can find higher benchmark numbers for 3080/3090 online, so trust AMD numbers with a grain of salt.
 
Last edited:
Joined
Mar 9, 2020
Messages
80 (0.05/day)
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
The 3070 is at the moment, Unicorn breath, like the rest of the Ampere lineup. What you call "impressive" regards the 3090, becomes idiotic when a $1500 card only beats a $800 card by 10%.
Oh! and CUDA is no good for gaming - Whilst Ray Tracing kills performance without resorting to DLSS.
Raytracing is todays equivalent of Hairworks or Physx.
The leather jacket openly lied to Nvidia's consumer base, claiming the 3090 was "Titan-like" when it clearly isn't, and promising plenty of stock for buyers. The reality is that abysmal yields are the reason
the Ampere series are almost impossible to come by.
 
Joined
Dec 26, 2006
Messages
3,837 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
The vanilla 6800 is actually looking really strong in the first few of those benchmarks.

It's great that yesterday's $1200 performance is now half price, but what the overwhelming majority have needed for two years is yeserday's $600 performance for $300.

I wonder if they will release a cheaper 8GB version of the 6800?
 
Joined
Jun 18, 2015
Messages
575 (0.17/day)
I must admit 3080XT and 3090 look impressive performance and pricewise. But 6800 disappoints big time. Not because of it's performance, but because of the price. 10-15% more rasterization performance for 16% more money offers slightly worse price/performance ratio than 3070 which is a BIG SHAME. 3070 would look utterly silly if AMD had chosen to price 6800 at $499. At $579, 3070 remains viable alternative. 3070 minuses are less vram, poorer standard rasterization & pluses better driver support, CUDA cores (for Adobe apps), AI upscaling (DLSS), probably better looking ray tracing. I'd say it's a draw. BUT given that Nvidia has much better brand recognition when it comes to GPUs, AMD will have hard time selling 6800 at the given MSRP IF Nvidia can actually produce enough Ampere GPUs to satisfy demand, which might not be the case in the near future.
except the 3070's real price is not 499$. since rtx 2xxx series nvidia has been selling their cards with much higher price than the announced prices.
This is fraud and reviewer sites channels should warn people and condemn nvidia for this but very few does.
 
Top