• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7800 XT

Joined
Apr 24, 2014
Messages
6 (0.00/day)
On the The Witcher 3 page it says, "There's too many issues with the Enhanced Edition right now, so we'll be testing the original a little longer."

Is that regarding the update that introduces RT and FSR/DLSS?
 
Joined
Dec 25, 2020
Messages
7,456 (4.97/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
On the The Witcher 3 page it says, "There's too many issues with the Enhanced Edition right now, so we'll be testing the original a little longer."

Is that regarding the update that introduces RT and FSR/DLSS?

Yes. Said issues should be resolved by now, though
 

Mitchel78

New Member
Joined
Feb 6, 2024
Messages
1 (0.00/day)
my xfx 319 quick 7800xt becomes instant unstable if i just look at the switches in wattman. not even the stock settings are stable under all conditions. tried everythin, disable rebar, rollback drivers ... i have no idea how to get tis card stable.
 
Joined
Jan 14, 2019
Messages
14,053 (6.35/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race

OscarPill

New Member
Joined
Mar 6, 2024
Messages
1 (0.00/day)
Is it normal that there are 7700xt models listed along the 7800xt on the Temperature and Fan noise page ? If not, how noisy is the 7800xt Sapphire Pulse ?
 
Joined
Jan 14, 2019
Messages
14,053 (6.35/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Joined
Apr 24, 2008
Messages
2,049 (0.33/day)
Processor RyZen R9 3950X
Motherboard ASRock X570 Taichi
Cooling Coolermaster Master Liquid ML240L RGB
Memory 64GB DDR4 3200 (4x16GB)
Video Card(s) RTX 3050
Storage Samsung 2TB SSD
Display(s) Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080
Case Zulman MS800
Audio Device(s) On Board
Power Supply Seasonic 650W
VR HMD Oculus Rift, Oculus Quest V1, Oculus Quest 2
Software Windows 11 64bit
Now it seems as if it’s a bit of a toss-up between the 7800XT and the 7900GRE (~$50 give or take). However the price drop on the 7700XT makes it a bit more palatable the. It was before if you can’t step up to the 7800XT.
 

JustBuyLianLi

New Member
Joined
Oct 27, 2022
Messages
13 (0.02/day)
Picked one up for around $415usd which is around what a 4060ti and 7700xt are currently going for here in Canada. They named it wrong as it’s more like what the 7700xt should have been and the 7600xt should have been as far as generational uplift goes. While it is a discount vs the 6800xt price was originally, a $450 price point would have made more sense given the 6800xt was $650 and that’s a 4 year old card.

That being said, the uplift over my 3060 has been insanely good, especially in Helldivers 2 where frames fell to 10-20fps in intense battles with automatons, drop ships and projectiles flying all around the screen and this was only on 1080p. This is with a 5700x3d too. I was told it's due to the game being too cpu heavy. The game engine is pretty old though, so I didn't expect a gpu upgrade to help as much as I'd like it to. Now I'm able to play on 1440p and while frames aren't as high as I think they would be in other games, it's still most of the time 80-100+ and drops remain at or around 60fps which is a much more smoother experience.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,669 (3.99/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Picked one up for around $415usd which is around what a 4060ti and 7700xt are currently going for here in Canada. They named it wrong as it’s more like what the 7700xt should have been and the 7600xt should have been as far as generational uplift goes. While it is a discount vs the 6800xt price was originally, a $450 price point would have made more sense given the 6800xt was $650 and that’s a 4 year old card.

Having had my 7800XT for almost a year now, I think I can say that the reduced power draw over the 6800XT is welcome and newer games seem to prefer RDNA3 over RDNA2.

Whilst the 7800XT was basically a like-for-like sidegrade with the games tested in 2023, the 2024 benchmark suite puts the 7800XT 11% ahead in RT-enabled titles and 5-6% ahead in non-RT games:

1721925729587.png


They named it wrong as it’s more like what the 7700xt should have been
I guess the real naming error was the 6800XT and that's what's skewing your perception.

The 6800XT should have been called a vanilla 6900 as it's the same silicon at the 6900XT with the exact same power budget, bandwidth, VRAM, cache, ROP count and 90% of the shader count. The reduced shader count actually just means that the 6800XT boosts higher at the same power budget so unless you manually tune the two cards, the difference between them is almost insignificant.

Compared to the "vanilla RX 6900" as I'm calling it, the 7800XT is a full tier lower than the 6800XT in terms of the product stack; AMD just fudged the naming last gen which is why this generation can't be compared against it like for like. This isn't new to AMD either - they did the same thing moving from Radeon HD 5000 to Radeon HD 6000 before the GCN architecture days, and there was definitely some naming/tier shenanigans going on in the early GCN days between 4 generations of the same old Pitcairn silicon from the HD 7000-series all the way to the R7 300-series!
 
Joined
Feb 8, 2017
Messages
271 (0.09/day)
Just picked one up for 450 euros, its pretty much tied to the 4070ti in terms of performance at 1440p and has more Vram, while being 300 euros cheaper.

I went from the GTX 760 to the GTX 1060 6GB to the RX 5700XT to the RX 7800XT. Each decision has been amazing and I've been getting gigantic value out of each of these GPU's. Literally best value cards at the time.

Almost 150% faster performance from the 760 to the 1060 6gb, then around 80% faster performance from the 1060 to the 5700xt and now now over 100% performance going from the 5700Xt to the 7800XT.

I feel sorry for people who went from something like the 2070 to something like the 4070 vanilla, its just so awful and barely any performance gain at very low value.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
I don't like it, and will never buy it because of the high power consumption.

Must be 2 W:
1721934519056.png


Must be 9 W:
1721934536011.png


Must be 10 W:
1721934572908.png


Must be 175 W:
1721934603311.png


Must be 175 W (no power spikes at all):
1721934641985.png


Is Counter-Strike 2 already fixed?

1721934756043.png
1721934872316.png
 
Joined
Feb 20, 2019
Messages
8,669 (3.99/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
You're unhappy that it's not as efficient as Nvidia's monolithic products, using a more expensive, smaller TSMC node?

I mean, duh! Of course a chiplet design uses more power, and it's TSMC N5 and N6, not TSMC N4. Power consumption wouldn't even be the same as Nvidia even if AMD used TSMC N4!

If you want power efficiency, pay the $800 for the 4070Ti which matches it for raster performance. You save $300 buying a 7800XT and obviously you don't get like for like since the chiplets will always have higher idle power and the RT performance isn't at the same level as Nvidia.
 
Joined
Jan 8, 2017
Messages
9,658 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
(no power spikes at all):
Even your beloved Nvidia cards have power spikes well over their rated TDP, you live in fantasy land.
 
Joined
Jun 7, 2023
Messages
173 (0.28/day)
Location
Holy Roman Empire
System Name Shadowchaser /// Shadowchaser Jr.
Processor AMD Ryzen 7 7700 /// AMD Ryzen 9 5900X
Motherboard Asus ROG Strix X670E-E Gaming /// Asus ROG Strix B550-E Gaming
Cooling Noctua NH-D15S chromax.black /// Thermalright Venomous X
Memory Kingston Fury Beast RGB 6000C30 2x16GB /// G.Skill Trident Z RGB 3200C14 4x8GB
Video Card(s) Zotac GeForce RTX 4080 Trinity 16GB /// Gigabyte G1 GTX 980Ti
Storage Kingston KC3000 2TB + Samsung 970EVO 2TB + 2x WD Red 4TB /// Samsung 970EVO 1TB + WD Red 4TB
Display(s) Gigabyte M27Q /// Dell U2715H
Case Fractal Define 7 Black /// Fractal Define R5 Black
Power Supply Seasonic SS-750KM3 /// Seasonic SS-660KM
Mouse Logitech MX Master 2S /// Logitech MX Master
Software Linux Debian 12 x64 + MS Windows 10 Pro x64
Even your beloved Nvidia cards have power spikes well over their rated TDP, you live in fantasy land.
Right on. Ampere based cards based on el-cheapo and el-crapolla Samsung 8nm waffer was the supreme king of power spikes. I had 3090 Palit GameRock and that was the ultimate piece of crap card. Sold it couple of hours after purchase for the same amount.
 
Low quality post by ARF

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
Even your beloved Nvidia cards have power spikes well over their rated TDP, you live in fantasy land.

Why trolling? This is your beloved brand, not mine.
Power spikes are like viruses, should find a medicine and heal them :D
 
Low quality post by Vya Domus
Joined
Jan 8, 2017
Messages
9,658 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Yeah bro I am the one trolling, totally not you saying a bunch of nonsense of what "must be" and must not be, as if anyone should even care.
 

JustBuyLianLi

New Member
Joined
Oct 27, 2022
Messages
13 (0.02/day)
Having had my 7800XT for almost a year now, I think I can say that the reduced power draw over the 6800XT is welcome and newer games seem to prefer RDNA3 over RDNA2.

Whilst the 7800XT was basically a like-for-like sidegrade with the games tested in 2023, the 2024 benchmark suite puts the 7800XT 11% ahead in RT-enabled titles and 5-6% ahead in non-RT games:

View attachment 356385


I guess the real naming error was the 6800XT and that's what's skewing your perception.

The 6800XT should have been called a vanilla 6900 as it's the same silicon at the 6900XT with the exact same power budget, bandwidth, VRAM, cache, ROP count and 90% of the shader count. The reduced shader count actually just means that the 6800XT boosts higher at the same power budget so unless you manually tune the two cards, the difference between them is almost insignificant.

Compared to the "vanilla RX 6900" as I'm calling it, the 7800XT is a full tier lower than the 6800XT in terms of the product stack; AMD just fudged the naming last gen which is why this generation can't be compared against it like for like. This isn't new to AMD either - they did the same thing moving from Radeon HD 5000 to Radeon HD 6000 before the GCN architecture days, and there was definitely some naming/tier shenanigans going on in the early GCN days between 4 generations of the same old Pitcairn silicon from the HD 7000-series all the way to the R7 300-series!
The 4060ti was bashed for good reason as performance wasn't really better than a 3060ti, so aside from being more power efficient and the newer features like AV1 and frame generation, many said it was either a 50 class, or should have been the actual 4060 vanilla the 4060 should have been a 4050 with the AD107 die. Nvidia used the naming all the cards a tier or two above where they would be if it were an actual generational performance uplift gave them the excuse to price it the same as the previous generation. They almost tried to call the 4070 ti a 4080, and even after they "fixed" the name, it still was overpriced as it was still more expensive than previous gen 80 class
 
Joined
Aug 13, 2009
Messages
3,382 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
I am thinking about getting a 7800 XT, but is it really incapable of doing above 100 FPS without ray tracing at 1440p? That's pretty disappointing.
I typically like to play games on max settings with anti aliasing disabled (weird habit), and really hoped one of these could easily get me 120+ FPS even in demanding games. 165Hz monitor suddenly feels like a waste of money...
 
Joined
Jan 29, 2021
Messages
1,917 (1.31/day)
Location
Alaska USA
I am thinking about getting a 7800 XT, but is it really incapable of doing above 100 FPS without ray tracing at 1440p? That's pretty disappointing.
I typically like to play games on max settings with anti aliasing disabled (weird habit), and really hoped one of these could easily get me 120+ FPS even in demanding games. 165Hz monitor suddenly feels like a waste of money...
I would wait for the reviews / prices of the new AMD cards.
 
Joined
Feb 24, 2023
Messages
3,528 (4.97/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
with anti aliasing disabled
TAA is baked into most recent titles and can sometimes only be disabled by tweaking the configuration files. Games look broken afterwards. 100+ FPS + max settings + 1440p = get something 800+ USD á-la 7900 XTX, or upcoming GPUs if they're worth it (which is very, very unlikely in your case). The difference between high and max settings is so subtle you can squeeze more FPS by going down a bit. XeSS and FSR, despite being far from optimal, are sometimes a legit way to get another 30 to 50 percent performance.

This translates into me, for example, playing CP2077 at 90ish FPS at 1440p (SSR at Medium; XeSS at Quality) despite it clearly running at 50 FPS at pure max settings without XeSS involved (6700 XT). Visuals loss is significant, yet not deal breaking. This game is one of the worst cases for AMD GPU gaming anyway, DLSS is the only tech implemented somewhat correctly there. Native sucks, XeSS sucks, FSR sucks even more.

100+ FPS 1440p gaming has never been cheap. Getting away with a last gen mid ranger demands very significant game settings sacrifices.
 
Joined
Jun 2, 2017
Messages
9,694 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
TAA is baked into most recent titles and can sometimes only be disabled by tweaking the configuration files. Games look broken afterwards. 100+ FPS + max settings + 1440p = get something 800+ USD á-la 7900 XTX, or upcoming GPUs if they're worth it (which is very, very unlikely in your case). The difference between high and max settings is so subtle you can squeeze more FPS by going down a bit. XeSS and FSR, despite being far from optimal, are sometimes a legit way to get another 30 to 50 percent performance.

This translates into me, for example, playing CP2077 at 90ish FPS at 1440p (SSR at Medium; XeSS at Quality) despite it clearly running at 50 FPS at pure max settings without XeSS involved (6700 XT). Visuals loss is significant, yet not deal breaking. This game is one of the worst cases for AMD GPU gaming anyway, DLSS is the only tech implemented somewhat correctly there. Native sucks, XeSS sucks, FSR sucks even more.

100+ FPS 1440p gaming has never been cheap. Getting away with a last gen mid ranger demands very significant game settings sacrifices.
You have a very jaded point of view. You talk about Gaming like the only thing that matters are Nvidia features. Native is not broken and 1440P is not a hard as you claim. There is also the notion that you are implying that all Games are somehow in the same bucket. Is TWWH3 the same as Rogue Trader? Are older titles that came before DLSS bad to play now because they don"t have that? This is the notion that allowed people to brow beat 6800XT owners as they had 3080s. Need I say more? I was watching a live stream today of someone using a 3080 and it was not good seeing 29 FPS in the new Asseto Corso. I expect the flames to start by the weekend that they had the nerve to release a Game without DLSS support. Just like 99% of all Games created today. Want to bring AAA into the equation. The truth was that when Hogwarts Launched and Starfield. Users with 7000 cards wondered what all the noise was about. Then before that there was Avatar. The best though is City Skylines 2 that only support DLSS but clicking Hyper RX in AMD software (or if you want to call it app) produced a very enjoyable 1,100.000 population in that Game that was smooth as butter. What I don't understand that there are about 8 to 10 of you that love to wax on AMD like their GPUs are garbage.
 
Joined
Feb 24, 2023
Messages
3,528 (4.97/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
@kapone32, hard to quote something specific from this, ahem, wall of thought, let's put it that way. However, what I stated/implied is:

1. Cyberpunk 2077 specifically is an NVIDIA oriented title. Just don't bother buying an AMD GPU if this is your main gaming focus.
2. This exact game, once again, doesn't have any good native rendering implementation, it relies on DLSS heavily.
3. I never said these are necessary in gaming in general. In some other games AMD/Intel GPUs massively suck, too, but not in all of them.
4. I played 1440p myself and I know what it takes to max it out at 100+ FPS. Definitely at least two times my 6700 XT's calculating power in the games of 2023+.
5. Hyper RX is just an "I am too lame or the game devs are too lame to enable appropriate FSR mode in game menu myself/themselves" button. Image quality suffers a crap ton because it's enabling FSR1.
6. AMD's behaviour is what is utter rubbish, not their GPUs. GPUs themselves are reasonable. Prices are horrible (not cheap enough to convince. Not nearly enough. If you only see Canadian prices then it's on you, it's not the whole world. Russian prices, for example, are BETTER for NVIDIA video cards), software features are always late and bad quality and marketing is just pure kindergarten.

Please try reading carefully, thoughtfully and remember not a single corporation is your friend. AMD, nVidia, Intel, whatever, they all are poison and venom at the same time. AMD are enablers of this monopoly, that's why I hate them more than I hate nVidia.
 
Joined
Jan 14, 2019
Messages
14,053 (6.35/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I am thinking about getting a 7800 XT, but is it really incapable of doing above 100 FPS without ray tracing at 1440p? That's pretty disappointing.
I typically like to play games on max settings with anti aliasing disabled (weird habit), and really hoped one of these could easily get me 120+ FPS even in demanding games. 165Hz monitor suddenly feels like a waste of money...
That depends on what you want to play. But like it's been said, I'd wait for the new cards to release and see the reviews.

@kapone32, hard to quote something specific from this, ahem, wall of thought, let's put it that way. However, what I stated/implied is:

1. Cyberpunk 2077 specifically is an NVIDIA oriented title. Just don't bother buying an AMD GPU if this is your main gaming focus.
2. This exact game, once again, doesn't have any good native rendering implementation, it relies on DLSS heavily.
3. I never said these are necessary in gaming in general. In some other games AMD/Intel GPUs massively suck, too, but not in all of them.
4. I played 1440p myself and I know what it takes to max it out at 100+ FPS. Definitely at least two times my 6700 XT's calculating power in the games of 2023+.
5. Hyper RX is just an "I am too lame or the game devs are too lame to enable appropriate FSR mode in game menu myself/themselves" button. Image quality suffers a crap ton because it's enabling FSR1.
6. AMD's behaviour is what is utter rubbish, not their GPUs. GPUs themselves are reasonable. Prices are horrible (not cheap enough to convince. Not nearly enough. If you only see Canadian prices then it's on you, it's not the whole world. Russian prices, for example, are BETTER for NVIDIA video cards), software features are always late and bad quality and marketing is just pure kindergarten.

Please try reading carefully, thoughtfully and remember not a single corporation is your friend. AMD, nVidia, Intel, whatever, they all are poison and venom at the same time. AMD are enablers of this monopoly, that's why I hate them more than I hate nVidia.
That all is deeply into personal opinion territory. You can't sell any of it as fact.
 
Joined
Jun 2, 2017
Messages
9,694 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
@kapone32, hard to quote something specific from this, ahem, wall of thought, let's put it that way. However, what I stated/implied is:

1. Cyberpunk 2077 specifically is an NVIDIA oriented title. Just don't bother buying an AMD GPU if this is your main gaming focus.
2. This exact game, once again, doesn't have any good native rendering implementation, it relies on DLSS heavily.
3. I never said these are necessary in gaming in general. In some other games AMD/Intel GPUs massively suck, too, but not in all of them.
4. I played 1440p myself and I know what it takes to max it out at 100+ FPS. Definitely at least two times my 6700 XT's calculating power in the games of 2023+.
5. Hyper RX is just an "I am too lame or the game devs are too lame to enable appropriate FSR mode in game menu myself/themselves" button. Image quality suffers a crap ton because it's enabling FSR1.
6. AMD's behaviour is what is utter rubbish, not their GPUs. GPUs themselves are reasonable. Prices are horrible (not cheap enough to convince. Not nearly enough. If you only see Canadian prices then it's on you, it's not the whole world. Russian prices, for example, are BETTER for NVIDIA video cards), software features are always late and bad quality and marketing is just pure kindergarten.

Please try reading carefully, thoughtfully and remember not a single corporation is your friend. AMD, nVidia, Intel, whatever, they all are poison and venom at the same time. AMD are enablers of this monopoly, that's why I hate them more than I hate nVidia.


1. While Nvidia does have a relationship with CP2077 using an AMD is not a problem
2. While I am getting 165 FPS at 4k native high
3. Show me a Game where AMD GPUs absolutely suck my 7900XT ha no problem with my entire library
4. I had 1440P in 2011 QNIX

6. Where I live a 7900XTX is about $1300 while a 4090 is minimum $2900. Only someone that needs CUDA would pay double for 15% increase in performance. Then I have been in this Game for a long time so 4K 144hz is plenty fine and yes Games have deeper colours on my 7900XT vs my 3060 laptop. Even my measly 8600G based HTPC produces a richer picture than my Gaming laptop.

I have also been around long enough that Nvidia have done desultory actions. Even to people that would help their bottom line they have and it is the reason I went to AMD. The truth is since the original 6800 1GB AMD have had better price/performance but the narrative has been all Nvidia. Whether it was the Super Bowl or World Series f me even the Triple Crown and Masters had Nvidia (The way it 's meant to be played") commercials to create the sheep culture. I was like you when I built my first PC it had a GTS 450 and I used to drool about the 8800 but I found the 6800 and then 7950X and then Vega and the 7900XT is perfect for me and my library of Humble, GOG, Fanatical, Steam even Epic as I have been a Gamer since I stopped at the Arcade at the bowling alley every day in Grade 6 on the way home from school. You would not understamnd the hook of Outrun on Atari because there is no DLSS. Even TOCA Racing on the PS was progess but AMS2 looks great and plays at 300+ FPS at 4K.
 
Joined
Feb 24, 2023
Messages
3,528 (4.97/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
That all is deeply into personal opinion territory. You can't sell any of it as fact.
Another one not even trying to read. I put zero opinion in any of that.
2. While I am getting 165 FPS at 4k native high
In Cyberpunk? On a 7900 XT? It's impossible on ANY GPU. Even 5090 can't do that, it's gonna be hard stuck at double digits. If you have Hyper RX enabled then it's not native, you have massive FSR included. Hello, fellow 720p gamer!
3. Show me a Game where AMD GPUs absolutely suck my 7900XT ha no problem with my entire library
Literally any title with ray tracing being the only available option clearly shows that AMD GPUs trail behind Team Green pricesakes. Especially at the highest end where VRAM amount is redundant on whatever model. Avatar, for example, and SH2 IIRC. It's not strictly "absolutely sucks" but still bad.
7900XTX is about $1300 while a 4090 is minimum $2900
Do you realise you compare a GPU that has a competitor to a GPU that hasn't? Compare to a 4070 Ti or maybe Super if we got a 7900 XTX that overclocks well enough, that's where 7900 XTX lies if we take RT seriously. Or 4080 if we do not. 4090 is a league of its own, it's allowed ANY price that's lower than whatever NV got in stock for even more advanced customers like these RTX 6000 GPUs.
Games have deeper colours on my 7900XT vs my 3060 laptop
Placebo effect. Colours are identical if we talk identical displays and identical properly working cables. If you talk "Vivid gaming" thing from the Adrenalin app it's arguably useful. Never found these oversaturated colours any appealing. Just calibrate your monitor properly and run games at default colours, unless you're a daltonic.
 
Top