System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS |
Motherboard | ASUS ROG Maximus Z790 Apex Encore |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | NVIDIA RTX A2000 |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic IntelliMouse (2017) |
Keyboard | IBM Model M type 1391405 |
Software | Windows 10 Pro 22H2 |
Benchmark Scores | I pulled a Qiqi~ |
On the The Witcher 3 page it says, "There's too many issues with the Enhanced Edition right now, so we'll be testing the original a little longer."
Is that regarding the update that introduces RT and FSR/DLSS?
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Then something must be wrong with the card. RMA maybe?not even the stock settings are stable under all conditions.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
As long as your case has good airflow, it's whisper quiet.how noisy is the 7800xt Sapphire Pulse ?
Processor | RyZen R9 3950X |
---|---|
Motherboard | ASRock X570 Taichi |
Cooling | Coolermaster Master Liquid ML240L RGB |
Memory | 64GB DDR4 3200 (4x16GB) |
Video Card(s) | RTX 3050 |
Storage | Samsung 2TB SSD |
Display(s) | Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080 |
Case | Zulman MS800 |
Audio Device(s) | On Board |
Power Supply | Seasonic 650W |
VR HMD | Oculus Rift, Oculus Quest V1, Oculus Quest 2 |
Software | Windows 11 64bit |
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
Picked one up for around $415usd which is around what a 4060ti and 7700xt are currently going for here in Canada. They named it wrong as it’s more like what the 7700xt should have been and the 7600xt should have been as far as generational uplift goes. While it is a discount vs the 6800xt price was originally, a $450 price point would have made more sense given the 6800xt was $650 and that’s a 4 year old card.
I guess the real naming error was the 6800XT and that's what's skewing your perception.They named it wrong as it’s more like what the 7700xt should have been
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
System Name | Good enough |
---|---|
Processor | AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge |
Motherboard | ASRock B650 Pro RS |
Cooling | 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30 |
Memory | 32GB - FURY Beast RGB 5600 Mhz |
Video Card(s) | Sapphire RX 7900 XT - Alphacool Eisblock Aurora |
Storage | 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB |
Display(s) | LG UltraGear 32GN650-B + 4K Samsung TV |
Case | Phanteks NV7 |
Power Supply | GPS-750C |
Even your beloved Nvidia cards have power spikes well over their rated TDP, you live in fantasy land.(no power spikes at all):
System Name | Shadowchaser /// Shadowchaser Jr. |
---|---|
Processor | AMD Ryzen 7 7700 /// AMD Ryzen 9 5900X |
Motherboard | Asus ROG Strix X670E-E Gaming /// Asus ROG Strix B550-E Gaming |
Cooling | Noctua NH-D15S chromax.black /// Thermalright Venomous X |
Memory | Kingston Fury Beast RGB 6000C30 2x16GB /// G.Skill Trident Z RGB 3200C14 4x8GB |
Video Card(s) | Zotac GeForce RTX 4080 Trinity 16GB /// Gigabyte G1 GTX 980Ti |
Storage | Kingston KC3000 2TB + Samsung 970EVO 2TB + 2x WD Red 4TB /// Samsung 970EVO 1TB + WD Red 4TB |
Display(s) | Gigabyte M27Q /// Dell U2715H |
Case | Fractal Define 7 Black /// Fractal Define R5 Black |
Power Supply | Seasonic SS-750KM3 /// Seasonic SS-660KM |
Mouse | Logitech MX Master 2S /// Logitech MX Master |
Software | Linux Debian 12 x64 + MS Windows 10 Pro x64 |
Right on. Ampere based cards based on el-cheapo and el-crapolla Samsung 8nm waffer was the supreme king of power spikes. I had 3090 Palit GameRock and that was the ultimate piece of crap card. Sold it couple of hours after purchase for the same amount.Even your beloved Nvidia cards have power spikes well over their rated TDP, you live in fantasy land.
Even your beloved Nvidia cards have power spikes well over their rated TDP, you live in fantasy land.
System Name | Good enough |
---|---|
Processor | AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge |
Motherboard | ASRock B650 Pro RS |
Cooling | 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30 |
Memory | 32GB - FURY Beast RGB 5600 Mhz |
Video Card(s) | Sapphire RX 7900 XT - Alphacool Eisblock Aurora |
Storage | 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB |
Display(s) | LG UltraGear 32GN650-B + 4K Samsung TV |
Case | Phanteks NV7 |
Power Supply | GPS-750C |
The 4060ti was bashed for good reason as performance wasn't really better than a 3060ti, so aside from being more power efficient and the newer features like AV1 and frame generation, many said it was either a 50 class, or should have been the actual 4060 vanilla the 4060 should have been a 4050 with the AD107 die. Nvidia used the naming all the cards a tier or two above where they would be if it were an actual generational performance uplift gave them the excuse to price it the same as the previous generation. They almost tried to call the 4070 ti a 4080, and even after they "fixed" the name, it still was overpriced as it was still more expensive than previous gen 80 classHaving had my 7800XT for almost a year now, I think I can say that the reduced power draw over the 6800XT is welcome and newer games seem to prefer RDNA3 over RDNA2.
Whilst the 7800XT was basically a like-for-like sidegrade with the games tested in 2023, the 2024 benchmark suite puts the 7800XT 11% ahead in RT-enabled titles and 5-6% ahead in non-RT games:
View attachment 356385
I guess the real naming error was the 6800XT and that's what's skewing your perception.
The 6800XT should have been called a vanilla 6900 as it's the same silicon at the 6900XT with the exact same power budget, bandwidth, VRAM, cache, ROP count and 90% of the shader count. The reduced shader count actually just means that the 6800XT boosts higher at the same power budget so unless you manually tune the two cards, the difference between them is almost insignificant.
Compared to the "vanilla RX 6900" as I'm calling it, the 7800XT is a full tier lower than the 6800XT in terms of the product stack; AMD just fudged the naming last gen which is why this generation can't be compared against it like for like. This isn't new to AMD either - they did the same thing moving from Radeon HD 5000 to Radeon HD 6000 before the GCN architecture days, and there was definitely some naming/tier shenanigans going on in the early GCN days between 4 generations of the same old Pitcairn silicon from the HD 7000-series all the way to the R7 300-series!
Processor | Ryzen 5800X |
---|---|
Motherboard | Asus TUF-Gaming B550-Plus |
Cooling | Noctua NH-U14S |
Memory | 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC |
Video Card(s) | Sapphire AMD Radeon RX 7900 XTX Nitro+ |
Storage | HP EX950 512GB + Samsung 970 PRO 1TB |
Display(s) | Cooler Master GP27Q |
Case | Fractal Design Define R6 Black |
Audio Device(s) | Creative Sound Blaster AE-5 |
Power Supply | Seasonic PRIME Ultra 650W Gold |
Mouse | Roccat Kone AIMO Remastered |
Software | Windows 10 x64 |
I would wait for the reviews / prices of the new AMD cards.I am thinking about getting a 7800 XT, but is it really incapable of doing above 100 FPS without ray tracing at 1440p? That's pretty disappointing.
I typically like to play games on max settings with anti aliasing disabled (weird habit), and really hoped one of these could easily get me 120+ FPS even in demanding games. 165Hz monitor suddenly feels like a waste of money...
System Name | D.L.S.S. (Die Lekker Spoed Situasie) |
---|---|
Processor | i5-12400F |
Motherboard | Gigabyte B760M DS3H |
Cooling | Laminar RM1 |
Memory | 32 GB DDR4-3200 |
Video Card(s) | RX 6700 XT (vandalised) |
Storage | Yes. |
Display(s) | MSi G2712 |
Case | Matrexx 55 (slightly vandalised) |
Audio Device(s) | Yes. |
Power Supply | Thermaltake 1000 W |
Mouse | Don't disturb, cheese eating in progress... |
Keyboard | Makes some noise. Probably onto something. |
VR HMD | I live in real reality and don't need a virtual one. |
Software | Windows 11 / 10 / 8 |
Benchmark Scores | My PC can run Crysis. Do I really need more than that? |
TAA is baked into most recent titles and can sometimes only be disabled by tweaking the configuration files. Games look broken afterwards. 100+ FPS + max settings + 1440p = get something 800+ USD á-la 7900 XTX, or upcoming GPUs if they're worth it (which is very, very unlikely in your case). The difference between high and max settings is so subtle you can squeeze more FPS by going down a bit. XeSS and FSR, despite being far from optimal, are sometimes a legit way to get another 30 to 50 percent performance.with anti aliasing disabled
System Name | Best AMD Computer |
---|---|
Processor | AMD 7900X3D |
Motherboard | Asus X670E E Strix |
Cooling | In Win SR36 |
Memory | GSKILL DDR5 32GB 5200 30 |
Video Card(s) | Sapphire Pulse 7900XT (Watercooled) |
Storage | Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500 |
Display(s) | GIGABYTE FV43U |
Case | Corsair 7000D Airflow |
Audio Device(s) | Corsair Void Pro, Logitch Z523 5.1 |
Power Supply | Deepcool 1000M |
Mouse | Logitech g7 gaming mouse |
Keyboard | Logitech G510 |
Software | Windows 11 Pro 64 Steam. GOG, Uplay, Origin |
Benchmark Scores | Firestrike: 46183 Time Spy: 25121 |
You have a very jaded point of view. You talk about Gaming like the only thing that matters are Nvidia features. Native is not broken and 1440P is not a hard as you claim. There is also the notion that you are implying that all Games are somehow in the same bucket. Is TWWH3 the same as Rogue Trader? Are older titles that came before DLSS bad to play now because they don"t have that? This is the notion that allowed people to brow beat 6800XT owners as they had 3080s. Need I say more? I was watching a live stream today of someone using a 3080 and it was not good seeing 29 FPS in the new Asseto Corso. I expect the flames to start by the weekend that they had the nerve to release a Game without DLSS support. Just like 99% of all Games created today. Want to bring AAA into the equation. The truth was that when Hogwarts Launched and Starfield. Users with 7000 cards wondered what all the noise was about. Then before that there was Avatar. The best though is City Skylines 2 that only support DLSS but clicking Hyper RX in AMD software (or if you want to call it app) produced a very enjoyable 1,100.000 population in that Game that was smooth as butter. What I don't understand that there are about 8 to 10 of you that love to wax on AMD like their GPUs are garbage.TAA is baked into most recent titles and can sometimes only be disabled by tweaking the configuration files. Games look broken afterwards. 100+ FPS + max settings + 1440p = get something 800+ USD á-la 7900 XTX, or upcoming GPUs if they're worth it (which is very, very unlikely in your case). The difference between high and max settings is so subtle you can squeeze more FPS by going down a bit. XeSS and FSR, despite being far from optimal, are sometimes a legit way to get another 30 to 50 percent performance.
This translates into me, for example, playing CP2077 at 90ish FPS at 1440p (SSR at Medium; XeSS at Quality) despite it clearly running at 50 FPS at pure max settings without XeSS involved (6700 XT). Visuals loss is significant, yet not deal breaking. This game is one of the worst cases for AMD GPU gaming anyway, DLSS is the only tech implemented somewhat correctly there. Native sucks, XeSS sucks, FSR sucks even more.
100+ FPS 1440p gaming has never been cheap. Getting away with a last gen mid ranger demands very significant game settings sacrifices.
System Name | D.L.S.S. (Die Lekker Spoed Situasie) |
---|---|
Processor | i5-12400F |
Motherboard | Gigabyte B760M DS3H |
Cooling | Laminar RM1 |
Memory | 32 GB DDR4-3200 |
Video Card(s) | RX 6700 XT (vandalised) |
Storage | Yes. |
Display(s) | MSi G2712 |
Case | Matrexx 55 (slightly vandalised) |
Audio Device(s) | Yes. |
Power Supply | Thermaltake 1000 W |
Mouse | Don't disturb, cheese eating in progress... |
Keyboard | Makes some noise. Probably onto something. |
VR HMD | I live in real reality and don't need a virtual one. |
Software | Windows 11 / 10 / 8 |
Benchmark Scores | My PC can run Crysis. Do I really need more than that? |
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
That depends on what you want to play. But like it's been said, I'd wait for the new cards to release and see the reviews.I am thinking about getting a 7800 XT, but is it really incapable of doing above 100 FPS without ray tracing at 1440p? That's pretty disappointing.
I typically like to play games on max settings with anti aliasing disabled (weird habit), and really hoped one of these could easily get me 120+ FPS even in demanding games. 165Hz monitor suddenly feels like a waste of money...
That all is deeply into personal opinion territory. You can't sell any of it as fact.@kapone32, hard to quote something specific from this, ahem, wall of thought, let's put it that way. However, what I stated/implied is:
1. Cyberpunk 2077 specifically is an NVIDIA oriented title. Just don't bother buying an AMD GPU if this is your main gaming focus.
2. This exact game, once again, doesn't have any good native rendering implementation, it relies on DLSS heavily.
3. I never said these are necessary in gaming in general. In some other games AMD/Intel GPUs massively suck, too, but not in all of them.
4. I played 1440p myself and I know what it takes to max it out at 100+ FPS. Definitely at least two times my 6700 XT's calculating power in the games of 2023+.
5. Hyper RX is just an "I am too lame or the game devs are too lame to enable appropriate FSR mode in game menu myself/themselves" button. Image quality suffers a crap ton because it's enabling FSR1.
6. AMD's behaviour is what is utter rubbish, not their GPUs. GPUs themselves are reasonable. Prices are horrible (not cheap enough to convince. Not nearly enough. If you only see Canadian prices then it's on you, it's not the whole world. Russian prices, for example, are BETTER for NVIDIA video cards), software features are always late and bad quality and marketing is just pure kindergarten.
Please try reading carefully, thoughtfully and remember not a single corporation is your friend. AMD, nVidia, Intel, whatever, they all are poison and venom at the same time. AMD are enablers of this monopoly, that's why I hate them more than I hate nVidia.
System Name | Best AMD Computer |
---|---|
Processor | AMD 7900X3D |
Motherboard | Asus X670E E Strix |
Cooling | In Win SR36 |
Memory | GSKILL DDR5 32GB 5200 30 |
Video Card(s) | Sapphire Pulse 7900XT (Watercooled) |
Storage | Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500 |
Display(s) | GIGABYTE FV43U |
Case | Corsair 7000D Airflow |
Audio Device(s) | Corsair Void Pro, Logitch Z523 5.1 |
Power Supply | Deepcool 1000M |
Mouse | Logitech g7 gaming mouse |
Keyboard | Logitech G510 |
Software | Windows 11 Pro 64 Steam. GOG, Uplay, Origin |
Benchmark Scores | Firestrike: 46183 Time Spy: 25121 |
@kapone32, hard to quote something specific from this, ahem, wall of thought, let's put it that way. However, what I stated/implied is:
1. Cyberpunk 2077 specifically is an NVIDIA oriented title. Just don't bother buying an AMD GPU if this is your main gaming focus.
2. This exact game, once again, doesn't have any good native rendering implementation, it relies on DLSS heavily.
3. I never said these are necessary in gaming in general. In some other games AMD/Intel GPUs massively suck, too, but not in all of them.
4. I played 1440p myself and I know what it takes to max it out at 100+ FPS. Definitely at least two times my 6700 XT's calculating power in the games of 2023+.
5. Hyper RX is just an "I am too lame or the game devs are too lame to enable appropriate FSR mode in game menu myself/themselves" button. Image quality suffers a crap ton because it's enabling FSR1.
6. AMD's behaviour is what is utter rubbish, not their GPUs. GPUs themselves are reasonable. Prices are horrible (not cheap enough to convince. Not nearly enough. If you only see Canadian prices then it's on you, it's not the whole world. Russian prices, for example, are BETTER for NVIDIA video cards), software features are always late and bad quality and marketing is just pure kindergarten.
Please try reading carefully, thoughtfully and remember not a single corporation is your friend. AMD, nVidia, Intel, whatever, they all are poison and venom at the same time. AMD are enablers of this monopoly, that's why I hate them more than I hate nVidia.
System Name | D.L.S.S. (Die Lekker Spoed Situasie) |
---|---|
Processor | i5-12400F |
Motherboard | Gigabyte B760M DS3H |
Cooling | Laminar RM1 |
Memory | 32 GB DDR4-3200 |
Video Card(s) | RX 6700 XT (vandalised) |
Storage | Yes. |
Display(s) | MSi G2712 |
Case | Matrexx 55 (slightly vandalised) |
Audio Device(s) | Yes. |
Power Supply | Thermaltake 1000 W |
Mouse | Don't disturb, cheese eating in progress... |
Keyboard | Makes some noise. Probably onto something. |
VR HMD | I live in real reality and don't need a virtual one. |
Software | Windows 11 / 10 / 8 |
Benchmark Scores | My PC can run Crysis. Do I really need more than that? |
Another one not even trying to read. I put zero opinion in any of that.That all is deeply into personal opinion territory. You can't sell any of it as fact.
In Cyberpunk? On a 7900 XT? It's impossible on ANY GPU. Even 5090 can't do that, it's gonna be hard stuck at double digits. If you have Hyper RX enabled then it's not native, you have massive FSR included. Hello, fellow 720p gamer!2. While I am getting 165 FPS at 4k native high
Literally any title with ray tracing being the only available option clearly shows that AMD GPUs trail behind Team Green pricesakes. Especially at the highest end where VRAM amount is redundant on whatever model. Avatar, for example, and SH2 IIRC. It's not strictly "absolutely sucks" but still bad.3. Show me a Game where AMD GPUs absolutely suck my 7900XT ha no problem with my entire library
Do you realise you compare a GPU that has a competitor to a GPU that hasn't? Compare to a 4070 Ti or maybe Super if we got a 7900 XTX that overclocks well enough, that's where 7900 XTX lies if we take RT seriously. Or 4080 if we do not. 4090 is a league of its own, it's allowed ANY price that's lower than whatever NV got in stock for even more advanced customers like these RTX 6000 GPUs.7900XTX is about $1300 while a 4090 is minimum $2900
Placebo effect. Colours are identical if we talk identical displays and identical properly working cables. If you talk "Vivid gaming" thing from the Adrenalin app it's arguably useful. Never found these oversaturated colours any appealing. Just calibrate your monitor properly and run games at default colours, unless you're a daltonic.Games have deeper colours on my 7900XT vs my 3060 laptop