• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7600 Slides Down to $249

Joined
Dec 25, 2020
Messages
7,065 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Actually you can play some games with ray tracing on those too at really low resolutions which are more tolerable on a handled I suppose but the point is performance is so poor this isn't much of a selling point.

On my 3050M's case, it's not even that the hardware's performance is too inadequate, targeting 1080p with DLSS and medium settings, you're going to have a decent time... or would; if the 4 GB VRAM didn't get in the way. Nvidia is devious like that, even their low-end hardware is designed to be like a gateway drug to get people to buy their higher-end stuff.
 
Joined
Jun 14, 2020
Messages
3,550 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
FSR doesn't suck. In my opinion, it's pretty equal to DLSS in its current state. DLSS 1 also sucked, by the way, so there's that. The only other Nvidia-exclusive feature is DLSS 3 FG, which you won't enjoy on a mid-range card due to the latency, and is pretty much pointless on a high-end one due to the already high framerates. It only exists for Nvidia to win on review charts.

If you think it's worth the extra money, by all means, buy into the "ecosystem" (whatever that word means here), but I really think it isn't.
Why won't you enjoy it on a midrange card? Say I get 70 fps and fg gets me up to 100 or 120 or whatever, how is that a problem?

On my 3050M's case, it's not even that the hardware's performance is too inadequate, targeting 1080p with DLSS and medium settings, you're going to have a decent time... or would; if the 4 GB VRAM didn't get in the way. Nvidia is devious like that, even their low-end hardware is designed to be like a gateway drug to get people to buy their higher-end stuff.
Good thing with laptops is, you can drop resolution or use upscaling easily, the monitor is so small that ppi will still be insane huge
 
Joined
Dec 25, 2020
Messages
7,065 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Good thing with laptops is, you can drop resolution or use upscaling easily, the monitor is so small that ppi will still be insane huge

If you have a high-end panel, that is. My laptop has a basic 1080p 120Hz panel, at 15.6 it looks pretty much like any ol' entry level monitor.
 
Joined
Feb 20, 2019
Messages
8,343 (3.90/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The price it always should have been, given what it's up against at $180-250

Finally, at $249 it's competitive with the $229 6650XT cards you can still find new on store shelves.
 
Joined
Jul 13, 2016
Messages
3,354 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
By that I don't mean ray tracing, but all of the Nvidia-exclusive features that they've developed over the years. Successfully, that is.

Since most of AMD's open source equivalents either flopped (weren't adopted) or suck (FSR)

Vulkan is based on AMD's Mantle
FreeSync in industry standard while G-Sync is niche.

Surely you jest, do you know how terrible PC gaming would be if things like PhysX were standard and not made open source? CUDA is a good example of what happens when an Nvidia standard wins, zero vendor choice.

And please with the exaggeration on FSR. FSR is not that far off from DLSS. If FSR sucks as you says, so too does DLSS. That's if your opinion is consistent. Why must internet comments always needlessly exaggerate.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,550 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
If you have a high-end panel, that is. My laptop has a basic 1080p 120Hz panel, at 15.6 it looks pretty much like any ol' entry level monitor.
Ah, mine is 14" 1440p. Looks extra sharp even at 1080p so I have no issue playing with the igpu with FSR on. The panel is so small that even FSR looks good on it :roll:
 
Joined
Dec 25, 2020
Messages
7,065 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Vulkan is based on AMD's Mantle
FreeSync in industry standard while G-Sync is niche.

Surely you jest, do you know how terrible PC gaming would be if things like PhysX were standard and not made open source? CUDA is a good example of what happens when an Nvidia standard wins, zero vendor choice.

And please with the exaggeration on FSR. FSR is not that far off from DLSS. If FSR sucks as you says, so too does DLSS. That's if your opinion is consistent. Why must internet comments always needlessly exaggerate.

1. Vulkan's only positive is that it's an open-source standard that works on operating systems other than Windows. Under Windows, DirectX 12 produces consistently better results.
2. Hardware G-Sync monitors still tend to outperform FreeSync/VESA AdaptiveSync monitors in general, though this is irrelevant on a current context as compatibility is universal
3. CUDA gained its foothold because AMD failed to offer a competing solution when it mattered; and it's too late to do so now
4. There is no reason for anyone with the ability to use DLSS or XeSS to use FSR. Its greatest strength is that it is hardware agnostic, but NVIDIA can do all three and Radeons can do XeSS


Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people. While the amount of shimmering is less pronounced in comparison to the average FSR 2.1 implementation, shimmering is clearly more visible than in either the in-game native TAA or DLSS image output. Also, there is quite noticeable shimmering issues on weapon scopes, which glow brightly and blink in motion, especially at lower resolutions. The second-most-noticeable difference in the FSR 2.2 implementation compared to the in-game TAA or DLSS solution is a softer and less detailed overall image quality, which is especially visible with grass and vegetation in general.


...second-most-noticeable issue in both DLSS and FSR 2.2 is the ghosting around your characters head, and it is especially visible at lower resolution such as 1080p Quality mode. Also, the FSR 2.2 implementation has shimmering in motion on vegetation and tree leaves, however, the amount of shimmering is less pronounced in comparison to the usual FSR 2.1 implementations, like in the Resident Evil 4 Remake for example, and these shimmering issues on vegetation and tree leaves are visible only in motion.

Speaking of performance, compared to DLSS, FSR 2.2 has slightly smaller performance gains across all resolutions, while also producing more image quality issues compared to other available temporal upscaling techniques.


The in-game TAA solution has very poor rendering of small object detail—thin steel objects and power lines, tree leaves, and vegetation in general. The in-game TAA solution also has shimmering issues on the whole image, even when standing still, and it is especially visible at lower resolutions such as 1080p, for example. All of these issues with the in-game TAA solution are resolved when DLAA, DLSS or XeSS are enabled, due to the better quality of their built-in anti-aliasing solution. Also, the sharpening filters in the DLAA, DLSS and XeSS render path can help to improve overall image quality. With DLSS and XeSS you can expect an improved level of detail rendered in vegetation and tree leaves in comparison to the in-game TAA solution. Small details in the distance, such as wires or thin steel objects, are rendered more correctly and completely in all Quality modes. With DLAA enabled, the overall image quality improvement goes even higher, rendering additional details, such as higher fidelity hair for example, compared to the in-game TAA solution, DLSS and XeSS. Also, both DLSS 3.1 and XeSS 1.1 handle ghosting quite well, even at extreme angles.

The FSR 2.1 implementation comes with noticeable compromises in image quality
—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering on vegetation, tree leaves and thin steel objects; they are shimmering even when standing still and it is visible even at 4K FSR 2.1 Quality mode, which might be quite distracting for some people. Once you're switching from FSR 2.1 Quality mode to Balanced or Performance, the whole image will start to shimmer even more. The anti-aliasing quality is also inferior, as the overall image has more jagged lines in motion, especially visible behind cars while driving through the world and in vegetation. Also, in the current FSR 2.1 implementation ghosting issues are worse than both DLSS and XeSS at day time, and it is even more pronounced when there is a lack of lighting in the scene, as the FSR 2.1 image may have some black smearing behind moving objects at extreme angles.

I rest my case. It's clearly the worst upscaler of the bunch - and it's not even something I'm particularly enthusiastic about (though many feel this is the most important current-generation tech around) - I prefer a native image whenever possible.
 
Last edited:
Joined
Feb 10, 2023
Messages
285 (0.41/day)
Location
Lake Superior
AMD's their own enemy here with 6600 (XT) and 6650 XT pricing.
 
Joined
Apr 14, 2018
Messages
701 (0.29/day)
1. Vulkan's only positive is that it's an open-source standard that works on operating systems other than Windows. Under Windows, DirectX 12 produces consistently better results.
2. Hardware G-Sync monitors still tend to outperform FreeSync/VESA AdaptiveSync monitors in general, though this is irrelevant on a current context as compatibility is universal
3. CUDA gained its foothold because AMD failed to offer a competing solution when it mattered; and it's too late to do so now
4. There is no reason for anyone with the ability to use DLSS or XeSS to use FSR. Its greatest strength is that it is hardware agnostic, but NVIDIA can do all three and Radeons can do XeSS










I rest my case. It's clearly the worst upscaler of the bunch - and it's not even something I'm particularly enthusiastic about (though many feel this is the most important current-generation tech around) - I prefer a native image whenever possible.

Afaik, and the last time I checked, XeSS generally offers less of a performance improvement than FSR and DLSS; which is completely aside from the fact each software technology is prone to introduce additional visual artifacts otherwise not produced via native rendering. It’s interesting tech, but relatively useless if you care about image quality and stability; most notably ghosting, smearing, and flickering. There’s way too much emphasis put on these software rendering techniques. Not to mention I find it odd any tech review site can come to the conclusion that you’re improving image quality when you’re actively introducing additional artifacts.

If there was any specific tech to be lauded as a true improvement to gaming, it would absolutely be VRR. And while we have Nvidia to thank for that, being hardware locked into one vendor is no longer an issue. Module based gsync monitors are pretty rare, and I don’t think there are many reviews doing comparisons, or that sister units of the same design exist with/without, so there’s no way to make a definitive statement saying either is superior.
 
Joined
Dec 25, 2020
Messages
7,065 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Afaik, and the last time I checked, XeSS generally offers less of a performance improvement than FSR and DLSS; which is completely aside from the fact each software technology is prone to introduce additional visual artifacts otherwise not produced via native rendering. It’s interesting tech, but relatively useless if you care about image quality and stability; most notably ghosting, smearing, and flickering. There’s way too much emphasis put on these software rendering techniques. Not to mention I find it odd any tech review site can come to the conclusion that you’re improving image quality when you’re actively introducing additional artifacts.

If there was any specific tech to be lauded as a true improvement to gaming, it would absolutely be VRR. And while we have Nvidia to thank for that, being hardware locked into one vendor is no longer an issue. Module based gsync monitors are pretty rare, and I don’t think there are many reviews doing comparisons, or that sister units of the same design exist with/without, so there’s no way to make a definitive statement saying either is superior.

Agreed on the monitors (and I even pointed out that it's an irrelevant thing in the present age), but it's not that XeSS offers less of a performance improvement, it's that by design it can only be the fastest if the XMX cores are available (thus, on Arc GPUs). DP4A path is relatively high speed for an ML-based upscaler, but it will scale down to INT24 on hardware that doesn't support DP4A such as Navi 10/5700 XT, and that's where it will get particularly slow. It retains the image quality though.

Remember for all the fanfare AMD went on about FSR 3, they're still a no-show, and were I a betting man they are probably looking at a way to mitigate the performance "issues" that XeSS exhibits in some way, as earlier iterations of FSR are all about speed and compatibility.
 
Joined
Apr 14, 2018
Messages
701 (0.29/day)
Agreed on the monitors (and I even pointed out that it's an irrelevant thing in the present age), but it's not that XeSS offers less of a performance improvement, it's that by design it can only be the fastest if the XMX cores are available (thus, on Arc GPUs). DP4A path is relatively high speed for an ML-based upscaler, but it will scale down to INT24 on hardware that doesn't support DP4A such as Navi 10/5700 XT, and that's where it will get particularly slow. It retains the image quality though.

Remember for all the fanfare AMD went on about FSR 3, they're still a no-show, and were I a betting man they are probably looking at a way to mitigate the performance "issues" that XeSS exhibits in some way, as earlier iterations of FSR are all about speed and compatibility.

My point being, based on which vendor your GPU is from, you’re always going to want to run the software implementation from your vendor. The end result will always be a mixed bag of what did the upscalar “fix” vs what new bad thing it introduced.

FSR3 seems like more of blindsiding than anything else; rushed PR announcement with likely little to no development completed when it was actually announced. It will be just as useless as DLSS3; pointless on high end hardware, trade offs at midrange, and bad on the lower end where it would theoretically be most useful.
 
Joined
Apr 30, 2011
Messages
2,718 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
To become a bargain it needs to get closer to $200. Still, it has a good price now.
 
Joined
Dec 28, 2012
Messages
3,969 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
To become a bargain it needs to get closer to $200. Still, it has a good price now.
IMO, $150. $200 was the price of the 8GB RX 580......6 years ago. Today a 8GB GPU is the same as cards liek the RX 550 were back then, bare minimum 1080p cards that cant even truly match the current gen consoles.

AMD's their own enemy here with 6600 (XT) and 6650 XT pricing.
Those cards really do show what rDNA3 brings to the table: that is to say, nothing. core for core, clock for clock, there is almost no difference. So unsurprisingly, people are not jumping on it.
 
Joined
Jul 13, 2016
Messages
3,354 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
1. Vulkan's only positive is that it's an open-source standard that works on operating systems other than Windows. Under Windows, DirectX 12 produces consistently better results.

Vulkan tends to provide higher performance on average than DX 12 does. DX12's closer to metal features of which were inspired by mantle anyways.

2. Hardware G-Sync monitors still tend to outperform FreeSync/VESA AdaptiveSync monitors in general, though this is irrelevant on a current context as compatibility is universal

Nonsense: https://www.tomshardware.com/features/gsync-vs-freesync-nvidia-amd-monitor

"So which is better: G-Sync or FreeSync? With the features being so similar there is no inherent reason to select a particular monitor. Both technologies produces similar results, so the contest is mostly a wash at this point. There are a few disclaimers, however.

If you purchase a G-Sync monitor, you will only have support for its adaptive-sync features with a GeForce graphics card. You're effectively locked into buying Nvidia GPUs as long as you want to get the most out of your monitor. With a FreeSync monitor, particularly the newer, higher quality variants that meet the FreeSync Premium Pro certification, you're often free to use AMD or Nvidia graphics cards."

3. CUDA gained its foothold because AMD failed to offer a competing solution when it mattered; and it's too late to do so now

Actually AMD very well intents to compete using ROCm

In fact they have a wrapper you can use to run CUDA native code on AMD cards.

That AMD originally didn't have a CUDA competitor was down to the fact that they were nearly broke for 8 years.

4. There is no reason for anyone with the ability to use DLSS or XeSS to use FSR. Its greatest strength is that it is hardware agnostic, but NVIDIA can do all three and Radeons can do XeSS


I rest my case. It's clearly the worst upscaler of the bunch - and it's not even something I'm particularly enthusiastic about (though many feel this is the most important current-generation tech around) - I prefer a native image whenever possible.

XeSS looses more to FSR than wins. You cherry picking a single example doesn't prove otherwise. Your original comment was that FSR sucks. Nowhere in any of those linked articles, even the worse case scenarios does the author implies it sucks because it doesn't. That was you, who apparently doesn't use upscaling and has no personal experience, being overly hyperbolic.
 
Joined
Feb 24, 2023
Messages
3,149 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
FSR 2.1 is borderline ruining the whole gaming experience in Cyberpunk 2077 as of latter iterations of the game. That's for sure. Intel's XeSS provides with WAY more consistent and stable image, albeit for a huge performance cost (6700 XT here so no complaints, XeSS has a right to prefer Intel GPUs). Mega ghosting effect in motion is yuck!

The positive thing about RX 7600 which most of you decided to ignore is energy efficiency whilst running low-demanding tasks compared to previous generation cards. Look at its power consumption in Cyberpunk 1080p capped to 60 FPS (about 75 or 80 W) which is more than 1.5x lower than that of RX 6700 XT (about 130 W). In lands of expensive electricity bills this matters. And thus the card deserves its place on the shelves (mostly because predecessors are insanely expensive lmao).

There is no doubt this initial $300 was a rip off. But is there a card which is not priced way higher than most people agree to pay for it?
 
Joined
Jan 17, 2018
Messages
440 (0.17/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
IMO, $150. $200 was the price of the 8GB RX 580......6 years ago. Today a 8GB GPU is the same as cards liek the RX 550 were back then, bare minimum 1080p cards that cant even truly match the current gen consoles.


Those cards really do show what rDNA3 brings to the table: that is to say, nothing. core for core, clock for clock, there is almost no difference. So unsurprisingly, people are not jumping on it.
8GB 580 was never $200 MSRP. It may have gotten down to $200, but at release it was $230-300. $230 in 2017 is $248 in today. Like I said earlier, another $25 off or so and this becomes reasonable upgrade for people on a budget with those 580's. or older cards/lower performing cards.

I do suggest you stop saying RDNA3 brings nothing to the table, it just makes you sound like a fanboy. The RDNA3 cards that are on 5nm(7900's) are just as efficient as Nvidia's 4000 series, and they're on an inferior node(5nm vs 4nm). They also typically are a better value-per-dollar than Nvidia's cards in pure rasterization.

For example, let's say I was in the market for a card for 1440p+, and my budget for the GPU is below $900. My highest-end options are a 7900XT and a 4070 Ti. I would almost definitely get a 7900XT over a 4070 Ti. With RDNA3 I get more than enough RAM to not worry about lowering textures in my games, I get 8-10% better rasterization performance in most games and I would save ~$15.

Seems like RDNA3 brought something to the table in this case, does it not?
 
Last edited:
Joined
Apr 30, 2011
Messages
2,718 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
IMO, $150. $200 was the price of the 8GB RX 580......6 years ago. Today a 8GB GPU is the same as cards liek the RX 550 were back then, bare minimum 1080p cards that cant even truly match the current gen consoles.


Those cards really do show what rDNA3 brings to the table: that is to say, nothing. core for core, clock for clock, there is almost no difference. So unsurprisingly, people are not jumping on it.
We have to include inflation and, to be precise, 8GB RX480 and later 580 cost $250 when they launched back then. They dropped in price much later and got closer to $200. So, 7600 for $250 is fair priced and anything lower than that price is even better. None has to agree, just my opinion considering all of the above.
 
Joined
Dec 25, 2020
Messages
7,065 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Vulkan tends to provide higher performance on average than DX 12 does. DX12's closer to metal features of which were inspired by mantle anyways.



Nonsense: https://www.tomshardware.com/features/gsync-vs-freesync-nvidia-amd-monitor

"So which is better: G-Sync or FreeSync? With the features being so similar there is no inherent reason to select a particular monitor. Both technologies produces similar results, so the contest is mostly a wash at this point. There are a few disclaimers, however.

If you purchase a G-Sync monitor, you will only have support for its adaptive-sync features with a GeForce graphics card. You're effectively locked into buying Nvidia GPUs as long as you want to get the most out of your monitor. With a FreeSync monitor, particularly the newer, higher quality variants that meet the FreeSync Premium Pro certification, you're often free to use AMD or Nvidia graphics cards."



Actually AMD very well intents to compete using ROCm

In fact they have a wrapper you can use to run CUDA native code on AMD cards.

That AMD originally didn't have a CUDA competitor was down to the fact that they were nearly broke for 8 years.





XeSS looses more to FSR than wins. You cherry picking a single example doesn't prove otherwise. Your original comment was that FSR sucks. Nowhere in any of those linked articles, even the worse case scenarios does the author implies it sucks because it doesn't. That was you, who apparently doesn't use upscaling and has no personal experience, being overly hyperbolic.

1. Vulkan games never perform better than DX12 on Windows if the game offers both code paths on Nvidia GPUs. Vulkan in addition has several limitations on Windows that DX12 doesn't regarding multiplane and other things that are of relevance to developers. I understand AMD's DirectX implementations have always been behind but with the overhauled driver base that they introduced for Navi 21 and newer in May 2022 it should be much better.

2. Truly weird to double down on this, but hardware Gsync modules have a much wider range in general. Again, it doesn't matter. Both brands of VRR technology will work with both brands of GPUs, only that Radeon doesn't really take advantage of the hardware module, making it wasteful in a sense

3. ROCm isn't a competitor for CUDA and was never intended to be, ROCm is closer in nature to the Tesla compute mode driver. It's also not supported on Windows and hardware support is exceptionally limited: in fact RDNA 3 doesn't support it yet. Calling ROCm a competitor to CUDA is a bad move to say the least. Supposedly it's coming to Windows with that trademarked soon but currently, the official support is limited to Pro and Instinct only, exclusively on Linux. AMD fans need to understand that you can't bank on a potential future development to justify a product. Really, you have no guarantees a potential future thing will pan out as you expect.

4. I'm not cherry picking, what kind of denial is that? I've linked at least 4 of TPUs own reviews. Read them and see the comparison images. It's just worse. He won't say it sucks but you can deduce from what's being said. Shimmering, ghosting, occlusion issues, softened image, loss of detail... How is any of that desirable? Then you wonder why AMD has 10% of the GPU market share.

I don't have this need to appease, as an AMD fan you shouldn't defend them but DEMAND that they improve by exposing their dirt. As a corporation they only do the bare minimum to get approval from their customers. AMD isn't Mr. Nice Guy, demand improvements, vote with your wallet and they'll come.
 
Last edited:
Joined
Feb 24, 2023
Messages
3,149 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
This is worth 100 usd
Whilst I agree on its price being considerably high this seems to be an exaggeration. It would be the first $100 GPU to run ALL games at 1080p at high settings at 60 FPS if it cost $100. Even the legendary GTX 1060 series and RX 480 series had been twice as expensive whilst providing with less smoothness in games.

$200 to $220 is completely OK. Anything more is just a one-way ticket for nVidia to profits. Anything less just doesn't make sense for AMD. Don't you forget there is a none slower RTX 4060 incoming which does also sport DLSS3 and Fake Frames™ techniques, want you or not, making for a difference worth paying a third more. Not to mention RTX 4060 is the first GPU of such price segment actually capable of running casual RT at reasonable speeds awhilst RX 7600 fails to even achieve that. And RTX 4060 is less power demanding.

"Reasonable" (probably reasonable indeed) price of anything below 270 USD in RTX 4060 would just be a complete funeral for all AMD products. Which, to be frank, haven't look good from the very start except for RX 7900 XTX which kicks 4070 Ti's butt for sure.
 
Joined
Jul 13, 2016
Messages
3,354 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
1. Vulkan games never perform better than DX12 on Windows if the game offers both code paths on Nvidia GPUs. Vulkan in addition has several limitations on Windows that DX12 doesn't regarding multiplane and other things that are of relevance to developers. I understand AMD's DirectX implementations have always been behind but with the overhauled driver base that they introduced for Navi 21 and newer in May 2022 it should be much better.

Key words there are Nvidia GPU. You are confusing AMD's DX11 and DX9 implementation with their DX12 implementation. AMD's DX12 implementation has not had the same multi-threading problems that they had with DX11 and 9. That might be down to several factors aside from the game itself being more in control of threads on the newer API.

2. Truly weird to double down on this, but hardware Gsync modules have a much wider range in general. Again, it doesn't matter. Both brands of VRR technology will work with both brands of GPUs, only that Radeon doesn't really take advantage of the hardware module, making it wasteful in a sense

There are a variety of variable sync capable display scalers and for the most part they have gotten as good if not better than the G-Sync module. Some Variable refresh rate scales are capable of a larger refresh range than the G-Sync module is As the article I linked pointed out, they are feature equivalent.

3. ROCm isn't a competitor for CUDA and was never intended to be, ROCm is closer in nature to the Tesla compute mode driver. It's also not supported on Windows and hardware support is exceptionally limited: in fact RDNA 3 doesn't support it yet. Calling ROCm a competitor to CUDA is a bad move to say the least. Supposedly it's coming to Windows with that trademarked soon but currently, the official support is limited to Pro and Instinct only, exclusively on Linux. AMD fans need to understand that you can't bank on a potential future development to justify a product. Really, you have no guarantees a potential future thing will pan out as you expect.

I never said ROCm was going to save anyone. I merely pointed it out given that you implied AMD was never going to have a CUDA competitor.

4. I'm not cherry picking, what kind of denial is that? I've linked at least 4 of TPUs own reviews. Read them and see the comparison images. It's just worse. He won't say it sucks but you can deduce from what's being said. Shimmering, ghosting, occlusion issues, softened image, loss of detail... How is any of that desirable? Then you wonder why AMD has 10% of the GPU market share.

You linked 4 articles, of which 1 has XeSS and based off that sample of 1 you declared XeSS was better than FSR. None of the articles you linked are one of the better implementations of FSR either. Anyone can cherry pick examples to make any one of the three up-scaling technologies look bad. You aren't even trying to take a balanced approach.

I don't have this need to appease, as an AMD fan you shouldn't defend them but DEMAND that they improve by exposing their dirt. As a corporation they only do the bare minimum to get approval from their customers. AMD isn't Mr. Nice Guy, demand improvements, vote with your wallet and they'll come.

You do realize that both me and you own Nvidia cards right? You probably should have checked system specs before making such nonsense claims. The difference is I don't get invested in a brand and blindly defend a company. My post history here clearly demonstrates this. You don't have to be an AMD fanboy to say that FSR does not suck, that was clearly hyperbolic language on your part which you seem determined to continue to double down on. I'd challenge you to tell the difference in a double blind test.

We have to include inflation and, to be precise, 8GB RX480 and later 580 cost $250 when they launched back then. They dropped in price much later and got closer to $200. So, 7600 for $250 is fair priced and anything lower than that price is even better. None has to agree, just my opinion considering all of the above.

The die size is a decent bit smaller than a 580. At the end of the day the GPU is priced like most other GPUs this generation, underwhelming and lacking excitement.

The only upside to the current GPU market situation is that it's giving Intel a good chunk of time to catch up with their drivers.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,065 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Honestly man I'm not looking for a fight, and that I'm so critical of AMD sometimes is because I know they can do better. I have an Nvidia GPU out of chance, speaking for myself I've always had a thing for Radeon cards - but so far AMD hasn't given me reasons to celebrate. Seems like they just keep self owning like that.

The statement towards AMD fans wasn't explicitly directed at you, sorry if it came out that way: but it's a general trend I see. AMD can do better. I know it, I've seen it first hand, trust me on this.

The easiest way to spot DLSS v. FSR is foliage and fire renditions. FSR will always be worse off. XeSS is still new. Cyberpunk is like the only implementation of XeSS 1.1 that I know of, but more are coming, and the criticisms leveled at FSR are consistent through a large variety of games. Good enough as it may be for some people, when options are available it's the last thing I'm looking at.
 
Joined
Jan 14, 2019
Messages
12,627 (5.81/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Honestly man I'm not looking for a fight, and that I'm so critical of AMD sometimes is because I know they can do better. I have an Nvidia GPU out of chance, speaking for myself I've always had a thing for Radeon cards - but so far AMD hasn't given me reasons to celebrate. Seems like they just keep self owning like that.

The statement towards AMD fans wasn't explicitly directed at you, sorry if it came out that way: but it's a general trend I see. AMD can do better. I know it, I've seen it first hand, trust me on this.
I'm not entirely sure what you expect AMD to "do better" at. The arguments I see above seem more like nit-picking than actual arguments to me. At least as an owner of both AMD and Nvidia cards, I don't see any of those issues manifest anywhere. If you're a developer, and you absolutely need CUDA, I understand. Other than that, your arguments sound a bit made-up to me (no offense).

The easiest way to spot DLSS v. FSR is foliage and fire renditions. FSR will always be worse off. XeSS is still new. Cyberpunk is like the only implementation of XeSS 1.1 that I know of, but more are coming, and the criticisms leveled at FSR are consistent through a large variety of games. Good enough as it may be for some people, when options are available it's the last thing I'm looking at.
I see you've got a 3090. In my opinion, comparing any kind of upscaling with that level of performance at hand is a bit silly. I would just run everything at native res. :)
 
Joined
Feb 24, 2023
Messages
3,149 (4.69/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
I would just run everything at native res.
Good luck running latterday crap at native 4K on Ultra settings using this card. It will cry and beg you for dropping some settings below Ultra.

It's also insuffiicient for native 1440p on Ultra at 144 Hz or higher mark. Upscalers are the necessary poison here as well. And I'm not criticising 3090, don't get me wrong, this one is more than solid. Game developers' urge to make games as badly as possible is what's criticised.
 
Top