Sunday, June 25th 2023
AMD Radeon RX 7600 Slides Down to $249
The AMD Radeon RX 7600 mainstream graphics card slides a little closer to its ideal price, with an online retailer price-cut sending it down to $249, about $20 less than its MSRP of $269. The cheapest RX 7600 graphics card in the market right now is the MSI RX 7600 MECH 2X Classic, going for $249 on Amazon; followed by the XFX RX 7600 SWFT 210 at $258, and the ASRock RX 7600 Challenger at $259.99.
The sliding prices of the RX 7600 should improve its prospects against the upcoming NVIDIA GeForce RTX 4060, which leaked 3DMark benchmarks show to be around 17% faster than the previous-generation RTX 3060 (12 GB) and 30% faster than its 8 GB variant. Our real-world testing puts the RX 7600 about 15% faster than the RTX 3060 (12 GB) at 1080p, which means there could be an interesting square-off between the RTX 4060 and RX 7600. NVIDIA has announced $299 as the baseline price for the RTX 4060, which should put pressure on AMD partners to trim prices of the RX 7600 to below the $250-mark.
Source:
VideoCardz
The sliding prices of the RX 7600 should improve its prospects against the upcoming NVIDIA GeForce RTX 4060, which leaked 3DMark benchmarks show to be around 17% faster than the previous-generation RTX 3060 (12 GB) and 30% faster than its 8 GB variant. Our real-world testing puts the RX 7600 about 15% faster than the RTX 3060 (12 GB) at 1080p, which means there could be an interesting square-off between the RTX 4060 and RX 7600. NVIDIA has announced $299 as the baseline price for the RTX 4060, which should put pressure on AMD partners to trim prices of the RX 7600 to below the $250-mark.
61 Comments on AMD Radeon RX 7600 Slides Down to $249
Finally, at $249 it's competitive with the $229 6650XT cards you can still find new on store shelves.
FreeSync in industry standard while G-Sync is niche.
Surely you jest, do you know how terrible PC gaming would be if things like PhysX were standard and not made open source? CUDA is a good example of what happens when an Nvidia standard wins, zero vendor choice.
And please with the exaggeration on FSR. FSR is not that far off from DLSS. If FSR sucks as you says, so too does DLSS. That's if your opinion is consistent. Why must internet comments always needlessly exaggerate.
2. Hardware G-Sync monitors still tend to outperform FreeSync/VESA AdaptiveSync monitors in general, though this is irrelevant on a current context as compatibility is universal
3. CUDA gained its foothold because AMD failed to offer a competing solution when it mattered; and it's too late to do so now
4. There is no reason for anyone with the ability to use DLSS or XeSS to use FSR. Its greatest strength is that it is hardware agnostic, but NVIDIA can do all three and Radeons can do XeSS
www.techpowerup.com/review/atomic-heart-fsr-2-2-vs-dlss-2-vs-dlss-3-comparison/www.techpowerup.com/review/the-last-of-us-part-i-fsr-2-2-vs-dlss-comparison/www.techpowerup.com/review/cyberpunk-2077-xess-1-1-vs-fsr-2-1-vs-dlss-3-comparison/ I rest my case. It's clearly the worst upscaler of the bunch - and it's not even something I'm particularly enthusiastic about (though many feel this is the most important current-generation tech around) - I prefer a native image whenever possible.
If there was any specific tech to be lauded as a true improvement to gaming, it would absolutely be VRR. And while we have Nvidia to thank for that, being hardware locked into one vendor is no longer an issue. Module based gsync monitors are pretty rare, and I don’t think there are many reviews doing comparisons, or that sister units of the same design exist with/without, so there’s no way to make a definitive statement saying either is superior.
Remember for all the fanfare AMD went on about FSR 3, they're still a no-show, and were I a betting man they are probably looking at a way to mitigate the performance "issues" that XeSS exhibits in some way, as earlier iterations of FSR are all about speed and compatibility.
FSR3 seems like more of blindsiding than anything else; rushed PR announcement with likely little to no development completed when it was actually announced. It will be just as useless as DLSS3; pointless on high end hardware, trade offs at midrange, and bad on the lower end where it would theoretically be most useful.
"So which is better: G-Sync or FreeSync? With the features being so similar there is no inherent reason to select a particular monitor. Both technologies produces similar results, so the contest is mostly a wash at this point. There are a few disclaimers, however.
If you purchase a G-Sync monitor, you will only have support for its adaptive-sync features with a GeForce graphics card. You're effectively locked into buying Nvidia GPUs as long as you want to get the most out of your monitor. With a FreeSync monitor, particularly the newer, higher quality variants that meet the FreeSync Premium Pro certification, you're often free to use AMD or Nvidia graphics cards." Actually AMD very well intents to compete using ROCm
In fact they have a wrapper you can use to run CUDA native code on AMD cards.
That AMD originally didn't have a CUDA competitor was down to the fact that they were nearly broke for 8 years. XeSS looses more to FSR than wins. You cherry picking a single example doesn't prove otherwise. Your original comment was that FSR sucks. Nowhere in any of those linked articles, even the worse case scenarios does the author implies it sucks because it doesn't. That was you, who apparently doesn't use upscaling and has no personal experience, being overly hyperbolic.
The positive thing about RX 7600 which most of you decided to ignore is energy efficiency whilst running low-demanding tasks compared to previous generation cards. Look at its power consumption in Cyberpunk 1080p capped to 60 FPS (about 75 or 80 W) which is more than 1.5x lower than that of RX 6700 XT (about 130 W). In lands of expensive electricity bills this matters. And thus the card deserves its place on the shelves (mostly because predecessors are insanely expensive lmao).
There is no doubt this initial $300 was a rip off. But is there a card which is not priced way higher than most people agree to pay for it?
I do suggest you stop saying RDNA3 brings nothing to the table, it just makes you sound like a fanboy. The RDNA3 cards that are on 5nm(7900's) are just as efficient as Nvidia's 4000 series, and they're on an inferior node(5nm vs 4nm). They also typically are a better value-per-dollar than Nvidia's cards in pure rasterization.
For example, let's say I was in the market for a card for 1440p+, and my budget for the GPU is below $900. My highest-end options are a 7900XT and a 4070 Ti. I would almost definitely get a 7900XT over a 4070 Ti. With RDNA3 I get more than enough RAM to not worry about lowering textures in my games, I get 8-10% better rasterization performance in most games and I would save ~$15.
Seems like RDNA3 brought something to the table in this case, does it not?
2. Truly weird to double down on this, but hardware Gsync modules have a much wider range in general. Again, it doesn't matter. Both brands of VRR technology will work with both brands of GPUs, only that Radeon doesn't really take advantage of the hardware module, making it wasteful in a sense
3. ROCm isn't a competitor for CUDA and was never intended to be, ROCm is closer in nature to the Tesla compute mode driver. It's also not supported on Windows and hardware support is exceptionally limited: in fact RDNA 3 doesn't support it yet. Calling ROCm a competitor to CUDA is a bad move to say the least. Supposedly it's coming to Windows with that trademarked soon but currently, the official support is limited to Pro and Instinct only, exclusively on Linux. AMD fans need to understand that you can't bank on a potential future development to justify a product. Really, you have no guarantees a potential future thing will pan out as you expect.
4. I'm not cherry picking, what kind of denial is that? I've linked at least 4 of TPUs own reviews. Read them and see the comparison images. It's just worse. He won't say it sucks but you can deduce from what's being said. Shimmering, ghosting, occlusion issues, softened image, loss of detail... How is any of that desirable? Then you wonder why AMD has 10% of the GPU market share.
I don't have this need to appease, as an AMD fan you shouldn't defend them but DEMAND that they improve by exposing their dirt. As a corporation they only do the bare minimum to get approval from their customers. AMD isn't Mr. Nice Guy, demand improvements, vote with your wallet and they'll come.
$200 to $220 is completely OK. Anything more is just a one-way ticket for nVidia to profits. Anything less just doesn't make sense for AMD. Don't you forget there is a none slower RTX 4060 incoming which does also sport DLSS3 and Fake Frames™ techniques, want you or not, making for a difference worth paying a third more. Not to mention RTX 4060 is the first GPU of such price segment actually capable of running casual RT at reasonable speeds awhilst RX 7600 fails to even achieve that. And RTX 4060 is less power demanding.
"Reasonable" (probably reasonable indeed) price of anything below 270 USD in RTX 4060 would just be a complete funeral for all AMD products. Which, to be frank, haven't look good from the very start except for RX 7900 XTX which kicks 4070 Ti's butt for sure.
The only upside to the current GPU market situation is that it's giving Intel a good chunk of time to catch up with their drivers.
The statement towards AMD fans wasn't explicitly directed at you, sorry if it came out that way: but it's a general trend I see. AMD can do better. I know it, I've seen it first hand, trust me on this.
The easiest way to spot DLSS v. FSR is foliage and fire renditions. FSR will always be worse off. XeSS is still new. Cyberpunk is like the only implementation of XeSS 1.1 that I know of, but more are coming, and the criticisms leveled at FSR are consistent through a large variety of games. Good enough as it may be for some people, when options are available it's the last thing I'm looking at.
It's also insuffiicient for native 1440p on Ultra at 144 Hz or higher mark. Upscalers are the necessary poison here as well. And I'm not criticising 3090, don't get me wrong, this one is more than solid. Game developers' urge to make games as badly as possible is what's criticised.
Other than that, I agree. Using bad optimisation to sell upscaler tech is disgusting to say the least.