Sunday, June 25th 2023
AMD Radeon RX 7600 Slides Down to $249
The AMD Radeon RX 7600 mainstream graphics card slides a little closer to its ideal price, with an online retailer price-cut sending it down to $249, about $20 less than its MSRP of $269. The cheapest RX 7600 graphics card in the market right now is the MSI RX 7600 MECH 2X Classic, going for $249 on Amazon; followed by the XFX RX 7600 SWFT 210 at $258, and the ASRock RX 7600 Challenger at $259.99.
The sliding prices of the RX 7600 should improve its prospects against the upcoming NVIDIA GeForce RTX 4060, which leaked 3DMark benchmarks show to be around 17% faster than the previous-generation RTX 3060 (12 GB) and 30% faster than its 8 GB variant. Our real-world testing puts the RX 7600 about 15% faster than the RTX 3060 (12 GB) at 1080p, which means there could be an interesting square-off between the RTX 4060 and RX 7600. NVIDIA has announced $299 as the baseline price for the RTX 4060, which should put pressure on AMD partners to trim prices of the RX 7600 to below the $250-mark.
Source:
VideoCardz
The sliding prices of the RX 7600 should improve its prospects against the upcoming NVIDIA GeForce RTX 4060, which leaked 3DMark benchmarks show to be around 17% faster than the previous-generation RTX 3060 (12 GB) and 30% faster than its 8 GB variant. Our real-world testing puts the RX 7600 about 15% faster than the RTX 3060 (12 GB) at 1080p, which means there could be an interesting square-off between the RTX 4060 and RX 7600. NVIDIA has announced $299 as the baseline price for the RTX 4060, which should put pressure on AMD partners to trim prices of the RX 7600 to below the $250-mark.
61 Comments on AMD Radeon RX 7600 Slides Down to $249
What i'm getting at is that "good enough" isn't good enough, they need to strive to be the best.
In regards to XeSS, here is an article on it's Death Stranding implementation: www.techpowerup.com/review/death-stranding-director-s-cut-xess-vs-dlss-vs-fsr-2-0-comparison/
I should note, TechPowerUp did not use an Intel card and thus did not get XMX acceleration. I looked around to see if I could find one but only found TechSpot's article also testing the non-XMX as well.
From the above article:
"Speaking of XeSS, compared to DLSS and FSR 2.0, the XeSS render quality in terms of overall image detail is comparable to what DLSS and FSR 2.0 can output, but with some differences in temporal stability. One of the most noticeable differences in image quality between XeSS, DLSS and FSR 2.0 is how XeSS deals with ghosting. XeSS has noticeable ghosting issues and black trails on the flying chyral crystal particles and flying cryptobiotes similar to what DLSS 2.1 had in the past. On the DLSS side, this issue was fixed with the updates to the DLSS render pipeline, and no doubt, it can be fixed in Xess too. What's also important to note is that we are testing XeSS with an RTX GPU using the "standard" kernel instead of the Intel Arc GPU kernel, which uses the XMX engines and an advanced XeSS upscaling model, which may affect our image quality results.
Interestingly, when using XeSS, there are some major differences in performance gains, compared to DLSS or FSR 2.0, which essentially had equal or very similar performance gains in most games. As we are testing XeSS with an RTX 3060 GPU, which does not have the XMX instruction set, designed to accelerate XeSS workloads on Intel's Arc GPUs, the performance gains are less than what we can expect on Arc GPUs, so keep that in mind. That said, the actual performance increase difference between XeSS and DLSS or FSR 2.0 is about 13% at 4K Quality mode, in favor of DLSS or FSR 2.0. However, compared to native 4K resolution, XeSS manages to deliver up to 25% more performance while using the DP4a instruction set that's compatible with all GPU architectures, which is still a quite decent performance uplift. "
There's also a shadow of the tomb raider article on it as well, again non-XMX: www.techpowerup.com/review/shadow-of-the-tomb-raider-xess-vs-dlss/
As you pointed out, these are not the latest versions of XeSS and in general there just isn't enough of a sample size to tell whether these are representative of the whole.
Another break ARC 380 with 6GB cost 140€ vs. RX 6400 4 Lanes 4GB 136€ vs. GTX 1650 4GB 164€,
ARC 380 have against its competitors: 2GB higher Memory, AV1 support, Qicksync and Raytracing Units (yeah useless in this priceclass but it have it)
You seem to think that unless you have a high end system you won't enjoy the experience? That notion makes the assumption that anyone who is not on 7000/4080/4090 cards can enjoy Gaming. more than people that own 6000/3000 or even ARC cards. You seem to be forgetting that everything is relative.
As for AMD, you know very well what I'm speaking of. There's issues that have been known for years on end, just from the top of my head, persistent corruption issues with video playback, ReLive crapping itself and recording nonsense (had this thing of it not recording any audio and/or super low quality even at high bit rates as far back with my Vega FE and this never changed - at least not where APUs are concerned, it's random too, sometimes hard to replicate), WattMan resetting all the time, high idle power while using multi-monitor setups, garbage VR performance, etc. - extremely unstable experience, one step forward and two backwards, it seems they can't fix anything without breaking 2 or 3 other things, recently ran into one issue with the Radeon audio driver on my 5600H was causing endless BSODs - always amdacpafd.sys repeatedly, issues which occur even on public driver builds... I literally gave up. It's a laughing stock and not of the funny kind - and if that wasn't enough, you need to take the backseat to all of the cool new tech because it's all developed for or by Nvidia.
Can't blame me if I'm not entirely enthusiastic about it. With AMD, sometimes you have to change graphics drivers like you change a t-shirt - because release A will work for these games well, but release B will not, and then release C fixes it again but release D performs better with that other game, however it doesn't work with the games that release A supports well so now you try release E, the games that worked on A B and C work well but the games that worked on D no longer do; it's always been this way and I honestly think people have gotten used to it and don't seem to mind it anymore.
My very first CPU I bought for myself was the 965Be. I enjoyed that so much that I had to get the 2 more cores that the 1090T gave and enjoyed the hell out of that chip too. I got an 8320 and enjoyed the 4.7 GHZ OC on that chip. Then I got a 1700X and was blown away. Then I jumped up to a 1920X and totally loved Threadripper. Then I got a 2920X and combined with my Crossfire Vega setup enjoyed 200+ FPS on my 60HZ monitor was nothing. Then I got a 3600 but that was replaced by the 5900X> Then that became a 5950X but now I just like Gaming so I got a 5800X3D. I missed the snappiness of many cores so I now am the proud owner of a 7900X3D. Through all of those years I have heard and read about how bad AMD is by people who don't even use it and like to compare AMD 2023 to AMD circa 2015. Your last sentence is evidence of what I am talking about. I have never enjoyed my PC as much as I do and both my CPU and GPU have a lot to do with it.