• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sapphire Radeon RX 7600 XT Pulse

It's the 4060 Ti 16 GB all over again, except this time it's slower than the 4060 and also acceptable because AMD did it
Who said it was acceptable? All reviewers basically said it's pointless.
 
Who said it was acceptable? All reviewers basically said it's pointless.

Read all the reviews (this being the last one) and not once have I read a comment that "AMD is evil, greedy, releasing pointless cash grab products to extort money from suckers"
 
Read all the reviews (this being the last one) and not once have I read a comment that "AMD is evil, greedy, releasing pointless cash grab products to extort money from suckers"

Maybe at 2/3 the price with 2/3 the markup it's more acceptable?
 
It's the 4060 Ti 16 GB all over again, except this time it's slower than the 4060 and also acceptable because AMD did it

The review doesn't show it as slower than a 4060. Surely you mean a 4060ti?

1080p:
Screenshot 2024-01-24 at 22-37-24 relative-performance-1920-1080.png (PNG Image 560 × 1410 pix...png


Also I see nowhere saying this is an acceptable launch. Everyone is saying it needs a price cut and is pointless.


Maybe at 2/3 the price with 2/3 the markup it's more acceptable?

I don't see anyone defending this. Also the rtx4060ti 16gb was close to a 4070 12gb in price. Although it was technically a $100 difference,the 4070 street price had dropped so it was less than that.
 
I'm not surprised, but then again, who would be. AMD just piles on the disappointment this generation. The 7600XT is at least 10% overpriced for what it offers. If I was shopping in this price range, I'd the 7600 vanilla for $60 less, no questions asked.
 
I don't see anyone defending this. Also the rtx4060ti 16gb was close to a 4070 12gb in price. Although it was technically a $100 difference,the 4070 street price had dropped so it was less than that.

RT factored in it's 15% slower. I know the cool thing is to deny it exists, but I feel like that 5 years since the first gen RT GPUs, poor raytracing performance no longer gets a pass.
 
RT factored in it's 15% slower. I know the cool thing is to deny it exists, but I feel like that 5 years since the first gen RT GPUs, poor raytracing performance no longer gets a pass.

25-40 fps at 1080p in RT games makes RT mostly irrelevant in this performance category. 7% faster in raster and 15% slower in RT is a wash at best and overall better when you're targetting 60 fps, because you'll use little if any RT. DLSS is a better feature to emphasize at this performance level.
 
RT factored in it's 15% slower. I know the cool thing is to deny it exists, but I feel like that 5 years since the first gen RT GPUs, poor raytracing performance no longer gets a pass.

I have a 3060ti and the RT performance isn't great in newer games such as Alan Wake 2. I had my card three years and the 4060 has worse performance.

rt-alan-wake-2-1920-1080.png


People talk about RT performance but everyone who I know cares about it buys higher end hardware.

This whole generation is a con anyway. The 7600/4060 are a 7500xt/4050ti. The 4060ti 16gb should be at least a 4060. The 7700xt should be a 7600xt 12gb,etc. Instead we get both of these companies just pricing around each other. They both want gamers to spend more to get a noticeable upgrade.
 
RT factored in it's 15% slower. I know the cool thing is to deny it exists, but I feel like that 5 years since the first gen RT GPUs, poor raytracing performance no longer gets a pass.
Sure, RT performance is better in the 4060, no one will deny that, but we can agree that at that performance tier, most gpus can't do raytracing properly, right? Sure, we have gems like Metro Exodus Enhanced, which run with a great RT implementation at very good framerates, but most RT enabled games (specially nowadays) often have very, very heavy implementations, even if they aren't even that good, limiting RT to 4070 super-ish performance for RT for it to be useable (Disclaimer: I don't factor in upscaling). So that pretty much limits this gpu for Rasterized only, where it loses.

I do agree that the gpu is pointless and shouldn't be a thing, or should be cheaper.
 
So an extra $60 for 8GB of ram and some OC

Since ram helps for 4k, maybe they should have at least slapped a 256-bit mem bus on it.....................
 
Sure, RT performance is better in the 4060, no one will deny that, but we can agree that at that performance tier, most gpus can't do raytracing properly, right? Sure, we have gems like Metro Exodus Enhanced, which run with a great RT implementation at very good framerates, but most RT enabled games (specially nowadays) often have very, very heavy implementations, even if they aren't even that good, limiting RT to 4070 super-ish performance for RT for it to be useable (Disclaimer: I don't factor in upscaling). So that pretty much limits this gpu for Rasterized only, where it loses.

I do agree that the gpu is pointless and shouldn't be a thing, or should be cheaper.

Good you bring up Metro EE. With some DLSS quality (which the 4060 supports), even my laptop's 4 GB 3050 can actually handle that game at an "acceptable" frame rate. 30-40fps at medium at the Taiga level. Despite the card being 8 GB, it still comes out on top regardless...

So an extra $60 for 8GB of ram and some OC

Since ram helps for 4k, maybe they should have at least slapped a 256-bit mem bus on it.....................

It's Navi 33, it's impossible to have a 256-bit bus unless it's moved up to Navi 32, and the 7700 XT is coming at 192-bit/12 GB.

25-40 fps at 1080p in RT games makes RT mostly irrelevant in this performance category. 7% faster in raster and 15% slower in RT is a wash at best and overall better when you're targetting 60 fps, because you'll use little if any RT. DLSS is a better feature to emphasize at this performance level.

Pretty much the same dynamic at the entire stack throughout, so I guess it's "fair" to call them equivalent at best.
 
Is AMD still making the 6700XT/6750XT?
I was wondering the same thing. Would of rather just them kept making/selling the 6700xt for $299 to be honest even if they just called it the 7600xt.

A rebadge would of been faster and same price ????
 
As a 1080p only card and $259 I'd recommend it. 7600 has so little going for it, this should have been the one and only N33 product. What a super piss poor effort from AMD, this is really N34 level stuff. And yes Nvidia are just as bad with the 4050 class 4060 BS.
 
Disapppointment is all over me...
I wonder what's the real improvement RDNA 3's got over RDNA 2... I guess if it weren't for 6 nm process node, RX 7600 / 7600 XT would be no more energy efficient than RX 6600 XT. And yes sure floating-point performance is doubled for double-issue design but why doesn't that help in games?
 
It is trailing with the 6700XT. I'd say the 6700XT is "more bang for the buck" because things like MPT are still functional. This means that you can play with power way more then the 20% AMD is putting through it's drivers.
 
Good you bring up Metro EE. With some DLSS quality (which the 4060 supports), even my laptop's 4 GB 3050 can actually handle that game at an "acceptable" frame rate. 30-40fps at medium at the Taiga level. Despite the card being 8 GB, it still comes out on top regardless...
Yeah I know right, that game is actually well made. I play at native 1440p, ultra preset, tessellation, rt reflections, rt on medium or high, no VRS and I still get 80-110fps, it's a very well developed game with an impressive RT implementation that somehow does not destroy your performance. It even runs at 60fps on consoles!
But sadly most games (specially todays games) don't have those RT implementations, they usually use whatever comes with UE, which is likely to be very heavy, and needless to say that neither the 4060 or 7600XT can do RT there, hell not even some cards above them can, not even with sacrifices.
 
poor raytracing performance no longer gets a pass.
Considering that even the fastest cards come down to a crawling halt in a lot of RT games without the use of upscaling and frame gen I think it's OK for any card to get a pass if it's 15% slower in RT.
 
16GB seems to help in a number of games.

rt-spiderman-remastered-2560-1440.png


Makes Spiderman + RT possible at 1440p with over 60 fps. A shame there does not seem to be any min fps for the games tested with RT.

rt-far-cry-6-2560-1440.png


A much smoother experience in FC6 at 1440p with RT on and I suspect if min FPS was available the XT would be > 60 and the standard would be sub 60.

ratchet-and-clank-1920-1080.png


min-ratchet-and-clank-1920-1080.png


Ratchet and Clank shows that more VRAM takes decompression load off of the GPU. I would not be surprised if more PS5 exclusive ports have this kind of behaviour going forward. Look at those minimums as well, if you want a locked 60 fps 1080p experience in R&C the XT can deliver and the 7600 fails miserably.

the-last-of-us-part-1-1920-1080.png


min-the-last-of-us-part-1-1920-1080.png


Helps in TLOU as well.

the-last-of-us-part-1-2560-1440.png


min-the-last-of-us-part-1-2560-1440.png


This is sub 60 fps but it shows it beating far faster cards with less VRAM like the 3070 and 4060Ti. 41 fps min with 45 average is a very stable frame rate. With tuning I bet you could get a 60 fps 1440p experience in this title but there is no way the 7600 is getting that and even more powerful cards like the 3060Ti look like they might struggle with it.

I do wish we had atleast 1 outlet doing the [H]ardOCP maximum playable settings type comparison so the IQ difference between the 7600 and 7600XT was actually visible.

Another issue which HUB showed is that not all games suffer performance penalties with low VRAM, they just use crap textures instead like Halo Infinite, that makes the FPS chart useless because the IQ in those cases is not actually fixed and as such you are not comparing like for like. Assuming this suite is not automated there needs to be massive asterisks next to cards that display typical FPS but have much worse IQ due to texture swapping even when the settings are equal.

One thing I notice about these titles is that they are PS5 ports which has 10GB or more of ram available for the GPU and a fast texture decompression engine to offload that task when streaming assets. I would not be at all surprised if more and more PS5 exclusive titles that get ported to the PC have far worse performance on weaker 8GB GPUs because they are using some of the compute to decompress assets where as those cards with 12GB or more can have them sitting in VRAM already decompressed so all the compute can be used to render the image. This will buck the conventional wisdom that weak GPUs don't need that much VRAM and would actually make it more important than slightly higher tier GPUs like the 6700XT or the 7700XT which have an excess of compute so can actually do both.

As I said in another thread. For those who keep hardware for 4/5 years this is a pretty okay GPU, the 6700XT is better if you are in a region where that is cheap so go for that but otherwise this is probably the next best. In my region the price difference between the 4060 and the pulse is £20 which is about 7% more and you can't get cheap 6700XTs or 6750XTs. I would not be surprised if this has legs like the RX480 8GB did over the 4GB version. At the time the advice was go for the 4GB because nothing uses more yet, then we came out of the PS4 cross gen phase and into the 8th gen only phase and 4GB was left behind. We are starting to see the signs of that happening again with 8GB GPUs and the 9th Gen only phase. If you intend to upgrade again within the next 2 years then you would probably be fine with the 8GB parts but if not you are going to need to make more IQ compromises to retain playable frame rates in upcoming AAA titles.
 
The VRAM discussion is misleading because games nowadays will simply just unload more assets/textures if there isn't enough VRAM so you may not get catastrophic performance issues but instead the game will just look worse to varying degrees of noticeability. That being said I wouldn't touch a GPU that has 8GB be it AMD or Nvidia unless it's really low end and cheap.
 
AMD should slap a sticker on the box: "Our weakest offer, yet still faster than Intel's best shot." :laugh:
 
Considering that even the fastest cards come down to a crawling halt in a lot of RT games without the use of upscaling and frame gen I think it's OK for any card to get a pass if it's 15% slower in RT.
I'm willing to ignore raytracing on 4070 and slower in today's games.

The benchmark for which GPUs have enough grunt to enable raytracing without seriously harming image quality overall is a moving target based on what the popular games are and how heavy their RT implementations are.

The 3090 was the only era-appropriate card I used that could actually raytrace without having to make massive compromises elsewhere. My 3070 couldn't hack the same raytraced games without killing framerate, or image quality-destroying DLSS balanced/performance. Better to just disable RT and crank up all the other settings instead!

RT factored in it's 15% slower. I know the cool thing is to deny it exists, but I feel like that 5 years since the first gen RT GPUs, poor raytracing performance no longer gets a pass.
Nothing at this price point can RT. It's a moot point because the equivalent $330 Nvidia card is so hopeless at RT you can't even use it.
 
image quality-destroying DLSS balanced/performance
At 4K (27" display, 40" eye-to-display distance), I notice no difference between Quality and Balanced (FSR), with Performance being ever so slightly worse (still better than native 1080p I came from). I play with Balanced if FPS count is right and with Performance if it stutters too much (with RX 6700 XT, it's rather expected). I assume DLSS at Performance will be more than acceptable for me as well.
I'm willing to ignore raytracing on 4070 and slower in today's games.
Cyberpunk 2077 with some tweaks and aggressive FSR proved playable at 4K with 6700 XT with RT Reflections being enabled (other RT is disabled). Of course it's 20 to 50 FPS but it's not a particularly fast paced game to begin with. Currently play at 3200x1800 with the same settings, averaging at 48 FPS (33 to 64 FPS range). 4070 would've given me about 2.5 times the performance so idk...
 
Cyberpunk 2077 with some tweaks and aggressive FSR proved playable at 4K with 6700 XT with RT Reflections being enabled (other RT is disabled). Of course it's 20 to 50 FPS but it's not a particularly fast paced game to begin with. Currently play at 3200x1800 with the same settings, averaging at 48 FPS (33 to 64 FPS range). 4070 would've given me about 2.5 times the performance so idk...

I think you are running at a net lower Image Quality just to turn RT reflections on when you could turn it off and run closer or maybe even at native resolution.
 
I think you are running at a net lower Image Quality just to turn RT reflections on when you could turn it off and run closer or maybe even at native resolution.
Yeah, the game runs faster with FSR at Quality and RT disabled but I'm too tired of cursed baked reflections.
 
Back
Top