• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7600 XT Launches on May 25

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,676 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD Radeon RX 7600 XT reportedly launches on May 25, 2023. Moore's Law is Dead scored the key dates associated with the launch. The upcoming performance-segment graphics card is rumored to be based on the 5 nm "Navi 33" silicon and RDNA3 graphics architecture. Apparently, the tech press should have its samples to test by May 15, and AMD is taking a similar approach to NVIDIA's recent GeForce RTX 4070 launch, where cards priced at MSRP will be eligible to a review embargo that's a day sooner than that of non-MSRP cards. Reviews of MSRP cards go live on May 24, with those of non-MSRP cards following the next day on May 25, along with market availability. It's no wonder that we heard reports of RX 7600 series cards being shown off at Computex, all those cards will be available to purchase by then.



View at TechPowerUp Main Site | Source
 
Did you guys see the rumour that AMD is releasing a 7800XTX, 7800XT and 7700XT with 16 and 12 GB of VRAM respectively?
 
Did you guys see the rumour that AMD is releasing a 7800XTX, 7800XT and 7700XT with 16 and 12 GB of VRAM respectively?
That's the rumor since the 7900 XT and 7900 XTX has more than 256-bit bus.

7800 XT = 256 bit, replacing RX 6800 XT 16 GB.
7700 XT = 192 bit, replacing RX 6700 XT 12 GB.

Major improvements are raytracing hardware and AV1 encoding support.
 
I'll never understand ray tracing, it just doesn't impress me at all personally.
 
I'll never understand ray tracing, it just doesn't impress me at all personally.

That's the same reaction we got to tessellation, contact hardened shadows, subsurface scattering, unified shaders, HDR lighting, bloom, hardware T&L, etc.

Games have used it sparingly and conservatively due to hardware being largely inadequate for it (AMD cards need not apply and you need the most expensive Nvidia GPUs to use it well), but even then there have been a few cases of exceptional raytracing usage. Metro Enhanced and Cyberpunk's new Overdrive mode are great examples.

It'll take some time, but it is the next major evolution in computer graphics, and in due time, it will become a requirement.
 
That's the same reaction we got to tessellation, contact hardened shadows, subsurface scattering, unified shaders, HDR lighting, bloom, hardware T&L, etc.

Games have used it sparingly and conservatively due to hardware being largely inadequate for it (AMD cards need not apply and you need the most expensive Nvidia GPUs to use it well), but even then there have been a few cases of exceptional raytracing usage. Metro Enhanced and Cyberpunk's new Overdrive mode are great examples.

It'll take some time, but it is the next major evolution in computer graphics, and in due time, it will become a requirement.

I would remind you they said the same thing about Physx (which was also pretty cool when done right, the papers flying around in Batman games was pretty neat looking)
 
I'll never understand ray tracing, it just doesn't impress me at all personally.

IMO, they are placing too much focus on the wrong thing, and have done so since ray tracing was introduced.

Supposedly, the more "audience" a game COULD have, the more potential customers the game's publisher could have, so i find it VERY ODD for game publishers to PURPOSELY cripple their games by having such RIDICULOUS minimum requirements, which are AGGRAVATED EVEN FURTHER with ray tracing.

Were "the penalty" for enabling ray tracing like ... say ... 15% or 20% tops, THEN it would be justified, but "the penalty" is often in the 40% range and sometimes crosses the 50% mark (AMD cards mostly, but not always), which makes the tech premature, IMO: it needs to be developed further.


Were the game's minimum requirements "more down to earth" while ALSO having "higher quality modes", their games could have A LOT MORE potential buyers: the game's publishers are shooting their own feet ... with a cannon ...
 
That's the same reaction we got to tessellation, contact hardened shadows, subsurface scattering, unified shaders, HDR lighting, bloom, hardware T&L, etc.

The difference is that most of those technologies provide an obvious visual benefit at a much lower performance cost at the time they were popularized. Even with dedicated hardware accelerators 1st generation RT hardware can take up to an 80% FPS hit with RT enabled and the visual benefit is often subjective. On top of that Nvidia has been cranking up the price of it's graphics cards using RT as an excuse. Did the price of graphics cards double when Tessellation or SSAO were introduced? Nope. Nvidia is pushing RT because it sells graphics cards, it's great at quickly making last gen cards obsolete, and tons of people are ripe to parrot that sales line ignoring the obvious fact that the benefit to negative ratio is not nearly there yet.

Games have used it sparingly and conservatively due to hardware being largely inadequate for it (AMD cards need not apply and you need the most expensive Nvidia GPUs to use it well), but even then there have been a few cases of exceptional raytracing usage. Metro Enhanced and Cyberpunk's new Overdrive mode are great examples.

They use it sparingly because very few people have PCs capable of it in general. You have to spend at least $500 to get something like a 3070 / RX 6800 and even then the 3070 is already obsolete in new AAA games. $500 doesn't even get you a card capable of playing AAA games without issues anymore, surely that is a joke.

If you need to spend $1,200+ on a graphics card to get terrible performance how in the world do you expect that to be a feature devs are going to put a lot of effort into? It's the same ordeal as SLI, not worth it for the few people that can actually afford to spend that much on a GPU.

IMO, they are placing too much focus on the wrong thing, and have done so since ray tracing was introduced.

Supposedly, the more "audience" a game COULD have, the more potential customers the game's publisher could have, so i find it VERY ODD for game publishers to PURPOSELY cripple their games by having such RIDICULOUS minimum requirements, which are AGGRAVATED EVEN FURTHER with ray tracing.

Were "the penalty" for enabling ray tracing like ... say ... 15% or 20% tops, THEN it would be justified, but "the penalty" is often in the 40% range and sometimes crosses the 50% mark (AMD cards mostly, but not always), which makes the tech premature, IMO: it needs to be developed further.


Were the game's minimum requirements "more down to earth" while ALSO having "higher quality modes", their games could have A LOT MORE potential buyers: the game's publishers are shooting their own feet ... with a cannon ...

You cannot blame game devs for the greed of Nvidia and AMD. People would not be complaining if GPU prices over the last few gens had been reasonable. This system requirement increase was long overdue, games have been stuck on 8GB of VRAM for 7 years.
 
unified shaders, HDR lighting, bloom, hardware T&L

Woah what. Unified shaders was a holy grail of 3D raster tech advancement, and hardware T&L was irrefutably the single greatest leap in graphics horsepower at the time with massive research and development put into making geometry transform logic chips from the most important players in the industry such as TI, IBM, SGI, 3DLabs, ArtX and NVIDIA. There was no lukewarm reception to either one, the entire industry pivoted on those without question. They're not even remotely on the same level as the other feature improvements and secondary shader techniques you listed, they fundamentally changed everything about the way hardware accelerated graphics worked. Ray tracing is still on the horizon as that holy grail, but RTX ain't it yet. It's currently only supplemental and most importantly secondary.
 
games have been stuck on 8GB of VRAM for 7 years

I was referring to minimum requirements.

Games CAN STILL HAVE "all the bells and whistles", which could well require 16+ GB if need be, but by having such high minimum requirements, they are effectively cutting off a SIGNIFICANT portion of potential customers: this is the point i was trying to make.
 
I'll never understand ray tracing, it just doesn't impress me at all personally.
Unless you bake everything which is unreasonable for most devs and doesn't work in games with dynamic systems you need RT for proper lightning.
 
I'll never understand ray tracing, it just doesn't impress me at all personally.
Not yet, anyway. Most RT implementations currently are added on as an afterthought. I think RT has great potential to look a lot better by the end of the decade.
 
Unless you bake everything which is unreasonable for most devs and doesn't work in games with dynamic systems you need RT for proper lightning.
Then I must have been gaming blindfolded for the last 20 odd years! Interesting please tell us more
 
I would remind you they said the same thing about Physx (which was also pretty cool when done right, the papers flying around in Batman games was pretty neat looking)
Even GTA-4 had good implementation of Phyx for how bad the PC port was. I played that game using both AMD and nVidia cards back in the day there were things like car reacting to bumps on road, small floating objects interacting with player while swimming, etc... all of which were absent while playing on AMD cards.
 
Woah what. Unified shaders was a holy grail of 3D raster tech advancement, and hardware T&L was irrefutably the single greatest leap in graphics horsepower at the time with massive research and development put into making geometry transform logic chips from the most important players in the industry such as TI, IBM, SGI, 3DLabs, ArtX and NVIDIA. There was no lukewarm reception to either one, the entire industry pivoted on those without question. They're not even remotely on the same level as the other feature improvements and secondary shader techniques you listed, they fundamentally changed everything about the way hardware accelerated graphics worked. Ray tracing is still on the horizon as that holy grail, but RTX ain't it yet. It's currently only supplemental and most importantly secondary.
Not RTX eye-candy by itself, but what enable you to get to path-tracing in real time- RT HW cores topped with AI cores (tensor with NV) for image upscaling\generating.
Those are the new standers that any player will need to incorporate somehow in order to stay afloat. No feature without it (that is sticking only to raster).
 
Last edited:
Did you guys see the rumour that AMD is releasing a 7800XTX, 7800XT and 7700XT with 16 and 12 GB of VRAM respectively?
Did you see the bit where it was said 7800XTX was most likely cancelled as it was going to be a cut down N31 die like W7800.

7700XT with only 12GB would make AMD seem like trolls given how much they were taunting Nvidia over lack of memory. 7700XT brings nothing over the 4070 and is said to be slower.
 
Raytracing. I have never used it and I have seen it but I don't think I am not enjoying Gaming. People are so quick to quote the narrative when the truth is 99% of PC Games have no Ray Tracing. TWWH3, Company of heroes 3 and Project Cars come to mind. There is no one who has a 7900XT(X) that is complaining (except one user) in the 7000 club forum and I don't see many threads about Ray Tracing there. Then there is the caveat that the only Nvidia card that beats the 7900XTX is the 4090 so if you want to pay an extra $800 for Ray Tracing be my guest while I enjoy the hell out of Humble Choice. Aliens Fire Team here we come. Even though CP2077 is an Nvidia sponsored title it still runs absolutely butter smooth on my rig. I expect these cards to move up a tier in performance so the 7600XT will have the performance of the 6700XT @1080P and the 7700XT will have the performance of the 6800XT @1440P while the 7800XT will be a little faster than a 6950XT with better Power draw. What I really want is one of those Z1 chips or a 7940 Desktop APU for $200-300.

Did you see the bit where it was said 7800XTX was most likely cancelled as it was going to be a cut down N31 die like W7800.

7700XT with only 12GB would make AMD seem like trolls given how much they were taunting Nvidia over lack of memory. 7700XT brings nothing over the 4070 and is said to be slower.
Yeah I saw that but I won't say it is cancelled. We don't know and the 6700XT is currently the best value card if it's 30% faster than that it will be fine.
 
Then I must have been gaming blindfolded for the last 20 odd years! Interesting please tell us more
Rather I'd say you have been living blindfolded if you think games have proper lightning! You can argue that you don't need GI, reflections or accurate shadows to enjoy gaming but that's another argument entirely.
 
I'll never understand ray tracing, it just doesn't impress me at all personally.
The worst part, as ray tracing is implemented in more and more scenarios current gen GPUs will drop even more in performance. You can’t even buy an RT enabled card now for future proofing.

Just imagine how slow a $2000 4090 will render a full path RT game a couple of years from now. We are talking slide show frame rates.
 
@W1zzard will you be getting a sample to review?
 
Rather I'd say you have been living blindfolded if you think games have proper lightning! You can argue that you don't need GI, reflections or accurate shadows to enjoy gaming but that's another argument entirely.
Not at all, gaming is an entertainment medium. I play games with RT and I literally dont care much about the difference in visuals, I do however see a massive performance hit for arguably the same images.

Thing is if you need to tell and convince people its better, that on its own is proof it wont truly go places anytime soon. It will only go anywhere better if the price/bar of entry is sufficiently low.
 
I'll never understand ray tracing, it just doesn't impress me at all personally.
Cinematic-level realistic RT is wonderful, but these GPUs even the 4090 don't have a 1/10th of the power needed to run games at that level of RT. So yes, it's a joke, which Nvidia created and will insist on dragging everyone to the end.
 
Woah what. Unified shaders was a holy grail of 3D raster tech advancement, and hardware T&L was irrefutably the single greatest leap in graphics horsepower at the time with massive research and development put into making geometry transform logic chips from the most important players in the industry such as TI, IBM, SGI, 3DLabs, ArtX and NVIDIA. There was no lukewarm reception to either one, the entire industry pivoted on those without question. They're not even remotely on the same level as the other feature improvements and secondary shader techniques you listed, they fundamentally changed everything about the way hardware accelerated graphics worked. Ray tracing is still on the horizon as that holy grail, but RTX ain't it yet. It's currently only supplemental and most importantly secondary.

I agree, that's why I mentioned them. They were each a huge leap in graphics fidelity. But take unified shaders for example, despite all of its benefits, the average gamer stuck to Windows XP and DirectX 9.0c/shader model 3.0 to dear life, mostly as a consequence of rejecting Windows Vista. We've had AAAs releasing that way well into 2014.

Needless to say my point was that often, great new technologies have hurdles in adoption only because people have prejudice towards something unrelated, raises the cost of the hardware or don't like how it raises system requirements significantly. Ray tracing is no different.
 
Back
Top