Wednesday, April 26th 2023
AMD Radeon RX 7600 Early Sample Offers RX 6750 XT Performance at 175W: Rumor
AMD is expected to debut its performance-segment Radeon RX 7600 RDNA3 graphics card in May-June 2023, with board partners expected to show off their custom-design cards in the 2023 Computex (June). Moore's Law is Dead reports that they've spoken to a source with access to an early graphics card sample running the 5 nm "Navi 33" silicon that powers the RX 7600. This card, with development drivers (which are sure to be riddled with performance limiters); offers a 11% performance uplift over the Radeon RX 6650 XT, and a gaming power draw of 175 W (the RX 6650 XT pulls around 185-190 W).
This is still an early sample running development drivers, but a 11% performance boost puts it in the league of the Radeon RX 6700 XT. Should a production RX 7600 with launch-day drivers put on another 5-7% performance over this, the RX 7600 could end up with performance roughly matching the RX 6750 XT (a slim performance lead over the RTX 3070 in 1080p gaming). Should its power draw also hold, one can expect custom-design graphics cards to ship with single 8-pin PCIe power connectors. A couple of nifty specs of the RX 7600 also leaked out in the MLID report: Firstly, that 8 GB will remain the standard memory size for the RX 7600, as it is for the current RX 6650 XT. Secondly, the RX 7600 engine clock is reported to boost "above" 2.60 GHz.
Source:
Moore's Law is Dead (YouTube)
This is still an early sample running development drivers, but a 11% performance boost puts it in the league of the Radeon RX 6700 XT. Should a production RX 7600 with launch-day drivers put on another 5-7% performance over this, the RX 7600 could end up with performance roughly matching the RX 6750 XT (a slim performance lead over the RTX 3070 in 1080p gaming). Should its power draw also hold, one can expect custom-design graphics cards to ship with single 8-pin PCIe power connectors. A couple of nifty specs of the RX 7600 also leaked out in the MLID report: Firstly, that 8 GB will remain the standard memory size for the RX 7600, as it is for the current RX 6650 XT. Secondly, the RX 7600 engine clock is reported to boost "above" 2.60 GHz.
91 Comments on AMD Radeon RX 7600 Early Sample Offers RX 6750 XT Performance at 175W: Rumor
That massive inertia regarding their most popular segment isn't something that a company that believes in its product will do, especially if they can retake badly needed market share by doing so.
This is the proof. It's a full tier of performance below its competition in the same weight class, the RTX 4070 will leave this thing in the dust in every regard. Shame that Nvidia charges for that privilege.
I would've been pretty happy with 47W idle on two screens, since that's basically normal behaviour using RDNA2 as a reference, and proof that VRAM isn't forced 100% all the time. It's not 10W idle Ada, it's good enough considering that by design Navi31 dumps a whole bunch of wasted power on mem-related rails. Next time you get the chance, spend some time watching HWInfo. Core and (to an extent) VRAM itself idles effectively, it's everything else associated with the new interconnect that doesnt.
If you've ever played a lighter game, you'd know not to wish for *more* downclocking in-game. The lack of a stable VRAM clock under lighter loads is exactly what torpedoes the Radeon experience on an otherwise strong card, and why no Geforce GPU suffers the same fate - pick a memory Pstate and stick to it, even if it's not max VRAM clock, and even if core clocks stay low.
There might be some future optimizations for excessive draw on the memory rails, but none of the other problems are unique to RDNA3. They've been around for years.
Regardless, if Navi33 is monolithic as it looks to be, it won't suffer any of the same problems as Navi31 (except Radeon's immortal hatred for multi-monitor power draw, of course).
Of course a GPU that has peak performance on a higher level is going to be able to clock lower and run more efficiently when capped to a certain framerate.
This is why I use the numbers from a large bench suite like TPU has it instead of cherry picking. The cherry pick you made and especially the conclusions you draw couldn't be further from the truth. It might be the number today. Consider also that the 4090 and 4080 are CPU limited more often than not, so they'll run more efficiently than a fully loaded 7900 card to begin with. Part of the reason for that is higher CPU overhead on Nvidia drivers in newer APIs. I think right now the 4070~4070ti is the best indicator of the real efficiency Ada has. Those GPUs aren't limited in the test aggregate I linked.
So no I don't believe GPUs run at max utilization all the time, but that IS the real way to measure differences in real power efficiency - you need to level the playing field which means all GPUs should run as close to 100% util as possible. That is the peak stock perf you get and you will land there sooner or later as games get heavier and the GPU relatively will have to work harder.
I mean what point are you trying to make here? That Ada is more efficient? Its as efficient as the TPU test shows, no matter what you link there, it doesn't show us anything different, you just failed to interpret numbers here. I hope this shines some light.
That also means 7700 will come with 12GB, where we should be getting 16GB.
All this marketing about RX 6800 and 16GB and then ... 8GB. Meh...
IIRC, 2GB GDDR6 chips consumed ~40% more power vs 1GB GDDR6 chips.
Price aside, the 4070 should murder this product in practically every regard. The performance per watt and feature set gap... Ignore that, it's not a gap, it's an abyss between them. For this GPU to be reasonable it can't cost more than $250.
And 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.
If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
Or if unable to reach an aggressive price/performance, just discount the 6xxx series, keep selling the top RDNA3 and go as fast as possible to RDNA4 for a full lineup, instead of wasting money launching GPUs with poor price/performance.
I can't help but point out the sheer karmic rebalance at play here. AMD marketing hit on NVIDIA being stingy with VRAM? Proceeds to release a 8 GB GPU anyway. It's not even the first time, they've done the same with the 4 GB 6500 XT, and even temporarily removed a hit piece on their website that claimed 4 GB GPUs were dead.
www.theverge.com/2022/1/19/22891092/amd-4g-vram-claim-gpu-launch-rx-6500-xt
AMD marketing hit on NVIDIA for 12VHPWR overheating failures? Their own MBA 7900 XTXs were cooking due to manufacturing defects in the vapor chamber.
www.tomshardware.com/news/amd-faulty-thermal-solution-7900-xtx-throttling
Like that wasn't enough, all the memes about RTX 4090 power connectors catching fire and the i9-13900KS's high power consumption became reason for grief once the socket AM5 Ryzens began catching fire inside the socket and went up in smoke taking the motherboard alongside it, all over a failure/error in the AGESA/SMU power management code. For the substrate to bubble up and explode like these are doing? You need an insane amount of current to damage a processor that way.
Arguing this on Discord I've been called multiple names pointing that out, until... buildzoid came out with his video and brought up that one of the root causes of the problem could be an AGESA bug, and then no one talked me down on it again.
www.tomshardware.com/news/amd-ryzen-7000-burning-out-root-cause-identified-expo-and-soc-voltages-to-blame
Guess what AMD and motherboard vendors have done? Withdrew all current versions of BIOS and began issuing high-priority AGESA updates to everyone. I'm gonna lay down the cards here, AMD fans really need to stop being so proud and rabid in defense of this company. They're not your friends, they're not your old pals, they're not an indie, they're not an underdog. Nothing they do is out of kindness of their hearts. Begin demanding better products of them. They can do better.
If this series of blunders continue to occur routinely and every product they put out is massively overhyped, underdelivered and completely underwhelming, NVIDIA will charge you a thousand bucks for the RTX 5070 instead of just 600 for the 4070 as they do now, and Intel is gonna bring back $1800 Extreme Edition CPUs the second they gain an absolute performance lead back.
And 'RX 7600 Early Sample Offers RX 6750 XT Performance' is a joke beyond belief as it wouldn't even compete with ngreedia's previous gen 3070 class GPU. Well, maybe if it's priced at 250 bucks, it would be acceptable buy, but we all know that's not gonna happen. It's likely gonna get 349 MSRP and rot no shelves like most new Radeon products lately :banghead:
Anyone making this kind of projection work about yet unreleased tech products is sure to be at least as wrong as the people projecting to make the products. The semiconductor industry keeps trumping up new ideas and throws them away. On every generation of GPUs, the cuts to full dies are basically random and are just made on whatever the engineers found to work ok enough. On every new technology, the possibility of it failing or being abandoned is present.
Demanding MLID or any leaker to be 100% accurate is absolutely idiotic. People getting on their high horse and demanding 100% accuracy or bust have not understood the first thing about innovation or high tech. And if you're gonna say that the only worthwhile news is the one that's been thoroughly verified:
- Zen 4 x3D is a lie since we just got voltage issue on some
- 4070s are both a lie since their VRAM has yet to be proven insufficient broadly
- Intel was unworthy of being newsworthy for over 10 years because 10 years after, we found Heartbleed in it
- TPU is not newsworthy because it's analysing cards that'll get driver updates and will be proven completely wrong (see RDNA3 power draw day 1 and 4 months later)
And so on. The entire point of the bleeding edge is that we're always living in novelty and uncertainty. If you don't want to listen to leakers that's your right, but getting all uppity and demanding 100% accuracy is basically denying the very concept of high tech. You're asking a field that's all about innovation to be always sure of what's going to happen, as if the leakers were wizards with magic balls that saw the future or were pants on fire liars.I swear that half the stupidity of this demand is from the leakers themselves never admitting getting anything wrong. There's a freakish aura about MLID and the like, where if TPU reviews a product and it gets proven innacurate later down, it's ok, but if it's MLID, it should always foresee everything perfectly. It's so dumb.
I listen to MLID, the guy is an AMD shill with a severe tendency of tooting his own horn and listen to himself talk, but the actual quality of conversation (his podcasts, except the ones with his bro where it's MLID agrees with MLID), general industry analysis and overall intelligence, are all high. It's one of the most interesting people to listen to if you can overcome the shilling and self-aggrandizing. And nobody should ask of him, or any news website, to get everything right. It's all about evaluating how right and how wrong they can be and deciding the validity of their statements, not demanding 100% truth on educated guesses and climbing a 40m ladder up to your saddle when they get a guess wrong.
2. The 4070 isn't a lie. It's got 12 GB VRAM. It's a fact. Whether you like it or not is up to you to decide.
3. We found Heartbleed years after release of the first affected CPU generation. It's a backdoor that somebody found, not a design feature of a product.
4. TPU analyses cards with release drivers. Release drivers aren't a lie. They exist. No one can test the future, obviously.
I am not demanding 100% accuracy from MLID. What I want is TPU not to use Youtube channels that are famous for making random shit up as source for newsworthy articles. Or if it's fine, then I think (based on nothing) that the 7700 XT will have 3840 shader cores and 16 GB VRAM over a 256-bit bus and will fall between the 6800 XT and 6900 XT. That's it, I've said it. Can somebody write an article now? :rolleyes:
MLID is not a leak channel. It's a fake channel.
AMD is at this stage our only real chance of getting any kind of competition. Intel was the fattest monopoly in tech for far too long. Their prices and laziness were the stuff of legends.
Nvidia is currently an untouchable king in parallel processing (aka GPUs). There is nothing getting close. AMD is barely capable of offering somewhat capable products for gaming.
We don't like AMD because we like red or for Lisa Su's pretty face. We like them because AMD managed to go from being near bankruptcy and rising to actually take Intel down from their status as Lazy Gods.
Nvidia is being extremely greedy right now, and unsurprisingly so. We support AMD because we think that they are our only shot at actually opening up this terrible market into something more palatable. And as for this stupid line:
- AMD IS an underdog by any definition of the term. They were, and still are, woefully understaffed and underpowered vs Nvidia and Intel. The sales status on CPUs and server chips is nothing short of extraordinary when you consider that AMD, all sectors combined (CPU and GPU) was 1/4th of Intel's personnel not 2 years ago. I've heard more than once that RTG's entire staff is about the size of Nvidia's driver team alone.
- AMD is not "indie" is a meaningless phrase, no industrial-scale manufacturer or designer can be indie. Nobody's expecting Raspberry Pi power or prices, we just expect competition and that can only come from an industrial provider. You don't need to be a mom and pop corner shop to get my sympathy, you need to make me see that you're bringing something good for people. AMD is bringing something good across the board in CPUs, and I want to see that in GPUs too.
- "Not your friends/pals" yes I'm aware of how capitalism works, thanks. You will see me supporting Intel and Nvidia if AMD ever succeeds at bringing them down and starts becoming the monopoly. It's all about playing giants one against the other so that we humble buyers can reap the best situation.
- "Begin demanding better products" is another entirely silly idea, since the only ways to "demand" that is to either wait for them to produce them, buy the competition, or do not buy at all. High grade consumer chips are a duopoly in both GPUs and CPUs. That translates to "buy Nvidia, buy Intel, or buy nothing and pout". NONE of these options help the consumers at all. What you're asking for is basically the attitude of a teenager that wants his mum to buy him a new videogame. If you want to actually help "better products", you have two choices: go work for AMD and help them actually make these better products, or build your own company that'll compete better than AMD or Nvidia. Or find a way to force Jensen to lower prices and his margins.
"They can do better" is the most void and pointless phrase from a consumer that is encouraging no competition, encouraging no growth, and does nothing at all but sit at home and wait for the big players to either make him cough up the money, or not cough it up. Nothing you're suggesting is helping at all. "Random shit up" isn't a thing MLID does.And yes, it's fine for you to invent that and for it to be a news source on TPU. As long as you get it right. If you get it wrong, then its not fine.
And as I've explained at length, the one and only metric that matters is how often you get it right vs how often you don't.
MLID is solid enough that he can claim to mostly get things right. That's why he's in articles.
I agree with the above (not quoted) part of your post. AMD will never become better than Nvidia if everyone takes the "wait and see" approach. Of course, we're not charity organisations to buy bad products just to support on underdog company, but AMD's products at this point are not so much worse that one can't have a decent gaming experience on an AMD system. Not to mention the price disparity. I'd much rather have an "inferior" AMD card for 50-100 bucks less than supply the green greed and make sure the next generation of Nvidia cards ends up being even more expensive than this one.
I'm just aiming for 2 things:
- proper idling at 30W without taking the piss (not 5 entire minutes from watching a video to properly downpower)
- video playback not being at a ridiculous 70W+ but rather around 40 to 50W
In games, I have no complaints. I'm quite happy. Aye, that's basically the gist of my demand. I want it optimised. Agree, we shall see soon.Perhaps I'll make a BollocksTech channel then.
If I imitate a convincing British accent I'm sure to get all the americans to subscribe to my every word.
If anything, I bought RDNA3 because my choices were between extremely overpriced and flawed. I bought it thinking that I'm giving AMD a good chance here: they can fix/optimise the card until it is worth its price, or not. If not, I will make sure to go pay the Jensen Tax next time.
Also, the entire open source aspect did tip my decision AMD's way. There is a strong streak of open sourcing valuable technologies that Nvidia certainly can't claim to have on their side. Inferior or not, AMD tech does feel like it's trying to be passed along to the public and like it's generally trying to help people. DLSS may be amazeballs, if it's both vendor locked in and generation locked in, it can go in the gutter. FSR actually broadly helps, if in an insufficient way for now.
(and tbh, Adrenalin is the kind of software quality that makes me want to never go back to Nvidia's GeCrap Experience) It's never about purity, it's about results.
AMD's results have been to take down Intel back to Earth. AMD's results may yet be to take down Nvidia to Earth too.
I'd rather support that than anything else.
Although if Intel actually bothered to go on with their ARC lineup, I might be interested as well. The hardware wasn't there on gen 1 (large chips for poor performance), but the software had some great highlights (XeSS and the video encoders). I've reconsidered.
I'll make a "HatedTV" channel.
Instead of making wild claims in a thick Scottish accent about how amazing AMD's next thing will be, I'll make wild claims in a thick French accent about how Intel's stuff is going to ruin us all with how good it is.
And I'll shill some shitcoins for extra Hatred.
The guy tested the 4090 before it existed: :laugh: