Wednesday, April 26th 2023

AMD Radeon RX 7600 Early Sample Offers RX 6750 XT Performance at 175W: Rumor

AMD is expected to debut its performance-segment Radeon RX 7600 RDNA3 graphics card in May-June 2023, with board partners expected to show off their custom-design cards in the 2023 Computex (June). Moore's Law is Dead reports that they've spoken to a source with access to an early graphics card sample running the 5 nm "Navi 33" silicon that powers the RX 7600. This card, with development drivers (which are sure to be riddled with performance limiters); offers a 11% performance uplift over the Radeon RX 6650 XT, and a gaming power draw of 175 W (the RX 6650 XT pulls around 185-190 W).

This is still an early sample running development drivers, but a 11% performance boost puts it in the league of the Radeon RX 6700 XT. Should a production RX 7600 with launch-day drivers put on another 5-7% performance over this, the RX 7600 could end up with performance roughly matching the RX 6750 XT (a slim performance lead over the RTX 3070 in 1080p gaming). Should its power draw also hold, one can expect custom-design graphics cards to ship with single 8-pin PCIe power connectors. A couple of nifty specs of the RX 7600 also leaked out in the MLID report: Firstly, that 8 GB will remain the standard memory size for the RX 7600, as it is for the current RX 6650 XT. Secondly, the RX 7600 engine clock is reported to boost "above" 2.60 GHz.
Source: Moore's Law is Dead (YouTube)
Add your own comment

91 Comments on AMD Radeon RX 7600 Early Sample Offers RX 6750 XT Performance at 175W: Rumor

#26
watzupken
I feel the performance is within expectations. Navi33 as I recall is almost like a RDNA 2+, since there is little in common between NAVI 31 and 33 since its built on 6nm and a monolithic design. I do wonder if its got dedicated RT cores like the actual RDNA3 chips.
Posted on Reply
#27
Dr. Dro
Precisely why I brought up that I suspected that they couldn't make a product competitive with the RTX 4070 with the same wattage figures.

That massive inertia regarding their most popular segment isn't something that a company that believes in its product will do, especially if they can retake badly needed market share by doing so.

This is the proof. It's a full tier of performance below its competition in the same weight class, the RTX 4070 will leave this thing in the dust in every regard. Shame that Nvidia charges for that privilege.
Posted on Reply
#28
tabascosauz
MahboiRocking an rx 7900 xt right now, I can say that the biggest sign if immaturity of the card is its power draw. It constantly stays above what's necessary, or jitters in power draw and has a very hard time properly downpowering.

For example I can run Overwatch 2 on Ultra (also Epic, but frankly I saw no visual difference so why bother) with Radeon Chill on, and comfortably get 140fps: card eats 150W.
I then start a basic Youtube video/Twitch and just idle away: card eats 70W.
Downpowering back to the best possible idle power draw (29W) is a test of patience because the card will literally dance around 40-55W for several minutes before entirely downpowering to its real minimum. Even when all you do is idle away or just type on these forums:

All of that to say that the card clearly has the chops for a good efficiency, but is woefully immature in its power usage.
I'm eagerly waiting for the next driver update to see what actually gets done, and I'm expecting to gain a few watts on each new driver or so in the next ~1 year.

To be fair, between the huge upcoming APUs and the value of cards that have less than 12Go of VRAM, I'm really expecting low end enthusiast to be squeezed away in the next years. No need to put big efforts when you can just shove 10-40 RDNA3 CUs on your CPU...
The regular fluctuations in idle and power are normal, it's just the result of more precisely and dynamically controlled core and mem clocks. Same way Ryzen idle is touchy. Ever experienced the 90W multi monitor idle on Navi31? I guess not......

I would've been pretty happy with 47W idle on two screens, since that's basically normal behaviour using RDNA2 as a reference, and proof that VRAM isn't forced 100% all the time. It's not 10W idle Ada, it's good enough considering that by design Navi31 dumps a whole bunch of wasted power on mem-related rails. Next time you get the chance, spend some time watching HWInfo. Core and (to an extent) VRAM itself idles effectively, it's everything else associated with the new interconnect that doesnt.

If you've ever played a lighter game, you'd know not to wish for *more* downclocking in-game. The lack of a stable VRAM clock under lighter loads is exactly what torpedoes the Radeon experience on an otherwise strong card, and why no Geforce GPU suffers the same fate - pick a memory Pstate and stick to it, even if it's not max VRAM clock, and even if core clocks stay low.

There might be some future optimizations for excessive draw on the memory rails, but none of the other problems are unique to RDNA3. They've been around for years.

Regardless, if Navi33 is monolithic as it looks to be, it won't suffer any of the same problems as Navi31 (except Radeon's immortal hatred for multi-monitor power draw, of course).
Posted on Reply
#29
Vayra86
napataEffiency is a complicated topic and can't be reduced to a single game. If you use Doom Eternal for example then it's suddenly a 20-30% difference. 4080 only uses 280-290W in it while being faster than a 7900XTX IIRC so it's very much an example in favor of ADA, but it's a good example to show how much a game can swing the results in favor of either Nvidia or AMD.

Although you're that guy who mistakenly believes GPUs always hit max power consumption, right? That reminds me that I should deliver you some screenshots of synthetic GPU workloads, like TSE or Port Royal, where my 4090 runs below 400W.


Also once you cap games with a framelimit RDNA3 loses a ton of efficiency. That might not matter to you if you never run a cap but a lot of people run games with frame caps.
Euh... what? Can you please kick in some more open doors while you're at it?

Of course a GPU that has peak performance on a higher level is going to be able to clock lower and run more efficiently when capped to a certain framerate.
This is why I use the numbers from a large bench suite like TPU has it instead of cherry picking. The cherry pick you made and especially the conclusions you draw couldn't be further from the truth. It might be the number today. Consider also that the 4090 and 4080 are CPU limited more often than not, so they'll run more efficiently than a fully loaded 7900 card to begin with. Part of the reason for that is higher CPU overhead on Nvidia drivers in newer APIs. I think right now the 4070~4070ti is the best indicator of the real efficiency Ada has. Those GPUs aren't limited in the test aggregate I linked.

So no I don't believe GPUs run at max utilization all the time, but that IS the real way to measure differences in real power efficiency - you need to level the playing field which means all GPUs should run as close to 100% util as possible. That is the peak stock perf you get and you will land there sooner or later as games get heavier and the GPU relatively will have to work harder.

I mean what point are you trying to make here? That Ada is more efficient? Its as efficient as the TPU test shows, no matter what you link there, it doesn't show us anything different, you just failed to interpret numbers here. I hope this shines some light.
Posted on Reply
#30
john_
Performance looks good. VRAM capacity, nope.... nope.... nope.... nope. It should have been at least 12GB and just keep the RX 6600 in the market as a cheaper, 8GB VRAM option.
That also means 7700 will come with 12GB, where we should be getting 16GB.
All this marketing about RX 6800 and 16GB and then ... 8GB. Meh...
Posted on Reply
#31
londiste
Vayra86However I don't quite get anyone who's pissing over general RDNA3 efficiency, its damn close to Ada really. And it does that while having much more VRAM, which does cost power.
RDNA3 cards so far are using GDDR6 vs GDDR6X on Ada. At least in RTX3000 series GDDR6X proved to be quite a power hog.
IIRC, 2GB GDDR6 chips consumed ~40% more power vs 1GB GDDR6 chips.
Posted on Reply
#32
Dr. Dro
londisteRDNA3 cards so far are using GDDR6 vs GDDR6X on Ada. At least in RTX3000 series GDDR6X proved to be quite a power hog.
IIRC, 2GB GDDR6 chips consumed ~40% more power vs 1GB GDDR6 chips.
2nd generation G6X (introduced with 3090 Ti) greatly alleviated power consumption, and the 4070 which is the GPU in the same power consumption tier uses standard G6, doesn't it? Because if not... The performance per watt disparity is even more extreme.

Price aside, the 4070 should murder this product in practically every regard. The performance per watt and feature set gap... Ignore that, it's not a gap, it's an abyss between them. For this GPU to be reasonable it can't cost more than $250.
Posted on Reply
#33
BoboOOZ
john_Performance looks good. VRAM capacity, nope.... nope.... nope.... nope. It should have been at least 12GB and just keep the RX 6600 in the market as a cheaper, 8GB VRAM option.
That also means 7700 will come with 12GB, where we should be getting 16GB.
All this marketing about RX 6800 and 16GB and then ... 8GB. Meh...
It's not out yet, but if you hear the whole video, it sounds like AMD really aren't sure what to launch in this market, which is kinda funny.

And 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.

If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
Posted on Reply
#34
TheinsanegamerN
BoboOOZIt's not out yet, but if you hear the whole video, it sounds like AMD really aren't sure what to launch in this market, which is kinda funny.
I said something similar a few days ago and was mocked. I dont understand how AMD doesnt know what to do, they've been doing this for 20 years now. Either that or they have a LOT of unused rDNA2 laying around, which is even funnier.
BoboOOZAnd 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.

If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
Laptops are a different beast, on a small screen you're probably not going to see the difference between medium and ultra on some games, or need fancy AA options. My venerable alienware 15r2 finally ran into issues this year with its 3GB framebuffer on the 970m. I'd say I got my use out of it.
Posted on Reply
#35
BoboOOZ
TheinsanegamerNI said something similar a few days ago and was mocked. I dont understand how AMD doesnt know what to do, they've been doing this for 20 years now. Either that or they have a LOT of unused rDNA2 laying around, which is even funnier.
Well, it's unclear if they are able to achieve an aggressive price performance ratio this generation. It seems that they thought the initial pricing on the 7900s was aggressive. That's really funny actually.
TheinsanegamerNLaptops are a different beast, on a small screen you're probably not going to see the difference between medium and ultra on some games, or need fancy AA options. My venerable alienware 15r2 finally ran into issues this year with its 3GB framebuffer on the 970m. I'd say I got my use out of it.
Well, I still play with it more often than not docked on a 1440p ultrawide. But that's irrelevant, what I mean is that it's not like 6GB, 8GB or 10GB GPUs should cease to exist right this very moment. It's just that they shouldn't be marketed as no-compromise options. If the price is right the compromise can be perfectly acceptable.
Posted on Reply
#36
john_
BoboOOZIt's not out yet, but if you hear the whole video, it sounds like AMD really aren't sure what to launch in this market, which is kinda funny.
It's not funny. It's a problem. When the competition has both the performance and the support of consumers, it's a problem. Even if you come out with a product at a much better price point, a simple price drop on the competitor's equivalent product and you are finished. Considering Nvidia no matter what, will place the VRAM obsolesce switch in it's products, AMD can only play the "more VRAM" at ANY price point marketing, to have some kind of advantage. Offering cards with the same VRAM at the same price points only helps Nvidia to sell more cards. We see it with RTX 3050 and RX 6600. The RX 6600 faster and cheaper, yet people buy the RTX 3050. But "more VRAM", that could sell cards on it's own.
And 8GB of RAM is indeed low, but it all depends on the price. I bought a 3050 laptop a few months ago, it still made lots of sense for the money, although obviously I will never be able to use high res textures with it.

If the price is right and higher VRAM alternatives are not priced obscenely this could still make sense.
8GB VRAM makes the cards cheaper to produce and market, but we seen that lower price is not enough to persuade people to prefer AMD cards over Nvidia when VRAM is the same. Considering the prices of modern cards and the fear that future models will be even more expensive, a GPU with more VRAM could be seen as a more future proof product and a way to avoid having to pay again in 1-2 years for a new GPU. Unfortunately AMD decided to cheap out here, while they done the complete opposite in CPUs, where they build a platform equivalent to a tank to make it future proof.
Posted on Reply
#37
BoboOOZ
john_8GB VRAM makes the cards cheaper to produce and market, but we seen that lower price is not enough to persuade people to prefer AMD cards over Nvidia when VRAM is the same. Considering the prices of modern cards and the fear that future models will be even more expensive, a GPU with more VRAM could be seen as a more future proof product and a way to avoid having to pay again in 1-2 years for a new GPU.
Well maybe they will achieve just that with their 7700xt? Let's not forget that due to their stupid naming scheme, the 7600 might actually be a 6500 equivalent this generation. It's unclear from the rumor.
Or if unable to reach an aggressive price/performance, just discount the 6xxx series, keep selling the top RDNA3 and go as fast as possible to RDNA4 for a full lineup, instead of wasting money launching GPUs with poor price/performance.
Posted on Reply
#38
Dr. Dro
john_8GB VRAM makes the cards cheaper to produce and market, but we seen that lower price is not enough to persuade people to prefer AMD cards over Nvidia when VRAM is the same. Considering the prices of modern cards and the fear that future models will be even more expensive, a GPU with more VRAM could be seen as a more future proof product and a way to avoid having to pay again in 1-2 years for a new GPU. Unfortunately AMD decided to cheap out here, while they done the complete opposite in CPUs, where they build a platform equivalent to a tank to make it future proof.
(rant warning)

I can't help but point out the sheer karmic rebalance at play here. AMD marketing hit on NVIDIA being stingy with VRAM? Proceeds to release a 8 GB GPU anyway. It's not even the first time, they've done the same with the 4 GB 6500 XT, and even temporarily removed a hit piece on their website that claimed 4 GB GPUs were dead.

www.theverge.com/2022/1/19/22891092/amd-4g-vram-claim-gpu-launch-rx-6500-xt

AMD marketing hit on NVIDIA for 12VHPWR overheating failures? Their own MBA 7900 XTXs were cooking due to manufacturing defects in the vapor chamber.

www.tomshardware.com/news/amd-faulty-thermal-solution-7900-xtx-throttling

Like that wasn't enough, all the memes about RTX 4090 power connectors catching fire and the i9-13900KS's high power consumption became reason for grief once the socket AM5 Ryzens began catching fire inside the socket and went up in smoke taking the motherboard alongside it, all over a failure/error in the AGESA/SMU power management code. For the substrate to bubble up and explode like these are doing? You need an insane amount of current to damage a processor that way.

Arguing this on Discord I've been called multiple names pointing that out, until... buildzoid came out with his video and brought up that one of the root causes of the problem could be an AGESA bug, and then no one talked me down on it again.


www.tomshardware.com/news/amd-ryzen-7000-burning-out-root-cause-identified-expo-and-soc-voltages-to-blame

Guess what AMD and motherboard vendors have done? Withdrew all current versions of BIOS and began issuing high-priority AGESA updates to everyone. I'm gonna lay down the cards here, AMD fans really need to stop being so proud and rabid in defense of this company. They're not your friends, they're not your old pals, they're not an indie, they're not an underdog. Nothing they do is out of kindness of their hearts. Begin demanding better products of them. They can do better.

If this series of blunders continue to occur routinely and every product they put out is massively overhyped, underdelivered and completely underwhelming, NVIDIA will charge you a thousand bucks for the RTX 5070 instead of just 600 for the 4070 as they do now, and Intel is gonna bring back $1800 Extreme Edition CPUs the second they gain an absolute performance lead back.
Posted on Reply
#39
RedelZaVedno
It's a total shitshow coming from both teams. xx60 class used to get us xx80 previous gen performance for the same price (or a bit higher adjusted for inflation). What we're getting now is xx60 class being a previous xx70 class for 20% more bucks from both teams. 4070TI should have been 4070 and costed 549 bucks, the 4080 should have been 4070TI for $749, 800 max and released 4070 4060TI for 400 to 440 bucks. Same goes for AMD 7900XTX = "7800XT" for 800 bucks and 7900XT should have been "7800" for 600 to 700 bucks.

And 'RX 7600 Early Sample Offers RX 6750 XT Performance' is a joke beyond belief as it wouldn't even compete with ngreedia's previous gen 3070 class GPU. Well, maybe if it's priced at 250 bucks, it would be acceptable buy, but we all know that's not gonna happen. It's likely gonna get 349 MSRP and rot no shelves like most new Radeon products lately :banghead:
Posted on Reply
#40
Mahboi
AusWolfYou can't afford to be wrong when you consider yourself newsworthy. Anything coming from him/them is hot air, imo.
Absolutely ridiculous.
Anyone making this kind of projection work about yet unreleased tech products is sure to be at least as wrong as the people projecting to make the products. The semiconductor industry keeps trumping up new ideas and throws them away. On every generation of GPUs, the cuts to full dies are basically random and are just made on whatever the engineers found to work ok enough. On every new technology, the possibility of it failing or being abandoned is present.

Demanding MLID or any leaker to be 100% accurate is absolutely idiotic. People getting on their high horse and demanding 100% accuracy or bust have not understood the first thing about innovation or high tech. And if you're gonna say that the only worthwhile news is the one that's been thoroughly verified:
  1. Zen 4 x3D is a lie since we just got voltage issue on some
  2. 4070s are both a lie since their VRAM has yet to be proven insufficient broadly
  3. Intel was unworthy of being newsworthy for over 10 years because 10 years after, we found Heartbleed in it
  4. TPU is not newsworthy because it's analysing cards that'll get driver updates and will be proven completely wrong (see RDNA3 power draw day 1 and 4 months later)
And so on. The entire point of the bleeding edge is that we're always living in novelty and uncertainty. If you don't want to listen to leakers that's your right, but getting all uppity and demanding 100% accuracy is basically denying the very concept of high tech. You're asking a field that's all about innovation to be always sure of what's going to happen, as if the leakers were wizards with magic balls that saw the future or were pants on fire liars.

I swear that half the stupidity of this demand is from the leakers themselves never admitting getting anything wrong. There's a freakish aura about MLID and the like, where if TPU reviews a product and it gets proven innacurate later down, it's ok, but if it's MLID, it should always foresee everything perfectly. It's so dumb.
I listen to MLID, the guy is an AMD shill with a severe tendency of tooting his own horn and listen to himself talk, but the actual quality of conversation (his podcasts, except the ones with his bro where it's MLID agrees with MLID), general industry analysis and overall intelligence, are all high. It's one of the most interesting people to listen to if you can overcome the shilling and self-aggrandizing. And nobody should ask of him, or any news website, to get everything right. It's all about evaluating how right and how wrong they can be and deciding the validity of their statements, not demanding 100% truth on educated guesses and climbing a 40m ladder up to your saddle when they get a guess wrong.
Posted on Reply
#41
AusWolf
MahboiDemanding MLID or any leaker to be 100% accurate is absolutely idiotic. People getting on their high horse and demanding 100% accuracy or bust have not understood the first thing about innovation or high tech. And if you're gonna say that the only worthwhile news is the one that's been thoroughly verified:
  1. Zen 4 x3D is a lie since we just got voltage issue on some
  2. 4070s are both a lie since their VRAM has yet to be proven insufficient broadly
  3. Intel was unworthy of being newsworthy for over 10 years because 10 years after, we found Heartbleed in it
  4. TPU is not newsworthy because it's analysing cards that'll get driver updates and will be proven completely wrong (see RDNA3 power draw day 1 and 4 months later)
1. We've got voltage issues on one. There is literally one confirmed case of CPU damage and we don't know how or why it happened. That doesn't make the whole series a lie.
2. The 4070 isn't a lie. It's got 12 GB VRAM. It's a fact. Whether you like it or not is up to you to decide.
3. We found Heartbleed years after release of the first affected CPU generation. It's a backdoor that somebody found, not a design feature of a product.
4. TPU analyses cards with release drivers. Release drivers aren't a lie. They exist. No one can test the future, obviously.

I am not demanding 100% accuracy from MLID. What I want is TPU not to use Youtube channels that are famous for making random shit up as source for newsworthy articles. Or if it's fine, then I think (based on nothing) that the 7700 XT will have 3840 shader cores and 16 GB VRAM over a 256-bit bus and will fall between the 6800 XT and 6900 XT. That's it, I've said it. Can somebody write an article now? :rolleyes:

MLID is not a leak channel. It's a fake channel.
Posted on Reply
#42
Mahboi
Dr. DroI'm gonna lay down the cards here, AMD fans really need to stop being so proud and rabid in defense of this company.
You're totally missing the point.

AMD is at this stage our only real chance of getting any kind of competition. Intel was the fattest monopoly in tech for far too long. Their prices and laziness were the stuff of legends.
Nvidia is currently an untouchable king in parallel processing (aka GPUs). There is nothing getting close. AMD is barely capable of offering somewhat capable products for gaming.

We don't like AMD because we like red or for Lisa Su's pretty face. We like them because AMD managed to go from being near bankruptcy and rising to actually take Intel down from their status as Lazy Gods.
Nvidia is being extremely greedy right now, and unsurprisingly so. We support AMD because we think that they are our only shot at actually opening up this terrible market into something more palatable.
Dr. DroThey're not your friends, they're not your old pals, they're not an indie, they're not an underdog. Nothing they do is out of kindness of their hearts. Begin demanding better products of them. They can do better.
And as for this stupid line:
  1. AMD IS an underdog by any definition of the term. They were, and still are, woefully understaffed and underpowered vs Nvidia and Intel. The sales status on CPUs and server chips is nothing short of extraordinary when you consider that AMD, all sectors combined (CPU and GPU) was 1/4th of Intel's personnel not 2 years ago. I've heard more than once that RTG's entire staff is about the size of Nvidia's driver team alone.
  2. AMD is not "indie" is a meaningless phrase, no industrial-scale manufacturer or designer can be indie. Nobody's expecting Raspberry Pi power or prices, we just expect competition and that can only come from an industrial provider. You don't need to be a mom and pop corner shop to get my sympathy, you need to make me see that you're bringing something good for people. AMD is bringing something good across the board in CPUs, and I want to see that in GPUs too.
  3. "Not your friends/pals" yes I'm aware of how capitalism works, thanks. You will see me supporting Intel and Nvidia if AMD ever succeeds at bringing them down and starts becoming the monopoly. It's all about playing giants one against the other so that we humble buyers can reap the best situation.
  4. "Begin demanding better products" is another entirely silly idea, since the only ways to "demand" that is to either wait for them to produce them, buy the competition, or do not buy at all. High grade consumer chips are a duopoly in both GPUs and CPUs. That translates to "buy Nvidia, buy Intel, or buy nothing and pout". NONE of these options help the consumers at all. What you're asking for is basically the attitude of a teenager that wants his mum to buy him a new videogame. If you want to actually help "better products", you have two choices: go work for AMD and help them actually make these better products, or build your own company that'll compete better than AMD or Nvidia. Or find a way to force Jensen to lower prices and his margins.
"They can do better" is the most void and pointless phrase from a consumer that is encouraging no competition, encouraging no growth, and does nothing at all but sit at home and wait for the big players to either make him cough up the money, or not cough it up. Nothing you're suggesting is helping at all.
AusWolfI am not demanding 100% accuracy from MLID. What I want is TPU not to use Youtube channels that are famous for making random shit up as source for newsworthy articles. Or if it's fine, then I think (based on nothing) that the 7700 XT will have 3840 shader cores and 16 GB VRAM over a 256-bit bus and will fall between the 6800 XT and 6900 XT. That's it, I've said it. Can somebody write an article now? :rolleyes:
"Random shit up" isn't a thing MLID does.
And yes, it's fine for you to invent that and for it to be a news source on TPU. As long as you get it right. If you get it wrong, then its not fine.

And as I've explained at length, the one and only metric that matters is how often you get it right vs how often you don't.
MLID is solid enough that he can claim to mostly get things right. That's why he's in articles.
Posted on Reply
#43
AusWolf
Mahboi"Random shit up" isn't a thing MLID does.
And yes, it's fine for you to invent that and for it to be a news source on TPU. As long as you get it right. If you get it wrong, then its not fine.

And as I've explained at length, the one and only metric that matters is how often you get it right vs how often you don't.
MLID is solid enough that he can claim to mostly get things right. That's why he's in articles.
I disagree on that, but have it your way. I consider anything coming from MLID as non-news, as they've been proven completely wrong on multiple occasions. Anyone can make a Youtube channel to spread bollocks.

I agree with the above (not quoted) part of your post. AMD will never become better than Nvidia if everyone takes the "wait and see" approach. Of course, we're not charity organisations to buy bad products just to support on underdog company, but AMD's products at this point are not so much worse that one can't have a decent gaming experience on an AMD system. Not to mention the price disparity. I'd much rather have an "inferior" AMD card for 50-100 bucks less than supply the green greed and make sure the next generation of Nvidia cards ends up being even more expensive than this one.
Posted on Reply
#44
Mahboi
tabascosauzThe regular fluctuations in idle and power are normal, it's just the result of more precisely and dynamically controlled core and mem clocks. Same way Ryzen idle is touchy. Ever experienced the 90W multi monitor idle on Navi31? I guess not......
Yes, since I had it and fixed it in about 2 minutes.
tabascosauzI would've been pretty happy with 47W idle on two screens, since that's basically normal behaviour using RDNA2 as a reference, and proof that VRAM isn't forced 100% all the time. It's not 10W idle Ada, it's good enough considering that by design Navi31 dumps a whole bunch of wasted power on mem-related rails. Next time you get the chance, spend some time watching HWInfo. Core and (to an extent) VRAM itself idles effectively, it's everything else associated with the new interconnect that doesnt.
Yes I read up about that. I wonder how much they'll be able to calm it in the next 18 months until RDNA4 rolls around.
tabascosauzIf you've ever played a lighter game, you'd know not to wish for *more* downclocking in-game. The lack of a stable VRAM clock under lighter loads is exactly what torpedoes the Radeon experience on an otherwise strong card, and why no Geforce GPU suffers the same fate - pick a memory Pstate and stick to it, even if it's not max VRAM clock, and even if core clocks stay low.
No no, I don't care for more downclocking in game. 150W on 4K Ultra OW2 is way better than I expected. Deep Rock Galactic pushed it somehow to the max 315W (before undervolting), and Radeon Chill brought it back down to ~220W.

I'm just aiming for 2 things:
  • proper idling at 30W without taking the piss (not 5 entire minutes from watching a video to properly downpower)
  • video playback not being at a ridiculous 70W+ but rather around 40 to 50W
In games, I have no complaints. I'm quite happy.
tabascosauzThere might be some future optimizations for excessive draw on the memory rails, but none of the other problems are unique to RDNA3. They've been around for years.
Aye, that's basically the gist of my demand. I want it optimised.
tabascosauzRegardless, if Navi33 is monolithic as it looks to be, it won't suffer any of the same problems as Navi31 (except Radeon's immortal hatred for multi-monitor power draw, of course).
Agree, we shall see soon.
Posted on Reply
#45
Dr. Dro
MahboiNothing you're suggesting is helping at all.
Know what helps even less? Excusing them at every turn. But suit yourself, I'm not looking to start a flame war here. My objective of pointing out that they do the same things that their supposedly impure competition does has been achieved.
Posted on Reply
#46
Mahboi
AusWolfI disagree on that, but have it your way. I consider anything coming from MLID as non-news, as they've been proven completely wrong on multiple occasions. Anyone can make a Youtube channel to spread bollocks.
Let's agree to disagree.
Perhaps I'll make a BollocksTech channel then.
If I imitate a convincing British accent I'm sure to get all the americans to subscribe to my every word.
Posted on Reply
#47
AusWolf
MahboiLet's agree to disagree.
Perhaps I'll make a BollocksTech channel then.
If I imitate a convincing British accent I'm sure to get all the americans to subscribe to my every word.
Exactly! :laugh:
Posted on Reply
#48
Mahboi
Dr. DroKnow what helps even less? Excusing them at every turn.
I don't recall excusing RDNA 3. Perhaps at first only when it seemed like the insufficient performance would be quickly fixed. If anything, I've come to the same conclusion as you, that currently AMD deserves no more love than Nvidia. I hesitated over a 4080 for weeks. Ended up going with the XT because I literally couldn't bear the Nvidia cult mentality and the price was extremely unacceptable for the hardware.

If anything, I bought RDNA3 because my choices were between extremely overpriced and flawed. I bought it thinking that I'm giving AMD a good chance here: they can fix/optimise the card until it is worth its price, or not. If not, I will make sure to go pay the Jensen Tax next time.

Also, the entire open source aspect did tip my decision AMD's way. There is a strong streak of open sourcing valuable technologies that Nvidia certainly can't claim to have on their side. Inferior or not, AMD tech does feel like it's trying to be passed along to the public and like it's generally trying to help people. DLSS may be amazeballs, if it's both vendor locked in and generation locked in, it can go in the gutter. FSR actually broadly helps, if in an insufficient way for now.
(and tbh, Adrenalin is the kind of software quality that makes me want to never go back to Nvidia's GeCrap Experience)
Dr. DroBut suit yourself, I'm not looking to start a flame war here. My objective of pointing out that they do the same things that their supposedly impure competition does has been achieved.
It's never about purity, it's about results.
AMD's results have been to take down Intel back to Earth. AMD's results may yet be to take down Nvidia to Earth too.
I'd rather support that than anything else.

Although if Intel actually bothered to go on with their ARC lineup, I might be interested as well. The hardware wasn't there on gen 1 (large chips for poor performance), but the software had some great highlights (XeSS and the video encoders).
AusWolfExactly! :laugh:
I've reconsidered.
I'll make a "HatedTV" channel.
Instead of making wild claims in a thick Scottish accent about how amazing AMD's next thing will be, I'll make wild claims in a thick French accent about how Intel's stuff is going to ruin us all with how good it is.
And I'll shill some shitcoins for extra Hatred.
Posted on Reply
#49
AusWolf
MahboiI've reconsidered.
I'll make a "HatedTV" channel.
Instead of making wild claims in a thick Scottish accent about how amazing AMD's next thing will be, I'll make wild claims in a thick French accent about how Intel's stuff is going to ruin us all with how good it is.
And I'll shill some shitcoins for extra Hatred.
Some food for thought: :toast:

The guy tested the 4090 before it existed: :laugh:
Posted on Reply
#50
AnotherReader
napataAlso once you cap games with a framelimit RDNA3 loses a ton of efficiency. That might not matter to you if you never run a cap but a lot of people run games with frame caps.
That was improved by a subsequent driver update. Computerbase, the source of your graphs, tested that release, and found that (translated text below):
The Adrenalin 22.12.2 improves the energy efficiency of the Radeon RX 7900 XTX by 57 percent and that of the Radeon RX 7900 XT by 34 percent in the test scenario in WQHD. The GeForce RTX 4080 is only 14 instead of 78 percent more efficient than the Radeon RX 7900 XTX and the new Radeons have better instead of worse efficiency than the previous generation.
Posted on Reply
Add your own comment
Nov 24th, 2024 07:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts