• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Here Be AMD RX Vega Model's Codenames: Vega XTX, Vega XT, Vega XL

You're still going with the "AMD fanboy" narrative... Shit, I'll cancel all the comedy channels on IPTV and just visit TPU instead... Best shit ever XD I mean, calling someone who hasn't had AMD CPU since AthlonXP era an "AMD lover" and "AMD fanboy" is a thick statement. Fanboy... bwahahahaha XD Sorry, but this is so hilarious. It just is. XD
Ok, not a fanboy then, relax, no need to overreact. How about 1080ti max oc'd vs. Vega stock comparement? Apples and oranges. What was your motivation to post that nonsense?
 
I had an X800 XL back in the day. It was the price/performance winner. Considering AMD doesn't have anything in the $300-400 range, that's a perfect home for Vega XL. Question is does it have 8 GiB or 4 GiB VRAM?

AMD also has to have a GCN5 GPU with a GDDR5X or GDDR6 memory controller in the works. I wonder when that will debut.
 
Ok, not a fanboy then, relax, no need to overreact. How about 1080ti max oc'd vs. Vega stock comparement? Apples and oranges. What was your motivation to post that nonsense?

Someone said Vega consumption is insane. I've posted a proof so can be Pascals. That was all there is to it. Whether one pushes it to max from factory or leaves some headroom to AIB's, that's up to the card maker.

The reason why they usually aren't pushed to the limit is the consumption and longevity. I've seen it during HD7950 overclocking. For a bump from 900MHz to 1GHz required nearly no voltage increase. For 1.1GHz it required 0.05V more. For 1.2GHz it required 0.15V+ more. And that's what they are balancing. The sweet spot. Once you're requiring high voltages for small gains, they simply call it a day. But then there are people who like to push things to the max no matter what...

EDIT:
Why people keep on calling it GCN5 even though AMD themselves call the units NCU's?
 
Graphics Core Next (GCN) is the architecture of the entire GPU including Asynchronous Compute Engine (ACE), Unified Video Decoder (UVD), Video Coding Engine (VCE), and so on. Next Compute Unit (NCU) is basically the design of each core (contains all the SIMDs).
 
Someone said Vega consumption is insane. I've posted a proof so can be Pascals. That was all there is to it. Whether one pushes it to max from factory or leaves some headroom to AIB's, that's up to the card maker.
It is kind of insane, at stock, no?

Wait, let me dilute that a bit........ :)

It is kind of high, at stock, don't you think? Though one may have existed, I don't recall a single GPU STARTING at 300W/375W board power. Scaling be damned, that is one hell of a difference, about 100W or 40% give or take several (didn't do the math).
 
With +50% power limit and slight overclock to 1150Core I have seen my FuryX edging close to 400Watt playing FO4. And this was all done purely on software level modification with stock AIO. So yeah Vega is only gonna be worse than that IMO.

Also somehow I feel like it is ReJzor versus everyone who doesn't agree with this hype ATM.

Vega is still GCN. The very fact that AMD was able to run it with slightly modified Fiji driver and demo it last December was telling everything.

It is late and it seems to be delayed to this Fall as well. Move on.
 
Someone said Vega consumption is insane. I've posted a proof so can be Pascals. That was all there is to it. Whether one pushes it to max from factory or leaves some headroom to AIB's, that's up to the card maker.
They will release first single GPU card with 375w TDP ever and the first thing you do is linking to tomshw page where extreme ed. 1080ti is consuming around 250w (stock 1080ti TDP is 250w, remember) in gaming and much more when ultra overclocked. That is proof of your...
 
Last edited:
Dude, just stop it already.
 
375W ?? Jeeeezus!!!

Taken into account that is the worst condition the card can consume in total.

But that does not mean you will pass 375W of power when playing games for example. There's various tricks and software features that caps the limit of the power usage.

The 9570 had a TDP of 220W as well but could be shaved off like 60watts from the wall by simply undervolting it since AMD is not finetuning their cards or CPU's to the limit.

They rather go for a safe approach with a slight higher power consumption. I think the AVG will be on 250 up to 275W. Not 375.
 
Yes, everything is ok, probably just a typo, they meant 275w as 290x. Peace.
 
Dont compare a maxed out card with a stock one... it wouldnt have started, rezj. ;)



@Jism - As far as the worst case. Sure, it is, but it CAN get there, just as how on the NVIDIA side it can too. That is why its listed there and not lower. Undervolting is also an option...as it can be for NVIDIA cards too. We need to take this at face value until we see more testing.

My guess is the average on a 375w card is well over 300W... they dont overestimate by 50%...

I just know comparing a maxed out 1080ti to stock board power wasn't remotely a good idea. :p
 
Last edited:
AMD CGN / NUC will always have a disadvantage due to the brute-force approach compared to Nvidia. This is why AMD cards are being favoured by miners in general. This brute-force always comes with a slight higher power consumption. But that does'nt mean AMD cards are bad. In fact, i tossed the brand Nvidia away back in the FX5xxx series and never switched back again. It was always AMD and their cards performed very well (X800XT era).
 
AMD loses to nVidia in power consumption, is that even a surprise?

Still a good buy if the price/perf is as expected from an AMD though (if the miners don't jack up the price).
 
Why is power consumption an issue for something that is only geared towards enthusiasts???
That's like saying I won't buy a Ferrari because it only gets 6 mpg while the Lamborghini gets 7 mpg...
In the end the people who will buy it don't give a fuck.
 
Why is power consumption an issue for something that is only geared towards enthusiasts???
That's like saying I won't buy a Ferrari because it only gets 6 mpg while the Lamborghini gets 7 mpg...
In the end the people who will buy it don't give a fuck.
You are right at least in my eyes. I really don't care about the power consumption much. It's more of a bonus to the purchase. Perf/$ that's what I'm mostly after. Of course in the high end segment which I assume vega will end up in.
 
Here be?? Is TPU run by pirates?? :p

overclocked to the max, while these are 300/375w stock...how was that point lost?

Well.... if you think critically about Fury X and its real world OC potential, and how HBM was super stubborn in anything you did with the clocks... And if you also look at how Ryzen behaves nowadays in terms of how it is optimized for a rather tight max clock range...

I find it reasonable to believe that AMD is taking a different route with its newer hardware releases. They push it to the edge themselves. In the case of Fury X > Vega it would make complete sense to do so, not only do they need the performance, but the competitor is essentially doing something similar but calls it GPU Boost 3.0, which essentially only gives you less guarantees on the box (300mhz gaps between stock and actual clocks) but has the same net result - and yes, it uses less power = less heat and therefore still offers OC headroom, but then again, look at the OC left on the 1080ti Lightning - 3% really ain't much is it...

Now that I think of it, RX580 is another great example of AMD eating up the OC headroom and marketing it themselves. Its a real trend. Look at Intel's Kaby Lake and X299 releases: increased TDP for clock bumps out of the box. Its a simple method to hide stagnation.
 
Last edited:
Of course you have to care about power consumption ... it's not like the only consequence is the power bill ... I mean, the dissipated heat has to go somewhere
 
Of course you have to care about power consumption ... it's not like the only consequence is the power bill ... I mean, the dissipated heat has to go somewhere
Of course it has to go somewhere. But considering my 780 TI I currently use and 670 before and 460 before I don't think that's much of a problem. I been also thinking about the water cooling solution. That all depends but believe me I don't think that's a problem with my current configuration and how it spreads the heat. If I find out that there's a problem slight modification or improvement will solve that issue.
 
I think he means you are putting the heat into your room no matter what, unless you put the PC in another room or run your cooling to outside the room/home/office. :)

i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.
 
I think he means you are putting the heat into your room no matter what, unless you put the PC in another room or run your cooling to outside the room/home/office. :)

i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.
Hmm what about aricon? does this count? sitting in a room with boiling hot temperatures is not my favorite.
 
Well.... if you think critically about Fury X and its real world OC potential, and how HBM was super stubborn in anything you did with the clocks... And if you also look at how Ryzen behaves nowadays in terms of how it is optimized for a rather tight max clock range...

I find it reasonable to believe that AMD is taking a different route with its newer hardware releases. They push it to the edge themselves. In the case of Fury X > Vega it would make complete sense to do so, not only do they need the performance, but the competitor is essentially doing something similar but calls it GPU Boost 3.0, which essentially only gives you less guarantees on the box (300mhz gaps between stock and actual clocks) but has the same net result - and yes, it uses less power = less heat and therefore still offers OC headroom, but then again, look at the OC left on the 1080ti Lightning - 3% really ain't much is it...

Now that I think of it, RX580 is another great example of AMD eating up the OC headroom and marketing it themselves. Its a real trend. Look at Intel's Kaby Lake and X299 releases: increased TDP for clock bumps out of the box. Its a simple method to hide stagnation.
yep.. already considered that, and was accounted for if you look back...i have 10 more posts after that one V. ;)
(Post 32 and 44)

Again.. if it slightly overclocks, 325/400+ is in the cards. If it has more headroom (like i suspect), its going to be even higher.
 
Of course it has to go somewhere. But considering my 780 TI I currently use and 670 before and 460 before I don't think that's much of a problem. I been also thinking about the water cooling solution. That all depends but believe me I don't think that's a problem with my current configuration and how it spreads the heat. If I find out that there's a problem slight modification or improvement will solve that issue.

I think he means you are putting the heat into your room no matter what, unless you put the PC in another room or run your cooling to outside the room/home/office. :)

i.e. in the summer, lots of heat might mean the room becomes too hot to sit in.

Yeah, two PCs in the same room in the summer ... the complete causal chain ends with buying stronger air conditioning
 
Yeah, two PCs in the same room in the summer ... the complete causal chain ends with buying stronger air conditioning
I think mine does just great in that department. I got only one computer but also other stuff that generate heat. So far never had problems and I don't think the Vega card would have such an impact on my aircon environment.
If it has an impact I just crank the aircon up and it's done. Besides I think I will go with water cooling solution anyway.
 
Back
Top