• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Vega Discussion Thread

  • Thread starter Thread starter Deleted member 50521
  • Start date Start date
I don't understand why some people are desperate for VEGA to have 16GB HBM2... it won't have the grunt to use all that. Even the 1080Ti struggles when you absolutely MAX out some games such as Ghost Recon Wildlands, and it never even gets near 8GB before it starts to struggle. This is at 1440p Ultrawide or 4K by the way. It may be that some games can use that extra VRAM, but it comes back to the grunt of the card ultimately, and that's where it's going to fall short, just as any card would.
 
I don't understand why some people are desperate for VEGA to have 16GB HBM2... it won't have the grunt to use all that. Even the 1080Ti struggles when you absolutely MAX out some games such as Ghost Recon Wildlands, and it never even gets near 8GB before it starts to struggle. This is at 1440p Ultrawide or 4K by the way. It may be that some games can use that extra VRAM, but it comes back to the grunt of the card ultimately, and that's where it's going to fall short, just as any card would.
The amount of stacks of HBM2 memory has not only an impact on the capacity but also on the speed of the memory. That's why.
 
AMD-Radeon-Vega-Frontier-Edition-Liquid-Cooled-1000x466.jpg


OMFG this is SEXY as hell

How it looks is irrelevant. You seem to have your priorities mixed up, dude: how it looks shouldn't be high on the priorities list.

It could be adorned with gold and diamonds but if it's performance is crap, then Vega is crap: expensive crap ... but still crap ...
 
How it looks is irrelevant. You seem to have your priorities mixed up, dude: how it looks shouldn't be high on the priorities list.

It could be adorned with gold and diamonds but if it's performance is crap, then Vega is crap: expensive crap ... but still crap ...
Yeah but if it has the same performance and price as the competitor, I'd rather have this gorgeous beast than a triple fan plastic shroud crap.
 
I don't understand why some people are desperate for VEGA to have 16GB HBM2... it won't have the grunt to use all that. Even the 1080Ti struggles when you absolutely MAX out some games such as Ghost Recon Wildlands, and it never even gets near 8GB before it starts to struggle. This is at 1440p Ultrawide or 4K by the way. It may be that some games can use that extra VRAM, but it comes back to the grunt of the card ultimately, and that's where it's going to fall short, just as any card would.

To make matters even more complicated people shouldn't assume VRAM used is the same thing as VRAM needed. Some games fill up more VRAM than is actually needed just because it's there. This was illustrated here in the Maxwell Titan X review. Granted it's from 2 years ago and games continue to be more VRAM demanding

memory.gif


From the review:

"Modern games use various memory allocation strategies, and these usually involve loading as much into VRAM as fits even if the texture might never or only rarely be used. Call of Duty: AW seems even worse as it keeps stuffing textures into the memory while you play and not as a level loads, without ever removing anything in the hope that it might need whatever it puts there at some point in the future, which it does not as the FPS on 3 GB cards would otherwise be seriously compromised."

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html
 
Even if GDDR6 was out, it only solves the bandwidth issues, not the latency ones. HBM on the other hand solves both.

It's similar to normal DDR for system RAM. We get higher frequencies with each release, but timings just keep on climbing. Anyone remembers RAM when it had 2-3-3-6 timings? These days, 10-10-10-20-ish is considered extreme top of the line RAM that costs both kidneys if you want it at high enough speeds. VRAM is not really much different. HBM on the other hand, physical construction allows that.
Yup mine runs 10-10-12-20-20, and they are tighter than the Trident set I based my timings off of.
 
Some games fill up more VRAM than is actually needed just because it's there.
Yeah, it's modern approach of using virtual texturing combined with texture streaming that produces that behaviour:
virtual-texturing.jpg

You essentially have average of 50 to 80 GB of textures on disk per game today, and all of them are used in a single scene (open world games) ... but "only" 8 GB is streamed in and cached in appropriate level of detail (mip level) depending on screen resolution, distance to camera, what part of the world player is in, and at what direction player is looking.
 
Lets wait till its officially out before evaluating it, instead of speculating
 
OMFG this is SEXY as hell
Meh, they all just remind me of those horrible looking square shaped vehicles driving down the road.

That said, I don't really care about the looks, it's the performance that I feel will be lacking compared to the 1080 Ti, and even IF the performance were on par with it, their software is still worse than Nvidia's.

Before you AMD fanboys jump in again, like you do EVERY time I point it out, I'm not just talking about the display driver, but the rest of the software that goes with it.
 
Meh, they all just remind me of those horrible looking square shaped vehicles driving down the road.

That said, I don't really care about the looks, it's the performance that I feel will be lacking compared to the 1080 Ti, and even IF the performance were on par with it, their software is still worse than Nvidia's.

Before you AMD fanboys jump in again, like you do EVERY time I point it out, I'm not just talking about the display driver, but the rest of the software that goes with it.
So no one can differ from your opinion without being a fanboy tut, personally I think nvidia are not all that either and intel are even worse on gpu software.
But you can have your opinion im fine with that ,yet due to the way your expressing it im now the fanboi ,tut oh well

You are in a forum a place of contrasting opinions made for debate yet seek to negate it hmnn odd again i think someone's getting confused , you want geforce now . com or something maybe:);)
 
Meh, they all just remind me of those horrible looking square shaped vehicles driving down the road.

That said, I don't really care about the looks, it's the performance that I feel will be lacking compared to the 1080 Ti, and even IF the performance were on par with it, their software is still worse than Nvidia's.

Before you AMD fanboys jump in again, like you do EVERY time I point it out, I'm not just talking about the display driver, but the rest of the software that goes with it.

That is exactly what a fanboy would say. I'm joking of course :) .

But really , I can hardly see any of these fanboys you are talking about. Believing AMD/Nvidia dose better because of X reason dose NOT mean you are a fanboy and in this case there are legit reasons to believe Vega will impress. Seriously , if you truly believe your logic then everybody else is a fanboy whenever we say something different.
 
The
Meh, they all just remind me of those horrible looking square shaped vehicles driving down the road.

That said, I don't really care about the looks, it's the performance that I feel will be lacking compared to the 1080 Ti, and even IF the performance were on par with it, their software is still worse than Nvidia's.

Before you AMD fanboys jump in again, like you do EVERY time I point it out, I'm not just talking about the display driver, but the rest of the software that goes with it.

Nah I will not be picking up another AMD card, at least not for a very long time.
 
Meh, they all just remind me of those horrible looking square shaped vehicles driving down the road.

That said, I don't really care about the looks, it's the performance that I feel will be lacking compared to the 1080 Ti, and even IF the performance were on par with it, their software is still worse than Nvidia's.

Before you AMD fanboys jump in again, like you do EVERY time I point it out, I'm not just talking about the display driver, but the rest of the software that goes with it.

Crimson control panel is million years ahead of archaic NV CP which looks like software which escaped from 2005... And it's also that buggy/glitchy...
 
The


Nah I will not be picking up another AMD card, at least not for a very long time.
Yet you've always time to chat about them , have you had therapy or something , I've not seen you say Vega in a while:);)
 
Crimson control panel is million years ahead of archaic NV CP which looks like software which escaped from 2005... And it's also that buggy/glitchy...

Not to mention that for some reason it still freezes for like 5 seconds every single god damn time you make a change.
 
Frag are you ever happy dude?
 
Not to mention that for some reason it still freezes for like 5 seconds every single god damn time you make a change.

That and the whole list ALWAYS jumps to top when you make a change. So you have to change it from bottom to top, otherwise you have to keep scrolling down. It's so friggin infuriating and it has been like this for months if not years (but I don't know since I only have GTX 980 for like a year). I can't understand how not a single person in NVIDIA's QA department said, "Guys, the menu is really glitchy, should we fix this?" Apparently not.
 
Why are you using control panel to make changes?

I install card, then drivers then game the shit out of it. I don't get Control Panel involved, I dont use GF Experience - I just overclock it and play it. You know - the way its meant to be played. Zero probs.
 
Why are you using control panel to make changes?

I install card, then drivers then game the shit out of it. I don't get Control Panel involved, I dont use GF Experience - I just overclock it and play it. You know - the way its meant to be played. Zero probs.

V-Sync, if you play older games (which i do) adjusting of FSAA and AF etc. And with Wattman, things have basically reversed. In the past NVIDIA was the one which had overclocking tool built-in. Then AMD had it and with Wattman, you don't even have to think about MSI Afterburner because built-in tool is so powerful. And I've used Crimson panel on cousin's laptop. It's so pretty and really nice to use. AMD really turned a new page with a lot of things, especially on the software side.
 
God bless duopolies.
 
Crimson control panel is million years ahead of archaic NV CP which looks like software which escaped from 2005... And it's also that buggy/glitchy...

^This a 1000 times. I was excited to try GeForce Experience and it got to the point where nothing at all worked and had to revert to downloading drivers like before. Such disappoint.
 
Yeah but if it has the same performance and price as the competitor, I'd rather have this gorgeous beast than a triple fan plastic shroud crap.

If it's the same performance and price as the competitor, then it's definitely not crap. In this case, how it looks adds to the overall value.

People give more weight to aesthetics then they should.

IMO, this is how one should measure GPUs:

1 - price
2 - performance
3 - power consumption
4 - noise
5 - manufacturer
6 - aesthetics
7 - brand

Notice how i put aesthetics near the end. Many people switch some of these around and you have cases such as this:

1 - manufacturer
2 - aesthetics
3 - brand
4 - performance
5 - price
6 - noise
7 - power consumption
 
Then AMD had it and with Wattman, you don't even have to think about MSI Afterburner because built-in tool is so powerful.
Wattman is good indeed, but I can't get it to hold my overclock settings on my 480. Afterburner holds whatever daily overclock I want for it startup to startup. So that's what I use.
 
If it's the same performance and price as the competitor, then it's definitely not crap. In this case, how it looks adds to the overall value.

People give more weight to aesthetics then they should.

IMO, this is how one should measure GPUs:

1 - price
2 - performance
3 - power consumption
4 - noise
5 - manufacturer
6 - aesthetics
7 - brand

Notice how i put aesthetics near the end. Many people switch some of these around and you have cases such as this:

1 - manufacturer
2 - aesthetics
3 - brand
4 - performance
5 - price
6 - noise
7 - power consumption

Well, it's kind of a rule that "brand" dictates several rules down the line. For example, you expect certain very specific qualities when your brand preference is for example "ASUS's Strix". Or Sapphire's "Nitro". It's why people have tendency to stick to one brand, because brand basically assures you'll get certain performance, noise and aesthetics that you do like and you're prepared to pay certain price for it. You may have certain brand preferences and if it happens that one is just too expensive for some reason or they made a specific design switch (like for example how most switched to wider graphic card coolers), you may switch between brands if there is some other option.

I don't have to specifically state what qualities I need, I can just say, if I'd be picking a graphic card it would either be Strix, Aorus, Gaming X or Nitro. Because they basically have overall qualities I always look at with graphic cards.

^This a 1000 times. I was excited to try GeForce Experience and it got to the point where nothing at all worked and had to revert to downloading drivers like before. Such disappoint.

I mean, if NV CP had NVIDIA Experience design, I'd actually be fine with it. Minus the registration thing. Though, NVIDIA Experience, other than it's video recording part, doesn't really offer anything useful to me. Those auto game optimizations are silly. Is it really that hard to tweak game settings yourself? How can some software know what's your visual preference compromise in games? For example, NVIDIA Experience may trim down shadows and textures in order to gain performance where I'd sacrifice everything else other than these exact two things. So, I just never rely on this stuff because in the end I'd glance over settings anyway to see what the thing fiddled with and if I like it.
 
Last edited by a moderator:
I'm going to continue buying ATI-AMD even if they aren't the best as long as I have a choice.
I don't care how much better Nvidia is... I really don't.
My first card was an ATI and my last will be the same.
Yes I have bought Nvidia... And will occasionally... But not for my main PC...
I don't consider myself a fanboy... I just have a very good point of reference in a long line of improvements to the same product line... Why change now... Too old.
 
Back
Top