• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

5070 cards available below £550 in in the UK

Fair enough if it's still getting by (depends on how each game handles VRAM also, performance is one metric) but I'm still not going to excuse NVIDIA gimping their £500+ GPUs. And it's the same for AMD, they're planning 8GB for the 9060 so they're no better. They only try when they have to, it's just better for us when they actually have to.


Nope. If AMD drops the 9070 GRE with 12GB they're in the same boat as NVIDIA. At these prices it's blatant corpo-scum behaviour to gimp these GPUs.
Do you think 9060 users who spend $2-300 on GPUs are really playing at higher than 1080p?
 
It doesn't matter if you like the guy, or of you like youtubers or whatever drama you didn't like. He presented the evidence. The messenger is irrelevant.

Steve can think what he likes. x50 cards (pre-Ampere) were 100-ish watts and under USD200. There's no way on God's green earth that a card with the resources of a 5070 is coming in at those numbers.

The 5070 averages about 60fps in TPU testing at 4K, with heavyweight titles around 30fps, whilst drawing 200-250W. The Turing card that performed roughly similarly in its contemporary suite is the 2080.
 
Steve can think what he likes. x50 cards (pre-Ampere) were 100-ish watts and under USD200. There's no way on God's green earth that a card with the resources of a 5070 is coming in at those numbers.

The 5070 averages about 60fps in TPU testing at 4K, with heavyweight titles around 30fps, whilst drawing 200-250W. The Turing card that performed roughly similarly in its contemporary suite is the 2080.

facts are not "think what he likes"
 
facts are not "think what he likes"
I'm not sure you know what the word "facts" means.

EDIT: Sorry, that was an unhelpful, knee-jerk reaction. Anyway, the mistake I feel Steve makes is the same one a lot of folks are making: Using the x90 cards as a baseline. They're a terrible baseline. The 3090 blew the lid off the power and cost ceiling of the PC graphics market, recalibrating expectations in the process, exactly as Nvidia intended. I've run the numbers: 200-250W of PC graphics has cost $500-700 for the past two decades. I don't know why anyone would expect that calculus to change, regardless of what kind of performance the top-level consumer card provides, because NV can change whatever the top-level card can do more-or-less at will.

Are Steve's numbers wrong? No. But I disagree with his interpretation of those numbers.

Would you feel better if the 5070 were called a 5050? It would cost the same regardless of the number on the box, after all.
 
Last edited:
Do you think 9060 users who spend $2-300 on GPUs are really playing at higher than 1080p?
They might. It's about giving consumers the headroom to work with and it's well within reasonable ask that AMD and NVIDIA provide more than 8GB for their budget cards.

For example, the 4060 suffered from stuttering, bad textures, pop-in, and performance loss (game dependant) in situations where it was more than capable of meeting the demand had it just had more VRAM. AMD pulled the same stunt with the RX 7600 and both companies released 16GB models as an overpriced apology.

A card like the the 5070 should never be in a scenario where it is unable to meet the same demand as a 9070 but there are cases where you can push a game to a level which a 9070 can do but a 5070 can not (like in Indiana Jones). These cases aren't common (yet) and game developers share a lot of the blame but it doesn't help when the GPUs themselves are intentionally limited. :shadedshu:
 
This is what amazes me. Someone with experience of Steve is making such basic mistakes as using flagship performance as a reference point. Flagships always were ridiculous. Do you want to go back to those dual GPUs days? This time RTX 5090 is particularly over-fattened GPU while rest of the line up stayed the same while receiving price drops. A strategy akin to good old days when lackluster performance gains was compensated by lower prices of new generation. If anything, this generation should be celebrated, because RTX 5060 seems to finally be RTX 5060 and not RTX 5050 with a bad name. Nvidia is feeling the pressure from Intel B580/B570 graphics cards and it keeps them in line.

He makes such amateurish mistakes despite being proclaimed as a tech Jesus. However, he has an ego of a messiah too.
 
and before someone comments on how that card makes no sense for 1080p, i personally know one individual that wants a 70 class card for 1080p
And it's completely valid. Some people game at laxed settings that allow for sky high framerates (at the same time shaving a major part off VRAM requirements) so going, say, 1080p Medium-High + 5070 makes for hundreds of FPS without even barely reaching the VRAM ceiling. Even despite the card "only" having 12 GB on board.

People keep forgetting that "X GB isn't enough" should end with "...for this use case." The fact you're not comfy with X GB doesn't make this amount inherently bad.

Still, 4060 having less VRAM than 3060 stinks. And the main issue of modern GPUs isn't how much VRAM they have, they have enough; it's how much VRAM they have per dollar. And with price tags like this, it's not even close to actually matter because the performance overall is also lacking.

My 6700 XT has 12 GB VRAM which is a lot by the standards of five years ago. Do I run out of 12 GB? Maybe if I crank the settings up so high my GPU would've only made it to 10+ FPS even if it had enough VRAM. Which is probably not the smartest thing to do if you have a 1080p170 monitor.
 
Do you think 9060 users who spend $2-300 on GPUs are really playing at higher than 1080p?
Yes. 1440p monitors cost less than $200 now, that's cheaper than the actual card.
And the main issue of modern GPUs isn't how much VRAM they have, they have enough
Not really. They're releasing alongside games they can't even play NOW, that's the exact opposite of "enough".
 
Not really. They're releasing alongside games they can't even play NOW, that's the exact opposite of "enough".
It's only a problem because:
1. GPUs are ridiculously expensive.
2. Game developers do it wrong.
 
It's only a problem because:
1. GPUs are ridiculously expensive.
2. Game developers do it wrong.
Alright. The simplest solution is to just increase the VRAM though. It's cheap and easy and it's not like they would be eating into other segments if they did.
 
Alright. The simplest solution is to just increase the VRAM though. It's cheap and easy and it's not like they would be eating into other segments if they did.
And the right solution is the buyers yelling at both NVIDIA and AMD that their solutions outright stink whilst NOT BUYING THEM and making it crystal clear that a 5070 for 550 USD is just a robbery. Either we get it discounted to, say, 450 USD which makes it an actually OK device for its money, or we get a 5070 Super with 16 GB and some more calculating power for essentially the same 550 USD and it's also fine.

Just slapping more VRAM, don't get me wrong, is nice but it's not a solution. The whole paradigm of 0 stock 999999999 price GPUs with absolutely pathetic excuses as to why it's happening, this is what's to fly a one-way plane.
 
Isn't RTX 5070 faster and cheaper than card it is replacing? Not to mention that VRAM speed and bandwidth being higher and Nvidia introducing better memory handling algorithms?

The point is, Nvidia did advanced in regards to this question. Its competition offers more VRAM, but it is a lot more inferior version. GDDR6 vs GDDR7. Even previously the amount of VRAM AMD GPUs were drawing were noticeably higher than that of Nvidia's card. Nvidia VRAM is worth more than AMD's VRAM. This comparison is a lot more nuanced and not as black as white as people would like to be.

Does it sucks and Blackwell should had come with more VRAM? Absolutely. Is it enough today? Indeed it is. I think that VRAM fanatics are a small and loud group of people and Nvidia's GPUs will continue selling despite the whining.
 
Last edited:
And the right solution is the buyers yelling at both NVIDIA and AMD that their solutions outright stink whilst NOT BUYING THEM and making it crystal clear that a 5070 for 550 USD is just a robbery. Either we get it discounted to, say, 450 USD which makes it an actually OK device for its money, or we get a 5070 Super with 16 GB and some more calculating power for essentially the same 550 USD and it's also fine.
Don't look at me, I'm doing just that. A 5070 Super would probably have 18GB with 3GB modules... or 15GB if they left one off because they don't want you getting more VRAM than a 5080.
Just slapping more VRAM, don't get me wrong, is nice but it's not a solution.
Adding more VRAM is absolutely a solution to problems where there's not enough VRAM, they just choose not to do it for some reason. With current cards it's not possible without clamshelling or denser modules, but there's really no good reason to engineer around having so little during the design stage. As for keeping GPUs in stock, I don't think VRAM is the bottleneck here.
 
The point is, Nvidia did advanced in regards to this question. Its competition offers more VRAM, but it is a lot more inferior version. GDDR6 vs GDDR7.
As far as I've seen, 12GB of GDDR7 is still the first to hit a wall vs 16GB of DDR6. What's the nuance that should convince a buyer towards 4GB less but GDDR7? Genuinely asking.

NVIDIA could sell the 5070 with 8GB on their name alone so I don't see a good point in saying they'll sell regardless of what we think. This conversation should be about them them limiting a card's capabilities even if the ceiling is not often reached. For example, I never used all 16GB on my 6800 but I also was able to all adjust my settings without worrying about VRAM and then download additional HD texture packs because I know for sure I have the room. This card was £350 when I bought it. The 5070 is 30% more powerful and I would hope it isn't possible for there to be a game where the 5070 would choke before a 6800..:rolleyes:
 
Isn't RTX 5070 faster and cheaper than card it is replacing? Not to mention that VRAM speed and bandwidth being higher and Nvidia introducing better memory handling algorithms?

The point is, Nvidia did advanced in regards to this question. Its competition offers more VRAM, but it is a lot more inferior version. GDDR6 vs GDDR7. Even previously the amount of VRAM AMD GPUs were drawing were noticeably higher than that of Nvidia's card. Nvidia VRAM is worth more than AMD's VRAM. This comparison is a lot more nuanced and not as black as white as people would like to be.

Does it sucks and Blackwell should had come with more VRAM? Absolutely. Is it enough today? Indeed it is. I think that VRAM fanatics are a small and loud group of people and Nvidia's GPUs will continue selling despite the whining.
NVIDIA VRAM is indeed worth more than AMD VRAM but additional VRAM is worth even more than fast VRAM. A 12GB 3060 with GDDR6 is worth much more than an 8GB 5060 with GDDR7 in every application that actually uses it. Swapping is absolute murder, 16GB of GDDR5 is way more valuable than 8GB of HBM4, full stop.
 
Zero issues with every game on 1440p ultra + RT with 3080 Ti 12 GB.

12 GB 4070 Ti does as well as 24 GB 7900 XTX in one of the most demanding RT games at Ultra.

View attachment 393280

Muh VRAM argument is boring and tired, lets quit it.

Play Wukong.
Before you say that's one game, play the latest AC, maybe even w/o up-scaling or RT.
Before you say that's two games play the latest Spider-man with RT.
Before you say that's three games, play Star Wars Outlaws with any kind of decent features.
Before you say that's four games...realize that's what people are looking at...and also that SM2 and AC are two of the last games tested on this site...I ran out of options bc Wizzard's suite needs an overhaul.
Which is happening, thank goodness. Monster Hunter, etc.
Also, remember that you still look at averages when you struggle with that stutter. I mean really, I'm not trying to be cruel to people that don't get it, but seriously think about it for a second.
If averages are higher, and mins similar or lower, what does that mean? That means the lows are LOWER and more often, and FR less consistent; that's what that means. That's just provable math. That is undesirable.
Especially if outside the typical VRR window of 48fps.
Reviewers that are not on this site have shown this very thing, over and over. Some people need to look at more than one source; hopefully not one reliant on Big Green sponsorships for their income.

Some people really do have a difficult time understanding that, and this sites' reviews truly do not explain it well. I will hammer it home until I'm blue in the face because people take what you do from W1z.

Anyone recommending *new* 12GB rn for 1440p (esp RT) is either precious to their older purchase or a fool. It's not a tired argument because some people give others incredibly bad advice.
This sometimes includes W1zard, as with the 5070. Then look at LITERALLY EVERY OTHER REVIEW and people that understand RAM limitations (of stutter/swap/mins), and ways this card falters now.
And will more-so in the future as these features more common-place or even standardized into engines, which would likely be within the lifetime of this purchase. That is our point.
5070 is VERY bad; it's not just the ram but the overall raster performance, especially for it's price. It is a 1080p card...sometimes barely rn. Yes, that's sometimes high settings. It's also a new $550+ GPU.
Not that far off in price from a 9070 xt that will run ALL of the things I've mentioned just fine, if even VRR and/or an OC, but usually ~60fps mins; nothing saving the 12GB cards.
Not now, and will be even worse in the future (where I believe more games will be like Wukong; 9070 XT OC itself barely staying in the 1440p VRR window w/ high settings; but still likely may/will/could).

Comparing 7900xtx in RT is such a ridiculous straw man. If you want to do that, let's see a 12GB nVIDIA card run 4k raster in any kind of recent/demanding title. Exactly; things have different uses.
Many people don't understand how things function at all; so quick to blame games for nVIDIA's inadequacies. That is why we have these arguments; people are ignorant and I will show/tell them that.
I don't blame games for AMD not having similar RT performance in RDNA3; it's just what ended up getting implemented (likely due to nVIDIA's sponsorships), but that's still the reality people have to deal with.
Just like nVIDIA not giving enough VRAM (and sometimes even compute) per tier of card. Everything else is cope; the game standards have increased (and likely will a little more) regardless if you like it.
AMD adapted w/ RT/up-scaling, but always had strong-enough compute/buffer per resolution before those got better.
nVIDIA did not with their shenanigans which have now gone on for generations, and I have consistently proven this as their cards with one issue or another fall from quickly out of the purpose they are sold.

I'll quit it when people stop saying things that are frankly just wrong.

FWIW, I don't like the 9070 or anything below it either, so don't give me that nonsense. I have said we are talking a new generation that justifies a certain level of compute/RT/buffer/up-scaling quality.
Many cards don't have that. If that's not in your budget right now, I understand that (even though 9070xt made it more accessible). But then you should be buying an appropriate cheaper/better raster card IMHO.
Probably still w/ 16GB+ (like a 6800xt/7800xt)/79xx)! Maybe not *because* of that; just a side-effect of it being the most reasonable tool for the job w/o faltering at anything it's being sold to do (which isn't RT).
 
Last edited:
As far as I've seen, 12GB of GDDR7 is still the first to hit a wall vs 16GB of DDR6. What's the nuance that should convince a buyer towards 4GB less but GDDR7? Genuinely asking.

NVIDIA could sell the 5070 with 8GB on their name alone so I don't see a good point in saying they'll sell regardless of what we think. This conversation should be about them them limiting a card's capabilities even if the ceiling is not often reached. For example, I never used all 16GB on my 6800 but I also was able to all adjust my settings without worrying about VRAM and then download additional HD texture packs because I know for sure I have the room. This card was £350 when I bought it. The 5070 is 30% more powerful and I would hope it isn't possible for there to be a game where the 5070 would choke before a 6800..:rolleyes:

Simple, Nvidia VRAM is worth more. So, getting 4 GB of GDDR6 VRAM is not that much of a massive difference. It is more like 2 GB of difference and people are losing their minds over Nvidia while giving it a pass to AMD. Logically, if Nvidia is unacceptable with VRAM then RX 9070 XT is also bad in this matter as it has just a little more of it. It is like all that VRAM panic at 2023 when it started. It was proven largely baseless. If 2 GB VRAM difference doesn't have that much of an importance, there is also that much difference between RTX 5070 and RTX 9070 XT due to low quality VRAM of AMD. Or lower quality or slower speed, call like you want this.

As for how much less valuable AMD VRAM is, here is the proof. AMD allocates among 1-3 GB more of VRAM depending on a game and scenarios which are not even VRAM limiting for RTX 5070. So, if you want to talk about how bad Nvidia is with its VRAM then you have to accept that AMD with its RX 9070 is just as crap. Well, a crap which stinks just a little less. It is what they were trying to do all this time anyways.
 
Play Wukong.
Before you say that's one game, play the latest AC, maybe even w/o up-scaling or RT.
Before you say that's two games play the latest Spider-man with RT.
Before you say that's three games, play Star Wars Outlaws with any kind of decent features.
Before you say that's four games...realize that's what people are looking at...and also that SM2 and AC are two of the last games tested on this site...I ran out of options bc Wizzard's suite needs an overhaul.
Which is happening, thank goodness. Monster Hunter, etc.
Also, remember that you still look at averages when you struggle with that stutter. I mean really, I'm not trying to be cruel to people that don't get it, but seriously think about it for a second.
If averages are higher, and mins similar or lower, what does that mean? That means the lows are LOWER and more often, and FR less consistent; that's what that means. That's just provable math. That is undesirable.
Especially if outside the typical VRR window of 48fps.
Reviewers that are not on this site have shown this very thing, over and over. Some people need to look at more than one source; hopefully not one reliant on Big Green sponsorships for their income.

Some people really do have a difficult time understanding that, and this sites' reviews truly do not explain it well. I will hammer it home until I'm blue in the face because people take what you do from W1z.

Anyone recommending *new* 12GB rn for 1440p (esp RT) is either precious to their older purchase or a fool. It's not a tired argument because some people give others incredibly bad advice.
This sometimes includes W1zard, as with the 5070. Then look at LITERALLY EVERY OTHER REVIEW and people that understand RAM limitations (of stutter/swap/mins), and ways this card falters now.
And will more-so in the future as these features more common-place or even standardized into engines, which would likely be within the lifetime of this purchase. That is our point.
5070 is VERY bad; it's not just the ram but the overall raster performance, especially for it's price. It is a 1080p card...sometimes barely rn. Yes, that's sometimes high settings. It's also a new $550+ GPU.
Not that far off in price from a 9070 xt that will run ALL of the things I've mentioned just fine, if even VRR and/or an OC, but usually ~60fps mins; nothing saving the 12GB cards.
Not now, and will be even worse in the future (where I believe more games will be like Wukong; 9070 XT OC itself barely staying in the 1440p VRR window w/ high settings; but still likely may/will/could).

Comparing 7900xtx in RT is such a ridiculous straw man. If you want to do that, let's see a 12GB nVIDIA card run 4k raster in any kind of recent/demanding title. Exactly; things have different uses.
Many people don't understand how things function at all; so quick to blame games for nVIDIA's inadequacies. That is why we have these arguments; people are ignorant and I will show/tell them that.
I don't blame games for AMD not having similar RT performance in RDNA3; it's just what ended up getting implemented (likely due to nVIDIA's sponsorships), but that's still the reality people have to deal with.
Just like nVIDIA not giving enough VRAM (and sometimes even compute) per tier of card. Everything else is cope; the game standards have increased (and likely will a little more) regardless if you like it.
AMD adapted w/ RT/up-scaling, but always had strong-enough compute/buffer per resolution before those got better.
nVIDIA did not with their shenanigans which have now gone on for generations, and I have consistently proven this as their cards with one issue or another fall from quickly out of the purpose they are sold.

I'll quit it when people stop saying things that are frankly just wrong.

FWIW, I don't like the 9070 or anything below it either, so don't give me that nonsense. I have said we are talking a new generation that justifies a certain level of compute/RT/buffer/up-scaling quality.
Many cards don't have that. If that's not in your budget right now, I understand that (even though 9070xt made it more accessible). But then you should be buying an appropriate cheaper/better raster card IMHO.
Probably still w/ 16GB+ (like a 6800xt/7800xt)/79xx)! Maybe not *because* of that; just a side-effect of it being the most reasonable tool for the job w/o faltering at anything it's being sold to do (which isn't RT).
Blah blah blah 1% lows. "Frankly wrong", lmao.

Meanwhile in reality.

Better min FPS than 16 GB GRE or 24 GB 3090.

1000013282.png


Meanwhile 10 GB 3080 beats "future proof" 16 GB 7800 XT.

Meanwhile 8 GB 4060 beats 16 GB 7600XT.

Meanwhile 8 GB 3060 Ti beats 12 GB 6700 XT.

Here's that 12 GB card beating a 24 GB 3090 in 4K, since you asked.

1000013283.png
 
This is roughly how it works:

0. Desperate, and trigger-happy people buy more GPUs than they actually need.
1. Both nVidia and AMD recognize the sitch and charge more because it sells better than expected.
2. You get less VRAM per dollar than you should've gotten if people were rational.
3. In the meantime, game developers struggle to make games optimized (at least 90% of the fault is on management), slapping ridiculously heavy textures on top of post-effects on top of post-effects because due to the lack of time budget to make these post-FX run right they have to compensate for blur and smearing by enlarging the textures. Almost no one ever thinks of disabling the post-FX they don't have time/skill to configure the correct way.
4. VRAM requirements go north, video card manufacturers' generosity goes south.
5. ????????
6. Happy new 2008! You can afford 1080p gaming if you have 600 USD for a GPU!

This is a systemic disease. What Joe Public should do with this is:

1. Stop buying video cards if you don't absolutely need them right here, right now. Having low FPS in games doesn't qualify for that.
2. Stop pre-ordering video games. Especially if they're made by big companies.

What in fact will be done is:

1. Whining on forums.
2. Many YouTube videos "revealing" the "corporate greed."
3. People panicking and overbuying GPUs "because it'll be even worse tomorrow."
4. ????????
5. More of the same.

The problem isn't GPUs, isn't VRAM, isn't tariffs or whatnot. The problem is homo sapiens sapiens that fails to live up to its name.
 
Ah, facts. That old chestnut :)

Problem is the "facts", just like statistics, science papers, and any other allegedly hard internet currency, can be massaged and presented in a way that supports your narrative. The proof of that is just about anywhere you look on the internet, when people on two sides of some argument will bombard themselves with "facts" ad nauseum.

If you think that somebody like the aforementioned "Tech Jesus" who makes bank from his videos and merch, doesn't have an agenda/follow a narrative (even subconsciously), then here's a piece of an old rope I have to sell.

That's not to say he or others are always wrong, it's just that taking all their output as gospel is really naive.
 
Stop the insults.
Stick to the topic.
And, report problems... the moderation team will deal with them. Don't become the problem by making retaliatory comments!
 
  • Like
Reactions: NSR
1. Stop buying video cards if you don't absolutely need them right here, right now. Having low FPS in games doesn't qualify for that.
Sorry, but try as I might, I really struggle to understand this argument. What does it even mean, "if you don't really need them"? Isn't it obvious that GPUs (past some basic one to run your OS if your CPU can't) are a want, not need? We're in a hobby space. So, any lamentations about "needs" are a waste of time. And sure, "low fps" (or some other related factor) absolutely qualifies as a reason for an upgrade. If it doesn't, you might as well start telling people to take up fishing or stamp collecting (not that most other hobbies are really cheap either at higher levels)

I'm also up for calling out companies/devs/etc for whatever snafus they perpetrate but going into full conspiracy theory mode all the time (like your rant about textures) can get a bit tiresome.

Now, I seem to recall (?) you have already called me "an enabler" :) before (apologies if it was somebody else thou), so perhaps this is pointless as a direct reply, but there are enough of these sentiments out there in general.
 
This is roughly how it works:

0. Desperate, and trigger-happy people buy more GPUs than they actually need.
1. Both nVidia and AMD recognize the sitch and charge more because it sells better than expected.
2. You get less VRAM per dollar than you should've gotten if people were rational.
3. In the meantime, game developers struggle to make games optimized (at least 90% of the fault is on management), slapping ridiculously heavy textures on top of post-effects on top of post-effects because due to the lack of time budget to make these post-FX run right they have to compensate for blur and smearing by enlarging the textures. Almost no one ever thinks of disabling the post-FX they don't have time/skill to configure the correct way.
4. VRAM requirements go north, video card manufacturers' generosity goes south.
5. ????????
6. Happy new 2008! You can afford 1080p gaming if you have 600 USD for a GPU!

This is a systemic disease. What Joe Public should do with this is:

1. Stop buying video cards if you don't absolutely need them right here, right now. Having low FPS in games doesn't qualify for that.
2. Stop pre-ordering video games. Especially if they're made by big companies.

What in fact will be done is:

1. Whining on forums.
2. Many YouTube videos "revealing" the "corporate greed."
3. People panicking and overbuying GPUs "because it'll be even worse tomorrow."
4. ????????
5. More of the same.

The problem isn't GPUs, isn't VRAM, isn't tariffs or whatnot. The problem is homo sapiens sapiens that fails to live up to its name.
Nah, the blame definitely doesn't lie on the shoulders of gamers. It wouldn't matter if every single one of us stopped buying NVIDIA GPUs tomorrow, they'd just move the rest of their sales to datacenter. You have to remember, 90% of their sales goes to datacenter, we're 10% of their market. NVIDIA has the lion's share of the power here.
 
on a sidenote in Europe you can now get a 5070ti for about 50 to 100 more than a 9070xt and a 5070 also for around the same price as a 9070. AMD managed to cook itself even in this climate, even with Nvidia offering them everything, it's actually impressive.
 
Back
Top