Tuesday, February 25th 2025

NVIDIA GeForce RTX 5070 Reviews Reportedly Due for Publication on March 4

NVIDIA's upcoming mid-range GeForce RTX 5070 12 GB model is almost ready for launch, according to recent reports. Industry moles reckon that GB205 GPU-based specimens are already in the clutches of press and influencer outlets; review embargoes are due to be lifted on March 4, for $549 MSRP conformant SKUs (as disclosed by a VideoCardz source). Last week, we heard whispers about Team Green's (allegedly) troubled production cycle for incoming GeForce RTX 5070 and RTX 5060 models.

Insiders insist that these issues have caused a delay; many believed that NVIDIA had (prior) plans for a February GeForce RTX 5070 launch. A revised schedule was leaked to VideoCardz; the publication posits that GeForce RTX 5070 cards will launch at retail on March 5, with non-MSRP ($549+) reviews projected to go live on the same day. Based on various leaks, NVIDIA and AMD will likely clash with their respective new offerings. Right now, reviewers could be dealing with sizable piles of competing Team Green and Team Red hardware. Graphics card enthusiasts will be looking forward to incoming comparisons—GeForce RTX 5070 and its Ti sibling versus Radeon RX 9070 XT and RX 9070 (non-XT).
VideoCardz's has updated its speculative timetable for forthcoming graphics card embargoes:
  • February 28: Radeon RX 9070 Series Announcement
  • March 4: GeForce RTX 5070 reviews (NEW)
  • March 5: GeForce RTX 5070 sales & non-MSRP Reviews
  • March 5: Radeon RX 9070 Series reviews
  • March 6: Radeon RX 9070 Series sales
Source: VideoCardz
Add your own comment

36 Comments on NVIDIA GeForce RTX 5070 Reviews Reportedly Due for Publication on March 4

#1
bug
Does that include all the ROPs? Or do they still come in installments? :slap:
Posted on Reply
#2
Space Lynx
Astronaut
5070 is meant for 1080p gamers i assume, which is still a lot of people. 12gb vram isn't all that bad for that resolution from what i remember. i often find High preset looks better than Ultra preset in a lot of games these days too, terrible optimization from devs.

so yeah, if i was building a budget rig as a 1080p gamer, a 5070 would be a great choice assuming you can get it at msrp 549. future games will take advantage of the dlss4 frame gen well, so yeah its not a terrible option. 5070 ti is still the sweet spot though.
bugDoes that include all the ROPs? Or do they still come in installments? :slap:
I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
Posted on Reply
#3
BigMack70
Space Lynx5070 is meant for 1080p gamers i assume, which is still a lot of people. 12gb vram isn't all that bad for that resolution from what i remember. i often find High preset looks better than Ultra preset in a lot of games these days too, terrible optimization from devs.

so yeah, if i was building a budget rig as a 1080p gamer, a 5070 would be a great choice assuming you can get it at msrp 549. future games will take advantage of the dlss4 frame gen well, so yeah its not a terrible option. 5070 ti is still the sweet spot though.



I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
"budget rig"... "$550 GPU is a great choice [for 1080p]"

:kookoo:


Y'all been smoking way too much of Jensen's marketing.
Posted on Reply
#4
Space Lynx
Astronaut
BigMack70"budget rig"... "$550 GPU is a great choice [for 1080p]"

:kookoo:


Y'all been smoking way too much of Jensen's marketing.
i mean my teenager nephew is pulling in $280 a week at his part time. literally only take one month pay to get build him a decent rig since has no bills, with a 5070 and 1080p 180hz 23.8" monitor, he would have a blast. and that would last him many years thinks to 3-4x frame gen in future games/lowering settings

every market has inflation, it is what it is. adjust or whine, i dunno what to tell ya
Posted on Reply
#5
medi01
28th is just "announcement"?

Why not swing with the 9700XT?
Posted on Reply
#6
Lionheart
The memory bandwidth is disappointing, hopefully the 20Gps overclocks well.
Posted on Reply
#7
Onasi
Space Lynxi mean my teenager nephew is pulling in $280 a week at his part time. literally only take one month pay to get build him a decent rig since has no bills, with a 5070 and 1080p 180hz 23.8" monitor, he would have a blast. and that would last him many years thinks to 3-4x frame gen in future games/lowering settings

every market has inflation, it is what it is. adjust or whine, i dunno what to tell ya
Ah yes, because the entire world has a similar earning profile. Newsflash - 1080p PC gaming is nowadays really popular because of e-sports titles and is experiencing growth mostly through emerging markets. No shot that those populations will stomach 550 (more in reality) for a 1080p card. What we see in Steam HS kinda bears it out - the cheaper 4060 is the king.

Now, one might argue that e-sports games don’t really need a 5070 class GPU, sure, but then, what, are we arguing that building a 1080p rig for single player games with a 600 dollar class GPU is a sign of a sane market in big 2025? That performance cost HALF a bit less than a decade ago. You can “muh inflation” all you like, I thought technical progress was supposed to gradually lower the barrier of entry for certain performance tiers, not… whatever the fuck current GPU clownery is.
Posted on Reply
#8
dir_d
LionheartThe memory bandwidth is disappointing, hopefully the 20Gps overclocks well.
So far all Skus 5090 all the way to 5070Ti have been limited on memory overclocking on the driver level. Don't get your hopes up.
Posted on Reply
#9
dyonoctis
Space Lynx5070 is meant for 1080p gamers i assume, which is still a lot of people. 12gb vram isn't all that bad for that resolution from what i remember. i often find High preset looks better than Ultra preset in a lot of games these days too, terrible optimization from devs.

so yeah, if i was building a budget rig as a 1080p gamer, a 5070 would be a great choice assuming you can get it at msrp 549. future games will take advantage of the dlss4 frame gen well, so yeah its not a terrible option. 5070 ti is still the sweet spot though.



I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
Something like a 7700XT will be more than enough for 1080p gaming. You could even go as low as a 7600. If the 5070 is going to be as fast as 4070 super, it's going to be a tad overkill/overpriced imo
Posted on Reply
#10
BigMack70
Consoles do 1080p 60/120 for less than $500.

A $550 card is not a budget PC, and it's embarrassing if it's only satisfactory for 1080p.

$550 was good for 1080p gaming 13 years ago with the gtx 680 and HD 7970
Posted on Reply
#11
dyonoctis
OnasiAh yes, because the entire world has a similar earning profile. Newsflash - 1080p PC gaming is nowadays really popular because of e-sports titles and is experiencing growth mostly through emerging markets. No shot that those populations will stomach 550 (more in reality) for a 1080p card. What we see in Steam HS kinda bears it out - the cheaper 4060 is the king.

Now, one might argue that e-sports games don’t really need a 5070 class GPU, sure, but then, what, are we arguing that building a 1080p rig for single player games with a 600 dollar class GPU is a sign of a sane market in big 2025? That performance cost HALF a bit less than a decade ago. You can “muh inflation” all you like, I thought technical progress was supposed to gradually lower the barrier of entry for certain performance tiers, not… whatever the fuck current GPU clownery is.
tbf, GPUs die are not getting smaller, but wafers are getting much more expensive, and games more demanding. RT could be blamed but some devs said that they could find a way to bring a GPU to it's knees just by even improving rasterisation. x10 the shader performance still wouldn't be overkill with all the things that still need to be improved. the biggest difference now is that 4K gaming and high refresh rate gaming is trying to become a thing, but that require x4 the power needed to max out 1080p. While game devs won't chill with visual improvement.

60 fps 1080p max setting is still fairly cheap. Unless you shop with "future proofing" in mind thinking that 12 GB is clutch, and 16GB the ideal amount of vram for 1080p :D

Unlike other areas we are not going to reach a "good enough" point, unless we reach some kind of limit software wise.
Posted on Reply
#12
Space Lynx
Astronaut
i mean if he wants to play cyberpunk 2077 and kingdom come deliverance 2 at 180 fps on a 180hz 1080p monitor on high/ultra settings, he prob will need a 5070.

not defending nvidia at all, i only own amd products. im just saying for 549 imo its not a horrible card for 1080p gamers who like to run their games at 140+ frames. little bit of future proofing built in with the 2-4x frame gen, etc.

im not comparing anyone to the rest of the world, everyones situation is diff. im just saying for him i'd prob try to talk him in to spending an extra 150 to upgrade to the 5070
Posted on Reply
#13
_roman_
Put reviewers under NDA contracts so other youtubers and "leakers" can slowly reveal when new hardware will be published.

Kinda fishy. For some reason the speculative release table is quite often right. I wonder why.
Posted on Reply
#14
bug
Space LynxI'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
I believe that, but it's still embarrassing this can go undetected through QA. You know, there's only two ways this can happen. Either they don't check for ROPs during QA or they do, but can't count very well.
Posted on Reply
#15
Space Lynx
Astronaut
bugI believe that, but it's still embarrassing this can go undetected through QA. You know, there's only two ways this can happen. Either they don't check for ROPs during QA or they do, but can't count very well.
I imagine QA works on sample size, as checking every item at the software level readings would take an immense amount of time, so for every 10 gpus that come through the conveyor belt, you pick one at random test it in-depth, and so on.

Maybe this is not how it works, I just assume this is how it does, so I dunno (i do agree with you overall though, should have been something more fullproof as I don't recall AMD ever having this issue at least not in recent years)
Posted on Reply
#16
Bomby569
i will wait for the 16Gb version, thanks
Posted on Reply
#17
Space Lynx
Astronaut
Bomby569i will wait for the 16Gb version, thanks
at 1440p you really should wait for 20+gb version, as reviews show 16gb is already hampering at 1440p in a very select few games
Posted on Reply
#18
Bomby569
Space Lynxat 1440p you really should wait for 20+gb version, as reviews show 16gb is already hampering at 1440p in a very select few games
that seems exaggerated, i don't really use ultra or even all high settings much, and RT should be the same, a gimmick
but i wouldn't say no
Posted on Reply
#19
Ruru
S.T.A.R.S.
LionheartThe memory bandwidth is disappointing, hopefully the 20Gps overclocks well.
Blame the 192-bit bus. Almost 700GB/s doesn't sound too bad to me though.
Bomby569i will wait for the 16Gb version, thanks
GB205 has a 192-bit bus so a 16GB version is just not gonna happen.
Posted on Reply
#20
Bomby569
RuruGB205 has a 192-bit bus so a 16GB version is just not gonna happen.
would it be the 1st time they would use another version for the same class? i'm just asking
Posted on Reply
#21
Ruru
S.T.A.R.S.
Bomby569would it be the 1st time they would use another version for the same class? i'm just asking
Usually that happens later during a card's lifetime when they use a different chip for a lower tier card. Take RTX 2060 TU104 or 4070 TiS AD102 for example.
Posted on Reply
#22
Bomby569
RuruUsually that happens later during a card's lifetime when they use a different chip for a lower tier card. Take RTX 2060 TU104 or 4070 TiS AD102 for example.
see! there will a cost, that they can afford, because I'm sure they will put a nice premium on those 6GB just like they did with the 4060 16GB
I'm convinced it will happen
Posted on Reply
#23
Arkz
$549 12GB card in 2025... lol.
Posted on Reply
#24
neatfeatguy
Arkz$549 12GB card in 2025... lol.
You can easily go for the $300 3060 from ASUS. 12GB! Only $300!

After the 33 - RTX 5070 cards sell and inventory is depleted, a 3060 will be about your only option for a while.
Posted on Reply
#25
Crackong
Soon there will be RTX5070 -8 ROPs edition

Posted on Reply
Add your own comment
Feb 26th, 2025 18:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts