• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 5070 Ti Ventus 3X OC

Joined
Jul 14, 2018
Messages
486 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 572.16 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 88W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -
IMO, i much prefer to choose MSI trio gaming OC or suprim X version, especially trio gaming OC white, because its has better durability, cooling and aesthetics than ventus 3x oc.
 
Joined
May 11, 2018
Messages
1,456 (0.59/day)
Because there were shortages during Covid and the mining boom, GPU prices went up. For this gen Nvidia deliberately launched with very little stock to hold them up.

It's unlikely to be sustainable longer term, once stock is plentiful, prices could well collapse, but we'll see.

I actually predict we won't even see the "Plentiful Stock" this generation, and the next ones, if the demand for server AI components will remain this high.

Nvidia can play the De Beers of Gaming GPU world - they create the hype, they develop the desirable product, but then they create this scarcity that drives the prices up.

I think they wouldn't even bother making Gaming GPUs this generation - every single made is a potential $70.000 AI accelerator less.

But I think they know this hype won't last forever, even without AI bubble collapse there will be more competition in this lucrative business. So they remain in the Gaming ring, and if AI bursts or demand for specifically Nvidia accelerators lessens, they can all of a sudden offer the amazing price / performance increase, just as they did with RTX 30x0 compared to previous generation - that $699 RTX 3080 was a direct result of low revenue due to crypto crash in early 2020 and abysmal sales of RTX 20x0 cards. It was only after they finalized everything that the new cryptomining surge shafted us gamers.

So for the foreseeable future I don't expect any good value for gamers, we aren't needed. Cards Will remain scarce, AIBs will bitch and moan, several will go out of business just like EVGA...
 

broodro0ster

New Member
Joined
Feb 20, 2025
Messages
1 (0.50/day)
Thanks for the review. These cards also seem to OC wel.
I notice the remark about memory overlocking on and the 375mhz cap. Is this also the case for the 5080 and 5090?
 
Joined
May 13, 2008
Messages
918 (0.15/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I'm confused. VRAM consumption was never part of my reviews

Did you switch from allocation to usage in game reviews? I know it was something like that. Because *both* are important. You can't just go by usage because too many variables from swapping (or not).

If you go by only usage, you need to use games that also will overflow the buffer. Do you understand why this is important?

_______________________________
RE: margins.

15k 4nm wafer. Ram costs about the speed it is (for 2GB, 3GB probably about 25% more if similar ratio to GDDR6)...Hmm. (At least) 100 good dies per wafer....so 150 + (8)*28 = yep, $750!

Oh wait. HALF THAT. YES. HALF THAT. Oh wait. That's 5080. LESS THAN HALF THAT bc salvage. OH WAIT. A 24GB 5080 would only have 200% margin, but they really needed that extra 50%.

They're hurtin'.

OH WAIT. That's us. nVIDIA making a GD killing.
 
Last edited:
Joined
Dec 12, 2012
Messages
811 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Lots of games use greater than 16GB, especially at 4k. Some 1440p.

By use, do you mean allocate or actually need?

I don't think I've ever seen anyone showcasing VRAM-related stuttering on a 16 GB card, and definitely not at settings offering playable framerates. You're not going to be enabling path tracing in native 4K on a 5080, even if it had 24 GB.
I've seen a few examples of games spilling over 12 GB (maxed out Hogwarts Legacy would be one), but also not at settings offering playable performance.
The only inexcusable amount is 8 GB. A 5060 with this amount of memory should not even be allowed to launch. They should wait for 3 GB modules and make it 12 GB.

Yes, we should be getting more VRAM at these prices, but to say that we actually need more than 16 GB is not accurate. Even when the PS6 comes out, it'll take a few years before games start fully utilizing more memory. All the PS5 cross-gen games had no problems running on 8 GB.
 
Joined
May 13, 2008
Messages
918 (0.15/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
By use, do you mean allocate or actually need?

I don't think I've ever seen anyone showcasing VRAM-related stuttering on a 16 GB card, and definitely not at settings offering playable framerates. You're not going to be enabling path tracing in native 4K on a 5080, even if it had 24 GB.
I've seen a few examples of games spilling over 12 GB (maxed out Hogwarts Legacy would be one), but also not at settings offering playable performance.
The only inexcusable amount is 8 GB. A 5060 with this amount of memory should not even be allowed to launch. They should wait for 3 GB modules and make it 12 GB.

Yes, we should be getting more VRAM at these prices, but to say that we actually need more than 16 GB is not accurate. Even when the PS6 comes out, it'll take a few years before games start fully utilizing more memory. All the PS5 cross-gen games had no problems running on 8 GB.

I mean make it so the game doesn't 'stutter struggle' and/or have problems with a swap (which can sometimes be seen as graphical glitches/dithering), which sometimes people blame on the games.

If you think about it for a second, you'll get it. You've likely seen it, especially in Hogwarts. Ofc there are ways to limit these things from happening (one or the other), but there's almost-always a limitation.

You are absolutely not correct that 12GB is a good amount for current feature sets. Even W1zard changed his monologue in the current conclusion to reflect this. Pretty sure it used to be 8, then 12, now 16GB.

Ofc, none of that is true depending upon the actual resolutions and quantity of the textures, not-to-mention other features that use VRAM (like FG/RT/etc).

W1zard again, PURPOSELY DOES NOT SHOW THIS BECAUSE HE THINKS IT MEANS A GAME IS BADLY OPTIMIZED. In reality, his choices are using low-rez textures and/or swapping more often, if it even can.

Go look it up, pretty sure HUB showed this a long while back? Again, some people just be like 'why texture funny-lookin'?'. BC RAM, that's why.

I'll ask you this: When next-gen GPUs arrive, what configs do you think they will be? I would assume 16GB (low-end, 128-bit), 18GB (192-bit), and 24+ (256-bit+). That is 16GB minimum. Do you disagree?
Do you expect those 128-bit cards to be running maxed out 4kRT? Or rather baseline 1080p? Maybe that, maybe less than that upscaled? 1440p upscaled is 960p. Again, watch for this on 5070.
Mins. Not averages, W1zard.

Now think of a PS6. How much ram do you think it will have? How much devoted to graphics? Do you think it will run at 1440p all the time?
Keep dreaming...considering this card can't even keep 1440pRT mins...It'll probably be 1080p upscaled...and use at least 16GB for the GPU. Again, you can see this in stuff like Spiderman. 1080? Yes. 1440p. No.

Current consoles run games like this, like Star Wars Outlaws (which are likely set up for decent settings/resolution on next-gen) at 720p. That's seven two zero p's. 8GB. Seven twenty pee.
One thousand two hundred and eighty by seven hundred and twenty pixels.

Again, this will make more and more sense as more and more engines are adapted to take advantage of next-gen consoles (like Snow Drop already is, apparently).
Those times are, as already shown by RT options on PC not in current-gen console versions, fast-approaching.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,249 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I notice the remark about memory overlocking on and the 375mhz cap. Is this also the case for the 5080 and 5090?
Yes

Did you switch from allocation to usage in game reviews? I know it was something like that. Because *both* are important. You can't just go by usage because too many variables from swapping (or not).
In game reviews? I've been testing VRAM since forever, but only allocation. This is a graphics card review, not a game review though.

There is no way to track usage. Each frame is different, and each one touches different resources and many come from the various caches in the GPU, and this changes from frame to frame, even when standing still. GPU vendors have software that can capture the state of the GPU and everything related, and they can replay the command buffers and analyze a single frame. This is how they design their next-gen GPUs. Nothing that's accessible outside of these companies
 
Joined
Feb 18, 2005
Messages
6,144 (0.84/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I mean make it so the game doesn't 'stutter struggle' and/or have problems with a swap (which can sometimes be seen as graphical glitches/dithering), which sometimes people blame on the games.
Your whole take on this is quite literally retarded. Instead of blaming crap developers for making s**tty PC ports of console games, you blame W1zz for blaming those developers. I cannot stress enough how mind-bogglingly braindead your view is; I've come across some dumb takes in my 2 decades online, but this one is up there with the stupidest.

It is not NVIDIA's fault that developers make s**tty ports, nor is NVIDIA required - and nor should they be - to build their hardware to cater for s**tty ports.

Stop posting dumb crap and go sit in the dunce corner and for once THINK about what you're posting, BEFORE you post it.
 
Joined
Dec 31, 2020
Messages
1,176 (0.78/day)
System Name Dust Collector Mower 50
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case MATREXX 50
Power Supply SF450
In reality, it is using low-rez textures and/or swapping more often, if it even can.

Go look it up, pretty sure HUB showed this a long while back? Again, some people just be like 'why texture funny-lookin'?'. BC RAM, that's why.
That may have been just one game and it could be that the textures aren't preloaded just yet. And the whole pitcure is washed out as if the viewing distance is low.
For most of the time I can't even hit 6-8 GB of usage to have 100 FPS.
 
Joined
Mar 27, 2018
Messages
171 (0.07/day)
Processor AMD Ryzen 5 3600
Motherboard Asus ROG Strix X470-F
Cooling Reeven RC-1205
Memory G.Skill F4-3200C16D-16GTZKW TridentZ 16GB (2x8GB)
Video Card(s) Powercolor x470 red devil
Storage Mushkin MKNSSDPL500GB-D8 Pilot 500GB
Display(s) Samsung 23"
Case Phanteks PH-EC300PTG
Audio Device(s) SupremeFX S1220A
Power Supply Super Flower SF-650F14MT(BK) Leadex 650W 80 Plus Silver
Mouse Cooler master m530
Keyboard Cheapo
People are putting way too much faith in AMD. We're talking about AMD that's love shooting themselves in the foot each time. So I expect AMD to screw up the prices and quickly drop their prices again after 1 - 2 Months just to piss the early purchase customers off and sour the relationship with people that were holding out and hoping for the best.
 
Joined
May 13, 2008
Messages
918 (0.15/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Yes


In game reviews? I've been testing VRAM since forever, but only allocation. This is a graphics card review, not a game review though.

There is no way to track usage. Each frame is different, and each one touches different resources and many come from the various caches in the GPU, and this changes from frame to frame, even when standing still. GPU vendors have software that can capture the state of the GPU and everything related, and they can replay the command buffers and analyze a single frame. This is how they design their next-gen GPUs. Nothing that's accessible outside of these companies
I know that, which is why trusting usage is sketch. Isn't that literally what presentmon/RTSS attempts to do though, usage/allocation?
Your whole take on this is quite literally retarded. Instead of blaming crap developers for making s**tty PC ports of console games, you blame W1zz for blaming those developers. I cannot stress enough how mind-bogglingly braindead your view is; I've come across some dumb takes in my 2 decades online, but this one is up there with the stupidest.

It is not NVIDIA's fault that developers make s**tty ports, nor is NVIDIA required - and nor should they be - to build their hardware to cater for s**tty ports.

Stop posting dumb crap and go sit in the dunce corner and for once THINK about what you're posting, BEFORE you post it.

I find this quite humorous. A port using next-gen features the current consoles can't support is a crappy port? Or using higher-rez textures and keeping them loaded? Which? I'm lost.
It's not the game. It's the cards. You're blaming the wrong thing.

I apologize you literally can't see the limitations imposed on each card. It's very apparent if you're looking.
Again, I challenge W1zrd to post 1440pRT mins for games that are ~60fps avg at 1440pRT. 4k quality upscale mins.
What is less than 1440p? 1080p. That's correct. Or 960p if you use quality up-scaling to 1440p. Which is less than 1080p. Do you need to run those 1440p/60fps? You do not. Are they it selling this way? Yes.
Is W1zard capitulating to this? You be the judge.
Will AMD be able to run 1080p? Probably yes. This is why their card makes just as much sense.

Is this the devs fault, or nVIDIA keeping you on a string to upgrade when again, the thresholds are very apparent? They literally inch everything along. If you can't see this, you're blind.

You can see where 45TF and 16GB both become requirements (4070ti/5070 both less on purpose). This is and will soon become more apparent.
You can see where 60TF and >16GB both become requirements across multiple scenarios. Are basically their cards all over/under this? Yeeeeppppp. Can you OVERCLOCK a 5080 above 60TF? Yes. Ram limit? Yes.
Is 9070xt literally hopping the 45/12GB limitation (that AMD themselves put on 7800xt @ 45TF even if 16GB)? YEEEEEP. Will this make a lot of settings above 12GB cards but same as GB203 playable? PROBABLY.
Is it still below 60TF raster? Yep. Is 7900xt limited to ~60TF? YEEPPP. Is that where you need more than 16GB? YEEEP. Again, it's very purposeful product segmentation...nvidia's is just more gross.

I don't like the idea of AMD selling 1080p->4k upscaling as 4k either. That's my point. But literally nothing can do 1440pRT 60mins outside of incredibly expensive 90 cards. This will eventually change.
But when it does, do you not expect the bottom to rise? Would it not make sense if >60TF/>16GB then becomes what you want for even 1080pRT upscaling w/ FG etc? I think this makes tremendous sense.
This is why their 'middle-ground' card may not hold up for the settings you want there, either. This is where (for right now) you would want a higher-clocked 24GB 5080, which doesn't exist RN.
But it surely will, because they will create that gap, and then you will see it more clearly. Before long you will need more raster (like a 4090+) for 1440p. It's pretty clear how things will probably evolve if you look at it.
And nVIDIA will be there, to sell you every damn card they can below that (and those) threshold(s) until they have to do it.
Ask yourself why a 8 cluster (~12288sp) 24GB nvidia card does not exist when it would clear many hurdles AD103/B203 do not. 9728, 10240. 10752. All 16GB. All with different limitations.
AMD too apparently...but at least they're trying to make a well-balanced card for what's affordably possible this generation. They are more a victim of circumstance, than anything else. For nvidia; clearly planned.
 
Last edited:
Joined
May 11, 2018
Messages
1,456 (0.59/day)
No it's 750 dollars, the remaining amount is AI generated.


"DLSS Multi Price Generation generates up to three additional card values per traditionally calculated price, working in unison with the complete suite of scalping technologies to multiply price by up to 8X over traditional brute-force MSRP. This massive price improvement on GeForce RTX 50 series graphics cards unlocks stunning multiplies of original value for $1000 - $5000 Generally Unavailable Gaming Experience!"
 
Joined
Dec 31, 2020
Messages
1,176 (0.78/day)
System Name Dust Collector Mower 50
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case MATREXX 50
Power Supply SF450
Ask yourself why a 8 cluster (~12288sp) 24GB nvidia card does not exist when it would clear many hurdles AD103/B203 do not. 9728, 10240. 10752. All 16GB. All with different limitations.
AMD too apparently...but at least they're trying to make a well-balanced card for what's affordably possible this generation. They are more a victim of circumstance, than anything else. For nvidia; clearly planned.

N3P brings +60% logic density, therefore we're looking at 7680->12288C 18GB for the 6070/Ti, 6080 with 16384C 24GB. Rubin is released earlier, like Pascal, so there's no need for a super on N4. What for.
 
Joined
Dec 14, 2011
Messages
1,303 (0.27/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
"MSRP Promised"

I take it, they will never increase the price and keep stocking them going forward? They won't "magickly" disappear from the product stack, right? :)
 
Joined
May 13, 2008
Messages
918 (0.15/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
N3P brings +60% logic density, therefore we're looking at 7680->12288C 18GB for the 6070/Ti, 6080 with 16384C 24GB. Rubin is released earlier, like Pascal, so there's no need for a super on N4. What for.
60%? My understanding is ~33% (according to listed metrics).

I still think nvidia will do 12288/9216/6144. This is because their cache is set up for each of those (256/192/128) and 3780/36000; 3780mhz being the clockspeed of Apple's first 3nm chip. 36gbps bc could use Micron.
Yes, Apple used N3B and nVIDIA probably N3E, but nvidia also like to keep their clockspeeds conservative for power/size and/or to have the ability to sell that clock improvement with more products over time.
Also, again, at ~3700mhz you could argue 24GB of ram could become a limitation in some respects, but we use more and more compute for things like up-scaling so it'll probably be fine most of the time.

Also, again, 12288 is 8 clusters...where 4080 and 4080s were kinda like 6.5ish (6*1536/1*512 or 1024) and 5080 7 (7*1536). Why when 4090/5090 16? I dunnnnnnoooo. Totally not bc people would keep that.
Had AMD made a good RT and/or good RT high-clocking 7900xtx, it might've happened on 4/5nm. Instead we have to settle for just N48, but it still makes just as much sense as most of nVIDIA's stack rn imho.
And much more sense than 4070ti/5070.

You ask 'what for'? And I would say 'because they didn't do it in the first place' and 'so they could sell it again and not suck'.
Like I said, there are a ton of instances you see where 5080 would benefit from 18-20GB of RAM. >60TF/>16GB is a key metric for a lot of things, and that card would not suck long-term (theoretically).
It also won't be as high-end long-term as people might think, but it should age gracefully. Current perf scaling at high clocks is bad bc 16GB. This is why stock clock low (2640mhz when capable of at least ~3165).
Probably replaced by 192-bit/18GB cards next-gen that will for all intents and purposes perform the same, but it's a stop-gap. All of these things are stop-gaps, imo. nVIDIA's just dragging them out. Bc $.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,249 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The headline has been changed and you have been exposed as an extreme Nvidia brand loyalist. Good job!
lol i'm just messing with you people. 7 more minutes until stores open

edit:

3:04pm
Screenshot 2025-02-20 at 15-04-39 GeForce RTX 5070 Ti GPUs _ Video Graphics Cards Newegg.com.png

3:19pm
Screenshot 2025-02-20 at 15-19-43 GeForce RTX 5070 Ti GPUs _ Video Graphics Cards Newegg.com.png

3:35pm
Screenshot 2025-02-20 at 15-35-16 GeForce RTX 5070 Ti GPUs _ Video Graphics Cards Newegg.com.png

3:50pm
no more cards in stock
 

Attachments

  • Screenshot 2025-02-20 at 15-49-48 GeForce RTX 5070 Ti GPUs _ Video Graphics Cards Newegg.com.png
    Screenshot 2025-02-20 at 15-49-48 GeForce RTX 5070 Ti GPUs _ Video Graphics Cards Newegg.com.png
    1.8 MB · Views: 32
Last edited:
Joined
May 18, 2009
Messages
3,029 (0.53/day)
Location
MN
System Name Personal / HTPC
Processor Ryzen 5900x / Ryzen 5600X3D
Motherboard Asrock x570 Phantom Gaming 4 /ASRock B550 Phantom Gaming
Cooling Corsair H100i / bequiet! Pure Rock Slim 2
Memory 32GB DDR4 3200 / 16GB DDR4 3200
Video Card(s) EVGA XC3 Ultra RTX 3080Ti / EVGA RTX 3060 XC
Storage 500GB Pro 970, 250 GB SSD, 1TB & 500GB Western Digital / lots
Display(s) Dell - S3220DGF & S3222DGM 32"
Case CoolerMaster HAF XB Evo / CM HAF XB Evo
Audio Device(s) Logitech G35 headset
Power Supply 850W SeaSonic X Series / 750W SeaSonic X Series
Mouse Logitech G502
Keyboard Black Microsoft Natural Elite Keyboard
Software Windows 10 Pro 64 / Windows 10 Pro 64
Joined
Feb 21, 2006
Messages
2,298 (0.33/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.12.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Joined
Apr 26, 2008
Messages
1,150 (0.19/day)
Location
Berkshire
System Name Staggered
Processor 7800X3D (XSPC Rasa)
Motherboard MSI B650 Carbon WiFi
Cooling RX360 (3*Scythe GT1850) + RX240 (2*Scythe GT1850) + Laing D5 Vario (with EK X-Top V2)
Memory 32GB (2x16GB) DDR5 6000MHz CL30 1.4v
Video Card(s) XFX RX 6800XT 16GB Speedster MERC319
Storage Samsung 970 EVO Plus 1TB + Crucial P3 Plus SSD 2TB
Display(s) Flatron W3000H 2560*1600
Case Cooler Master ATCS 840 + 1*120 GT1850 (exhaust) + 1*230 Spectre Pro + Lamptron FC2 (fan controller)
Power Supply NZXT C1000 (1000W)
Software Windows 10 Pro 64bit
Is it me or are people more disingenuous nowadays when discussing review results? Linking to a page that supports their argument, when the literal next or previous page disagrees with it. I usually read comments in a popcorn capacity but that's just becoming more depressing. Or I'm getting old, maybe both.

According to this review:
- It's equivalent to a 7900XTX in raster (bit better 1080p, bit worse 1440p, similar 4k for some reason). 15-28% faster than 4070ti
- 33-42% faster than 7900xtx in raytracing. 7-15% faster than 4070 ti (meh, guess rt really does rely on fixed function hw)
- MSRP is irrelevant. The graph people link to say "look how good/bad the price performance" literally has multiple bars for different prices. The cognitive dissonance is real.

My thoughts:
- GPU prices have beaten me into submission over the years. Went from £200-300 in early 2000s until £380 for GTX1070 (2016), £500 for current 6800XT (2023) and I think I'm willing to go to £750 next time...
- I don't care about RT unless a games conventional lighting is actually trash*, in which case I'll turn on the minimal required settings, performance willing. Performance willing being key, RT is useless outside of the top SKUs (in my games). 5070ti just eeks out 62fps, so RT might be slightly growing on me. Still 50% performance hit though, on what is supposed to be the RT industry leaders latest card.
- Don't care for upscaling or framegen. When it first came out I thought it might be good enough (for me) around DLSS5/FSR5. Looks like we're on track for that. Anything other than MSAA is already "fake", after all. So not too tribal. Ram and latency improvements seen, expect more next gen.
- Not sure where I stand on VRAM. More is of course better but that's assuming equal speed and price. For gaming 8GB still seems fine for some performance compromise, 12GB has none outside edge cases, 16GB still isn't enough for those edge cases. Not sure what to think, I only occasionally use GPU for other things. Religiously kept up with Wiz's VRAM usage tests since I got the 6800XT (to validate my purchase decision xD), not really feeling a push for more. Also, DirectStorage exists and SSDs keep getting faster.
- 7900XTs (13% slower) are in stock for £650 and XTXs for £850. RTX 40xx don't exist so are irrelevant to my decision making process. Not up in UK yet but at £750 it would net me 56%/50% more performance/price. Versus the 153%/32% going 1070->6800XT...
- More appealing than an MSRP 5080 (+12% fps, +33% $) but still not that appealing.

I'm not willing to upgrade for less than a 50% performance uplift, this barely hits that and 9070XT is expected to be between 7900XT and XTX, so won't hit that. I think this gen is a bust for me. Bought a quest 3 last year, it's "only" 2064 x 2208 per eye but VR pixels are warped so you want to render at higher resolutions to essentially AA the edges. Upscaling artifacts are "slightly" more noticible when next to your eyeball, so if I already don't like it on flatscreen... Also AV1 would be nice. 2 years for next gen GPUs, this'll be fun.

I don't think the future is much better. Nvidia stopped "old" gen production sooner than ever before (?), so prices won't go down over time. AMD won't do that for RDNA3 this gen but most likely will for RDNA4 once UDNA rolls around. Currently their AI and gaming archs are seperate so they have to produce both. Big reason for unifying is to cut costs, cutting RDNA4 to streamline their production chain makes sense.

I don't think Nvidia cut 40xx early just to make 50xx look like better value, although that is part of the reason. Going forward I expect the decision to cut or not to be based on how many people jump on the latest AI cards arch and how many people prefer to save money with older ones. No point making low profit gaming GPUs for an arch that no one's buying AI GPUs for. Expect the same decision making process from AMD come UDNA.

* Specifically reflections in CP2077, which reflect your nephews minecraft world with RT off (got to save money somewhere, ig). Not too much of a hit on 6800XT for just reflections (last I played, around 2.0 launch)
 

Mindweaver

Moderato®™
Staff member
Joined
Apr 16, 2009
Messages
8,355 (1.44/day)
Location
Charleston, SC
System Name Tower of Power / Delliverance
Processor i7 14700K / i9-14900K
Motherboard ASUS ROG Strix Z790-A Gaming WiFi II / Z690
Cooling CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / Air
Memory CORSAIR - VENGEANCE RGB 32GB (2x16GB) DDR5 7200MHz / DDR5 2x 16gb
Video Card(s) ASUS TUF Gaming GeForce RTX 4070 Ti / GeForce RTX 4080
Storage 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / NVM3 PC801 SK hynix 1TB
Display(s) Samsung 32" Odyssy G5 Gaming 144hz 1440p, 2x LG HDR 32" 60hz 4k / 2x LG HDR 32" 60hz 4k
Case Phantek "400A" / Dell XPS 8960
Audio Device(s) Realtek ALC4080 / Sound Blaster X1
Power Supply Corsair RM Series RM750 / 750w
Mouse Razer Deathadder V3 Hyperspeed Wireless / Glorious Gaming Model O 2 Wireless
Keyboard Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes
VR HMD Quest 3 (512gb) + Rift S + HTC Vive + DK1
Software Windows 11 Pro x64 / Windows 11 Pro x64
Benchmark Scores Yes
Oh look at that all of the ones for 749 are gone on Newegg.. lol but if you really want one then you can get one off of ebay in a few minutes...

EDIT: It's early.. did they ever show instock because they were always showing out of stock.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,249 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
This card is $909.99 at MC now. Did MSI lie or is MC price gouging?
829 at newegg and in-stock .. still much more than the promised 750
 
Joined
Sep 1, 2009
Messages
1,283 (0.23/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling ARCTIC Liquid Freezer II 240
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
Oh look at that all of the ones for 749 are gone on Newegg.. lol but if you really want one then you can get one off of ebay in a few minutes...

EDIT: It's early.. did they ever show instock because they were always showing out of stock.
No, they never went in Stock they stayed out of Stock. There never were any cards for $749 and if they did show up it was for 30seconds or less.
 
Top