• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RDNA4 (RX 9070XT / 9070) launch announced for (delayed to) March 2025

Joined
Jun 11, 2019
Messages
659 (0.32/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
Probably because it's just a flippant Twitter post with no actual date in it. The guys here at TPU are probably waiting for something solid. (Just my guess)
MLID's, Kepler's and others' solids, liquids and occasional gases have been counted as good enough for a long time round here, so probably news guys are just busy.
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I can guarantee that it is, which is another example of how higher marketshare works for the entity controlling it, via economies of scale.


Now is absolutely not the time for AMD to be pulling arbitrary product segmentation bulls**t. They absolutely cannot afford to piss off consumers with a bait-and-switch, especially when they have no idea of how the competition will perform, because if they guess wrong and end up with a card that's now too slow, they'll have to adjust something else (price) to compensate.

Yeah...nVIDIA making 100% margins...no cap.

As for AMD, we'll see. :D

FWIW, there is already product segmentation bullshit. Odds are, the bullshit will very much continue. New world, new AMD.

9070xt comp is absolutely 5070 (and it really shouldn't be)...and it'll never catch a 5070ti.
Again, I really don't think 9070xt should exist. They should just clock 9070 higher as it truly should be able to compete with 5070.

They are in-fact already making artificial tiers clearly shown by the clocks of 9070 being so low ootb. We'll have to see if they limit either of them or not. 3.2/3.3/3.4 for 9070....none would surprise me.
It's competition is really nothing....It's competition is the 9070 xt.

IMHO, this stack is ridiculous and clearly intended to hit certain prices and/or margins rather than actually be good ootb (who knows about oc/pl) designs.

The sad part is I think they really wanted 9070xt to be competitive with $600 5070 and a future card to be a cheaper alternative to a $800 5070ti. I think nVIDIA's pricing screwed them up.

I think they wanted a future card to replace 7900xt pricing, and in reality it'll probably have to be $600 or so (imho). All the more reason (having to lower prices) to create that segmentation.

Look, listen...I don't want it either! I'm just sayin'...wouldn't surprise me given what they've done recently with other products. Remember when 7900gre couldn't overclock the ram (to give it 10% extra perf)?

This is why I want you all to call them on this if they try it. I think the community got that memory clock unlocked...never know...if the PLs suck on these cards maybe AMD will listen again.

This is why these forums matter.
 
Joined
May 29, 2017
Messages
484 (0.17/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
Can't wait for RTX 5080 reviews this will tell a lot about RTX 5070 and Ti (January 30)
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
That's the Red Devil. They've also got the Hellhound and Reaper, both dual slot, both with 2 8-pins, both present at CES.

It switched the power design from 3 to 2x8-pin. That was the point. Why would they do that if not lowered PL/clock limit in bios? :confused:

edit: I guess they said it could have been a mistake and used a 9070 for the pic. My bad. Still...Who knows. Nothing would surprise me.
 
Joined
May 29, 2017
Messages
484 (0.17/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
9070xt comp is absolutely 5070 (and it really shouldn't be)...and it'll never catch a 5070ti.
Again, I really don't think 9070xt should exist. They should just clock 9070 higher as it truly should be able to compete with 5070.
Current rumors suggests that RTX 5070 12GB will be slightly weaker than RTX 4070 Ti 12GB when DLSS4 is off. I think RX 9070 XT will be somewhere between RTX 5070 and RTX 5070 Ti if we look at current rumors.....

RTX 5070 12GB 192bit is a clear mid-range **60 Class GPU (GTX 1060 199-249$)
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Can't wait for RTX 5080 reviews this will tell a lot about RTX 5070 and Ti (January 30)

The 5070ti will be another wonderful bait and switch just like the 4070Ti Super. The card that will last exactly one gen.

Why, you ask? Well, you see...The transformer model of DLSS uses ~15% extra perf to DLSS CNN ~8% (from 1440p to 4k quality).

Why is that important? Well, to kill the 4070Ti super from hitting 60fps targets.

Likewise, the 5070ti will likely have 5 SM....the same as 4070Ti. The difference is clusters could be 1792sp instead of 1536sp.

Sooooo.....Guess what the new RT standard will be come Rubin? If you guessed exactly out the reach of the clocks of a 5SM nvidia design...you would be right.

Also, you know, games will start requiring ~18GB when using RT/RR/FG/DLSS.....but we don't talk about that yet bc 5080 owners gonna be pissed..

Well, outside Outlaws, which is a pretty good example of what to expect from next-gen console usage (I expect a general ~14/18 split between CPU/GPU using 32GB of shared GDDR7).

Or well, at least games on Snowdrop...like my beloved eventual Division 3.

Search your heart, you know it to be true. If you don't believe it, you clearly haven't bought a nVIDIA product before.

Current rumors suggests that RTX 5070 12GB will be slightly weaker than RTX 4070 Ti 12GB when DLSS4 is off. I think RX 9070 XT will be somewhere between RTX 5070 and RTX 5070 Ti if we look at current rumors.....

RTX 5070 12GB 192bit is a clear mid-range **60 Class card.
Correct. But...you know...so is (at least the potential of) 9070. That's why it's clocked so low....so it will compete directly in some cases with 5070 while allowing them to make it cheap as possible.

Lowest clocks we really saw this gen (that weren't limited) were roughly 2715 from nVIDIA and 2718mhz from AMD...so 2700mhz should be pretty much every damn chip that kinda-sorta works on the wafer.
 
Last edited:
Joined
Jun 14, 2020
Messages
4,031 (2.40/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
The 5070ti will be another wonderful bait and switch just like the 4070Ti Super. The card that will last exactly one gen.

Why, you ask? Well, you see...The transformer model of DLSS uses ~15% extra perf to DLSS CNN ~8% (from 1440p to 4k quality).

Why is that important? Well, to kill the 4070Ti super from hitting 60fps targets.

Likewise, the 5070ti will likely have 5 SM....the same as 4070Ti. The difference is clusters could be 1792sp instead of 1536sp.

Sooooo.....Guess what the new RT standard will be come Rubin? If you guessed exactly out the reach of the clocks of a 5SM nvidia design...you would be right.

Also, you know, games will start requiring ~18GB when using RT/RR/FG/DLSS.....but we don't talk about that yet bc 5080 owners gonna be pissed..

Well, outside Outlaws, which is a pretty good example of what to expect from next-gen console usage (I expect a general ~14/18 split between CPU/GPU using 32GB of shared GDDR7).

Or well, at least games on Snowdrop...like my beloved eventual Division 3.

Search your heart, you know it to be true. If you don't believe it, you clearly haven't bought a nVIDIA product before.


Correct. But...you know...so is (at least the potential of) 9070. That's why it's clocked so low....so it will compete directly in some cases with 5070 while allowing them to make it cheap as possible.

Lowest clocks we really saw this gen (that weren't limited) were roughly 2715 from nVIDIA and 2718mhz from AMD...so 2700mhz should be pretty much every damn chip that kinda-sorta works on the wafer.
Can you explain exactly what you mean by "games will start requiring 18gb of vram"? Cause I've got an 8gb card, are you telling me it won't play those games? Or you just mean I have to...drop some settings?
 
Joined
Feb 18, 2005
Messages
5,889 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Can you explain exactly what you mean by "games will start requiring 18gb of vram"? Cause I've got an 8gb card, are you telling me it won't play those games? Or you just mean I have to...drop some settings?
Ignore that drivel, it's just the standard "less VRAM is bad therefore NVIDIA is evil" nonsense that has been thoroughly debunked time and time again here. Last time I checked, doing more with less was called "good engineering".
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Can you explain exactly what you mean by "games will start requiring 18gb of vram"? Cause I've got an 8gb card, are you telling me it won't play those games? Or you just mean I have to...drop some settings?

Sorry, I'm generally speaking of 1440p->4k upscaling. The norm for a lot of enthusiasts...and both the companies in question know this is a lot of peoples' targets.

That's why the goalpost moves in lots of different ways...and these cards are purposely bottlenecked for those future scenarios compared to future designs (and their relative clockspeeds).

Also, a 192-bit card with 18GB is cheaper to make than a 256-bit one with 16gb...

IMO, 8GB is not enough even for 1080p gaming...but one can make anything work depending upon your if you can handle upscaling to 1080p and/or lowering settings.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,384 (4.04/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Joined
Nov 13, 2024
Messages
124 (1.77/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Sorry, I'm generally speaking of 1440p->4k upscaling. The norm for a lot of enthusiasts...and both the companies in question know this is a lot of peoples' targets.
Yet to have problems with my 3080 10GB on 1440p native (technically dldsr 2.1) so.... not sure if VRAM needs jump that high. Or are we talking rt/ultra everything?
 
Joined
Feb 18, 2005
Messages
5,889 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Joined
Aug 3, 2006
Messages
224 (0.03/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
Really? According to who?

Are you guys really splitting hairs just to be argumentative? Would you really call someone with a 1080p pixio monitor and an 8gb 4060 an enthusiast? We all know what enthusiast means.
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Ignore that drivel, it's just the standard "less VRAM is bad therefore NVIDIA is evil" nonsense that has been thoroughly debunked time and time again here. Last time I checked, doing more with less was called "good engineering".
LOL. If you call stuttering from running out of ram (which granted some don't mind as much as others) before compute resources 'good-engineering'.

Where I'm from we call it planned obsolescence.

Last time I checked W1z was still saying 8GB was enough. Yes, we've had this fight before, and yes I was right then too.

8GB is outdated, 1440p will need more than 12gb. Can't you see the intended separation? 16GB for 1440p. 18GB for all the crap we can turn on now. 24GB for actual 4k.

It took the baby blue Aussies to call bullshit and make people listen last time...Which is weird, but whatever works. You can wait for them again IYW. Pretty sure they wizened up and rec 16GB now.

But like I said, it's all up to what you find 'acceptable'. I'm just saying for a maxed experience and general perf targets.

It's so amusing to me that every single generation people still fall for these obvious tells from nvidia. You don't hear about them, well...bc people like DF don't cover old products aging. Only the new shiny.

It's so predictable. Some people get it. Some people don't. I don't judge. Just trying to look out for ya'll.

edit: I didn't mean to start an 'enthusiast' term fight. I apologize. I meant the 'common usage of 1440p->4k quality upscaling a lot of people use' for a high-end experience. No, that isn't everyone.

Most know what I mean, I'd gather.
 
Last edited:
Joined
Jun 14, 2020
Messages
4,031 (2.40/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Sorry, I'm generally speaking of 1440p->4k upscaling. The norm for a lot of enthusiasts...and both the companies in question know this is a lot of peoples' targets.

That's why the goalpost moves in lots of different ways...and these cards are purposely bottlenecked for those future scenarios compared to future designs (and their relative clockspeeds).

Also, a 192-bit card with 18GB is cheaper to make than a 256-bit one with 16gb...

IMO, 8GB is not enough even for 1080p gaming...but one can make anything work depending upon your if you can handle upscaling to 1080p and/or lowering settings.
Can you give an example of a game that can't be played at 1080p with 8gb vram? I wanna try it.

LOL.

Last time I checked W1z was still saying 8GB was enough. Yes, we've had this fight before, and yes I was right then too.

8GB is outdated, 1440p will need more than 12gb. Can't you see the intended seperation? 16GB for 1440p. 18GB for all the crap we can turn on now. 24GB for actual 4k.

It took the baby blue Aussies to call bullshit and make people listen last time...Which is weird, but whatever works. You can wait for them again IYW.

It's so amusing to me that every single generation people still fall for these obvious tells from nvidia. You don't hear about them, well...bc people like DF don't cover old products aging. Only the new shiny.

It's so predictable. Some people get it. Some people don't. I don't judge. Just trying to look out for ya'll.

edit: I didn't mean to start an 'enthusiast' term fight. I apologize. I meant the 'common usage of 1440p->4k quality upscaling a lot of people use' for a high-end experience.

Most know what I mean, I'd gather.
So I should stop playing POE2 (just came out) and veilguard on my 8gb vram card cause it's outdated and can't play it. Even though it does in fact play them both, at 1440p...
 
Joined
Jun 14, 2020
Messages
4,031 (2.40/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Try Hogwarts with ray tracing and frame gen.
Finished it day one with RT maxed out and everything ultra bar the textures (high) at 3440x1440p with DLSS Q. That's on a 3060ti and day one before the patches even.

Ignore that drivel, it's just the standard "less VRAM is bad therefore NVIDIA is evil" nonsense that has been thoroughly debunked time and time again here. Last time I checked, doing more with less was called "good engineering".
I don't get it, do people actually think that you need more than 16gb of vram to play games? I think they really don't have a slight idea of what the word NEED even means. No other explanation.

Need, definition : require (something) because it is essential or very important rather than just desirable.
 
Last edited:
Joined
Feb 18, 2005
Messages
5,889 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
We all know what enthusiast means.
I've been in the PC space for over two decades at this point and would very much consider myself an enthusiast, yet am currently using a 4060 Ti. Does that make me an enthusiast or not? I can guarantee you that if you ask 20 people you'll get 20 different answers.

That's why I asked for clarification in this context, so that I can respond to the argument being made with the same understanding as the person making it. Without that basic shared understanding it is impossible to have meaningful discourse.

8GB is outdated, 1440p will need more than 12gb. Can't you see the intended separation? 16GB for 1440p. 18GB for all the crap we can turn on now. 24GB for actual 4k.
When NVIDIA released the 4060 Ti everyone called it a worthless product because it was. The GPU simply isn't powerful enough to be able to make meaningful use of that amount of memory, and framegen/upscaling don't change this basic fact. So why should NVIDIA waste money on a larger framebuffer when they know it's pointless and consumers won't pay for it? They marry their GPUs to the amount of memory that those GPUs need to run the resolutions that GPU is designed for; how exactly is this a bad thing?

The fact that AMD offers more VRAM at a similar price point is completely, utterly, pathetically irrelevant because their GPUs certainly don't seem to be able to turn that extra capacity into meaningful performance uplifts! Why are you allowing yourself to be suckered by the oldest marketing trick of "more = better" when that more does nothing for you and therefore isn't actually better? It's the same as putting Formula 1 tires on your cheap sedan, they might make you feel better than the guy with the same model car who doesn't have them - but both sedans will still perform the same.
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Can you give an example of a game that can't be played at 1080p with 8gb vram? I wanna try it.


So I should stop playing POE2 (just came out) and veilguard on my 8gb vram card cause it's outdated and can't play it. Even though it does in fact play them both, at 1440p...
Cutting edge games intended for future console platforms with max setting. I apologize. Not every game, not every instance.

You know exactly what I'm saying...again, one can make anything 'work', and not every title is equally demanding.

I'm not trying to fight. I don't want to be a jerk either...sometimes I honestly don't know if people are trying to just be contrarian bc they don't understand trends.

Personal flaw not always explaining the details and/or catching the sarcasm.
Try Hogwarts with ray tracing and frame gen.

Yep yep. Good example, but that game doesn't even have high-rez textures I don't think? Imagine if it did and the scaling it would require. That's what I'm saying.

I think Outlaws is a great example as I'm fairly certain Snowdrop is in the state to be optimized for next-gen platforms. Given Div3 prolly won't launch until the next-gen, it makes sense to test with SWO.
 
Joined
Jun 14, 2020
Messages
4,031 (2.40/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Cutting edge games intended for future console platforms with max setting. I apologize. Not every game, not every instance.
Sure, now we agree. If you want to max every setting at 4k (even with DLSS / FSR in use) 16gb will not be enough for much longer. That's not a need, that's a want, which is why I was trying to get exactly what you meant by the word need. Now the problem with that argument is that, 4090 has 24gb of vram and it's still not able to do max 4k anyways. What's the point of adding more vram to a 5070ti - it won't be able to max those settings regardless. And let's not even start talking about amd gpus, they lack the amount of raw power anyways since you mentioned max RT.

So yes, if you wanna max every setting under the sun, a midrange card won't be enough regardless of the amount of vram. Give it 2TB of vram, it's not gonna get there. That's the whole point that cards faster than midrange exist. If an xx70 could max out everything, what would be the point of an xx80? Or a 90?
 
Joined
Aug 3, 2006
Messages
224 (0.03/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
I've been in the PC space for over two decades at this point and would very much consider myself an enthusiast, yet am currently using a 4060 Ti. Does that make me an enthusiast or not? I can guarantee you that if you ask 20 people you'll get 20 different answers.

You are deliberately being difficult. You have a threadtripper and a 4k monitor. Why you have a 4060ti is an enigma in of itself. The fact you are on a tech site discussing tech would be proof enough as well.
 
Joined
May 13, 2008
Messages
773 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
In what world does 4K use 6-8GB more vram than 1440p ?

Many? 21-22GB is not an uncommon occurrence in some higher-end/open world games with 4k textures and things cranked up. Please don't make me google examples....they're out there.

Yes we can argue allocation versus usage. You will argue usage for the minimum, and you're right. That will work.

I will argue allocation so there isn't swap/stutter (the 'stutter struggle')....which most people think is an engine problem.

It's actually often a RAM problem.

I think at some point we're arguing semantics because people have different tolerances and/or don't even understand when there is a 'problem'. I'm talking the ideal.

Again, you can make ANYTHING work...or you can play different games. But this is the reality we're headed towards.

I honestly apologize...I thought there were more people that cared about the high-end scenario then there are in thread/perhaps this forum now. I misjudged and perhaps this isn't the appropriate crowd.
 
Top