• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750

Joined
Nov 6, 2016
Messages
1,740 (0.59/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
once they start selling i hope they get some traction and hopefully that be enough for their next gen gpu to compete with
team red n green in the upper gpu segments. I'm tempted to try their gpus but i already have a 3090
Based on history, I have a feeling that Intel will only be cutting into AMD's marketshare and not Nvidia's, and if this occurs, Intel's presence in the GPU market will do absolutely nothing to improve conditions for consumers.

In the past (as far back as 2008), even when AMD has offered GPUs that perform better and even at a lower price, everyone still buys Nvidia. Say what you want about AdoredTV, but a couple years ago he did a great three or four part video that used market research, sales figures, etc the demonstrated this very phenomenon and described it as Nvidia's "mentia", or mindshare. Basically it goes like this: the vast majority of PC hardware consumers are not enthusiasts who compare specs and benchmarks for hours and days prior to making a purchase, just like consumers in any other market, they base their purchasing decisions on a lot of non-empirical, irrational factors. These consumers therefore will be more influenced by the fact that they notice more people own Nvidia than AMD, that Nvidia has more fans that are more vocal online, and despite not being true since the 290x, the fans constantly repeat that AMD runs hot and that they have bad drivers (despite there being no real comprehensive empirical data to back up such a claim) and because they're not the type to do research, they're just going to accept the accusations as fact. Now, in their mind, they've associated Nvidia with the winning side and therefore want to associate themselves with the winning side as well.

These consumers will not look at the benchmarks, see Intel is performing better in their price tier and buy an Intel videocard. They will only consider Intel when they see a bunch of other people willing to buy intel or when, psychologically speaking, they've come to associate Intel videocards with the "winning side". The same irrational considerations that prevent them from buying an AMD videocard will also prevent them from buying an Intel videocard and therefore, Intel's sales will predominantly come at AMD's expense because the people most willing to take a chance on an Intel videocard are the ones willing to buy an AMD videocard...most likely because they're willingness to do the research and look at benchmarks is probably what brought them to buy an AMD videocard and would therefore make them also open to the idea of buying an Intel videocard.

Your diehard types who only buy Nvidia and will not even consider AMD (whether by habit or active choice), which I feel makes a large portion of Nvidia's marketshare and the consumer GPU market as a whole, are probably not going to consider Intel either and are only hoping that Intel's entry into the market will allow them to continue to buy Nvidia, but at a lower price. If Intel stays in the market, then years down the road they may change this, but for the first six months or year, or even Intel's first couple of generations, they'll predominantly take marketshare from AMD and this will do absolutely nothing to improve the conditions of the GPU market for consumers. As long as Nvidia holds on to 80% marketshare (or probably anything over 50%) they're going to have the monopolistic power to keep their prices high and even maintain the trend of constantly increasing prices every generation (like how a 4080 12GB is basically a 4070 so now xx70 tier cards are priced around $800, and because AMD's shareholders will expect the same profits and profit margin, AMD will follow suit to some degree).

Bottom line, everyone hoping that Intel will correct the course of the GPU market is going to be disappointed because if it happens at all, it won't be happening any time in the immediate future.
 
Joined
Jan 14, 2019
Messages
12,334 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Interesting choice. Will anybody buy the A750 and A770 8 GB when you can have the 16 GB model for basically the same price?
 
Joined
Sep 17, 2014
Messages
22,358 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Based on history, I have a feeling that Intel will only be cutting into AMD's marketshare and not Nvidia's, and if this occurs, Intel's presence in the GPU market will do absolutely nothing to improve conditions for consumers.

In the past (as far back as 2008), even when AMD has offered GPUs that perform better and even at a lower price, everyone still buys Nvidia. Say what you want about AdoredTV, but a couple years ago he did a great three or four part video that used market research, sales figures, etc the demonstrated this very phenomenon and described it as Nvidia's "mentia", or mindshare. Basically it goes like this: the vast majority of PC hardware consumers are not enthusiasts who compare specs and benchmarks for hours and days prior to making a purchase, just like consumers in any other market, they base their purchasing decisions on a lot of non-empirical, irrational factors. These consumers therefore will be more influenced by the fact that they notice more people own Nvidia than AMD, that Nvidia has more fans that are more vocal online, and despite not being true since the 290x, the fans constantly repeat that AMD runs hot and that they have bad drivers (despite there being no real comprehensive empirical data to back up such a claim) and because they're not the type to do research, they're just going to accept the accusations as fact. Now, in their mind, they've associated Nvidia with the winning side and therefore want to associate themselves with the winning side as well.

These consumers will not look at the benchmarks, see Intel is performing better in their price tier and buy an Intel videocard. They will only consider Intel when they see a bunch of other people willing to buy intel or when, psychologically speaking, they've come to associate Intel videocards with the "winning side". The same irrational considerations that prevent them from buying an AMD videocard will also prevent them from buying an Intel videocard and therefore, Intel's sales will predominantly come at AMD's expense because the people most willing to take a chance on an Intel videocard are the ones willing to buy an AMD videocard...most likely because they're willingness to do the research and look at benchmarks is probably what brought them to buy an AMD videocard and would therefore make them also open to the idea of buying an Intel videocard.

Your diehard types who only buy Nvidia and will not even consider AMD (whether by habit or active choice), which I feel makes a large portion of Nvidia's marketshare and the consumer GPU market as a whole, are probably not going to consider Intel either and are only hoping that Intel's entry into the market will allow them to continue to buy Nvidia, but at a lower price. If Intel stays in the market, then years down the road they may change this, but for the first six months or year, or even Intel's first couple of generations, they'll predominantly take marketshare from AMD and this will do absolutely nothing to improve the conditions of the GPU market for consumers. As long as Nvidia holds on to 80% marketshare (or probably anything over 50%) they're going to have the monopolistic power to keep their prices high and even maintain the trend of constantly increasing prices every generation (like how a 4080 12GB is basically a 4070 so now xx70 tier cards are priced around $800, and because AMD's shareholders will expect the same profits and profit margin, AMD will follow suit to some degree).

Bottom line, everyone hoping that Intel will correct the course of the GPU market is going to be disappointed because if it happens at all, it won't be happening any time in the immediate future.
Somebody's got blinders on, holy crap... seeing this in 2022 is mind blowing.

AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)

AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.

Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!

Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...

As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
 
Last edited:

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
I think they'll sell fine when(if) they get released.
I'm not looking to buy a GPU at the moment but if I was I'll be really tempted just to satisfy my curiosity.

I really don't see the point of having both an 8GB & 16GB 770 card with only a $20 price difference...very odd.
They'll probably perform exactly the same and with the highest price being $350 there is not much room to price it right.
Most people for 16GB version will gladly play those extra $20 with or without effect on performance so once those get sold out people will buy buy the 8GB version and still not feel bad because it does perform the same in real life. So actually it priced perfectly. :D
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
Somebody's got blinders on, holy crap... seeing this in 2022 is mind blowing.

AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)

AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.

Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!

Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...

As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
Lucky for AMD that the raytracing comes at a huge performance cost for the most time people can't decide if it makes things look better or worst.
It's definitely a good idea but game devs become so good at faking lighting that Raytracing doesn't have the same effect as other technologies in the past likeDX9 with the realistic water in Farcry and HL.
Plus not supported in all games. DLSS same thing works better but no supported in all games.
All companies are the same they are here to maximize their profit from any product.
Whoever has the fastest CPU/GPU has the right to charge more for the whole line up. Not something I agree with but the biggest part of the customer bas they'll just read AMD released the fastest CPU which of course they can't afford but in their mind that means that at any price range any AMD will be better then Intel. Same for the GPUs.

Waiting for reviews... want to get an upgrade for my wifes GTX 1650.
I see what you did there. :D
Recently I bought my wife a woofer it was a surprise and the other day she surprised me (without her knowledge) with a SteamDeck for my upcoming birthday.
The secret is knowing how to put the right spin to it. :D
 
Joined
Feb 14, 2012
Messages
1,845 (0.40/day)
Location
Romania
~2 to 3-years from now? Nah, those titles are already here... today. e.g. Godfall, FC6, etc.

ChaosGate_2022_09_30_16_14_32_468.jpgWarhammer3_2022_08_11_23_04_16_865.jpg
Maybe with some extra-mega-super-giga texture packs, other than that on ultra or high texture you are fine with 6-8GB in 2022. The two games i play ATM, none of them uses above 8GB of vRAM. So yes, it will be another 10 years before 8GB of vRAM is a minimum system requirement.
 
Joined
Sep 18, 2017
Messages
198 (0.08/day)
If Intel GPUs do match the 3060 in performance and price, then I think it will be a success. While I wont be buying these GPUs and probably wont for several generations, releasing their first GPUs and hitting the mainstream segment is pretty good. If they were able to get out and seriously compete with Nvidia and AMD right away, then Intel or another company already would have.

Hopefully they will be successful and we can get more competition.
 
Joined
Jul 13, 2016
Messages
3,262 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
16GB is probably barely better, if any. Considering RTX 3070 is fine with 8GB of VRAM, I don't see how a card in 3060 performance tier would need more. It's not a 4K card anyway.

1664550841659.png
 
Joined
Sep 18, 2017
Messages
198 (0.08/day)
~2 to 3-years from now? Nah, those titles are already here... today. e.g. Godfall, FC6, etc.

Additional VRAM does nothing for you but make your GPU more expensive, any unused vram just sits idle and does nothing.

Look at 1080p which has been around for a while now. 4gb of VRAM was enough for games made in 2010 and it's still enough for games made in 2022.

Obviously higher resolutions will have a higher floor for VRAM use and 4gb is not enough RAM for higher resolutions.

The floor wont change, but the peak VRAM use never moved past 4gb for 1080p and so why would it go higher for 1440p or 4k?
 
Joined
Apr 19, 2017
Messages
71 (0.03/day)
Somebody's got blinders on, holy crap... seeing this in 2022 is mind blowing.

AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)

AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.

Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!

Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...

As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
You make some good points, but what are all these features that gamers need to know about or have that Nvidia has that AMD doesn't? I honestly dont know. People mention features but never say what features that's so important that one would be missing out by purchasing an AMD card.
 
Joined
Apr 8, 2012
Messages
270 (0.06/day)
Location
Canada
System Name custom
Processor intel i7 9700
Motherboard asrock taichi z370
Cooling EK-AIO 360 D-RGB
Memory 24G Kingston HyperX Fury 2666mhz
Video Card(s) GTX 2080 Ti FE
Storage SSD 960GB crucial + 2 Crucial 500go SSD + 2TO crucial M2
Display(s) BENQ XL2420T
Case Lian-li o11 dynamic der8auer Edition
Audio Device(s) Asus Xonar Essence STX
Power Supply corsair ax1200i
Mouse MX518 legendary edition
Keyboard gigabyte Aivia Osmium
VR HMD PSVR2
Software windows 11
I bet They will sell like cupcakes

What a missed opportunity by nvidia not releasing something new and competitive in this segment (under 400$).

1500$+ card not gonna sell well + crypto not good atm
 
Joined
Dec 28, 2012
Messages
3,844 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
You make some good points, but what are all these features that gamers need to know about or have that Nvidia has that AMD doesn't? I honestly dont know. People mention features but never say what features that's so important that one would be missing out by purchasing an AMD card.
Stability for one thing, people seem to forget AMDs long history of garbage drivers and their (still present) habit of letting problems annoy users until the media gets involved (Ryzen 5000 compatibility, Ryzen 3000 compatibility, rDNA 1 downclocking issues, gcn black screens, ece.).

Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.

You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.

Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).

I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)

AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
 
Joined
Apr 22, 2021
Messages
249 (0.19/day)
View attachment 263745View attachment 263746
Maybe with some extra-mega-super-giga texture packs, other than that on ultra or high texture you are fine with 6-8GB in 2022. The two games i play ATM, none of them uses above 8GB of vRAM. So yes, it will be another 10 years before 8GB of vRAM is a minimum system requirement.
Clearly, you haven't played enough games... again... with *eye-candy* as I've stated above. :shadedshu:

Additional VRAM does nothing for you but make your GPU more expensive, any unused vram just sits idle and does nothing.

Look at 1080p which has been around for a while now. 4gb of VRAM was enough for games made in 2010 and it's still enough for games made in 2022.

Obviously higher resolutions will have a higher floor for VRAM use and 4gb is not enough RAM for higher resolutions.

The floor wont change, but the peak VRAM use never moved past 4gb for 1080p and so why would it go higher for 1440p or 4k?

Same for you. You need to get out more and play more pass ancient 1080p. :shadedshu:
 
Joined
Sep 17, 2014
Messages
22,358 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You make some good points, but what are all these features that gamers need to know about or have that Nvidia has that AMD doesn't? I honestly dont know. People mention features but never say what features that's so important that one would be missing out by purchasing an AMD card.
Eventually, AMD copied what Nvidia brought in terms of featureset. Every time.

- Shadowplay
- DLSS
- Hairworks (lol! more of a joke than anything else, but hey, AMD had to follow with TressFX)
- TXAA / temporal AA
- FXAA
- RT
- Gsync
- etc.

The gist of it is, Nvidia was exercising thought and innovative leadership in the gaming segment, and AMD was not. They are now building on that, I hope they keep momentum. But Nvidia hasn't stopped trying to lead. And these are no small things - many pushes Nvidia initiated have been pretty neat ones that brought gaming graphics further.
 
Joined
Feb 13, 2017
Messages
143 (0.05/day)
Stability for one thing, people seem to forget AMDs long history of garbage drivers and their (still present) habit of letting problems annoy users until the media gets involved (Ryzen 5000 compatibility, Ryzen 3000 compatibility, rDNA 1 downclocking issues, gcn black screens, ece.).

Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.

You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.

Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).

I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)

AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
As somone that replaces their GPU yearly, multiple times on the same build (recently had a 6800, 6600xt, 3060 and 3080) and that builds PCs since 2004, it's clear to me that you are a nvidia shill.

"Stability" my ass, the past 2 years I've had more issues with Geforces than AMDs. nVidias HDMI 2.1 implementation on TVs is a joke.

And as somone that had pretty much 1 Radeon of every gen (plus Geforces) since the x000 series (before HD 2000 series) I haven't had issues gaming with Radeons since AMD bought them. And yes fine wine does exist, while I've experienced the contrary with nVidias (old cards getting lower performance with newer drives after some years).

All features you mentioned only affect streamers, not vanilla gamers.

"I could go on" you can't, there's no killer feature that makes Geforces better for gamers. There was 1, DLSS which is great, but AMD and FSR 2.0 open source just killed it like GSync was killed by VRR.

Guy's post on AdoredTVs is correct. I have lots of friends that won't even consider an AMD card for their builds, they won't even check benchmarks. And they are IT guys, not stupid 12 year olds.

Ryzen took 2-3 generations to start getting mindshare, and was helped by the (justified) hatred people have at Intel for their shady practices and forever 4 core CPUs. nVidia doesn't sit on their hands like that, so there's no vacuum for AMD or Intel to fill.

All in all, to me this is actually advantageous. People flock to buy nVidia so, usually, I can get AMDs for a lower price. But these 2 are neck n neck on performance/quality of drivers, and any drastic comment on that is being a fan boy.
 
Joined
Jan 14, 2019
Messages
12,334 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Stability for one thing, people seem to forget AMDs long history of garbage drivers and their (still present) habit of letting problems annoy users until the media gets involved (Ryzen 5000 compatibility, Ryzen 3000 compatibility, rDNA 1 downclocking issues, gcn black screens, ece.).

Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.

You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.

Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).

I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)

AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
I would agree if it was 2014, but it's not. AMD has come a long way since then. The B550 USB bug was the last hiccup I've heard of with the Zen 2/3/4 platform. AMD systems have been as stable as Intel since then. Their GPUs work perfectly, too. Some 5700 series cards had heat issues, but the 6000 series is just as solid as Nvidia.
 
Top