• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Plays the VRAM Card Against NVIDIA

Status
Not open for further replies.

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
Yeah, the £1,000 7900 XTX is only 2-3% faster than the £1,300 4080. How pitiful, really!
1. $1000 vs $1200. 2. The 4080 has way better RT performance, better upscale quality (DLSS) and frame generation capability. So yes, in my opinion paying $200 more is absolutely worth it.
 
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
1. $1000 vs $1200. 2. The 4080 has way better RT performance, better upscale quality (DLSS) and frame generation capability. So yes, in my opinion paying $200 more is absolutely worth it.
1. I quoted UK prices at Scan. The 7900 XTX is £1000, the 4080 is £1300. That's 30% price increase for a -2% reduction in performance. Even in RT, it's only 16% faster. You must be crazy to think it's a good deal.
2. The difference between DLSS 2.0 and FSR 2.0 is placebo at best. Personally, I prefer native resolution to either of the two. I don't care about magic frames, either. The only valid argument is ray tracing, but even that isn't worth 30% extra money when it's only 16% faster.
1681379576585.png


Even for 1200 USD, the 4080 offers terrible value:
1681379652848.png
 
Joined
Apr 2, 2011
Messages
2,813 (0.56/day)
Translation: our cards suck, but we put a lot of VRAM in them because that was the only thing we could do to make them look more valuable.

You know, I love reading about all this hate for AMD, and reading statements without substance, and then trying to parse out whether this is a fanboy post or someone with an actual point.

If for a moment you start to compare the two companies:
1) AMD does have a history of releasing bad drivers....but in the last decade that has turned around.
2) Both AMD and Nvidia have released bad hardware. AMD has poorly built coolers, Nvidia has poorly specified power connectors. Neither company is free from that.
3) Nvidia and AMD don't compete at the high end. Conversely, if you can put the numbers away Nvidia and AMD both compete between the $100 and $1000 market. This is kind of silly to have to state, but the 7900 XTX just isn't built to compete with the 4090...and that's alright for anyone looking to spend south of a grand on a GPU.
4) Nvidia favors tech and closed source solutions. AMD favors old school performance and large numbers. Both are viable strategies. If Nvidia wants to be the best with edge case tech it's fine, because AMD sees running the 99.99% of all games that don't run this new tech better then it's a win for them.


Let me follow this up with a war chest of stupidity. Linked physics accelerator cards. TressFX. SLI. Crossfire. Good lord I'm already depressed. My point is that both AMD and Nvidia have released some absolutely bizarre stuff. AMD rightfully earned a reputation for bad drivers (from the owner of a 5000 series card), but Nvidia has also rightfully earned a reputation for charging super premium prices. So we are clear, this is a permutation of the Apple vs. PC debate...or put more succinctly the ongoing debate between absolute ease of use versus features.



Now that I've said all of this, let me end with a real response. Yes, AMD is obviously going to point out the unique features which make their product more competitive when social media suddenly bring it to the front. Nvidia does the exact same with their streaming integration and absolutely silly ray tracing videos...and somehow that's not called out for the same? I'm a firm believer that both companies offer value in different ways, so seeing this level of criticism for what is standard practices is baffling. Long live both team Red and Green, may they actually compete and get card prices back down to a place where a good medium tier card (x070 and x800 respectively) cost less than buying an entire game console. That's the win I'm hoping for, and if we want that it'll take both of them having to compete harder.
 
Joined
Sep 6, 2013
Messages
3,354 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
AMD is desperately trying to sell their cards. According to all the tests, on average the difference between the 4080 and the 7900xtx in regular rasterization is only 2-3%, but they are trying to make it seem like their product dominates with a 10-20% lead everywhere except for five games with ray tracing) It's a pitiful attempt.
You see things from a totally wrong perspective here. Remember. Nvidia already won. They don't need and they shouldn't manage to get the rest of the market share. Why?
Let's see what will happen if AMD's DESPERATE, as you say, marketing succeeds.

- People might start buying AMD cards. That's good for competition. It could force Nvidia to lower prices or come up with more models addressing the reasons people are turning to AMD.
- People don't buy AMD, but also don't buy Nvidia either. And I don't say they don't buy 8GB-12GB models and turn to 16GB-24GB models, which would have been what Nvidia would want to see and probably what some posters would want to see. I mean they DON'T BUY NVIDIA EITHER. Nvidia would have to address the reasons that make consumers to hesitate to buy their cards. So we could see new 4070 models coming out with 16GB of VRAM and 4060 and 4060 Ti coming out with a minimum of 12GBs instead of 8GBs. Models with less VRAM could see a price reduction or even slowly get removed from the market.

So, why calling out "desperate AMD" with their "pitiful attempts" instead of hopping they succeed?
 

bug

Joined
May 22, 2015
Messages
13,793 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Yeah, the £1,000 7900 XTX is only 2-3% faster than the £1,300 4080. How pitiful, really! :rolleyes:
He only said it's pitiful how AMD is trying to paint it faster than it is. Pricing is a different aspect.

As far as I'm concerned, whether it's $1,000 or $1,000,000, it's all the same to me. I'm never-ever going to pay that much for a video card.
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
I prefer native resolution to either of the two. I don't care about magic frames, either.
If you're not interested in upscaling or frame generation, that's your personal choice. I am interested in those features. Currently, I'm playing Cyberpunk with new path tracing at 1440p on ultra settings, with DLSS Q and FG enabled, and I'm getting 95-115 fps with great smoothness and image quality using my 4080. The 7900xtx is not even capable of coming close to that level of performance. In my opinion, paying 20% more or even 30% more in your case is absolutely worth it.
Regarding prices in specific countries, here in Poland, the cheapest 4080 (Gainward) costs $1287, and the cheapest 7900xtx (Biostar) costs $1170. The price difference is 10%. And such a "cheap" 7900xtx is available from only one manufacturer, from Biostar. Prices for this card from other brands start at $1287.
Your graph,"performance per dollar" doesn't take into account either RT or FG.
 

Attachments

  • 88.jpg
    88.jpg
    315.1 KB · Views: 43
  • 99.jpg
    99.jpg
    301.2 KB · Views: 44
Last edited:
Joined
Jan 3, 2023
Messages
51 (0.07/day)
Expect this trend to continue to grow instead of diminish. AMD is is in the Xbox and PS5. That means that Games that are made for all 3 are already optimized for AMD. One of the things I have noticed about AMD on new Games is how consistent the Clocks are. Translating to ultra smooth Gameplay. This is why for me averages mean nothing and 1% lows are everything. Once you go X3D you don't go back.


1. The trend will stop once they reach the limit on consoles.
2. The hw in consoles is entirely different than the one in pcs, not to mention, consoles use a different api to access it and they do not rely on drivers. With the upcoming archs, cpu and gpu wise from AMD, they will lose that little of the indirect optimisation advantage, they had, because those will be almost wholly different archs than the ones in the consoles currently. It is the same story every console generation. It is getting really dull.

7800X3D only matched Intel 13900K lows depending on the game and there are still plenty of titles in which Intel is considerably faster, including 1% and 0,1% lows, so I don't what you are talking about, AMD has no particular advantage in the lows % area and especially, when comes to 2 CCX models, 7900x3D and 7950x3D, it is actually their weakness if the games decide to use the CCX without the cache.
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
So we could see new 4070 models coming out with 16GB of VRAM and 4060 and 4060 Ti coming out with a minimum of 12GBs instead of 8GBs.
We will never see a 4070/4070ti with 16GB of VRAM. It's not as simple as just soldering two memory chips, the entire GPU needs to be redesigned. This is only possible if these cards are based on the AD103 chip from the 4080. However, this is not practical. The 4060/4060ti also cannot have more than 8GB for the same reason.

I personally believe that prices for all graphics cards are unreasonably high right now. This is a very bad trend. At one time, I used to build SLI configurations with two 470/770/970 cards, and it was 50% cheaper than buying a single 4080 now. I advocate for healthy competition, but when I see AMD's latest slides, I am appalled by their lies. Do they think people don't look at real tests before buying?) If they want to compete properly, they need to do only one thing - lower prices. Right now, I see that the recommended prices for the 4080 and 7900xtx are fair in terms of the ratio between the cards. They have roughly the same performance in regular rasterization, and Nvidia has an advantage in RT and technologies, which is why the 4080 is 20% more expensive. But this doesn't change the fact that both cards are unreasonably expensive. I would like to see the 4080 for at least $900 and the 7900xtx for $800.
 
Joined
Jun 2, 2017
Messages
9,210 (3.36/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
1. The trend will stop once they reach the limit on consoles.
2. The hw in consoles is entirely different than the one in pcs, not to mention, consoles use a different api to access it and they do not rely on drivers. With the upcoming archs, cpu and gpu wise from AMD, they will lose that little of the indirect optimisation advantage, they had, because those will be almost wholly different archs than the ones in the consoles currently. It is the same story every console generation. It is getting really dull.

7800X3D only matched Intel 13900K lows depending on the game and there are still plenty of titles in which Intel is considerably faster, including 1% and 0,1% lows, so I don't what you are talking about, AMD has no particular advantage in the lows % area and especially, when comes to 2 CCX models, 7900x3D and 7950x3D, it is actually their weakness if the games decide to use the CCX without the cache.
Where are your system specs? You are telling me that my 7900X3D does not always drive my GPU and if the CCX without the Cache is used what is the all core maximum clock speed? One of the assumptions that people make are that CPUs do not improve based on Day 1 reviews and Microsoft updates do not bring performance improvements.

Now let's get into your opinions.

1. Once the limit is reached? When Consoles have 16GB of VRAM. I have not heard 1 owner of a 6700 or above Complaining.
2. The Hardware is different? Explain to me how in hardware the AM4 CPU and RDNA2 GPU in PS5 and Xbox1 are different than AM4 on PC. AMD is going to abandon it partners? Will RDNA3 not work on AM4?
3. Do you think that Xbox Game Pass on PC is going to become incompatible when the Xbox2 or whatever launches on AMD systems? Did you see the Post about Game Pass being in 40 countries?

Yep you can talk about the 13900K but what happens next year this time when I can buy whatever AMD buys that will likely make a 13900K look like an 11900K. It does not matter though because all of that is just oxygen for the Culture Wars. I am of the strong opinion that if you put 4 people in front of 4 systems and not tell them what is inside that they will not know which one has AMD or Intel without being told.

I do have to say though that X3D is that good and you can't convince me or likely anyone that owns one to not appreciate that. The 13900K is also an absolute beast of a CPU and we should be glad that there are real choices to make you smile. Instead of preaching opinions from people that may have different reasons than you to say what they say.
 
Joined
Oct 18, 2019
Messages
413 (0.22/day)
Location
NYC, NY
You pay for what you get.

I paid for a 4090.

I got 24GB.

I think the funny thing is that most people still using 1080p and 1440p monitors are the ones so emotionally concerned with 4K performance.

My local Microcenter has PLENTY of 4090 in stock right now (WESTBURY). There's no way I'd buy any card below the "top model". My 3090 served me extremely well during Cyberpunk's release week and continues to serve me well in my DCS WORLD Flight Sim with Oculus VR.

The 3090 and the 4090 were UNCOMPROMISING. If they'd put more VRAM in the 4080, I could say the same thing there as well.

You need to upgrade your rig to a 1000W PSU and get a bigger tower case. I got the Strix 4090 on launch and that behemoth is as large as my PS5. I AM SADDENED that NVIDIA stepped out and I couldn't get a Kingpin 4090...but I may consider selling my card and getting a Liquid Suprim X - or waiting for the 4090Ti version of it. I would wish people waiting on line for a 4070 luck - BUT THEY WON'T NEED IT. The economic downturn and lack of stimulus welfare checks is keeping Demand low.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
We will never see a 4070/4070ti with 16GB of VRAM. It's not as simple as just soldering two memory chips, the entire GPU needs to be redesigned. This is only possible if these cards are based on the AD103 chip from the 4080. However, this is not practical. The 4060/4060ti also cannot have more than 8GB for the same reason.
Technically it is possible when higher density memory chips become available, on the last gen there was a guy offering to make "custom" 3080 20GB (desoldering the 1GB chips and soldering 2GB ones). This doesn't require any modification to the die, the bus width is still the same. The probability that we see this from Nvidia is very, very small, though.
 
Joined
Dec 24, 2022
Messages
86 (0.12/day)
Processor Ryzen 5 5600
Motherboard ASRock B450M Steel Legend
Cooling bequiet! Pure Rock Slim (BK008)
Memory 16GB DDR4 GoodRAM
Video Card(s) ASUS Expedition RX570 4GB
Storage WD Blue 500GB SSD
Display(s) iiyama ProLite T2252MTS
Case CoolerMaster Silencio 352
Power Supply bequiet! Pure Power 11 CM 400W
Mouse Logitech M590
Keyboard Logitech K270
Software Linux Mint
AMD ought to stop making dumb comments and concentrate on developing their hardware. Their comment looks pathetic after RDNA3 release and ZEN4 pricing.

That's another big part, people forget that console settings are perfectly playable even on RX 5600xts now. You just gotta accept low/meduim settings at 1080p, maybe 1440p with resolution scaling turned down.

Consoles are not running 4k ultra native rez 60+ FPS.
They forget console games must be well optimised. The hardware is fixed and the game has to run smoothly on current and previous generation console. The games are also limited due to controller. I don't see Civilisation series playable on PS5 or XBox Series.
Besides, I had PS3. It was fun, but the games were getting old quickly and internet browser was horrible so it was just sitting there for months unused. If I could use it akin to AiO PC, albeit impossible to upgrade, it'd make more sense. I considered buying a console numerous time, it just doesn't seem viable due to many reasons.
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
Technically it is possible when higher density memory chips become available, on the last gen there was a guy offering to make "custom" 3080 20GB (desoldering the 1GB chips and soldering 2GB ones). This doesn't require any modification to the die, the bus width is still the same. The probability that we see this from Nvidia is very, very small, though.
No. All RTX 4000 graphics cards already use 2GB chips. And there definitely won't be any 3GB chips in the next year or two. If 3GB chips do appear, it will only be with the release of the RTX 5000.
 
Joined
Sep 6, 2013
Messages
3,354 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
We will never see a 4070/4070ti with 16GB of VRAM. It's not as simple as just soldering two memory chips, the entire GPU needs to be redesigned. This is only possible if these cards are based on the AD103 chip from the 4080. However, this is not practical. The 4060/4060ti also cannot have more than 8GB for the same reason.

I personally believe that prices for all graphics cards are unreasonably high right now. This is a very bad trend. At one time, I used to build SLI configurations with two 470/770/970 cards, and it was 50% cheaper than buying a single 4080 now. I advocate for healthy competition, but when I see AMD's latest slides, I am appalled by their lies. Do they think people don't look at real tests before buying?) If they want to compete properly, they need to do only one thing - lower prices. Right now, I see that the recommended prices for the 4080 and 7900xtx are fair in terms of the ratio between the cards. They have roughly the same performance in regular rasterization, and Nvidia has an advantage in RT and technologies, which is why the 4080 is 20% more expensive. But this doesn't change the fact that both cards are unreasonably expensive. I would like to see the 4080 for at least $900 and the 7900xtx for $800.

Obviously the data bus could be a reason to not see cards with more VRAM, but we could see, as you say, cut down versions of AD103 going into 4070s and AD104 going to 4060. If not this year, maybe next year. Nvidia is doing it constantly and while we could argue that those cases are just cases when there is a lack of chips, or just plenty of problematic chips, if sales are not the expected, that's also a reason for Nvidia to choose to come out with new models with more VRAM. They have plenty of profit margin to play ball and also I doubt the cost of the silicon is prohibiting. The cost of the silicon is mostly the R&D costs associated with it's development, not the cost of the silicon itself, not what TSMC charges.

While you give a detailed explanation of how you see things, you don't really answer my question.
So, why calling out "desperate AMD" with their "pitiful attempts" instead of hopping they succeed?
I do understand that you talk about lies. Well, which company doesn't lie? Nvidia? You say that AMD is lying by promoting the importance of having more VRAM. How about saying nothing and promoting $600-$800+ cards to the general public as cards that are capable to max out every AAA game at 1440p? Isn't this a lie? And no. People don't look at real tests before buying. You think people have the time to check every product they buy to check if it is the best in the market? No one does that.

And let me repeat my reply to another member that I gave a few days ago of, why AMD can't see lowering prices as a viable strategy today

AMD profit margin 45%, Nvidia profit margin 65%
AMD card A at $700, Nvidia card A at $800 (same performance) - people buy Nvidia
AMD drops price at $600, Nvidia follows at $700 - people buy Nvidia
AMD drops price at $500, Nvidia follows at $600 - people buy Nvidia
AMD drops price at $400, Nvidia follows at $500 - people buy Nvidia
AMD loses interest because of low or no profit, Nvidia still makes a good profit, because of the higher profit margin they begin with.
Next Nvidia card A comes at $1000, AMD follows with their next card A at $900. AMD NEVER drops price, knowing from the beginning they will lose. Nvidia never drops price, because it sees consumers buying. Consumers blame AMD for the high and stable prices. Because they demand from AMD to fix Nvidia's pricing. Things only get worst in the future.
Only Nvidia can fix Nvidia's pricing.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
And let me repeat my reply to another member that I gave a few days ago of, why AMD can't see lowering prices as a viable strategy today


Only Nvidia can fix Nvidia's pricing.
That's basically what Jim says in this video, selling cheaper in the past for AMD was a losing strategy, smaller and smaller margins lead to less and less R&D budget. If they should be gaining market share, they must do it by also keeping healthy margins in the same time. The same goes for Intel's GPU adventure, btw.
 
Joined
Sep 6, 2013
Messages
3,354 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
That's basically what Jim says in this video, selling cheaper in the past for AMD was a losing strategy, smaller and smaller margins lead to less and less R&D budget. If they should be gaining market share, they must do it by also keeping healthy margins in the same time. The same goes for Intel's GPU adventure, btw.
I haven't seen the video, but it's just logic. People hope for AMD to lower prices to pay less to Intel and Nvidia. Well, that's not sustainable. They will buy one, two, three times cheaper Intel or/and Nvidia hardware and that's it. With AMD not having the income it needs and neither market share increase to at least hope to gain customers for it's next products, they will lose the abillity or even the interest to compete. Then monopoly happens and whatever people saved thanks to AMD's lowering prices strategy, they end up paying it double or more to Intel and Nvidia because they are a monopoly now. Fun part. They blame AMD when it's them who didn't gave AMD money to use in it's R&D and keep making competitive products. They where expecting others, less smart, to do it, to buy AMD hardware. They where the smart ones, taking advantage of AMD's strategy to buy cheaper Intel and Nvidia hardware. And yes the same goes for Intel's GPU adventure. I didn't pay any money to get an ARC GPU. Did anyone in here done so? Maybe one or two. Who expect Intel to save them from Nvidia's pricing. Probably many more than just one or two. Well it will never happen if Intel sees that department losing money every quarter. Of course Intel enjoys strong ties with OEMs, but this is probably enough to empty it's inventory, not to make money.
Anyway, all those trolls, fanboys, or just plain posters posting their own objective opinion about how bad AMD is, how bad their drivers are, how bad idea is to buy an RX 6600 over an RTX 3050, all those brought us today in a situation where AMD can't compete in GPUs. And they can't even start a price war because they know they will lose. And maybe they don't care, because they know that the whole "increasing prices" strategy from Nvidia, also helps them too. GPUs will become more expensive and even the integrated GPUs will become more expensive, at least the "strong" ones in the APUs. Everything will become more expensive in the end, but rejoice, people will avoid those bad Radeon drivers and those bad AMD graphics cards.

Damn, people shooting their own foot and never realizing.... Only a couple youtubers out there seems to realize it and they are constantly getting attacked. Hardware Unboxed that is of course AMD Unboxed, even when their videos support Nvidia products and technologies, and Linus of LTT. He sees that we are going into a monopoly and tries to push AMD as an alternative at least. Other Youtubers still drive all the focus of their audience to Nvidia. In their last video Gamers Nexus also realized that AMD also makes GPUs. They where promoting Nvidia constantly the last 12 months. JayTwoCents still makes videos that Nvidia's marketing department would envy. JMO
 
Last edited:

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
You say that AMD is lying by promoting the importance of having more VRAM.
No. I said that AMD is lying when they show slides where their 7900XTX outperforms the 4080 by 10-20% in all games except for five games with RT. In reality, these cards are roughly equal in regular rasterization, as seen from the tests I have provided. As for VRAM, yes, when it comes to the 3070 8GB and 6700XT 12GB, AMD looks good here. Although the GPU's performance is lower, having 12GB of VRAM allows for better textures in horribly optimized console ports like TLOU. But in the case of the 4080 16GB and 7900XTX 24GB, this is already absolutely unimportant because 16GB is enough everywhere and will be enough for a long time.
I understand your point of view on reducing prices, it's quite logical. But there's one thing. Wouldn't it be better for AMD to put 16GB of VRAM in the 7900XTX, thus saving money and selling more cards at a lower price while maintaining their profit margin and strengthening competition with Nvidia? Instead, they decided to pay people who make useless slides about the usefulness of 24GB in the distant future and about how in their fantasy the 7900XTX outperforms the 4080 in 95% of games)
And look at FSR 2... There would never have been this upscaling if Nvidia hadn't made its DLSS. The same story with FSR 3. Without Nvidia's FG, AMD wouldn't have even thought about it. The support for ray tracing on RX 7000 also surprises me. Couldn't they have implemented it better in the RX 7000 after the poor results of the RX 6000? Even the first generation of cards from Intel works with RT better. When attempts to compete come down to playing catch-up and publishing false slides, I logically cannot hope for any success. That is why I am calling out AMD for their pitiful attempts instead of hoping they succeed.
 
Joined
Sep 6, 2013
Messages
3,354 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
No. I said that AMD is lying when they show slides where their 7900XTX outperforms the 4080 by 10-20% in all games except for five games with RT. In reality, these cards are roughly equal in regular rasterization, as seen from the tests I have provided. As for VRAM, yes, when it comes to the 3070 8GB and 6700XT 12GB, AMD looks good here. Although the GPU's performance is lower, having 12GB of VRAM allows for better textures in horribly optimized console ports like TLOU. But in the case of the 4080 16GB and 7900XTX 24GB, this is already absolutely unimportant because 16GB is enough everywhere and will be enough for a long time.
OK, but every company manipulates it's charts. That's why everyone says "wait for interdependent benchmarks". You think when Nvidia makes charts it doesn't test under best case conditions for it's cards? Don't use double standards.
I understand your point of view on reducing prices, it's quite logical. But there's one thing. Wouldn't it be better for AMD to put 16GB of VRAM in the 7900XTX, thus saving money and selling more cards at a lower price while maintaining their profit margin and strengthening competition with Nvidia? Instead, they decided to pay people who make useless slides about the usefulness of 24GB in the distant future and about how in their fantasy the 7900XTX outperforms the 4080 in 95% of games)
You are still missing my point. AMD CAN NOT sell much lower. If 7900XTX was coming with 16GBs at $900, RTX 4080 could have seen a $100 price drop. And I say COULD and not WOULD, because in today's reality, people would still keep paying the extra money to Nvidia. But let's say that people where willing to buy an AMD card. Then RTX 4080 WOULD have seen a price drop of $100 and you would be saying here "Why haven't AMD names 7900XTX as 7800XT and sell it for $700? It's their fault they don't sell". That's how we go down in pricing and at levels where Nvidia makes money and AMD does not. Because AMD enjoys much lower profit margins and Nvidia is the brand that sells 9 times more cards. Even if their profit margins where the same, with Nvidia selling 9 times more, AMD doesn't make money, or even loses money.
And look at FSR 2... There would never have been this upscaling if Nvidia hadn't made its DLSS. The same story with FSR 3. Without Nvidia's FG, AMD wouldn't have even thought about it. The support for ray tracing on RX 7000 also surprises me. Couldn't they have implemented it better in the RX 7000 after the poor results of the RX 6000? Even the first generation of cards from Intel works with RT better. When attempts to compete come down to playing catch-up and publishing false slides, I logically cannot hope for any success.
I was always saying that Nvidia innovates like no other and that Huang is a genius. I was also shouting about the RT performance when RX 7000 came out and people where calling me Nvidia fanboy. They did had a point, meaning RTX 3000 performance was great from Nvidia cards, a few months latter the same RTX 3000 equivalent performance is trash from AMD cards. As for DLSS, that was cheating a few years ago. But Nvidia does have the market share and the tech press support to promote these kind of tricks. Upscaling isn't an Nvidia invention. It was out there all those years. It was just cheating. And DLSS/FSR 3.0 are just worst tricks that are based on one simple reality. The eye can NOT spot the difference. Even if it spots it, the brain chooses to ignore it in favor of the higher framerate. You get inferior image quality, but you can NOT realize it. The only thing you can understand is that you put "ultra" in settings and 4K as a resolution. So you get, in your mind, Ultra settings and 4K at the same time. Right? Well, no, but let's all agree to say yes.
GSync? That was also a good idea from Nvidia. Again not their invention. Just the usage in PC desktop monitors for smoother game, that was their idea. And even there there was a way to implement it in a way to make it free to everyone, but, no, let's throw a hardware board there, make money even from that board and make it Nvidia exclusive. Do you remember PhysX? Don't let me start about PhysX. Anyway Nvidia innovates but for their own profit. So, even if AMD follows, you have to be grateful that there is someone following and also giving away the results of their efforts as FREE alternatives. Would it be better if there was NO ONE to follow Nvidia? My new TV is FreeSync Premium and GSync compatible through HDMI 2.1. It's GSync compatible thanks to AMD, not Nvidia. Nvidia followed AMD here.

You ask from AMD to compete when at the same time you will probably do anything to stop people from buying their cards. So, why compete if no one will buy their cards? You ask from AMD it's marketing slides to be perfect, but do you ask from Nvidia to NOT lie in their marketing slides? It's not just you. There are so many people out there expecting from AMD to build the best hardware, offer it at the lowest prices and also market it in ways that will not showing it's strengths against competing hardware, only it's disadvantages. So many that I could copy this post and repost it 1000 times in the next 10 years.
That is why I am calling out AMD for their pitiful attempts instead of hoping they succeed.
But you will never call Nvidia. Only about pricing, another fault that is 100% AMD's fault. NOT Nvidia's......
 
Joined
Feb 23, 2011
Messages
467 (0.09/day)
System Name Gaming PC
Processor Intel Core i5 12400f at 5.4Ghz
Motherboard Asrock B660 Riptide PG (Eternal clock gen)
Cooling ARCTIC Liquid Freezer II
Memory 32GB Kingston FURY Renegade
Video Card(s) Zotac RTX 4070ti Trinity
Storage WD SN750 SE 1TB M.2 + 2x Kioxia Exceria 480GB
Display(s) BenQ EX2780Q 1440p/144Hz
Case Lian Li 205M
Power Supply Corsair RM850x SHIFT
I think the funny thing is that most people still using 1080p and 1440p monitors are the ones so emotionally concerned with 4K performance.

You hit the nail on the head!
 
Joined
Oct 19, 2022
Messages
21 (0.03/day)
Location
Sweden
Processor Ryzen 5 5600
Motherboard MSI B350M Mortar
Memory 2x8 Gb DDR4 HyperX Black 2133 @ 3200 CL16 + 2x8Gb Corsair DDR4 Vengeance @ 3200 CL16
Video Card(s) Asus Dual Radeon RX 6700 XT
Storage 1x Crucial P2 512Gb, 1x WD "old" Blue 1Tb 7.2k, 1x Seagate ST2000 2Tb 7.2k
Display(s) AOC 24G2 144Hz
Case Fractal Pop Air
Mouse Kone Aimo modded with JPN switches
Keyboard Logitech G 413 or Steelseries 6Gv2 depending on the mood
AMD could also have played on their (currently) better support. I had a Vega56 which was on par with GTX 1070... and then, I regularly checked some youtube videos here and there comparing the two over time. I was astonished to see that the Vega was still playable (1080p V. High/Ultra on some titles) while GTX owners had to scale back on some settings. (but all good stories have to end, and I recently replaced it for an RX 6700 XT after over 5 years...).

For the equipped VRAM questions, we can all be sure that RXs with 10-12Gb will age better than their RTX 8Gb-geared counterparts, experience should be smoother with demanding games.

And talking about FSR/DLSS, don't get on this slippery path. A few years ago (was it for the GeForce 5xx or 6xx series, I honestly can't recall), there were proofs that nVidia started playing with rendering quality to take the lead in Ultra settings. Nowadays, this type of cheat is considered as a feature.

But yes, I can say that nVidia can innovate, RT being one nice eye-candy, even if it is not mature yet (I still remember the fuss about the first Anti Aliasing techniques back in late '90s :D)

And don't get me wrong, I had a lot of cards from various vendors over the last 30 years or so, from Trident to S3, to Matrox, to 3DFx, then a relatively balanced mix between ATi/AMD and nVidia (and a Kyro2, I must say :))
 
Joined
Feb 23, 2011
Messages
467 (0.09/day)
System Name Gaming PC
Processor Intel Core i5 12400f at 5.4Ghz
Motherboard Asrock B660 Riptide PG (Eternal clock gen)
Cooling ARCTIC Liquid Freezer II
Memory 32GB Kingston FURY Renegade
Video Card(s) Zotac RTX 4070ti Trinity
Storage WD SN750 SE 1TB M.2 + 2x Kioxia Exceria 480GB
Display(s) BenQ EX2780Q 1440p/144Hz
Case Lian Li 205M
Power Supply Corsair RM850x SHIFT
For the equipped VRAM questions, we can all be sure that RXs with 10-12Gb will age better than their RTX 8Gb-geared counterparts, experience should be smoother with demanding games.

Not necessarily, having the VRAM is not the same as having the processing power required.

The 8GB cards could end up running lower texture settings but higher quality effects.
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Not necessarily, having the VRAM is not the same as having the processing power required.

The 8GB cards could end up running lower texture settings but higher quality effects.
Try playing a good-looking game with textures set to low. Observe the effect.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.

Hardware unboxed used to make videos finding the best settings for game with a good balance between performance and visual quality. Most games look good starting from medium/high effects, except for textures, which should be at maximum.
 
Joined
Feb 23, 2011
Messages
467 (0.09/day)
System Name Gaming PC
Processor Intel Core i5 12400f at 5.4Ghz
Motherboard Asrock B660 Riptide PG (Eternal clock gen)
Cooling ARCTIC Liquid Freezer II
Memory 32GB Kingston FURY Renegade
Video Card(s) Zotac RTX 4070ti Trinity
Storage WD SN750 SE 1TB M.2 + 2x Kioxia Exceria 480GB
Display(s) BenQ EX2780Q 1440p/144Hz
Case Lian Li 205M
Power Supply Corsair RM850x SHIFT
Try playing a good-looking game with textures set to low. Observe the effect.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.

Hardware unboxed used to make videos finding the best settings for game with a good balance between performance and visual quality. Most games look good starting from medium/high effects, except for textures, which should be at maximum.

Well designed good looking games have good textures below maximum texture settings.

RTGI will offer a higher visual boost than higher texture settings will.
 
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
RTGI will offer a higher visual boost than higher texture settings will.
Better lighting can make one shit one's pants in joy, but higher model quality and better textures make the overall image look more real. I prefer realism to eye candy.

Try playing a good-looking game with textures set to low. Observe the effect.
Textures are what you see everywhere in a game and are the cheapest (performance-wise) boost to visual quality, almost free, as long as you have enough VRAM. Volumetric clouds and other performance-intensive stuff like that don't make anywhere near so much of a difference visually. Since most games are made for consoles, most effects look fine between medium and high, because is that's how they were conceived by game designers. Ultra/epic is just added with the port to PC to give something to high-end PC gear to chew on. But textures are meant to be used at max, because they are free performance.
Exactly! I can live with fewer light sources, or non-volumetric fog, but I can't do with mushy textures.
 
Status
Not open for further replies.
Top