• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Possible Radeon RX 6700 XT Specs Surface, 12GB the New Mid-Range Memory Size?

Joined
Apr 7, 2011
Messages
1,380 (0.28/day)
System Name Desktop
Processor Intel Xeon E5-1680v2
Motherboard ASUS Sabertooth X79
Cooling Intel AIO
Memory 8x4GB DDR3 1866MHz
Video Card(s) EVGA GTX 970 SC
Storage Crucial MX500 1TB + 2x WD RE 4TB HDD
Display(s) HP ZR24w
Case Fractal Define XL Black
Audio Device(s) Schiit Modi Uber/Sony CDP-XA20ES/Pioneer CT-656>Sony TA-F630ESD>Sennheiser HD600
Power Supply Corsair HX850
Mouse Logitech G603
Keyboard Logitech G613
Software Windows 10 Pro x64
Here are MSRP prices at the time and I also provided prices with inflation included in the brackets.
HD5870 MSRP was $379(Inflation: $460), the eyefinity version $479(Inflation: $582), a dual card HD 5970 was $599(Inflation: $727).
GTX 970 MSRP was $329(Inflation: $362), GTX 980 cost $549(Inflation: $604), GeForce GTX 980 Ti cost $649(Inflation: $714) and GTX Titan X cost $999(Inflation: $1099).
There were already graphic cards which cost more than $500 and that's without adding inflation to that price.
Wanting the TOP cards -> RTX 3090 or RX 6900XT to sell for only $500 is insane.

I don't really care about MSRP, I only care how much I pay at the end and I provided the prices I paid.

And the inflation that is brought all the time is BS and does not apply for every region.
 
Joined
Jan 24, 2011
Messages
180 (0.04/day)
I don't really care about MSRP, I only care how much I pay at the end and I provided the prices I paid.

And the inflation that is brought all the time is BS and does not apply for every region.
And I don't really care at what cost you bought your card when I don't even know If you bought It at the release or a year later or If It was on discount at the time!

That inflation was for USA because both AMD and Nvidia are US companies and It's not BS, show me which country where the chips or cards are designed and produced didn't have inflation for the past 10 years!
Just making a 250mm2 chip today by TMSC cost a lot more than It did 5 or 10 years ago, just look at some info about wafer cost of different processes. Link
HD 5870 -> 40nm and 334mm2-> foundry sale per wafer: $2274 -> cost per chip is 2274/111(good chips per wafer) = $20.5
GTX 970 -> 28nm and 398mm2 for full GM204 chip-> foundry sale per wafer: $2981-> cost per chip is 2981/93(good chips per wafer) = $32.1
RX 5700XT -> 7nm and 251mm2-> foundry sale per wafer: $9346 -> cost per chip is 9346/173(good chips per wafer) = $54
The smallest chip cost by far the most to make from wafer.
You can't blame the increase in price just on corporation greed.
 
Last edited:
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
Joined
Jan 24, 2011
Messages
180 (0.04/day)
For sure not JUST on that, but surely ALSO on that.
It's hard to tell, because I don't know what margins they sell each GPU now and what they sold It in the past.
I won't say every single GPU is or will be overpriced(RDNA2 and Ampere), but some certainly look like they are.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
And I don't really care at what cost you bought your card when I don't even know If you bought It at the release or a year later or If It was on discount at the time!

That inflation was for USA because both AMD and Nvidia are US companies and It's not BS, show me which country where the chips or cards are designed and produced didn't have inflation for the past 10 years!
Just making a 250mm2 chip today by TMSC cost a lot more than It did 5 or 10 years ago, just look at some info about wafer cost of different processes. Link
HD 5870 -> 40nm and 334mm2-> foundry sale per wafer: $2274 -> cost per chip is 2274/111(good chips per wafer) = $20.5
GTX 970 -> 28nm and 398mm2 for full GM204 chip-> foundry sale per wafer: $2981-> cost per chip is 2981/93(good chips per wafer) = $32.1
RX 5700XT -> 7nm and 251mm2-> foundry sale per wafer: $9346 -> cost per chip is 9346/173(good chips per wafer) = $54
The smallest chip cost by far the most to make from wafer.
You can't blame the increase in price just on corporation greed.

HD 5870 = 20.5 from 339 = 6% of the retail price
GTX 970 = 32.1 from 329 = 10% of the retail price
RX 5700 XT = 54 from 399 = 13% of the retail price

You are right - the contribution of the individual die price to the final consumer price only increases, which means that if all the other components in the bill of materials cost the same as they cost in 2010, then the profit margin for the manufacturer is less today.
 
Joined
Feb 13, 2014
Messages
496 (0.13/day)
Location
Cyprus
Processor 13700KF - 5.7GHZ
Motherboard Z690 UNIFY-X
Cooling ARCTIC Liquid Freezer III 360 (NF-A12x25)
Memory 2x16 G.SKILL M-DIE (7200-34-44-44-28)
Video Card(s) XFX MERC 7900XT
Storage 1TB KINGSTON KC3000
Display(s) FI32Q
Case LIAN LI O11 DYNAMIC EVO
Audio Device(s) HD599
Power Supply RMX1000
Mouse PULSAR V2H
Keyboard KEYCHRON V3 (DUROCK T1 + MT3 GODSPEED R2)
Software Windows 11
Benchmark Scores Superposition 4k optimized - 20652
Not too sure about this one...VRAM requirements are heavily exaggerated these days and its RT performance will not be enough to sway 3060/Ti/70 buyers, especially for Cyberpunk. Will have to be shoved down one budget tier at least (300-350 dollars) to duke it out with the 3050
Exactly what i thought for the 6800/xt ones as well, I have no idea why they didn't release 8gb versions and 6gb of this as well, it is only against them.

Imagine a considerably cheaper 6800xt with 8gb ram versus nvidia, most gamers either play 1080p high refresh or 1440p, 8gb are plenty and the price drop will make it a vastly better product.
The same goes for this 6700xt.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Exactly what i thought for the 6800/xt ones as well, I have no idea why they didn't release 8gb versions and 6gb of this as well, it is only against them.

Imagine a considerably cheaper 6800xt with 8gb ram versus nvidia, most gamers either play 1080p high refresh or 1440p, 8gb are plenty and the price drop will make it a vastly better product.
The same goes for this 6700xt.

No, 8GB will make it mid-range product - there are already titles which require 10GB or more, and with true next-gen games the VRAM requirements will only rise.

Nvidia is wrong because it sacrifices other things like the textures resolution and makes the games look terrible.
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
40CU means that all we're just getting another 5700XT with some arch/clock improvements and raytracing. If that's the case I'm expecting a $399 2080S rival with 192-bit RAM using 14Gbps GDDR6.

I'd like to be wrong about Navi22 having only 40CU but AMD has historically made very big cuts to their next-smallest silicon. The only hope we have that it's >40CU is that the XBSX has 52CU and presumably that's not cut down from Navi21's 80CU.
 
Joined
Dec 14, 2011
Messages
1,035 (0.22/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Redragon K618 RGB PRO
Software Microsoft Windows 11 - Enterprise (64-bit)
You guys are wishful thinking these prices and how product segmentation works and competition works.
Right now we have 500 - ish level products as cheapest for nest-gen products. Next Nvidia will launch the 3060 ti at 399 so Amd will have to counter. So if 5700xt will have 3060ti (2080 super) performance they will be around the 399 mark probably a little bit less. Next both will launch 299 price products and then smth around 200 - could be more tiers as this are price oriented customers.
Bottom line both Nvidia and AMD will have products competing on all performance/price segments. They will not leave a 200 price gap between lines as they are loosing on a a whole tier of sales.
This is how commerce, competition in capitalism works.

I suppose you automatically assume everyone's salaries also increase to compensate for the magic thing called "inflation"

I remember when there was none, everyone who worked a job could afford a home, everyone could afford a car etc. I wonder who was behind this? *hand rubbing intensifies*
 
Joined
Feb 13, 2014
Messages
496 (0.13/day)
Location
Cyprus
Processor 13700KF - 5.7GHZ
Motherboard Z690 UNIFY-X
Cooling ARCTIC Liquid Freezer III 360 (NF-A12x25)
Memory 2x16 G.SKILL M-DIE (7200-34-44-44-28)
Video Card(s) XFX MERC 7900XT
Storage 1TB KINGSTON KC3000
Display(s) FI32Q
Case LIAN LI O11 DYNAMIC EVO
Audio Device(s) HD599
Power Supply RMX1000
Mouse PULSAR V2H
Keyboard KEYCHRON V3 (DUROCK T1 + MT3 GODSPEED R2)
Software Windows 11
Benchmark Scores Superposition 4k optimized - 20652
No, 8GB will make it mid-range product - there are already titles which require 10GB or more, and with true next-gen games the VRAM requirements will only rise.

Nvidia is wrong because it sacrifices other things like the textures resolution and makes the games look terrible.
I disagree with that but it doesn't even matter, what i asked to just have the option at a lower price, not replace the current 16gb ones but provide both versions.
 
Joined
Mar 2, 2019
Messages
177 (0.08/day)
System Name My PC
Processor AMD 2700 x @ 4.1Ghz
Motherboard Gigabyte X470 Aorus Gaming
Cooling Zalman CNPS20X
Memory Corsair Vengeance LPX Black 32GB DDR4
Video Card(s) Sapphire Radeon RX 570 PULSE
Storage Adata Ultimate SU800
Case Phanteks Eclipse P500A
Audio Device(s) Logitech G51
Power Supply Seasonic Focus GX, 80+ Gold, 550W
Keyboard Roccat Vulcan 121
I suppose you automatically assume everyone's salaries also increase to compensate for the magic thing called "inflation"

I remember when there was none, everyone who worked a job could afford a home, everyone could afford a car etc. I wonder who was behind this? *hand rubbing intensifies*

How on Earth did you assumed i assume that from my post. I was just coldly assessing the GPU binary market and product segmentation.
I do not condone the price that Nvidia and Amd are asking - but the blame is on people that pay them. You see the scalper situation - it exists because people are willing to pay even more than absurd launch prices.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
So technically, it's a 5700 (XT) with moderate raytracing support. I guess it was a good idea to buy a 5700 XT last week, despite my issues with its heat output.
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
So technically, it's a 5700 (XT) with moderate raytracing support. I guess it was a good idea to buy a 5700 XT last week, despite my issues with its heat output.
An overclocked 5700XT with potentially 64-128MB of cache. Until we get a 6700XT and a 5700XT and clock them the same, nobody outside of AMD will know for sure what the architectural improvement between RDNA1 and RDNA2 really is.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
An overclocked 5700XT with potentially 64-128MB of cache. Until we get a 6700XT and a 5700XT and clock them the same, nobody outside of AMD will know for sure what the architectural improvement between RDNA1 and RDNA2 really is.
Worst case scenario: if it only has the same IPC, but runs at higher clocks, even that's an achievement, I guess.

I'm also curious about the actual power consumption / heat output figures, as I think that's the main area where the 5700 series needs some serious improvement (mine especially).
 
Joined
Nov 26, 2020
Messages
106 (0.07/day)
Location
Germany
System Name Meeeh
Processor 8700K at 5.2 GHz
Memory 32 GB 3600/CL15
Video Card(s) Asus RTX 3080 TUF OC @ +175 MHz
Storage 1TB Samsung 970 Evo Plus
Display(s) 1440p, 165 Hz, IPS
Now THIS is what I'm talkin' about. I'm excited. Hope it turns out to be true AND affordable/readily available...

Yeah good luck with that :D

No, 8GB will make it mid-range product - there are already titles which require 10GB or more, and with true next-gen games the VRAM requirements will only rise.

Nvidia is wrong because it sacrifices other things like the textures resolution and makes the games look terrible.

Lmao, no there's not

AC Valhalla uses 6GB at 4K completely maxed out

Next gen consoles does not even have 10GB for GPU, they have16GB shared and most will be used by system and background/os meaning 4-8GB tops will be used for GPU

6700XT is not a 4K capable card or anywhere close, it's a 1440p card at best so 12GB VRAM will be pointless for 99.9% of games anyway, GPU is too weak to max settings out, and this lowers VRAM usage
 
Last edited:
Joined
Mar 23, 2005
Messages
4,086 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Yeah good luck with that :D



Lmao, no there's not

AC Valhalla uses 6GB at 4K completely maxed out

Next gen consoles does not even have 10GB for GPU, they have16GB shared and most will be used by system and background/os meaning 4-8GB tops will be used for GPU

6700XT is not a 4K capable card or anywhere close, it's a 1440p card at best so 12GB VRAM will be pointless for 99.9% of games anyway, GPU is too weak to max settings out, and this lowers VRAM usage
More people play on 1080p screens. Many have moved to the better 1440p and a very small Niche play on 4k.
I think AMD made a mistake with not launching the RX 6700XT (1440p Mainstream GPU) along with the 6800XT and 6900XT. If AMD wanted only 3 GPUs at launch, they should have held back the 6800 non XT, because Nvidia is eating the mainstream market and AMD has no GPUs available to appease this market. A major miss-calculation.
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Worst case scenario: if it only has the same IPC, but runs at higher clocks, even that's an achievement, I guess.

I'm also curious about the actual power consumption / heat output figures, as I think that's the main area where the 5700 series needs some serious improvement (mine especially).
Oof, I missed this from November. I think you just need to undervolt. My 5700XT managed to drop 40W for a 50MHz (2.5% clock reduction) and at that power draw it's a pretty good performance/Watt match for the 2060S I had too.

I've saved a few clock/voltage profiles for my 5700XT but I commonly run at a near-silent 1666MHz/914mv which uses 115W for the GPU core, so I'm guessing the total board power is 145-150W or so - that's 75W less than stock and I've probably lost 15% performance but it doesn't bother me as I'm usually capped by the vsync of my TV and not the GPU.

AMD's preferences, reflected in guildelines to partners is that they don't care about cool or quiet - every GPU they've spat out in the last decade has been clocked to within an inch of its life, right at the ugly steep end of the voltage curve. Most samples don't need anything like that voltage anyway, and the voltage curve is so steep at AMD's recommended clocks that even a 5% clock reduction can have absolutely massive reductions in power consumption.

More people play on 1080p screens. Many have moved to the better 1440p and a very small Niche play on 4k.
I think AMD made a mistake with not launching the RX 6700XT (1440p Mainstream GPU) along with the 6800XT and 6900XT. If AMD wanted only 3 GPUs at launch, they should have held back the 6800 non XT, because Nvidia is eating the mainstream market and AMD has no GPUs available to appease this market. A major miss-calculation.
You seem to be mistaken in how companies operate. They are not trying to make the best graphics card for us, they are trying to make the most dollars for them.

The profit margins on a vanilla 6800 are probably three times the profit margins on a 6700-series card. AMD pays TSMC for a wafer and they can have, say, 100 Big Navi dies out of it, or 150 Little Navi dies out of it. Except AMD can sell the Big Navi to partners for $300 to go on $100PCBs with $25 coolers, or they can sell Little Navi to partners for $100 to go on $75 PCBs with $15 coolers. What would you do, if you were AMD?

Just be thankful we're getting GPUs at all - AMD can get about 3 5950X from a wafer for every Big Navi, and those sell for way more money and don't require the same cost of PCB or cooling hardware either.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Oof, I missed this from November. I think you just need to undervolt. My 5700XT managed to drop 40W for a 50MHz (2.5% clock reduction) and at that power draw it's a pretty good performance/Watt match for the 2060S I had too.

I've saved a few clock/voltage profiles for my 5700XT but I commonly run at a near-silent 1666MHz/914mv which uses 115W for the GPU core, so I'm guessing the total board power is 145-150W or so - that's 75W less than stock and I've probably lost 15% performance but it doesn't bother me as I'm usually capped by the vsync of my TV and not the GPU.

AMD's preferences, reflected in guildelines to partners is that they don't care about cool or quiet - every GPU they've spat out in the last decade has been clocked to within an inch of its life, right at the ugly steep end of the voltage curve. Most samples don't need anything like that voltage anyway, and the voltage curve is so steep at AMD's recommended clocks that even a 5% clock reduction can have absolutely massive reductions in power consumption.
To be honest, with a -25% power target, I only lose about 7% performance which is barely noticeable. The fan curve is a bigger issue. It is so relaxed, that I get the same temperatures regardless of clock speed or power consumption. Things are probably much better with a custom curve. Only if I didn't use all my PC time to play games. :rolleyes::D
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
To be honest, with a -25% power target, I only lose about 7% performance which is barely noticeable. The fan curve is a bigger issue. It is so relaxed, that I get the same temperatures regardless of clock speed or power consumption. Things are probably much better with a custom curve. Only if I didn't use all my PC time to play games. :rolleyes::D
I mean if you are already underclocking, just let the card run hot. The silicon is rated for 105C (hot spot) and I let mine run at 90C+

Having a cool card in the 60-70C range is nice if you're aiming for max performance as the boost algorithms will see thermal headroom and use it to boost more. If your GPU isn't going to boost higher because of manually-reduced power limits, let it run up to it's rated temperature and enjoy the slower fans.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I mean if you are already underclocking, just let the card run hot. The silicon is rated for 105C (hot spot) and I let mine run at 90C+

Having a cool card in the 60-70C range is nice if you're aiming for max performance as the boost algorithms will see thermal headroom and use it to boost more. If your GPU isn't going to boost higher because of manually-reduced power limits, let it run up to it's rated temperature and enjoy the slower fans.
The GPU isn't an issue. It runs at the low-mid 70s (hot spot never exceeding 95 C) even on default settings. I'm more worried about the VRAM which can get above 90 C after some extensive gaming.
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The GPU isn't an issue. It runs at the low-mid 70s (hot spot never exceeding 95 C) even on default settings. I'm more worried about the VRAM which can get above 90 C after some extensive gaming.
That's absolutely fine. 106C is the temperature target for the VRAM, just like 105C is the temperature target for the hotspot. If it's not running that hot you're either leaving performance on the table, or running the fans louder than they need to be.

Nothing wrong with either of those, but I'm just saying if the noise bothers you there's no need to worry about the temps you're talking about. It took me a few months of Navi to get used to temperatures that were 20C higher than we were used to seeing with Nvidia or previous-gen AMD, but those cards also ran this hot - they just didn't report it.
 
Joined
Jan 14, 2019
Messages
12,340 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
That's absolutely fine. 106C is the temperature target for the VRAM, just like 105C is the temperature target for the hotspot. If it's not running that hot you're either leaving performance on the table, or running the fans louder than they need to be.

Nothing wrong with either of those, but I'm just saying if the noise bothers you there's no need to worry about the temps you're talking about. It took me a few months of Navi to get used to temperatures that were 20C higher than we were used to seeing with Nvidia or previous-gen AMD, but those cards also ran this hot - they just didn't report it.
I thought 95 C was the max rated temperature for GDDR6 chips. In this case I really have nothing to complain about the "dreaded" Asus Strix 5700 XT. :)
 
Joined
Mar 23, 2005
Messages
4,086 (0.57/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Oof, I missed this from November. I think you just need to undervolt. My 5700XT managed to drop 40W for a 50MHz (2.5% clock reduction) and at that power draw it's a pretty good performance/Watt match for the 2060S I had too.

I've saved a few clock/voltage profiles for my 5700XT but I commonly run at a near-silent 1666MHz/914mv which uses 115W for the GPU core, so I'm guessing the total board power is 145-150W or so - that's 75W less than stock and I've probably lost 15% performance but it doesn't bother me as I'm usually capped by the vsync of my TV and not the GPU.

AMD's preferences, reflected in guildelines to partners is that they don't care about cool or quiet - every GPU they've spat out in the last decade has been clocked to within an inch of its life, right at the ugly steep end of the voltage curve. Most samples don't need anything like that voltage anyway, and the voltage curve is so steep at AMD's recommended clocks that even a 5% clock reduction can have absolutely massive reductions in power consumption.


You seem to be mistaken in how companies operate. They are not trying to make the best graphics card for us, they are trying to make the most dollars for them.

The profit margins on a vanilla 6800 are probably three times the profit margins on a 6700-series card. AMD pays TSMC for a wafer and they can have, say, 100 Big Navi dies out of it, or 150 Little Navi dies out of it. Except AMD can sell the Big Navi to partners for $300 to go on $100PCBs with $25 coolers, or they can sell Little Navi to partners for $100 to go on $75 PCBs with $15 coolers. What would you do, if you were AMD?

Just be thankful we're getting GPUs at all - AMD can get about 3 5950X from a wafer for every Big Navi, and those sell for way more money and don't require the same cost of PCB or cooling hardware either.
I'll repeat myself, AMD made a calculated mistake, neglecting the mainstream market for both CPU & Graphics. All they had to do is launch one 8-Core/16 Thread Ryzen 5700XT mainstream CPU and one RX 6700XT mainstream GPU to appease that market segment. Instead they opened it up to Intel and Nvidia. The 6-Core/12-thread Ryzen 5 doesn't count in this scenario.

AMD would have been better off with releasing the RX 6800XT, RX 6900XT and a RX 6700XT for the mainstream market. Right now, they've left that market segment vulnerable and Nvidia is eating it right up with its entire RTX 3060 lineups. Mainstream is where you grow market share and positive market image. Anyhow seeing how Sony & MS has 80% of AMD's 7nm capacity, that explains why mobile chips, RDNA2 graphics cards, ZEN3 CPUs are rather difficult to locate. All 3 are battling with less than 20% cap.
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I'll repeat myself, AMD made a calculated mistake, neglecting the mainstream market for both CPU & Graphics. All they had to do is launch one 8-Core/16 Thread Ryzen 5700XT mainstream CPU and one RX 6700XT mainstream GPU to appease that market segment. Instead they opened it up to Intel and Nvidia. The 6-Core/12-thread Ryzen 5 doesn't count in this scenario.

AMD would have been better off with releasing the RX 6800XT, RX 6900XT and a RX 6700XT for the mainstream market. Right now, they've left that market segment vulnerable and Nvidia is eating it right up with its entire RTX 3060 lineups. Mainstream is where you grow market share and positive market image. Anyhow seeing how Sony & MS has 80% of AMD's 7nm capacity, that explains why mobile chips, RDNA2 graphics cards, ZEN3 CPUs are rather difficult to locate. All 3 are battling with less than 20% cap.
They haven't neglected the mainstream market - they're clearing inventories of Zen2. Can't buy a Ryzen 7 5700X because it doesn't exist yet? Buy a 3700X until something better comes along. The discounts are deep and it's not like this is new behaviour - AMD, Intel, and Nvidia have been following this pattern pretty solidly for almost 20 years.

Why should they appease the low-margin mainstream market segment? You can't buy a product that competes with Zen3 yet. Old Zen2 inventory is doing just fine against 10th Gen Intel. As for mainstream GPUs, it's not like you can buy anything faster than a vanilla 1650 for reasonably money right now, no matter what segment it's released in, so asking AMD to sacrifice the huge margins on high-end CPUs and GPUs is like baseless begging for charity.

When they launch midrange first, it's the exception to the rule, sacrificing profits as a way to regain marketshare, spurred on by fierce competition. At the moment, AMD have the highest marketshare they've had in ages, perhaps even ever, and the weakest competition. They don't need to compromise their profits by selling premium cost, constrained TSMC output at low-margin mainstream prices, especially when they have inventory of Zen2 they've already paid for - just sitting around in warehouses waiting to be turned into revenue.

Capitalism is what brought us CPU progress and you can't have it both ways; For nice things to come to us at great prices, companies have to be competitive and profitable. Right now, AMD is instantly selling everything they make at pretty much whatever price they ask for it. This is their reward for pulling ahead of the competition. It is the foundation of free-market capitalism and yes, for the plebs like us it does temporarily suck - but it's the natural course of things powered by fundamental rules of economics - it's happened before and it'll happen again. You just have to look at the 30-year history of the x86 home PC to see this is nothing new and nothing special.
 
Last edited:
Top