• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce RTX 4060 Ti Dual OC

Joined
Feb 20, 2019
Messages
8,118 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
What descrete GPU are you using atm?, as it not listed in your system specs.
"I love my 6800XT but it's power hungry", quoting to a discussion between you and @bug about the 6800XT's 139W higher power consumption;

I though that was pretty unambiguous ;)

Edit:
I should probably update my system specs but I have something like five ever-evolving main PCs at three locations at any point in time and as a SI and datacenter architect with a rack of hardware to set up VTEs on, I'm not sure which one to put in the spec. I guess I could just put "yes" for every field. Whatever it is, if I don't currently have one in my possession I can probably get my hands on one within an hour.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,682 (3.98/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Fair comment, when the card drops to £250.
Maybe upcoming cards from AMD will put pressure on Nvidia to cut the prices on these. Though going by some stuff I read, Nvidia only makes as few GPUs as it can to not get in trouble with TSMC and is happy to store them and just trickle few units into channels to keep the prices up... They probably have enough money from AI businesses, they can just ignore gaming for a year or two.
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
@Chrispy_ Did not test 30/40 cards, but I think the logic of Turbo boost is the same.
Can you confirm or test this scenario?...set power limit to 70%, increase core clock to max.(before the level when GPU freeze or crash), start video encoding(low power task) and check the max boost clock at the start of encoding and at the end, when the temperature will be highest. Also set the fan to 0rpm to be sure the temp will be rising. If the clocks will be +/- the same at 40°C and also at 80°C, then the gen20 TB logic is gone, otherwise it is still there.
 
Joined
Feb 20, 2019
Messages
8,118 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
@Chrispy_ Did not test 30/40 cards, but I think the logic of Turbo boost is the same.
Can you confirm or test this scenario?...set power limit to 70%, increase core clock to max.(before the level when GPU freeze or crash), start video encoding(low power task) and check the max boost clock at the start of encoding and at the end, when the temperature will be highest. Also set the fan to 0rpm to be sure the temp will be rising. If the clocks will be +/- the same at 40°C and also at 80°C, then the gen20 TB logic is gone, otherwise it is still there.
There is a tiny amount of truth to "lower temperatures results in higher clocks", since power limits are the bottleneck and the electrical resistance of the silicon increases with temperature, so knocking 20C off the temperature will contribute a tiny amount to the clockspeed by increasing efficiency. We're talking margin of error though, and it's insignificant compared to the clock headroom an undervolt will achieve within the same power limit, if power limit is even the reason. For these 4060 Ti cards, it always seems to be VRel.

I'm not sure what your test is trying to prove exactly, but to prove my point about temperature not affecting boost, here's a 4060 Ti 16GB at 59C hotspot with very high fan speeds after running long enough for all the temperatures to level off. You can see that it's pegged at max clocks, the limitation is voltage relegation at 81% TDP.

1691758416619.png


Then I turned the fans down to minimum (30%) and covered up the case intake to get it to cook the GPU at as close to <83C as possible, without going over that 83C temperature threshold that reduces TB clocks. The result? Still VRel, still max clocks, no change despite a 23C increase in temperatures and bumping right up to the temperature target of 83C. Power consumption at has gone up by a tiny 1.2% fraction, likely because of that silicon resistance at higher temperatures I was just talking about. So yes - at 100% TDP where you hit a power limit rather than a voltage limit, you might see 0.25-0.5% clockspeed improvements as a result of the fractionally, almost negligibly small effeciency improvement at lower temperatures, but we're talking about something completely significant compared to order-of-magnitude more significant gains from an undervolt which can give you 5%+ clock speed gains when power limited.

1691759502091.png


40-series is crippled by power and voltage limits, not temperature. It's as fast at <83C as it is at any lower temperatures, and I'll continue to believe that until someone else posts hard evidence proving otherwise. (I'm not saying it doesn't exist, but with my limited 40-series experience of only 3 different models, I haven't seen all the options yet, especially not any non-FE flaghship 4090's like the Strix or SUPRIM)
 
Last edited:
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
Power hungry is better than VRAM limited. You definitely don't want any 8GB card
Yes, therefore we have 16GB variant. 16GB + frame gen is perfect combo. There is no other card for this price. But if frame gen means nothing for you, then nvidia 40 cards are not for you. The same applies for power consumption, if you do not care, AMD cards are best for you.
It will be interesting to see HW survey 2026-2030.

There is a tiny amount of truth to "lower temperatures results in higher clocks", since power limits are the bottleneck and the electrical resistance of the silicon increases with temperature, so knocking 20C off the temperature will contribute a tiny amount to the clockspeed. We're talking margin of error though, and it's insignificant compared to the clock headroom an undervolt will achieve within the same power limit.
Yes, but the differences on my 2060super are higher ~1950-2130MHz.(so not margin of error). The first drop I see exactly on ~60-61°C. So it would be interesting to see how it is on gen 30/40 cards....If somebody can test this I would be glad.

Fair comment, but when the 4060/Ti isnt much of an upgrade of previous gen, due to the willfully stunted specs.
Can you imagine what would happen if gen 40 cards have had better specs(mem bus, core configuration) and on top of that frame gen at the user wanted prices?? it would be devastating for AMD.
If frame gen would never exist, it would be another story, but fortunately it exists and it is good move forward especially when RT is the future. It is the fastest way how to masively increases resolution and framerate. It would be not possible with just increasing raw perf....even on TSMC 4. The players are eager :) ...big monitors, high framerates, ultra settings, ...everything... and better today, not tomorrow.
 
Joined
Feb 20, 2019
Messages
8,118 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Yes, but the differences on my 2060super are higher ~1950-2130MHz.(so not margin of error). The first drop I see exactly on ~60-61°C. So it would be interesting to see how it is on gen 30/40 cards....If somebody can test this I would be glad.
I got you, I've edited my previous post with tests 23C apart.
It's not exactly what you suggested, but it's testing boost clocks at different temperatures.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
Yes, therefore we have 16GB variant. 16GB + frame gen is perfect combo. There is no other card for this price.

I am not convinced that this GPU works properly with 16 GB. The reviews showed 1% performance difference against the 8GB variant, while the RX 6800 16GB is 22% faster.
Your choice is Radeon RX 6800 16GB and use AMD Software settings for further adjustments.
 
Joined
Feb 18, 2005
Messages
5,701 (0.79/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I am not convinced that this GPU works properly with 16 GB. The reviews showed 1% performance difference against the 8GB variant, while the RX 6800 16GB is 22% faster.
BECAUSE THE NVIDIA GPU DOESN'T NEED AS MUCH MEMORY AS AMD ONES. Which is why all this crying about "8GB isn't enough" is just that.
 
Joined
Dec 31, 2020
Messages
952 (0.69/day)
Except this could manifest itself more in the future. Looks broken. however no problem in 1440.

1691765751010.png


Can you imagine what would happen if gen 40 cards have had better specs(mem bus, core configuration) and on top of that frame gen at the user wanted prices?? it would be devastating for AMD.

All it took to add 64 bits is 12mm2. 12GB 4608 CUDA and 16% performance uplift. But we don't live in that universe. Next in line is 4070 and it just needs to drop to 499 and it will.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,682 (3.98/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Except this could manifest itself more in the future. Looks broken. however no problem in 1440.

View attachment 308543



All it took to add 64 bits is 12mm2. 12GB 4608 CUDA and 16% performance uplift. But we don't live in that universe. Next in line is 4070 and it just needs to drop to 499 and it will.
Not only 12 sqmm. It also needs more traces on the PCB, which drives the price up. Still, 128 bits bus for $400 is an insult, cache or no cache.
 
Joined
Nov 26, 2021
Messages
1,574 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Not only 12 sqmm. It also needs more traces on the PCB, which drives the price up. Still, 128 bits bus for $400 is an insult, cache or no cache.
The extra memory devices are a bigger concern than the PCB. PCBs are cheap at the scale of ASUS or MSI.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
Can you imagine what would happen if gen 40 cards have had better specs(mem bus, core configuration) and on top of that frame gen at the user wanted prices?? it would be devastating for AMD.

AMD could have just put the RX 7600 in the right box - label it where it belongs - RX 7400 XT and call it a day.
Lower prices, lower margins per item, but more sales and revenue inflow would be pretty much the same.

Today the situation is bad - it's like the scalping because of the mining craze returned - absolutely unjustified prices and boxes for the cards.
 
Joined
Aug 21, 2015
Messages
1,716 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
AMD could have just put the RX 7600 in the right box - label it where it belongs - RX 7400 XT and call it a day.
Lower prices, lower margins per item, but more sales and revenue inflow would be pretty much the same.

Today the situation is bad - it's like the scalping because of the mining craze returned - absolutely unjustified prices and boxes for the cards.

The RX 6400 averaged 38 fps at lanuch at 1080p. The 7600 manages the same average at 4K. In what universe can they even be remotely considered the same class?
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
The RX 6400 averaged 38 fps at lanuch at 1080p. The 7600 manages the same average at 4K. In what universe can they even be remotely considered the same class?

In the universe of the truth

RX 6400 RDNA 2 generation - 2020-2022 - 6nm process 5.4 billion transistors 107 mm^2
RX 7400 XT RDNA 3 generation - 2023 - 4-5nm process 13.3 billion transistors (204 mm^2) normalised to 142 mm^2 half-node die shrink 7/6 to 5/4.

Since AMD lags badly in this market segment, it needs to offer more - a jump similar to the RX 5700 XT -> RX 6900 XT.
 
Joined
Aug 21, 2015
Messages
1,716 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
In the universe of the truth

RX 6400 RDNA 2 generation - 2020-2022 - 6nm process 5.4 billion transistors 107 mm^2
RX 7400 XT RDNA 3 generation - 2023 - 4-5nm process 13.3 billion transistors (204 mm^2) normalised to 142 mm^2 half-node die shrink 7/6 to 5/4.

Since AMD lags badly in this market segment, it needs to offer more - a jump similar to the RX 5700 XT -> RX 6900 XT.

I don't know why you keep throwing out transistor counts and die sizes. Product segment is driven by performance and, to a lesser extent, power envelope. 6-series should fall between 120-150W and push excellent performance at 1080p and be decent at 1440p. The 7600 does this. The 6400 did not.

Supporting your point by comparing the 5700 XT to the 6900 XT is useful, but not in the way you think. The former, as a 7-segment part, was meant for and acheived solid 1440p and passable 4K at an almost-appropriate (for AMD) 225W and intended $400. 6900 XT was a legit 4K chip, ran a 300W TDP, and launched with AMD asking a cool grand.

So, we've got (using the 6900 XT's current USD800 street price),
6400 --> 7600: 2.8X performance, 3X power, 1.7X price
5700 XT --> 6900 XT: 2X performance, 1.3X power, 2X price

Other than efficiency, it sure looks like the 7600's actually the better improvement.
 

bug

Joined
May 22, 2015
Messages
13,682 (3.98/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The extra memory devices are a bigger concern than the PCB. PCBs are cheap at the scale of ASUS or MSI.
It depends. If you can add traces without increasing the number of layers, you're fine. But if you have to add another layer, that usually hurts.
But the argument stands: for $400 I expect more than a capable GPU hamstrung in every possible way.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
I don't know why you keep throwing out transistor counts and die sizes. Product segment is driven by performance and, to a lesser extent, power envelope. 6-series should fall between 120-150W and push excellent performance at 1080p and be decent at 1440p. The 7600 does this. The 6400 did not.

Supporting your point by comparing the 5700 XT to the 6900 XT is useful, but not in the way you think. The former, as a 7-segment part, was meant for and acheived solid 1440p and passable 4K at an almost-appropriate (for AMD) 225W and intended $400. 6900 XT was a legit 4K chip, ran a 300W TDP, and launched with AMD asking a cool grand.

So, we've got (using the 6900 XT's current USD800 street price),
6400 --> 7600: 2.8X performance, 3X power, 1.7X price
5700 XT --> 6900 XT: 2X performance, 1.3X power, 2X price

Other than efficiency, it sure looks like the 7600's actually the better improvement.

Then, look at Radeon RX 5600 XT -> Radeon RX 6600 XT -> Radeon RX 7600 in order to understand the issue that I am trying to explain.
The issue is that AMD doesn't offer the progress expected and demanded by the gamers.
RX 5600 XT = 100% January 2020 280$
RX 6600 XT = 128% August 2021 380$
RX 7600 = 141% May 2023 270$

You got 41% higher performance for 10$ less 3 years and 4 months later.
 

bug

Joined
May 22, 2015
Messages
13,682 (3.98/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Then, look at Radeon RX 5600 XT -> Radeon RX 6600 XT -> Radeon RX 7600 in order to understand the issue that I am trying to explain.
The issue is that AMD doesn't offer the progress expected and demanded by the gamers.
RX 5600 XT = 100% January 2020 280$
RX 6600 XT = 128% August 2021 380$
RX 7600 = 141% May 2023 270$

You got 41% higher performance for 10$ less 3 years and 4 months later.
It's not AMD's problem gamers expect and demand things as if prices for everything didn't go through the roof between January 2020 and May 2023.
 
Joined
Aug 21, 2015
Messages
1,716 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Then, look at Radeon RX 5600 XT -> Radeon RX 6600 XT -> Radeon RX 7600 in order to understand the issue that I am trying to explain.
The issue is that AMD doesn't offer the progress expected and demanded by the gamers.
RX 5600 XT = 100% January 2020 280$
RX 6600 XT = 128% August 2021 380$
RX 7600 = 141% May 2023 270$

You got 41% higher performance for 10$ less 3 years and 4 months later.

Averaged out, that's.... pretty standard. +20% or so gen-on-gen on an 18-month release cadence. Yeah, Navi 3's been pretty disappointing relative to Navi 2 given how far Navi 2 blew past Navi 1, and it'd be nice if the 7600 cost $30-50 less than it does, but we can't exactly expect a return to pre-COVID-ish pricing all at once.
 
  • Like
Reactions: bug
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
@Chrispy_ OK, thanks for testing. What was the voltage value during VRel? Will be the frequencies also rock stable if you OC the core to ~2900MHz like on review, if no VRel appears?

I am not convinced that this GPU works properly with 16 GB. The reviews showed 1% performance difference against the 8GB variant, while the RX 6800 16GB is 22% faster.
Your choice is Radeon RX 6800 16GB and use AMD Software settings for further adjustments.
The 16GB version should work exactly the same as the 8GB version. VRAM is just fast storage for game.
RX6800 is faster because it has double the count of ROPs which are responsible of finalizing of everything what was calculated in rendering process and of course mem bandwidth. The rest is weaker.
I will buy that chip, which is working better inside at the same power consumption and that, which has more SW/HW/AI features.

I don't care if there is more or less FPS/price. I am buying hardware. If I want higher fps, I will make pressure on game developers, they are responsible for fps number. Nvidia/AMD/INTEL can affect fps only slightly with changes in drivers.

All it took to add 64 bits is 12mm2. 12GB 4608 CUDA and 16% performance uplift. But we don't live in that universe. Next in line is 4070 and it just needs to drop to 499 and it will.
This 40gen is to test how players will accept the DLSS3. If they would increase also what you mentioned and DLSS3 would be not accepted by gamers, it would be disaster for gen50.
So it is good as it is. We have very nice perf increase for free in titles where AI stuff already are or will be implemented in near future.

On gen50, mem bus can be returned back to desired/expected values + we should see overall perf improvements thanks to next TSMC manufacturing process.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
Averaged out, that's.... pretty standard. +20% or so gen-on-gen on an 18-month release cadence. Yeah, Navi 3's been pretty disappointing relative to Navi 2 given how far Navi 2 blew past Navi 1, and it'd be nice if the 7600 cost $30-50 less than it does, but we can't exactly expect a return to pre-COVID-ish pricing all at once.

+20% is way below average generational improvement. The average is ~+45%, with two generations that warrant the standard upgrade cycle. You don't upgrade every generation, but you can upgrade during the next.
+41% performance improvement between the RX 5600 XT and RX 7600 won't make the former owner buy the latter - simply because it is a sidegrade.
Hence, I am saying - the RX 7600 is not a true card, it is a fake one, the true one would be either RX 7400 XT or RX 7500 XT.

Speaking of which - where is the RX 7500 XT?

And why the RX 5500 XT -> RX 6500 XT warrants exactly 0% performance improvement? Where is the claimed "average"?
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
  • Haha
Reactions: ARF
Joined
Aug 21, 2015
Messages
1,716 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Going to use Nvidia as an example here, since they're much more consistent with names and segments than AMD. Using TPU reviews and the GTX 260 as a starting point (1):

460: +28%
560: +18%
660: +30%
760: +25%
960: +11% (2)
1060: +100%
1660: +20%

The 1060 is clearly the outlier here. Pascal kind of blew the lid off, and was the kind of generational leap we see maybe once a decade. The 1660 didn't have a proper direct successor, so I cut things off there. With the 1060 included, we have an average of +33%/gen (I think; stats aren't my strongest suit), +22% without. Again, the 7600 could be better. But it's not the garbage product you're making it out to be.

(1) The 260 was in a way different bracket at launch, $450 and 185W, making the 460 a much bigger advance than it appears. This is where they went from Tesla to Fermi
(2) Nvidia kind of sandbagged a bit here, but the 960 came in at $50 less than the 760 and had a 50W lower TDP
 
Joined
Dec 31, 2020
Messages
952 (0.69/day)
560/460 OC to 660 to 760 OC 33%
no consistency after that for Nvidia really
1060 100%
2060 75%
3060 25%
4060 15%
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.71/day)
Location
Ex-usa | slava the trolls
See how the midrange tries to lag behind and then at some point suddenly there is a generation which tries to make it look good again...
Because if they don't do it, at some point the midrange will turn to 1% of the performance of the current flagship...


GTX 280
+57% performance improvement GTX 480
+52% performance improvement GTX 680
+54% performance improvement GTX 780 Ti
+28% performance improvement GTX 980 Ti
+67% performance improvement GTX 1080 Ti
+31% performance improvement RTX 2080 Ti
+78% performance improvement RTX 3090 Ti
+45% performance improvement RTX 4090
________________
average: 51.5%


Radeon HD 4870
+70% performance Radeon HD 5870
+19% performance Radeon HD 6970
+44% performance Radeon HD 7970
+50% performance Radeon R9 290X
+31% performance Radeon R9 Fury X
+32% performance Radeon RX Vega 64
+22% performance Radeon VII
+95% performance Radeon RX 6900 XT
+47% performance Radeon RX 7900 XTX
________________
average: 45.5%
 
Top