• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

So what does the Ada release by Nvidia mean to you

Has the Ada release by Nvidia been good in your opinion


  • Total voters
    101
Status
Not open for further replies.
Joined
Aug 14, 2023
Messages
310 (0.60/day)
Processor AMD Ryzen 7900X
Motherboard MSI MPG X670E Carbon WiFi
Cooling Custom Loop (Watercool/HWLabs)
Memory G.Skill Trident Z5 DDR5-6000 64GB (F5-6000J3040G32GX2-TZ5K)
Video Card(s) Gainward RTX 4090 Phantom GS
Storage 7 x M.2, 4 x SSD, 2 x HDD.
Display(s) Alienware AW3423DW
Case Corsair 7000D Airflow
Audio Device(s) Logitech Z207, Shanling UA1 Plus
Power Supply Corsair HX1200
Mouse Logitech MX Master
Keyboard Logitech k360
Software Windows 11 Pro
Benchmark Scores None, but I think they'd be fairly decent.
I very much doubt it's that easy, or even close. Things are rarely, if ever, that clear cut in the real world.

It is a bad thing because it lowers your image quality and promotes developer laziness.

Just like fevgatos above I very much beg to differ. You need look no further than to Starfield for an example.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I very much doubt it's that easy, or even close. Things are rarely, if ever, that clear cut in the real world.



Just like fevgatos above I very much beg to differ.
Me, you and every reviewer on planet earth :banghead:
 
Joined
Aug 14, 2023
Messages
310 (0.60/day)
Processor AMD Ryzen 7900X
Motherboard MSI MPG X670E Carbon WiFi
Cooling Custom Loop (Watercool/HWLabs)
Memory G.Skill Trident Z5 DDR5-6000 64GB (F5-6000J3040G32GX2-TZ5K)
Video Card(s) Gainward RTX 4090 Phantom GS
Storage 7 x M.2, 4 x SSD, 2 x HDD.
Display(s) Alienware AW3423DW
Case Corsair 7000D Airflow
Audio Device(s) Logitech Z207, Shanling UA1 Plus
Power Supply Corsair HX1200
Mouse Logitech MX Master
Keyboard Logitech k360
Software Windows 11 Pro
Benchmark Scores None, but I think they'd be fairly decent.
Yeah, them too, no doubt barring a few fanbois.
 
Joined
Apr 30, 2020
Messages
1,020 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Do people not understand there is factual way to compare image quality" via overlayed pictures?

Here is what Bing A.I said on how do so.
Yes, it is possible to overlay frames and show errors in picture differences between each. One way to do this is by using OpenCV and Python 12. You can use the Structural Similarity Index (SSIM) to compare two images and determine if they are identical or have differences due to slight image manipulations, compression artifacts, or purposeful tampering 1. To visualize the differences between images using OpenCV and Python, you can draw bounding boxes around regions in the two input images that differ 1. You can also use the cv2.subtract() function to compute the difference between two images and color the mask red.

The difference between objective and subjective is that objective information is based on facts, while subjective information is based on feelings or opinions.

Currently it seems most reviewers have only put out an Subjective review of any type of these technologies that increase the fps in game.
 
Joined
Aug 14, 2023
Messages
310 (0.60/day)
Processor AMD Ryzen 7900X
Motherboard MSI MPG X670E Carbon WiFi
Cooling Custom Loop (Watercool/HWLabs)
Memory G.Skill Trident Z5 DDR5-6000 64GB (F5-6000J3040G32GX2-TZ5K)
Video Card(s) Gainward RTX 4090 Phantom GS
Storage 7 x M.2, 4 x SSD, 2 x HDD.
Display(s) Alienware AW3423DW
Case Corsair 7000D Airflow
Audio Device(s) Logitech Z207, Shanling UA1 Plus
Power Supply Corsair HX1200
Mouse Logitech MX Master
Keyboard Logitech k360
Software Windows 11 Pro
Benchmark Scores None, but I think they'd be fairly decent.
While it may not be scientific or fully objective, anyone can view any number of side-by-side comparisons on YouTube. And while fairly un-scientific due to YouTube compression et al, that goes equally for both sides. For my own part I only needed to play the same games again with my 4090 as I had played with my 6800 XT. That settled the matter rather nicely.
 
Joined
Jan 14, 2019
Messages
13,391 (6.11/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Just like fevgatos above I very much beg to differ. You need look no further than to Starfield for an example.
I play at 1080p. Any kind of upscaling looks like shit at 1080p.

While it may not be scientific or fully objective, anyone can view any number of side-by-side comparisons on YouTube. And while fairly un-scientific due to YouTube compression et al, that goes equally for both sides. For my own part I only needed to play the same games again with my 4090 as I had played with my 6800 XT. That settled the matter rather nicely.
I don't believe in youtube. I believe in playing games and seeing for myself.
 
  • Like
Reactions: ixi
Joined
Apr 12, 2013
Messages
7,582 (1.77/day)
And while fairly un-scientific due to YouTube compression et al, that goes equally for both sides.
Huh, what? You do know that (regular)4k compressed in YT will look appreciably worse than 1440p or 1080p uploads. Also not sure if there's an enhanced bitrate version available for 4k but even at 1080p (IQ) it's not that much better.

Worse as in the quality drops a lot more, just in case it wasn't clear.
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Do people not understand there is factual way to compare image quality" via overlayed pictures?

Here is what Bing A.I said on how do so.




Currently it seems most reviewers have only put out an Subjective review of any type of these technologies that increase the fps in game.
Is it subjective to claim that 4k is superior to 320p in image quality? Do I need to use a tool to objectively measure that?
 
Joined
Feb 24, 2023
Messages
3,308 (4.80/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Any kind of upscaling looks like shit at 1080p.
Turn VSR on, change your resolution to 4K, play with FSR: Quality. Try believing your eyes.

No need to thank me.
 
Joined
Jan 14, 2019
Messages
13,391 (6.11/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Turn VSR on, change your resolution to 4K, play with FSR: Quality. Try believing your eyes.

No need to thank me.
I can't do VSR because switching resolutions messes up the window positions on my secondary screen.
 
Joined
Aug 14, 2013
Messages
2,373 (0.57/day)
System Name boomer--->zoomer not your typical millenial build
Processor i5-760 @ 3.8ghz + turbo ~goes wayyyyyyyyy fast cuz turboooooz~
Motherboard P55-GD80 ~best motherboard ever designed~
Cooling NH-D15 ~double stack thot twerk all day~
Memory 16GB Crucial Ballistix LP ~memory gone AWOL~
Video Card(s) MSI GTX 970 ~*~GOLDEN EDITION~*~ RAWRRRRRR
Storage 500GB Samsung 850 Evo (OS X, *nix), 128GB Samsung 840 Pro (W10 Pro), 1TB SpinPoint F3 ~best in class
Display(s) ASUS VW246H ~best 24" you've seen *FULL HD* *1O80PP* *SLAPS*~
Case FT02-W ~the W stands for white but it's brushed aluminum except for the disgusting ODD bays; *cries*
Audio Device(s) A LOT
Power Supply 850W EVGA SuperNova G2 ~hot fire like champagne~
Mouse CM Spawn ~cmcz R c00l seth mcfarlane darawss~
Keyboard CM QF Rapid - Browns ~fastrrr kees for fstr teens~
Software integrated into the chassis
Benchmark Scores 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999
Peeps complain about Ada pricing but things were so bad w/ampere and the cryptocurrency mining shortage that even the pricing for the 4090 looks cheap by comparison.
This + inflation. Even only going back 5 years it’s kind of insane

IMG_4653.jpeg
 
Joined
Jun 27, 2019
Messages
2,110 (1.04/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
I would love to build a desktop but I'm no way in hell going to send a week's paycheck on a 4090 to get good performance vs last generation. That money goes into my retirement account so I can retire at 55. A 4080 are good but still expensive nothing under that are still expensive, the 4060ti isnt any better than a 3060ti.
Make that 2-3 months where I live. :laugh: 'if we are talking about a brand new card, 4090 starts around 2000 $ here'
Tho the entire product stack is overpriced so theres that.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
It is really for 4K. But the RTX 4070 Ti is a wrong card, with wrong specifications and wrong market position.
The reason for the good performance upgrade is the change from Samsung's 8nm to TSMC's 4nm.

And already memory starved in certain titles:

View attachment 313120

View attachment 313119

Memory starved LMAO. Educate yourself :laugh:
RE4 and TLOU1 had massive VRAM issues and has been fixed in drivers. Even 8GB cards run just fine now.

Cyberpunk 2077 can't even run in Overdrive mode on my 4090. Good example :laugh:

Btw heard of RAM allocation? Many game engines uses all VRAM available.
There's not a single game that has VRAM issues with 12GB in 4K.

One of the best looking games recently released, Atomic Heart, uses 7.5GB maxed out in 4K -> https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html

Lets have a look at 4060 Ti 8GB vs 16GB in 4K gaming -> https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/31.html

Pretty much identical performance, even in terms of minimum fps. VRAM don't matter when GPU is the weak link. You are not maxing out games anyway, when GPU is weak, meaning that VRAM won't matter because you won't be running high settings.

Logic 101.
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.89/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
The release got screwed over by them taking a cheaper route with the memory setup, and relying on DLSS to make it work.

GDDR6 came out with 4GB per chip (16Gb), and at higher speeds than the older, smaller modules.
That means you can now get higher VRAM capacity with half the complexity... but with half the bus width.

Energy efficiency: Sure, this came out great.
1694501339140.png

But did it come out great because it's good, or because the last gen were over-volted to the moon (Yes), and because they cut back on the RAM bus size massively due to moving to higher capacity VRAM modules? (Also yes)

The 3090Ti used a ton more power than the 3090 but was also more efficient
1694502130115.png

1694501507410.png

snipped image from here
But it's more efficient? How, asks the voice in my head?
1694501556060.png


Why little voicey... because they used half as many memory modules, so more of that power went to the GPU and not the VRAM.
The difference on the Ti is that each memory chip has twice the capacity, so there's only half as many required, which means no more memory chips on the back side of the card
During the eth mining boom, my 3090 would hit its 375W limit with the GPU clocked at 1.4GHz - super far down vs the 2.1GHz it can boost to when the ram is hardly in use.
Undervolting the GPU let me reach 1.6GHz at 370W, so it's easy to imagine the 3090Ti's higher capacity VRAM would have let it clock even higher with no other changes.



These new GPU's have the same deal
3060TI: 256 bit, 8x GDDR6 modules
4060Ti: 128 bit bus, x4 GDDR6 modules

It's almost like halving the amount of VRAM freed up more wattage, which massively boosts efficiency - but harms performance in bandwidth intensive situations.



At lower resolutions these changes don't impact them, and they're pretty good.
1694501942788.png


But at 4k, it's worse than what it replaced. All VRAM heavy titles will behave this way, and the gap will be bigger if you aren't running top of the line systems like TPU's review rig, where it's got stupid fast DDR5-6000 CL36 and PCI-E 5.0 - you need every nanosecond of lower latency to feed it new data, since it doesnt have the raw bandwidth to get the data across faster.
1694501982003.png

But who cares right, DLSS saves the day by rendering at a lower resolution.



Edited in, didn't even notice an 8GB variant existed. I cared so little about these cards.
The 8GB and 16GB both use the same amount of GDDR6 modules - just 4x2GB or 4x4GB modules.
The power consumption barely changed between them, because of this.

The 3060Ti needs a hecking lot more power - but ~15W from that ram going to the GPU would make a world of difference
1694502445954.png


Throw in the smaller gains from it being a newer GPU design even if it was just a refresh, a less complex PCB due to less wiring to VRAM, cheaper VRMs and all those things SHOULD have added up to a much cheaper card that was good value for money... and it wasnt. If you gamed at higher resolutions, you were paying more for less.
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Not worth the jump from previous generation, not that huge leap. 4060 losing to 3060 ti in raw power. 3070 ti beats 4070... 4xxx series is good only with DLSS except two most powerful cards from 4xxx. Have used that few times and I saw difference (in lower quality) than all the advertising etc. With 8GB mega duper nvidia high vram size have ran into wall with few games already (Far Cry 6, Harry Potter, Resident Sleeper) when fps goes from above 60 to slide show :love:. It aint the VRAM, it aint the VRAM goddamit.

Main point of dissapointment is that nvidia pushes useless DLSS in games and makes this DLSS groudbreaking innovation and trying to cash in from people who believes everything they have read, saw and told.

Sad that I got 3060 ti in mining BUM and 6700 xt at that moment costed almost x2 of 3060 ti that time which was already overpriced cr@p.



Short answer - Meh. Next gpu will be from RED.
3070 Ti don't beat 4070 haha. 4070 performs like a 3080, at ~200 watts on average, which is 125 watts less than 3080 and has more VRAM and way more cache. All of 4000 series have alot more cache than Ampere.

3070 Ti even uses like 300 watts because of GDDR6X and delivers pretty much nothing over the regular 3070 with GDDR6 which consumes like 75 watts less because bandwidth was never a problem for 1440p gaming which these cards are meant for. The only reason we got 3070 Ti was because GDDR6 supply was running low and GDDR6X was in abundance.

Useless DLSS? hahah... I bet you are using AMD right now because DLSS/DLAA/DLDSR is superior to what AMD have and DLSS/DLAA offers insane good antialiasing that easily beats TAA and native AA methods in pretty much all games. Read any Techpowerup comparison about FSR vs DLSS or look at Techspots comparison where DLSS easily won across 26 different games.

I am using DLAA whenever I can. Best AA no doubt. Zero shimmering or jaggies.

AMD don't even have an answer to DLAA, DLDSR or DLSS 3/3.5

Also, ShadowPlay, RTX Video Super Resolution, CUDA, Reflex, way better RT performance and I could probably go on. AMD drivers are also wonky when you leave the most popular games and benchmarks. AMD tends to have garbage performance in early access games and lesser popular titles. 9 out of 10 early access games are optimized for Nvidia GPU.

All these features is the reason I am not even considering AMD GPU these days. They are years behind on features and seems to only focus on raster performance. Hence the price.
 
Last edited:
Joined
Sep 27, 2008
Messages
1,215 (0.20/day)
Generally good cards, not-so-great pricing. 4060 Ti is the underwhelming card of the stack in both price/performance and gen-on-gen improvement (or lack thereof).
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
The release got screwed over by them taking a cheaper route with the memory setup, and relying on DLSS to make it work.

GDDR6 came out with 4GB per chip (16Gb), and at higher speeds than the older, smaller modules.
That means you can now get higher VRAM capacity with half the complexity... but with half the bus width.

Energy efficiency: Sure, this came out great.

But did it come out great because it's good, or because the last gen were over-volted to the moon (Yes), and because they cut back on the RAM bus size massively due to moving to higher capacity VRAM modules? (Also yes)

The 3090Ti used a ton more power than the 3090 but was also more efficient
View attachment 313262
View attachment 313258
snipped image from here
But it's more efficient? How, asks the voice in my head?
View attachment 313259

Why little voicey... because they used half as many memory modules, so more of that power went to the GPU and not the VRAM.
The difference on the Ti is that each memory chip has twice the capacity, so there's only half as many required, which means no more memory chips on the back side of the card
During the eth mining boom, my 3090 would hit its 375W limit with the GPU clocked at 1.4GHz - super far down vs the 2.1GHz it can boost to when the ram is hardly in use.
Undervolting the GPU let me reach 1.6GHz at 370W, so it's easy to imagine the 3090Ti's higher capacity VRAM would have let it clock even higher with no other changes.



These new GPU's have the same deal
3060TI: 256 bit, 8x GDDR6 modules
4060Ti: 128 bit bus, x4 GDDR6 modules

It's almost like halving the amount of VRAM freed up more wattage, which massively boosts efficiency - but harms performance in bandwidth intensive situations.



At lower resolutions these changes don't impact them, and they're pretty good.
View attachment 313260

But at 4k, it's worse than what it replaced. All VRAM heavy titles will behave this way, and the gap will be bigger if you aren't running top of the line systems like TPU's review rig, where it's got stupid fast DDR5-6000 CL36 and PCI-E 5.0 - you need every nanosecond of lower latency to feed it new data, since it doesnt have the raw bandwidth to get the data across faster.
View attachment 313261
But who cares right, DLSS saves the day by rendering at a lower resolution.



Edited in, didn't even notice an 8GB variant existed. I cared so little about these cards.
The 8GB and 16GB both use the same amount of GDDR6 modules - just 4x2GB or 4x4GB modules.
The power consumption barely changed between them, because of this.

The 3060Ti needs a hecking lot more power - but ~15W from that ram going to the GPU would make a world of difference
View attachment 313263

Throw in the smaller gains from it being a newer GPU design even if it was just a refresh, a less complex PCB due to less wiring to VRAM, cheaper VRMs and all those things SHOULD have added up to a much cheaper card that was good value for money... and it wasnt. If you gamed at higher resolutions, you were paying more for less.
Are you going to play at 4k with a 4060 or a 4060ti? Assuming they had 512 bit bus and 48 gb of vram, they would still be kinda terrible for that resolution. I mean my 4090 struggles at newest games in 4k so..

Generally good cards, not-so-great pricing. 4060 Ti is the underwhelming card of the stack in both price/performance and gen-on-gen improvement (or lack thereof).
That's pretty insane considering the 4060ti is the 2nd best card in terms of price to performance, only losing to the just released 7800xt, and by less than 9% mind you.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Imagine how disruptive the 4080 would have been this generation had they priced it like the 3080 or even 4070ti really.... It would have been interesting. I'm sure when AMD seen the actual price they where like thank god a 900 usd 7900XT isn't so bad lol.


I think the bottom line is as consumers we really got screwed this generation there are a bunch of cards that perform similarly with different strengths and weaknesses. Anyone who thinks either lineup is all that consumer friendly drinks too much Red/Green Koolaid but that's just me. It just doesn't seem as bad because the last 3+ years have been crap just different levels depending on when.

Inflation is a big reason for 4000 series was priced higher. AMDs 7900 series were priced high as well, stop acting like it is only Nvidia. Last gen cards also had to be sold still. Both Nvidia and AMD had huge stock because of mining demand and then mining crash. AMD especially had huge stock. Which is why they waited a loooong time with 7700 and 7800 series, still selling 6000 series deep into 2023.

People are screaming about prices because they feel the inflation as well, on all fronts. People can't afford new hardware and holding on to their dated hardware, while talking crap about new stuff, because they can't afford to upgrade and trying to justify waiting. Human nature, nothing new.


Only meh cards this generation is 4060 series and 7600 series. These were almost pointless, at least until last gen cards sell out.
4060 series will probably beat entire Radeon 7000 series on Steam HW Survey anyway in terms of marketshare. Why? Because x60 series always sell like hotcakes, because 9 out of 10 people that are not into hardware won't even consider AMD because of reputation or bad experience earlier.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.58/day)
Location
Ex-usa | slava the trolls
What and where do you imagine it would lead to? It's already very much a niche market in terms of Nvidias revenue stream. It's a bit sad I guess, but the simple fact is that gamers don't really mean a thing to their bottom line.

Only if you think nvidia-centric. EVGA left the sinking ship, XFX left the sinking ship... who's next? :D
Actually, I will happily appluad when nvidia leaves this market because I don't see its products competitive - let them give that occupied space to someone who makes graphics cards better.

Do you even remember all the shenanigans by the greens?

3.5GB GTX 970 https://www.extremetech.com/gaming/...-lawsuit-over-the-gtx-970s-3-5gb-memory-issue
Then RTX 4080-12 rebranded to RTX 4070 Ti when this card should be no more than RTX 4060 Ti. https://www.gsmarena.com/nvidia_rtx...ti_available_january_5_for_799-news-57062.php
Then the infamous geforce partner program which was heavily criticised for its monopolistic nature, and eventually was cancelled in 2018.

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.[1] The program proved to be controversial, with complaints about it possibly being an anti-competitive practice.[2] Nvidia canceled the program in May 2018.[3][4]
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Only if you think nvidia-centric. EVGA left the sinking ship, XFX left the sinking ship... who's next? :D
Actually, I will happily appluad when nvidia leaves this market because I don't see its products competitive - let them give that occupied space to someone who makes graphics cards better.

Do you even remember all the shenanigans by the greens?

3.5GB GTX 970 https://www.extremetech.com/gaming/...-lawsuit-over-the-gtx-970s-3-5gb-memory-issue
Then RTX 4080-12 rebranded to RTX 4070 Ti when this card should be no more than RTX 4060 Ti. https://www.gsmarena.com/nvidia_rtx...ti_available_january_5_for_799-news-57062.php
Then the infamous geforce partner program which was heavily criticised for its monopolistic nature, and eventually was cancelled in 2018.

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.[1] The program proved to be controversial, with complaints about it possibly being an anti-competitive practice.[2] Nvidia canceled the program in May 2018.[3][4]
Think about it clearly. If the 4070ti should have been named a 4060ti, can you explain to me why it's beating the snot out of the 7800xt that's an xx80 competitor?

Nothing wrong with the 3.5gb 970 or the NGPP. Companies have been doing this on their own on both AMD and Intel platforms. You know for example, Hero and Apex mobos are Intel only, Crosshair mobos are AMD only etc. It has been happening for at least 15 years now. How is it anticompetitive, i've no idea.

Think about it, everything described in your link AMD is already doing it

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Only if you think nvidia-centric. EVGA left the sinking ship, XFX left the sinking ship... who's next? :D
Actually, I will happily appluad when nvidia leaves this market because I don't see its products competitive - let them give that occupied space to someone who makes graphics cards better.

Do you even remember all the shenanigans by the greens?

3.5GB GTX 970 https://www.extremetech.com/gaming/...-lawsuit-over-the-gtx-970s-3-5gb-memory-issue
Then RTX 4080-12 rebranded to RTX 4070 Ti when this card should be no more than RTX 4060 Ti. https://www.gsmarena.com/nvidia_rtx...ti_available_january_5_for_799-news-57062.php
Then the infamous geforce partner program which was heavily criticised for its monopolistic nature, and eventually was cancelled in 2018.

The Nvidia GeForce Partner Program was a marketing program designed to provide partnering companies with benefits such as public relations support, video game bundling, and marketing development funds.[1] The program proved to be controversial, with complaints about it possibly being an anti-competitive practice.[2] Nvidia canceled the program in May 2018.[3][4]
Sinking ship :laugh: :laugh: meanwhile Nvidia sits at 80-85% gaming marketshare according to Steam, pretty much owns the enterprise market and completely owns the high-end AI GPU market which is worth blillions upon billions. Nvidia stock is higher than ever for a reason. Stop trolling. Nvidia is doing superb right now. They will make 100s of billions over the next 5+ years. Nvidia predict to generate $300 billion in AI revenues by 2027, AMD can only dream about numbers like this but they are years behind on AI.

Even with huge AI focus, Nvidia can beat AMD on both enterprise and gaming on the side with ease.

AMDs big business is CPUs and APUs, not GPUs. They have been loosing GPU marketshare for years for a reason. They earn more per wafer by making CPUs/APUs.

If AMD actually had the best GPUs I would be using AMD GPU but they don't and lacks features like crazy + wonky drivers when you step outside of popular games (that gets benchmarked often - this is what AMD prioritizes). Most early access games runs like crap on AMD GPUs in comparison to Nvidia. Why? Developers optimize for 80-85% of users instead of 10-15%. Most developers are using Nvidia as well.
 
Last edited:
Joined
Sep 27, 2008
Messages
1,215 (0.20/day)
Are you going to play at 4k with a 4060 or a 4060ti? Assuming they had 512 bit bus and 48 gb of vram, they would still be kinda terrible for that resolution. I mean my 4090 struggles at newest games in 4k so..


That's pretty insane considering the 4060ti is the 2nd best card in terms of price to performance, only losing to the just released 7800xt, and by less than 9% mind you.

IMO a problem for the 4060 Ti is the still-extant 3060 Ti is much better value at this time compared to the 4060 Ti, whereas the other cards in the stack don't have the same kind of self competition deal with. It has newer features than the 3060 Ti and lower power consumption, but is that worth $130 more?

($130, based on this https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/33.html)
 
Joined
Jun 14, 2020
Messages
3,781 (2.26/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
IMO a problem for the 4060 Ti is the still-extant 3060 Ti is much better value at this time compared to the 4060 Ti, whereas the other cards in the stack don't have the same kind of self competition deal with. It has newer features than the 3060 Ti and lower power consumption, but is that worth $130 more?

($130, based on this https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/33.html)
Maybe those prices apply in the US, in EU that is not the case. Cheapest 3060ti I can find in Germany is 370€, cheapest 4060ti is 409€. Cheapest 7800xt is 549€, cheapest 4070 is 599€.

Is FG and power draw worth 40€ between the 2 60ti's? Don't know honestly. Maybe. But the 4070 is definitely worth 50 over the 7800xt

If AMD actually had the best GPUs I would be using AMD GPU but they don't and lacks features like crazy + wonky drivers when you step outside of popular games (that gets benchmarked often - this is what AMD prioritizes). Most early access games runs like crap on AMD GPUs in comparison to Nvidia. Why? Developers optimize for 80-85% of users instead of 10-15%. Most developers are using Nvidia as well.
They have wonky drivers even in popular games. Take starfield for example. AMD cards don't render...the sun

Starfield on AMD

1694506086148.png


Starfield on nvidia

1694506102839.png
 

ixi

Joined
Aug 19, 2014
Messages
1,451 (0.38/day)
It gives you moar frames when or if you need them.


That's merely fanboi-talk and exactly nothing whatsoever to do with the technology. I didn't ask what you think about that horrid Green Monster's money-grabbing, I asked why DLSS is useless. Not the same thing.

Indeed it is, just as long as that particular aspect is all you care about.

Again, that has nothing whatsoever to do with the technology in itself. All extraneous factors excepted, is DLSS a good thing or a bad thing?


well, if you are after more numbers in corner of your screen rather than image quality I guess it works out good. If you are just after higher number on your screen - use fsr ehat is for free and can run on all three brends, woopsyy.

Money grabbing? Read again, even better, do it twice.

"additional" and as a customer you just needs to purchase new gpu from next gen to get next gen dlss while it can work on 3xxx series. I' sorry, but fanboysm talk is coming from you as now you clearly don't see the limitations.

You are defending limitations and are happy that company makes you buy stuff just to get now newest dlss version meanwhile 3060 ti eats 4060 with pork in games unless you use dlss 3 which by the way locked specially on 4xxx series to make 4xxx more appealing in games where dlss 3 is implemented. How many games are there with dlss 3 support?

DLSS gives you worse quality on Native res. Versions are specially locked behind gpu gen. I'll say again - nvidis logic - give customer less powerful gpu and dlss help us. Because numbers (fps) on peps screen is higher than without it. Who cares about quality.

I like these benching pipi talks. Fun to read when fanboysm is taking over.

Yep, I'm AMD fanboy who uses rtx gpu :D.


Is DLSS good or bad? Go read my previous post again and you'll see the answer.
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
well, if you are after more numbers in corner of your screen rather than image quality I guess it works out good. If you are just after higher number on your screen - use fsr ehat is for free and can run on all three brends, woopsyy.

Money grabbing? Read again, even better, do it twice.

"additional" and as a customer you just needs to purchase new gpu from next gen to get next gen dlss while it can work on 3xxx series. I' sorry, but fanboysm talk is coming from you as now you clearly don't see the limitations.

You are defending limitations and are happy that company makes you buy stuff just to get now newest dlss version meanwhile 3060 ti eats 4060 with pork in games unless you use dlss 3 which by the way locked specially on 4xxx series to make 4xxx more appealing in games where dlss 3 is implemented. How many games are there with dlss 3 support?

DLSS gives you worse quality on Native res. Versions are specially locked behind gpu gen. I'll say again - nvidis logic - give customer less powerful gpu and dlss help us. Because numbers (fps) on peps screen is higher than without it. Who cares about quality.

I like these benching pipi talks. Fun to read when fanboysm is taking over.

Yep, I'm AMD fanboy who uses rtx gpu.


Is DLSS good or bad? Go read my previous post again and you'll see the answer.

Yeah DLSS is so bad -> https://www.techpowerup.com/review/starfield-dlss-community-patch/
:roll:

"The default anti-aliasing method in Starfield is TAA (Temporal Anti-Aliasing), which results in a very blurry image at all resolutions, including 4K. It also fails to render small object details like thin steel structures, power lines, transparent materials, tree leaves, and vegetation well. Additionally, there are noticeable shimmering issues across the entire image, even when you're not moving, especially at lower resolutions like 1080p."

"The official implementation of FidelityFX Super Resolution (FSR) in Starfield addresses most of these problems but still exhibits excessive shimmering on thin steel objects, transparent materials, tree leaves, and vegetation. Enabling DLSS resolves these shimmering problems and ensures a stable image quality, even at lower resolutions like 1080p."

"The community patch also supports DLAA (Deep Learning Anti-Aliasing), which takes image quality a step further, surpassing the quality of native in-game TAA, FSR, or DLSS. Enabling DLSS/DLAA also enhances the quality of in-game particle effects, providing a more comprehensive and stable appearance, particularly during motion, compared to FSR."

Also, AMD GPUs don't even render the sun like fevgatos stated. Even tho it is an AMD sponsored and optimized for AMD CPU and GPU (which is why DLSS support is not present - AMD hates when FSR is compared with DLSS, because FSR always loose)

The game simply looks best on Nvidia, especially with DLSS/DLAA mod, easily beating native resolution visuals and FSR. TAA looks horrible.

I bet you don't even use Nvidia GPU :laugh:

TAA is garbage in most games. Most dev's prioritize DLSS/DLAA/FSR now. DLSS3 is in tons of games. You can easily mod most games by simply changing dll, there's several tools for this, very easy. Pretty much all new games have DLSS/DLAA (DLAA is a preset of DLSS now, meaning DLSS games have DLAA - making TAA useless)

DLAA is the best AA method hands down. DLSS works wonders as AA while improving perf and beating FSR in 9 out of 10 games, the last one is a draw meaning that FSR wins in zero games.

Aaaaaand this is why AMD GPUs are generally cheaper; Less and worse features, wonky drivers and lower resell value.

AMD don't even have an answer to DLAA or DLDSR which both are awesome features. 4K DLDSR on a 1080p or 1440p display looks insanely good. Close to 4K visuals without the performance hit of 4K and you can even use DLSS Quality on top to bump perf even higher while retaining most of the IQ.
 
Last edited:
Status
Not open for further replies.
Top