• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I like how people are desperate in changing names on AMD's products, just to defend Nvidia.
The same people will say that it is AMD's fault that Nvidia is pricing it's top products very high, while it is butchering the lower products in performance, but not in pricing, prices are moving up in the whole product stack, to avoid offering high performance/$ options to those consumers who dare to pay less than $1500.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
WTF... 4070 and 4070ti both 'endowed' with 504GB/s?

Nvidia, what?! Your initial '4080' is equal in bandwidth to a full blown 4070 that should have been an x60?!

That's an official pass on Ada for me
 
Joined
Nov 8, 2022
Messages
59 (0.08/day)
System Name Windows 11 Pro 64-bit
Processor AMD Ryzen 5 5600G
Motherboard MSI B550M PRO-VDH Wifi
Cooling AMD Wraith Stealth Stock Cooler
Memory 32GB(2x16GB) DDR4 3200 MHz
Video Card(s) AMD Radeon Vega 7 iGPU
Storage 512GB M.2 SSD, 1TB SATA SSD
Display(s) Dell S2422HG 1080p 165Hz Curved Gaming Monitor
Case Thermaltake Versa H18
Audio Device(s) Not telling you
Power Supply MSI MAG A550BN 80+ Bronze 550W PSU(Planning to get Corsair RM750e soon)
Mouse Logitech G502 Hero
Keyboard Logitech G213 Prodigy RGB Gaming Keyboard
VR HMD None
Software AMD Ryzen Master, Logitech Gaming Software, Steam, NVIDIA GEFORCE NOW
Benchmark Scores 40FPS in American Truck Simulator, 1080p ULTRA preset, 125% Scaling. More to come later.
Um, don't you mean smash on Ada?

Also, RTX 4070 should have had 9728 cuda cores and 16GB GDDR6X in the first place! RTX 4080 should have had 12288 cuda cores and 20GB GDDR6X!
 
Joined
Jan 20, 2019
Messages
1,552 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks​


The sadness of it all: whether its rumour filled speculation, releases, benchmarks, etc etc.... when anything mid2high performance is mentioned (u know 4070~4080 territory) i'm usually OVERLY enthusiastic to get some take on it. For the first time the presentiment is just gloomily dull. With current 4080/4090 prices i'm not expecting anything even remotely worthwhile with mid-tier or lower-tier 40-series. Gonna stick with my 2080-TI for now unless RDNA3 punches well above the belt for a reasonable $$ ask.
 
Joined
Sep 3, 2019
Messages
3,506 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
WTF... 4070 and 4070ti both 'endowed' with 504GB/s?

Nvidia, what?! Your initial '4080' is equal in bandwidth to a full blown 4070 that should have been an x60?!

That's an official pass on Ada for me
Don’t want to defend nVidia by any means but only state some facts…
The memory bandwidth these days is falsely measured by just multiply GDDR bus by its speed. Other stuff come to play big role like L0/1/2/3 cache size. Take RDNA2/3 for example. Their true memory bandwidth is way above 1TB/s up to a few TBs/s (RDNA3).
You can’t really calculate it.
 
Joined
Dec 31, 2020
Messages
980 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
Um, don't you mean smash on Ada?

Also, RTX 4070 should have had 9728 cuda cores and 16GB GDDR6X in the first place! RTX 4080 should have had 12288 cuda cores and 20GB GDDR6X!

Yes in my dreams 4080 was also 12288 CUs

Yes for the most part 70 tier had almost the same config as 80 of previous gen.
2070 and 1080, 2304 vs 2560 CUs
1070 as the 980, 1920 vs 2048 CUs
3070 carries exactly the same CUDA count as 2080 and double issue for 100% more compute.

But go figure since 40 series has 50% clock speed they decided to keep the same count, and 4080 is a little better.
I expected 4070 close to 8704 Cuda but fair enough, we have to be more realistic here and distinguish the delusions.
 
Joined
Sep 28, 2005
Messages
3,322 (0.47/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
- Games poorly optimized so it needs fake resolution rendering (upscaling) just to make games playable

- Biggest selling factor for new GPU's is having to use fake scaling and fake frames technology to make unoptimized games playable

- Charge abysmal amounts for said fake rendering technology just to make these unoptimized games playable.

Man, we are definitely the biggest cucks in the world. AMD is no better either.

I guess future machines will be generation or two behind in terms of GPU's to make budget gaming rigs while current gen won't offer anything of a real value. Unless you are going to sell your wife and first born to pay for said hardware and fake generating tech.

Let's just hope game developers can remove their heads from their own asses and optimize their games. But who am I kidding?
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Don’t want to defend nVidia by any means but only state some facts…
The memory bandwidth these days is falsely measured by just multiply GDDR bus by its speed. Other stuff come to play big role like L0/1/2/3 cache size. Take RDNA2/3 for example. Their true memory bandwidth is way above 1TB/s up to a few TBs/s (RDNA3).
You can’t really calculate it.
Still does not eliminate the fact these two GPUs are identical in the memory department, while the only thing that scales along with shadercount is 12MB of L2.

Once again bad balance on the Nvidia stack, showing us these arent built to be optimal products but planned to go obsolete in a few years. Cut down for the market, not the product. We have seen this before in the upper midrange. Staying far away.
 
Joined
Dec 31, 2020
Messages
980 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
4070 is actually shaping up to be the weakest 70 to date, ever. Can't even match let alone surpass a 80-tier.

every card since maxwell,
970 was 25% > than GTX 780
1070 40% > than 980,
2070 15% > than 1080
3070 25% > than 2080

and now this 4070 10% slower than 3080. a glorified 4050 Ti.

3060 Ti 4864 vs 4070 5888 + 21% Cuda -20% ROP
3070 Ti 6144 vs 4070 Ti 7680 + 25% Cuda - 17% ROP
 
Joined
Jun 10, 2014
Messages
2,985 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I can see the Nvidia bashing is in full swing based on nothing but speculated specs…

I did that for a while. But since I wasn't seeing any difference (I'm not a pixel peeper), I kinda forgot about it.

For those not in the know, this isn't about artificial patterns, but about a trilinear optimization that translated to texture shimmering and, iirc, visible transitions between various LOD levels.
Are you talking about texture filtering or about interpolating to a higher resolution?
For texture filtering I just assume people run 16x AF, as it is so cheap. I remember it being a thing almost 20 years ago, but haven't seen much of it since. Running without AF would usually be a blurry mess, unless the far textures are very high resolution, then you'll get a headache-inducing flickering nightmare. AF isn't perfect though, you can get a very visible "banding effect", where it's either flickering or blurry. Games have the ability to control this, but success will vary depending on the configuration.
Are there particular games which are known to do this badly?

I haven't yet found any problems with using brilinear, but I anticipate lack of pixels by RTX 4070 only having 64 ROPs instead of full 80. :)
It's called bilinear and trilinear, which refers to how many axes it interpolates with. Read it like bi-linear and tri-linear, and it makes sense. ;)

I wouldn't be too worried about ROP count. Throughput is what matters, and I haven't seen Nvidia bottleneck their cards with raster throughput yet.
 
Joined
Dec 6, 2018
Messages
342 (0.16/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
I can see the Nvidia bashing is in full swing based on nothing but speculated specs…


Are you talking about texture filtering or about interpolating to a higher resolution?
For texture filtering I just assume people run 16x AF, as it is so cheap. I remember it being a thing almost 20 years ago, but haven't seen much of it since. Running without AF would usually be a blurry mess, unless the far textures are very high resolution, then you'll get a headache-inducing flickering nightmare. AF isn't perfect though, you can get a very visible "banding effect", where it's either flickering or blurry. Games have the ability to control this, but success will vary depending on the configuration.
Are there particular games which are known to do this badly?


It's called bilinear and trilinear, which refers to how many axes it interpolates with. Read it like bi-linear and tri-linear, and it makes sense. ;)

I wouldn't be too worried about ROP count. Throughput is what matters, and I haven't seen Nvidia bottleneck their cards with raster throughput yet.
Why are AMD cultists so loud everywhere on the internet? Most people prefering Nvidia products just simply buy them without doing pro-Nvidia or anti-AMD propaganda crusades and enjoy the product. AMD cultists on the other hand, feel an elemental urge to bash Nvidia everywhere on the internet, at the same time they have zero criticism towards AMD, as if they were handing out AMD cards for free. I just don't get it. Reading any videocard related discussions on the internet ends up becoming an AMD sermon and honestly I don't really like reading forums and comments anymore.
 
Joined
May 11, 2018
Messages
1,254 (0.53/day)
I see a lot of Nvidia bashing from long time Nvidia users. These "bashings" are often focused on one aspect, and often AMD doesn't even get in the picture - I mean, why should they? They commands about 8% of discrete graphics cards market, and I think first 15 most represented cards on Steam survey are from Nvidia. To somehow brand everyone who objects or criticizes Nvidia's choices regarding products, pricing etc. as a work of "AMD cultists" is a conspiracy theory.
 
Joined
Dec 6, 2018
Messages
342 (0.16/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
I see a lot of Nvidia bashing from long time Nvidia users. These "bashings" are often focused on one aspect, and often AMD doesn't even get in the picture - I mean, why should they? They commands about 8% of discrete graphics cards market, and I think first 15 most represented cards on Steam survey are from Nvidia. To somehow brand everyone who objects or criticizes Nvidia's choices regarding products, pricing etc. as a work of "AMD cultists" is a conspiracy theory.
I wish you were right. AMD has a cult following, and it's getting worse and worse over the years. They are a loud minority overtaking every discussion. I've been silently raising my eyebrows over this for years.
 
Joined
May 11, 2018
Messages
1,254 (0.53/day)
I've only encountered this type of behavior on WCCFTech discussions or other similar unmoderated forums where everything debased into name calling and fanboyism. But nobody really reads that.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Why are AMD cultists so loud everywhere on the internet? Most people prefering Nvidia products just simply buy them without doing pro-Nvidia or anti-AMD propaganda crusades and enjoy the product. AMD cultists on the other hand, feel an elemental urge to bash Nvidia everywhere on the internet, at the same time they have zero criticism towards AMD, as if they were handing out AMD cards for free. I just don't get it. Reading any videocard related discussions on the internet ends up becoming an AMD sermon and honestly I don't really like reading forums and comments anymore.
The misconception here is that you say they are AMD cultists. That is effectively flame baiting with no basis; and it colors your own glasses.

The fact is, whatever company was producing a shit line up, got flak for it at its time. There are indeed strong followings on both brands. But the overwhelming, vast majority is simply considering options every gen that gets released. Compares the offers. Talks about the changes from one gen to the next. Compares perf/$ and featureset. Is, or is not affected by past experiences. And then makes a choice.

I've been on the Nvidia boat for a long time, for the simple fact they had better products. Since Turing, that 'norm' has been fading away, slowly but surely - perf/$ is taking a nosedive since RTX, TDPs go up, performance is achieved by stacking many proprietary technologies on top of games, and some shit just won't even run at all now without extensive Nvidia TLC. That's pretty much where I conclude this company doesn't offer me what I want anymore. As much as I didn't want to get tied into Nvidia's profit scheme bullshit with Gsync by buying a monitor with it, I have the exact same sentiment wrt all the proprietary must haves they push today to feed the insatiable RT monster for a few realtime rays. Matter of fact, f*ck that whole strategy entirely. Their GPUs get worse every gen and I'm not supporting that nonsense.

Meanwhile, AMD's overall quality on GPUs has massively improved since RDNA1, RDNA2 was near perfect that way + baked on a much better node, and RDNA3 seems to continue that mode. At the same time, the products seem to be priced a little less into insanity, the feature set is more than sufficient, and they're not 3 slot behemoths that require a spiderweb of adapters. More importantly though, it is AMD that truly pushes the gaming market forward, for the simple fact they own consoles and Nvidia does not. The most important gamer market share is effectively with AMD. Nvidia has taken over PC share for the larger part (last I saw was 80+%), and yet, they still can't define it proper. RT support still isn't commonplace and most implementations are lackluster at best; console ports won't have it and they just got refreshed, more games still are released without than with it; and its just a small set of effects every time. When we get full path traced games, the performance takes an immense nosedive (=Portal) and requires yet more proprietary nonsense. We're still solid in early adopter land - who cares AMD loses 10% fps with RT on compared to Nvidia. I really don't.

Far more interesting for the prolonged advances of graphics cards, is a technology like MCM/chiplets. That's where the real movement in the market will come from. And its also an approach to enable enough horsepower to run RT; now the chiplets are identical, but they probably don't have to remain so. There's a lot of new fruit on this tree; there is none left on the CUDA tree in a monolithic floor plan. Its like Intel's Core - way past EOL, but still pushed forward.

If you don't like that honesty, that is entirely your problem, but I would take a long look in the mirror for the real fix.
The underlying emotion there is that you feel the need for peer pressure to support your idea Nvidia is still king of the hill, but that principle seems to be under pressure. It feels uncomfortable to you, you prefer ignorance=bliss or not hearing an opinion that doesn't align with yours. But how is that relevant to anyone else?
 
Last edited:
Joined
Jun 10, 2014
Messages
2,985 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I see a lot of Nvidia bashing from long time Nvidia users. These "bashings" are often focused on one aspect, and often AMD doesn't even get in the picture - I mean, why should they? They commands about 8% of discrete graphics cards market, and I think first 15 most represented cards on Steam survey are from Nvidia.
How could AMD they gain larger market shares when they don't produce enough cards to truly compete with Nvidia? Availability has been an issue at least since the Fury series, RX Vega were mostly nowhere to be found. Radeon VII were limited to a few thousand copies. The 400/500 series weren't that great in availability either. The availability of AMD cards vary a lot by region, and this also affects retail pricing, as stores only getting limited supplies of AMD cards are going to price this with a premium.
I really miss the pre-GCN days of AMD/ATI, when they offered great value in the upper mid-range to lower high-end segments, with great availability and often priced below MSRP.
 
Joined
Jan 29, 2021
Messages
1,846 (1.32/day)
Location
Alaska USA
I wish you were right. AMD has a cult following, and it's getting worse and worse over the years. They are a loud minority overtaking every discussion. I've been silently raising my eyebrows over this for years.
This ^^ ... they've been the scourge of the internet ever since the release of C2D. That lot reminds of what the offspring would turn out to be if you crossed a bolshevik with an anarchist. All I ever see is 'Ngreedia' and 'Intel is bad'.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This ^^ ... they've been the scourge of the internet ever since the release of C2D. That lot reminds of what the offspring would turn out to be if you crossed a bolshevik with an anarchist. All I ever see is 'Ngreedia' and 'Intel is bad'.
The hardcore fanbase indeed does exist, and ironically... they damage AMD more than they help it. They're basically making it repulsive, because a lot of what's said is just plain untrue.

But I think we're pretty low on that fanbase on TPU.
 
Joined
Jan 29, 2021
Messages
1,846 (1.32/day)
Location
Alaska USA
The hardcore fanbase indeed does exist, and ironically... they damage AMD more than they help it. They're basically making it repulsive, because a lot of what's said is just plain untrue.

But I think we're pretty low on that fanbase on TPU.
I agree with everything in your post other than the bolded part. The TPU poll results speak for themselves.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I agree with everything in your post other than the bolded part. The TPU poll results speak for themselves.
See that's exactly what the point is; the poll doesn't lie, numbers don't lie - the sentiment exists, simple.

People voice an opinion, we have yet to see how it materializes in sales; the fact is though, Ada is on shelves, not sold out. And AMD is going to be pretty competitive with the 4080 that's not being bought.
 
Joined
Dec 6, 2018
Messages
342 (0.16/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
The misconception here is that you say they are AMD cultists. That is effectively flame baiting with no basis; and it colors your own glasses.
Guilty as charged, I'm indeed wearing glasses, but I've been forced to do so.

The fact is, whatever company was producing a shit line up, got flak for it at its time.

Shit product? why what metric? Every new generation offers better performance than the last one, nothing else is important. covid and the war messed with production costs, everything is getting more expensive if you haven't noticed. Sure there is some corpo greed in the equation, but demanding cheaper and cheaper generations is simply utopia. it's the other way around and stuff will always get more expensive unless they don't invent some robots who can grow tech on trees. inflation is hardcoded into our current wonderful economy. even the most peaceful golden years of happines have a 2-5% inflation in them.

performance is achieved by stacking many proprietary technologies on top of games,

welcome to the corporate world. big tech is forced to invent new ways to increase profits for extremely demanding shareholders. this is not exclusive to big tech.

profit scheme bullshit with Gsync by buying a monitor with it

Execution might not be perfect, but it's a nice innovation. I like innovation. Nvidia probably spent a truckload of time and money on developing it, I'm not an idealist, I understand that they try to monetize it as much as they can.

to feed the insatiable RT monster for a few realtime rays.

Raytracing, and hardware ray tracing is one of the most awesome revolutions of computer graphics in the last decades. I get it that everyone loves to hate it for various reasons, but again, I love innovation. I can't wait to see it get implementet perfectly. On some occasions, it's eyewatering already, just look at Atomic Heart RT shadows. That's what I want to see in gaming, innovation, and not stupid price-wars on 50 dollar differences between red and green.

Their GPUs get worse every gen and I'm not supporting that nonsense.

Strange, my 2060 is worse than the 3060, and the 3060 is worse than what the 4060 will be. Everything else is biased or emotional nonsense. Prices don't matter, because the world economy is hardcoded to make us pay more as time goes on.

Meanwhile, AMD's overall quality on GPUs has massively improved since RDNA1, RDNA2 was near perfect that way, and RDNA3 seems to continue that mode. At the same time, the products seem to be priced a little less into insanity, the feature set is more than sufficient, and they're not 3 slot behemoths.

I agree, AMD has made some efforts, but it's still doing the old "buy me, big memory! that never gets utilized" tricks. Their software is still slow and abysmal, and their answers to Nvidia innovation is still lackluster. I'm waiting for AMD innovation that's getting copied by Nvidia, finally. Getting AMD saves you 50 bucks at the time of purchase, but closes so many doors with cool possibilities and tools for good. RTX voice, CUDA, amazing NVENC hardware h.264/265 encoder etc.

I'll be pleased to switch to AMD if they weren't always behind, panting and trying to catch up.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Guilty as charged, I'm indeed wearing glasses, but I've been forced to do so.

Shit product? why what metric? Every new generation offers better performance than the last one, nothing else is important. covid and the war messed with production costs, everything is getting more expensive if you haven't noticed. Sure there is some corpo greed in the equation, but demanding cheaper and cheaper generations is simply utopia. it's the other way around and stuff will always get more expensive unless they don't invent some robots who can grow tech on trees. inflation is hardcoded into our current wonderful economy. even the most peaceful golden years of happines have a 2-5% inflation in them.

welcome to the corporate world. big tech is forced to invent new ways to increase profits for extremely demanding shareholders. this is not exclusive to big tech.



Execution might not be perfect, but it's a nice innovation. I like innovation. Nvidia probably spent a truckload of time and money on developing it, I'm not an idealist, I understand that they try to monetize it as much as they can.



Raytracing, and hardware ray tracing is one of the most awesome revolutions of computer graphics in the last decades. I get it that everyone loves to hate it for various reasons, but again, I love innovation. I can't wait to see it get implementet perfectly. On some occasions, it's eyewatering already, just look at Atomic Heart RT shadows. That's what I want to see in gaming, innovation, and not stupid price-wars on 50 dollar differences between red and green.



Strange, my 2060 is worse than the 3060, and the 3060 is worse than what the 4060 will be. Everything else is biased or emotional nonsense. Prices don't matter, because the world economy is hardcoded to make us pay more as time goes on.



I agree, AMD has made some efforts, but it's still doing the old "buy me, big memory! that never gets utilized" tricks. Their software is still slow and abysmal, and their answers to Nvidia innovation is still lackluster. I'm waiting for AMD innovation that's getting copied by Nvidia, finally. Getting AMD saves you 50 bucks at the time of purchase, but closes so many doors with cool possibilities and tools for good. RTX voice, CUDA, amazing NVENC hardware h.264/265 encoder etc.

I'll be pleased to switch to AMD if they weren't always behind, panting and trying to catch up.
Nobody forces you to do anything. You've always been free to make your own choices based on your own rationale. Its interesting to read and weigh that rationale and compare it to one's own. When it stops being interesting, that's a sign you're better off doing something else ;)

About corporations having to do what they have to do... yeah. Okay. So we as customers have to be complacent and beg for the next iteration of ass rape? I'll pass, thanks. Markets function because customers vote with wallets and convert their sentiment into action (or inaction).

As for being 'behind', I agree, AMD was always playing catch up. But they're not today - its a mistake to think so. They're technologically leaps and bounds ahead of their two largest competitors, having built experience in the future of chip technology / scaling options. Whatever happened to Nvidia's MCM whitepaper? And Intel's stacking technologies? They're still pushing monolithic behemoths. And even if they do shrink, they still need to expand TDP to meet their perf target - that's not really progress in my book, that's just pushing more volts through ever bigger chips, and its reaching the end of the line.

As for RT development. Sure, the innovation is neat. At the same time its a tool to create demand and pull people into 'adoption'. If the GPUs would remain priced sanely (or even aligned to inflation, just fine!), I would be all-in on supporting it with my wallet too.

But that's not what happened. This is what happened: You're even paying through the nose for something as shitty as a 3050; this isn't inflation here. This is what happens when a market/niche is cornered by a single company; practices that don't benefit us in the slightest.

Progress in perf/$ is deeply negative from Ampere to Ada; and Ampere is still sold at premium on top of it.
If you want to see progress in that direction... you do you. I don't.

Also, there is another long term consideration. If you value gaming, it would help you and us if it wasn't getting priced out of comfort for the small wallets. When a 3050 starts at 300,- that's quickly moving into a territory where gaming is for haves, and the rest is have nots. What do you think is next? More RT content? You can safely forget about it. What you'll get is predatory practices on those last idiots still doing PC gaming on their >1k GPUs, far fewer games that push the boundary (there is no market left). State of the art games cost money, so they need big market adoption. The games that will still get released, are going to be a console port (driven by AMD) or lackluster altogether.

Nvidia is actively damaging everything we value with its current strategy. But wee innovation! They have a few more frames in RT!

1670764713496.png
 
Last edited:
Joined
Dec 6, 2018
Messages
342 (0.16/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
Nobody forces you to do anything. You've always been free to make your own choices based on your own rationale. Its interesting to read and weigh that rationale and compare it to one's own. When it stops being interesting, that's a sign you're better off doing something else ;)

About corporations having to do what they have to do... yeah. Okay. So we as customers have to be complacent and beg for the next iteration of ass rape? I'll pass, thanks. Markets function because customers vote with wallets and convert their sentiment into action (or inaction).

As for being 'behind', I agree, AMD was always playing catch up. But they're not today - its a mistake to think so. They're technologically leaps and bounds ahead of their two largest competitors, having built experience in the future of chip technology / scaling options. Whatever happened to Nvidia's MCM whitepaper? And Intel's stacking technologies? They're still pushing monolithic behemoths. And even if they do shrink, they still need to expand TDP to meet their perf target - that's not really progress in my book, that's just pushing more volts through ever bigger chips, and its reaching the end of the line.

As for RT development. Sure, the innovation is neat. At the same time its a tool to create demand and pull people into 'adoption'. If the GPUs would remain priced sanely (or even aligned to inflation, just fine!), I would be all-in on supporting it with my wallet too.

But that's not what happened. This is what happened: You're even paying through the nose for something as shitty as a 3050; this isn't inflation here. This is what happens when a market/niche is cornered by a single company; practices that don't benefit us in the slightest.

Progress in perf/$ is deeply negative from Ampere to Ada; and Ampere is still sold at premium on top of it.
If you want to see progress in that direction... you do you. I don't.

Also, there is another long term consideration. If you value gaming, it would help you and us if it wasn't getting priced out of comfort for the small wallets. When a 3050 starts at 300,- that's quickly moving into a territory where gaming is for haves, and the rest is have nots. What do you think is next? More RT content? You can safely forget about it. What you'll get is predatory practices on those last idiots still doing PC gaming on their >1k GPUs, far fewer games that push the boundary (there is no market left). State of the art games cost money, so they need big market adoption.

Nvidia is actively damaging everything we value with its current strategy. But wee innovation! They have a few more frames in RT!

View attachment 274007
That single company may abuse its position, but everyone does that, until competition arrives. Where is the competition? I want AMD to break down green prices, I want AMD to out-innovate green. All I see is they both cost almost the same in my country, and one has ABYSMAL software, with internet forums showered with game crash and performance complaints from AMD users when a new game comes out. Overwhelmingly by AMD users, just remember this and try to see for yourself. What's going on with AMD software? Why the hell can't they fix this shit already. It's getting stale after a decade of horrible software and slow game support.

And I'm not an idealist like the aforementioned "AMD cultists" are. I'm not going to support small guy to give a finger to the big guy. I might consider AMD if they get their shit together and do some miracle innovation, or Nvidia seriously f*cks up.

regarding your perf/$ chart: Are you sure 1140p and RTX 3050 are a fair comparison? Who in their right mind would buy a 3050 for 1440p
 
Top