• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

Joined
Dec 31, 2020
Messages
980 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
3050 exists for the same reason 4070 exists. buyers decide not to pay more for the extra 30% Cuda cores they don't need anyway.

And also they occupy almost the same place it the line up, that of the half sized die as opposed to the 384 bit 600mm²

192 bit 276 mm² 12,000 million transistors
192 bit 295 mm² 35,800 million, triple the transistor count.

ah, forgot 3050 is 128 bit, but such a card based on AD104 may exist who knows.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Are you talking about texture filtering or about interpolating to a higher resolution?
For texture filtering I just assume people run 16x AF, as it is so cheap. I remember it being a thing almost 20 years ago, but haven't seen much of it since. Running without AF would usually be a blurry mess, unless the far textures are very high resolution, then you'll get a headache-inducing flickering nightmare. AF isn't perfect though, you can get a very visible "banding effect", where it's either flickering or blurry. Games have the ability to control this, but success will vary depending on the configuration.
Are there particular games which are known to do this badly?
I meant going to Nvidia CP to switch to high-quality to force the use of trilinear instead brilinear.
 
Joined
Apr 8, 2008
Messages
339 (0.06/day)
System Name Xajel Main
Processor AMD Ryzen 7 5800X
Motherboard ASRock X570M Steel Legened
Cooling Corsair H100i PRO
Memory G.Skill DDR4 3600 32GB (2x16GB)
Video Card(s) ZOTAC GAMING GeForce RTX 3080 Ti AMP Holo
Storage (OS) Gigabyte AORUS NVMe Gen4 1TB + (Personal) WD Black SN850X 2TB + (Store) WD 8TB HDD
Display(s) LG 38WN95C Ultrawide 3840x1600 144Hz
Case Cooler Master CM690 III
Audio Device(s) Built-in Audio + Yamaha SR-C20 Soundbar
Power Supply Thermaltake 750W
Mouse Logitech MK710 Combo
Keyboard Logitech MK710 Combo (M705)
Software Windows 11 Pro
Remember, you can't compare Ada MSRP with Ampere MSRP, because that one was fake and prices were much higher during majority of that generation.

So don't remember to compare to scalped prices, and only imagination can limit you on how expensive you want them to appear so Ada cards will look inviting!

Imagination!

Remember, this is exactly what NV wants you to believe, they just couldn't see scalpers & etailers getting so much profits over MSRP and NV couldn't get any. They say that consumers are just willing to pay, so they priced the 4080 as such, eating as much profit as they can.
 
Joined
Jan 20, 2019
Messages
1,552 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Prices don't matter, because the world economy is hardcoded to make us pay more as time goes on.

For the 1% minority i agree. For the 99% majority prices absolutely matter. Yep the world economy and recent events have affected the product and services market but recent GPU MSRPs is hardly a result of that... more like Nvidia and AMDs in-house world economies with shareholders as governing benefactors are the cause of the current cows backside milked-up MSRPs (NVIDIA clearly being the bigger offender). As much as businesses are free to push up on profits like theres no tomorrow and completely disregard the popular customer base at the lower and mid end, no one should be surprised with the growing counter-criticism which goes hand in hand. The shocking part of it all.... you are either a "consumer" who wants best/reasonable value for your hard earned cash or even pay a little on top for the satisfying experience or your a Brand affiliate who has a problem with broader consumer sentiments... you can't be both. I've been accused of being an AMD-pusher for resenting current 4080/4090 inflate-standardised new MSRPs. Which is absolutely absurd... haven't bought an AMD card since Adam and Eve. I would still throw myself at a 4080 should Nvidia axe it down to $800. But that aint gonna happen, well not anytime soon. I'm now ready to pull the trigger on anything for ~$800, whether RTX or RDNA3 providing the performance uplift widely surpasses my current 2080 TI. In your opinion, which would be more likely for the ~$800 grab, RTX or RDNA3? (keep in mind, $800 for a mid-ranged high performance card is already a F'ing Piss take)
 
Joined
Dec 6, 2018
Messages
342 (0.16/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
For the 1% minority i agree. For the 99% majority prices absolutely matter. Yep the world economy and recent events have affected the product and services market but recent GPU MSRPs is hardly a result of that... more like Nvidia and AMDs in-house world economies with shareholders as governing benefactors are the cause of the current cows backside milked-up MSRPs (NVIDIA clearly being the bigger offender). As much as businesses are free to push up on profits like theres no tomorrow and completely disregard the popular customer base at the lower and mid end, no one should be surprised with the growing counter-criticism which goes hand in hand. The shocking part of it all.... you are either a "consumer" who wants best/reasonable value for your hard earned cash or even pay a little on top for the satisfying experience or your a Brand affiliate who has a problem with broader consumer sentiments... you can't be both. I've been accused of being an AMD-pusher for resenting current 4080/4090 inflate-standardised new MSRPs. Which is absolutely absurd... haven't bought an AMD card since Adam and Eve. I would still throw myself at a 4080 should Nvidia axe it down to $800. But that aint gonna happen, well not anytime soon. I'm now ready to pull the trigger on anything for ~$800, whether RTX or RDNA3 providing the performance uplift widely surpasses my current 2080 TI. In your opinion, which would be more likely for the ~$800 grab, RTX or RDNA3? (keep in mind, $800 for a mid-ranged high performance card is already a F'ing Piss take)
don't get me wrong, but you're on a pretty decent card, you have the choice to rebel against the MSRP prices. But imagine if you ran a 2060 6Gb card like me. When I bought it in 2019 I thought It would be futureproof for 5-6 years, but PC games in the last couple of years became increasingly demanding, pretty steep actually. I was fine with a 750 Ti for many years before that. The days of PC gaming, as a cheap or affordable hobby are over. Some people need more time to accept this. It's not only Nvidia's fault, the whole system sucks. PC gaming is a niche market, and prices will continue to creep up. New generations of gamers will start buying hardware, and they wont even know how cheap shit used to be. After a while, only us "boomers" will remember the good old times when you could hop generations effortlessly. PC gaming will become a luxury hobby, and it's not only Nvidias fault.

How in the hell is it possible that a simple case fan that doesn't earrape you costs 40 bucks? A stupid peace of plastic for 40 bucks. And there is enormous competition in the fans market, and they still treat us like fools. So what's so surprising about GPU giants shenaningans, where there is almost non-existant competition.

Only two things can mitigate the problem:
Extreme competition to break down prices (which is unlikely because it's a niche market and very few entities can enter the market)
Reaching a point where it's not necessary to upgrade hardware anymore, where you can run a card for 15 years so a one time investment makes the prices look much better
 
Last edited:
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
don't get me wrong, but you're on a pretty decent card, you have the choice to rebel against the MSRP prices. But imagine if you ran a 2060 6Gb card like me. When I bought it in 2019 I thought It would be futureproof for 5-6 years, but PC games in the last couple of years became increasingly demanding, pretty steep actually. I was fine with a 750 Ti for many years before that. The days of PC gaming, as a cheap or affordable hobby are over. Some people need more time to accept this. It's not only Nvidia's fault, the whole system sucks. PC gaming is a niche market, and prices will continue to creep up. New generations of gamers will start buying hardware, and they wont even know how cheap shit used to be. After a while, only us "boomers" will remember the good old times when you could hop generations effortlessly. PC gaming will become a luxury hobby, and it's not only Nvidias fault.

How in the hell is it possible that a simple case fan that doesn't earrape you costs 40 bucks? A stupid peace of plastic for 40 bucks. And there is enormous competition in the fans market, and they still treat us like fools. So what's so surprising about GPU giants shenaningans, where there is almost non-existant competition.

Only two things can mitigate the problem:
Extreme competition to break down prices (which is unlikely because it's a niche market and very few entities can enter the market)
Reaching a point where it's not necessary to upgrade hardware anymore, where you can run a card for 15 years so a one time investment makes the prices look much better
Did you know your 2060 6GB is a direct match to the 1080 with 8GB in core performance? That right there is a writing on the wall: already GPUs moved to a tigher fit the moment RT got introduced. And the 2060 was effectively one of the best offers in the (early) Turing stack as well. That was the first time perf/dollar came to a near complete standstill over 2 years of time between gens. In the meantime, it couldn't, can't and will never do any kind of meaningful RT. Although, fair's fair, x60 was never endowed with VRAM to last it longer than 3-4 years. But that reduction right there is a painful one. The same thing happened with 3080 10GB. Trimmed down below par. The 1080 I have now is getting long in the tooth, too. But I'm still pushing 3440x1440 on it, see games exceed 6GB VRAM, and they run well; especially with FSR. Small difference: the latter card is now reaching the age of 6~6,5 years.

TL DR-your expectations weren't wrong, but you did buy the wrong card to do it with. When you're looking to last 4+ years with a GPU, you want VRAM headroom, and enough bandwidth. The exact thing Nvidia is cutting down since Turing.

That's in a nutshell what has been happening across the Nvidia stack at large. Today, we see a 4090 that on release struggles already on two notable games; Cyberpunk at full tilt and Portal, the latter being a horribly simplified full path traced application, after all, its geometry and texture simplicity offers major optimization chances. We're like those donkeys chasing the carrot. We'll never eat it, but we'll sniff it from time to time if only we keep running to the latest greatest. I've never been a donkey like that, it just doesn't feel right to me. I feel like being taken for a ride.

Innovation, progress, I guess so, all I see is pretty limited progress for three generations worth of innovation... for an extreme performance cost. Even with DLSS3 on the FPS gets a factor 4-5 worse in Portal. Without DLSS3, its a factor 20 worse or even more. Is it a better game for it? I'm really not seeing it tbh...

Its worth questioning this push.
 
Last edited:
Joined
Jul 1, 2011
Messages
362 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-12900KS 5.3GHZ All P-Cores ,4.2GHZ All E-Cores & Ring 4.2GhZ
Motherboard NZXT N5 Z690 Wi-Fi 6E
Cooling CoolerMaster ML240L V2 AIO with MX6
Memory 4x16 64GB DDR4 3600MHZ CL16-19-19-36-55 G.SKILL Trident Z NEO
Video Card(s) Nvidia ZOTAC RTX 3080 Ti Trinity + overclocked 100 core 1000 mem. Re-pasted MX6
Storage WD black 1GB Nvme OS + 1TB 970 Nvme Samsung & 4TB WD Blk 256MB cache 7200RPM
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510 Black with Cooler Master RGB Fans
Audio Device(s) Internal , EIFER speakers & EasySMX Wireless Gaming Headset
Power Supply Aurora R9 850Watts 80+ Gold, I Modded cables for it.
Mouse Onn RGB Gaming Mouse & Logitech G923 & shifter & E-Break Sim setup.
Keyboard GOFREETECH RGB Gaming Keyboard, & Xbox 1 X Controller & T-Flight Hotas Joystick
VR HMD Oculus Rift S
Software Windows 10 Home 22H2
Benchmark Scores https://www.youtube.com/user/matttttar/videos
HAPPY with my RTX 3070
 
Joined
Feb 20, 2021
Messages
311 (0.23/day)
System Name Office,Home and Game PC
Processor Intel Core i5 12600k Up to 4.9 GHz
Motherboard Z690 Gaming X Gigabyte DDR4 Version
Cooling Fuma 2 Air Cooler
Memory 32GB DDR4 2x16 3600 MHz Patriot Viper Steel RAM
Video Card(s) NVIDIA GeForce GTX 1080 and RTX 3070
Storage 512 GB M2 PCI Ex 3.0 NVMe SX6000 Pro, 1TB NV2 Kingston M2 PCI Ex 4.0 and 4TB WD Blue SATA 3.0 HDD
Display(s) 27 inç 75 Hz LG
Case Cooler Master MB511
Audio Device(s) Creative 2+1
Power Supply 750W 80+ Bronze PSU High Power Element
Mouse Logitech Wireless
Keyboard Microsoft
VR HMD N/A
Software Windows 10-11

Hxx

Joined
Dec 5, 2013
Messages
303 (0.08/day)
NVidia is really crippling that bus. 192bit on those powerful cards is such a dick move. if they want market segmentation they can just play with the # of cores. not sure what theyre doing. both of those cards will take a huge hit in 4k and beyond. i would skip both if you play at a resolution higher than 1440p.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
NVidia is really crippling that bus. even 256 bandwith will get chocked in 4k. probably the main reason why a 4080 will be losing the 4k benchies against a 7900 xtx. not sure what nvidia is thinking. there are plenty of gamers who want 4k high refresh.

4080 should lose all benchies against the XTX. If it doesn't, AMD is in a big big trouble. Because it won't have competitors to 4090.
I thought the XTX targets 4090, not the pitiful and lower grade 4080.

Navi 21 was relatively slower at 2160p exactly because of the limited memory throughput which the Infinity Cache of only 128 MB couldn't compensate.

i would skip both if you play at a resolution higher than 1440p.

Not really. They will be very fast even for 2160p because you can set the in-game settings to medium, high, very-high and ultra-high depending on the demands of the particular game engine.
Hell, you will be even able to play CS:GO at 4320p, no worries :D
 

Hxx

Joined
Dec 5, 2013
Messages
303 (0.08/day)
4080 should lose all benchies against the XTX. If it doesn't, AMD is in a big big trouble. Because it won't have competitors to 4090.
I thought the XTX targets 4090, not the pitiful and lower grade 4080.

Navi 21 was relatively slower at 2160p exactly because of the limited memory throughput which the Infinity Cache of only 128 MB couldn't compensate.

sorry edited my post but yeah i agree. im so pissed at nvidia releasing a $1.2k card on a 256 bit bus ... like wtf. thats why the gap at 4k will be much higher between the XTX and the 4080 than at lower resolutions.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
sorry edited my post but yeah i agree. im so pissed at nvidia releasing a $1.2k card on a 256 bit bus ... like wtf. thats why the gap at 4k will be much higher between the XTX and the 4080 than at lower resolutions.

Unfortunately, the mining, scalper-rule times are not in the past :(
You vote with your wallet - could buy a faster $0.9k card from AMD - 7900 XT 20 GB. You also get 4 GB VRAM more.
 
Joined
Dec 6, 2018
Messages
342 (0.16/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
Did you know your 2060 6GB is a direct match to the 1080 with 8GB in core performance? That right there is a writing on the wall: already GPUs moved to a tigher fit the moment RT got introduced. And the 2060 was effectively one of the best offers in the (early) Turing stack as well. That was the first time perf/dollar came to a near complete standstill over 2 years of time between gens. In the meantime, it couldn't, can't and will never do any kind of meaningful RT. Although, fair's fair, x60 was never endowed with VRAM to last it longer than 3-4 years. But that reduction right there is a painful one. The same thing happened with 3080 10GB. Trimmed down below par. The 1080 I have now is getting long in the tooth, too. But I'm still pushing 3440x1440 on it, see games exceed 6GB VRAM, and they run well; especially with FSR. Small difference: the latter card is now reaching the age of 6~6,5 years.

TL DR-your expectations weren't wrong, but you did buy the wrong card to do it with. When you're looking to last 4+ years with a GPU, you want VRAM headroom, and enough bandwidth. The exact thing Nvidia is cutting down since Turing.

That's in a nutshell what has been happening across the Nvidia stack at large. Today, we see a 4090 that on release struggles already on two notable games; Cyberpunk at full tilt and Portal, the latter being a horribly simplified full path traced application, after all, its geometry and texture simplicity offers major optimization chances. We're like those donkeys chasing the carrot. We'll never eat it, but we'll sniff it from time to time if only we keep running to the latest greatest. I've never been a donkey like that, it just doesn't feel right to me. I feel like being taken for a ride.

Innovation, progress, I guess so, all I see is pretty limited progress for three generations worth of innovation... for an extreme performance cost. Even with DLSS3 on the FPS gets a factor 4-5 worse in Portal. Without DLSS3, its a factor 20 worse or even more. Is it a better game for it? I'm really not seeing it tbh...

Its worth questioning this push.
No idea what's going on with the 4090, my only comment would be that Cyberpunk is a POS programmed by idiots, maybe it's not only the cards fault. And hey, remember the October driver update that handed out double digit performance boosts for all DX12 cards? maybe they need to work some more on the 4090 drivers. just guessing here. I never cared much about the top end. But I agree, RT needs work, and I personally hate DLSS and FSR, they both look awful.
 
Joined
Feb 20, 2019
Messages
8,277 (3.94/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Wow, that sucks!
I'm glad I bought a 3070 now....
 
Joined
Feb 20, 2019
Messages
8,277 (3.94/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Rtx 4070 is a scam.
Don't we need to see performance/$ to make that call?
I'm not saying you're wrong, and if I were a betting man, I'd bet you'll be right - but it's way too early to know at the moment.
 
Joined
Sep 3, 2019
Messages
3,506 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
Yeah... People should just stop judging products by the individual numbers they carry alone. This is false mindset IMHO, that (some) people carry over the years
Legit benchmarks is the only way to judge and eventually the perf/$-€ and/or from features every individual user is interested in.

Bit bus is just a single number on a equation. As speeds, cores/stream processors/MBs count, and all other bits and pieces of modern PCs are.
Wait until all of them are put to real usage together and then make up your minds.

Its only logical and rational.
 

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Yeah... People should just stop judging products by the individual numbers they carry alone. This is false mindset IMHO, that (some) people carry over the years
Legit benchmarks is the only way to judge and eventually the perf/$-€ and/or from features every individual user is interested in.

Bit bus is just a single number on a equation. As speeds, cores/stream processors/MBs count, and all other bits and pieces of modern PCs are.
Wait until all of them are put to real usage together and then make up your minds.

Its only logical and rational.
Tbh, you can actually tell the performance of a card going by its technical specs. If you know typical game and compute loads, which part is acting like a bottleneck, which part is underutilized, you can tell how increasing this or narrowing that will affect performance. Somehow, a surprising amount of posters over here seem to be familiar with those details. I am ashamed not to be part of that club.
 
Top