- Joined
- Jan 14, 2019
- Messages
- 14,014 (6.34/day)
- Location
- Midlands, UK
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Features and VRAM have got little to do with the whole card's power consumption, as long as you have the same number of RAM chips of the same kind. Wait, the 1060 actually has more than the 960.Transistor count has steadily increased, features have increased, VRAM has increased. It's not really a fair comparison considering that there's no real power constraint nor demand for it on desktop systems. The desktop RTX3060 still is about 20% faster than the GTX 1080 as already mentioned above.
Then there should be no reason for nvidia and AMD to shoot their desktop cards' TDPs through the roof. Sure, the extra 5% will convinceIf on the other hand you check out the mobile versions (where clocks are lower, allowing for more efficient operation due to a practical need for lower power), it becomes clearer that smaller nodes lead in principle to better efficiency, which should be an obvious statement anyway:
GTX 1080 Mobile: 150W
RTX 2070 Mobile: 115W
RTX 3060 Mobile: 80W
These should actually be performance-wise all within a few % from each other.
I guess I'm in the minority with my love for small form factor / passively cooled hardware. I'm happier to see a modern game run on an iGPU or old / low profile PC than to see a hundred core CPU with a 3090 in action.Because most desktop gamers do not care as long as power remains within reasonable levels, and manufacturers have realized this. There's no need to artificially gimp performance when end-users can do that themselves if they want or need.
I hope you're right.Once current midrange GPUs will have the same inflation-adjusted price of midrange GPUs of when your 1050Ti was released, low-power, passive GPUs might start appearing as well.
Actually, I think we live in a time when it makes perfect sense. Low power cards need less electricity which isn't only good for the green movements, but also to counteract rising energy prices. They also need smaller heatsinks that are cheaper to manufacture. Nvidia / AMD shouldn't blame the high price of their products on resource costs when they themselves design them to be needing bigger coolers and beefier VRMs than they practically should. I mean, sure, copper and aluminium are expensive, but who said you must have 5 kg of them on your graphics card when with a little (factory) tweaking, it would work just fine with a lot less? Also, having fewer fans (or no fans) on your graphics card significantly decreases the chance of failure, also decreasing the amount of e-waste on the planet.Until then, this won't make economically sense neither for manufacturers nor end-users, also given that the latter can adjust power themselves and probably already run their cards passively or semi-passively given the massive coolers they generally come up with nowadays.
Then maybe generations shouldn't come so soon after each other, either. At least to me, it doesn't make any sense to release a product that's barely better than the last one.100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)
True. Though x80 Ti cards have always been halo products. Only that nvidia decided to change that to x90 with Ampere - or just decided to release several halo products within the same generation (whichever way one looks at it). Also, you're comparing within two generational gaps again. If I saw the same improvement coming from Turing, I'd say it's great.It's a little bit of an unfair comparison, only because the 3060 is probably the worst Nvidia offering. It's just a bad card. If you compare the 3060ti to a 1080ti, the difference is massive. Τhe 1080ti consumes 25% more while being 15% slower. That's without even including all the goodies of the 3060ti (dlss / rt etc.). So yeah...
Even a 3070 is around 35-40% faster at lower tdp..
Not that I'm complaining, though. Like I said, with baby steps like these, I can probably keep my 2070 for a long time.
Last edited: