Somebody here even calculated, that the difference in resources between 4090 and 3080 12Gb is a difference between a 3090 and 3060Ti. Imagine that. I need to check this and obviously check the performance of the new stuff but I'm shocked by the arrogance of the company.
I hope AMD will do better and the price probably will be OK-ish but not good. The prices have been going up since Turing and they are accelerating.
Honestly, I kind of feel like Nvidia went too far with the 4090, and this is part of why they're making themselves look so damn terrible right now. Sure, it's not an absurdly large die in and of itself - it's even smaller than its predecessor. But the density jump from Samsung 8 to TSMC N4 is so large that to fill that space they just made it
too big - making everything else look ridiculously bad in comparison.
I can see two reasons for choosing this path: competitive pressure from AMD, and a desire to sell hyper-expensive ultra-flagships. Without either of those, they could have scaled AD102 down to a die size like the GP102 instead, making a much smaller and more reasonably sized die that would have delivered tons of performance still. But no - they chose to go balls-to-the-wall, on a super expensive node.
I mean, what if they went a bit more moderate with this instead - or at least didn't try to make a smooth gradient in pricing from a 380mm2 die to a 600mm2 one? If TPU's die size for the AD103 is correct, charging $1200 for that and $1600 for the 4090 - at 60% more die area, and 50% more VRAM - is downright absurd. But Nvidia has clearly chosen the path of "we'll sell on mindshare and flagship cred alone, screw any idea of value".
nonsense really? I have friends in Germany and Poland. They pay peanuts for energy. Literally nothing per month. to give you and example.
In Poland, my friend pays 320NOK per 2 MOTHS OF ELECTRICITY using around 250-300KW/H per month (you get it? ) Two fucking months. You know how much I paid for 300KW/H for last month? 2200NOK
Germany, the last time I checked they were only concerned about the gas price. When I mentioned Electricity, they have asked my. What electricity prices?
Energy meaning GAS PRICE not electricity per se. how do I know? My wife lived there for 7 months before moving in to Norway. Of course they have had a price hike but not ten times more like in Norway. No way you going to tell me that the price in Germany and Poland for instance (also France and Spain) went up ten times over the last 2 years.
So no it is not a nonsense to me rather it is nonsense of what the price for electricity is here in Norway when you clearly produce electricity from renewable sources which is supposed to be free.
Sorry, but where are you getting the idea that energy from renewable resources is supposed to be
free? Do you imagine that building, running and maintaining a power plant doesn't have a cost? They're
cheap in the long run, but not free.
Also, did you read anything at all of the post I responded to? The countries you mention
do not use electricity for heating, they use gas for heating. Gas is their major domestic energy expenditure, and gas is traded in the same system as electricity in the EU. There are of course exceptions - there is a move towards less reliance on natural gas for heating, but pricing hasn't followed yet, meaning anyone there using electricity for heating is in a pretty good position.
As for gas prices:
more than 5x higher according to this source (and historically, EU energy prices have been significantly higher than Norwegian prices, which makes any increase here look bigger).
Here are reports of recent energy pricing protests in Germany; e
nergy prices are a big part of the ongoing and hotly protested cost of living crisis across the continent. It's not getting much exclusive attention because it's tied into rising prices for food and other necessities as well, but energy prices are a central part of this crisis.
I mean, you're explicitly contradicting yourself here, on the one hand you say "they pay peanuts for energy", and on the other you say "they pay for gas, not electricity", which ... well, if you read my previous post, maybe you'd understand why this is effectively the same thing - it's the core domestic source of energy, is traded within the same system, and is used for the same things. Norway uses mainly electricity and that's where prices are high; continental Europe uses mostly gas and that's where prices are high. Trying to separate these two is actively misleading and just downright misrepresenting reality - the current situation in Norway is in no way unique.
Nord pool, the exchange for energy trading is a private institute, not EU exchange. 66% ENX and 34% TSO Holding owned.
At least vs Sweden and Denmark, Norway have a way higher percentage use spot prices, instead of fixed long term prices. I'm assuming it's similar in the mentioned places. I've seen people from UK being nervous about what their new fixed price plan will be when their current expires.
Of course prices in energy will increase, when demand is above/close to supply.
Yes, it is a private exchange - just as most stock markets are private - but it's still subject to EU trade regulations. And the EU could quite easily mandate a change in their trade policies. And crucially, nobody here is arguing that it's unnatural for prices to rise, it's just the magnitude of this rise and how it's intrinsically tied into the mechanisms of the trade system rather than any even remotely sensible system of price setting that makes this problematic. The solution is simple: price regulation targeted towards averaging out prices closer to the average cost of production/sourcing the energy. This will of course eat into corporate profits, but, well, tough luck. Corporations do not have a right to exploit people in a crisis.
If that's the case, then I guess it'll be just another feature for the "300+ fps gaming ftw" mob. I'm fine with 30-45 fps 99% of the time, but if it comes with a 15 fps input response, then it'll be just as unplayable as 15 fps on screen.
This might well be - but then, games that are playable at 30-45fps are typically not all that reliant on smooth and rapid input, so the difference will also be less noticeable. But if the "source" framerate is indeed as low as 15fps, this will most likely be unplayable, yes.
Let's get to the point: no circus trick can beat good old native resolution gaming.
True to some extent, but native resolution gaming is also a rather silly brute-force solution as resolutions scale higher, simply because the perceptible increase in detail and sharpness is pretty much inversely proportional to the resolution increase at this point. 4K has 4x the pixels of 1080p, and is clearly sharper even at 27", but it's not night and day. 8k is 4x the pixels of 4k, and the increase in sharpness is essentially imperceptible unless you're sitting
very close to a very large TV. And as new nodes and increased transistor density becomes more difficult, we need to abandon the simplistic brute force solutions for improved visual fidelity - they're getting too expensive. If moving up one step in resolution has a 4x compute cost but a ... let's say 50% increase in perceptible detail/sharpness, then that is
terrible, and never worth it. Upscaling is really the only viable way forward - though precisely how said upscaling will work is another question entirely.
I think cables are a good analogy for this: the signal requirements for the excessive bandwidth of DP 2.0 and HDMI 2.1 isn't bringing with it thumb-thick cables, but rather bringing about a shift to active cabling instead of passive copper. This takes us from a simple, brute-force solution to a more complex one. Where this analogy falls apart is that active cabling is easily 10x the BOM cost of passive, while upscaling is about as close to a free performance upgrade as you'll find. But it's another example of needing to find more complex, smarter solutions to a problem as the older brute-force ones are failing.