CPU, Mobo, RAM, PSU should lower their prices too. Freaking weakest chipsets going above 100e . What a time to be alive... and then then again... z68 for 100euro... sometimes would be great to go back in time.
Pretty sure most people understand the Nvidia issue. Memory capacity >< Bus size >< Greed.
Or do you wanna tell me Nvidia, the absolute market leader of GPU's, in incapable to develop GPU's with adequate bus sizes for adequate memory sizes? Come on.
Good choice if 4060Ti 8GB is $399 and 16GB version is $449 otherwise uninteresting.
At this performance class 16GB makes the exact same sense as 8GB did on RX580 6 years ago, which means a lot of sense... (The only problem is that RX580 had $239 SRP and 4060Ti will be at best $449)
If 4060Ti is only -10% slower in QHD raster vs RX6800 (starting at $479 atm) i will certainly prefer it at $449 vs RX6800 with all the other advantages that it has.
I won't believe until the product is out in the market. We were led to believe that the RTX 3070 Ti may sport 16GB VRAM config, but instead got a meaningless GDDR6X upgrade, but still stuck at 8GB. If 16GB is true, it is great, but will depends on the cost. Also it does not change the fact that the GPU specs is very underwhelming for a xx60 Ti class that likely will struggle with new titles at 1440p (forget about RT).
16GB on a 4060ti is just wasted potential. You pay for something extra that benefits in maybe 0,01% of games. Nvidia's lineup doesn't make any sense.
If they would copy AMD (who are since years more generous without senseless overstacking), their lineup would look like that:
M1 Ultra has two M1 Max. Each M1 Max has 400 GB/s. For LPDDR5-6400, it would need 512 bits bus per M1 Max.
Apple attached LPDDR5-6400 chips on the chip package like on HBM implementations.
-----------
Recent GDDR6W chip has 64-bit IO pins instead of GDDR6 / GDDR6X's 32-bit IO pins.
12 GDDR6W chips can offer 768-bit and 48 GB.
8 GDDR6W chips can offer 512-bit and 32 GB.
4 GDDR6W chips can offer 256-bit and 16 GB.
Not factoring clamshell configuration.
GDDR6W didn't make it to ADA's initial release window.
Intel, AMD, NVIDIA, Hynix, Micron, and Samsung needs to gang up and figure out a way to significantly advance GDDRx standards in a timely manner. Need a VESA-type group for the PC memory standards instead of the slower JEDEC. The stakeholders in the PC clone industry cooperated when it crushed IBM PS/2.
I won't believe until the product is out in the market. We were led to believe that the RTX 3070 Ti may sport 16GB VRAM config, but instead got a meaningless GDDR6X upgrade, but still stuck at 8GB. If 16GB is true, it is great, but will depends on the cost. Also it does not change the fact that the GPU specs is very underwhelming for a xx60 Ti class that likely will struggle with new titles at 1440p (forget about RT).
What about the 4070 and 4070 Ti? It looks to me like you'll have to choose between VRAM and performance once again, just like with the 3060 and 3070 (Ti).
16GB at the lower spectrum alongside console memory enrichment sounds about right.
Doesn't matter how effective these SKUs will be in the current norms but more VRAM standardised across the board is no longer preventable but inevitable. The earlier we can adopt, the greater the possibilities going fwd. While some of us are more than content with the shit-show compromised graphics made available today, others simply fancy tapping into north of "made available".... you know the already perpetually accessible enhanced visual eye-candy which fails to land on our screens. The possibilities are endless only a wider paint pallette is needed alongside compute progression (which is already abundant) and a boat load of more time (years) for things to develop into a more immersively stunning picture. Oh and the big elephant in the room "price" - nothings perfect!
Lets just hope Leaky Leaks is taking a leak with precision (no wet trousers pls)
Yes, i know it... He said " Nvidia is incapable to develop GPU's with adequate bus sizes for adequate memory sizes? "
and i answered that it's possible and with cheap RAM, but Nvidia don't want. Nothing else.
I don't say that nvidia have to redesign now ADA 106 or 107, I know that is not possible.
and yes, I know they would need bigger buses and more controllers in the gpu = more expensive chip, bigger,
but what is better and more cheap? Big silicon + huge cheap ram or small silicon + short expensive ram
Now why praytel did nvidia not do that with the 3070 ti. It only has 8 gig version. But it seems all the other midrange cards have two flavours of vram. Nvidia cheapened out because they knew the 40 series was coming and just could not be bothered. Shame on you nvidia.lol. I did pay a lot for that gpu. So I kinda have a right to be a little pissed about it. That being said, the 4060 ti 16 gig version would be ok for higher resolutions. But I still think the bus width for the 40 series on everything below the 4090 should be to the same specs as yhe previous generation. Maybe they did it this way so the 50 series will get more "wows" when they use the same bandwidth as the 30 series respectively. With more RT cores and DLSS 3 is revamped with more A.I. involvement. I think the 50 series will be a smashing success. But if it is better than the 40 series like the 40 was better than the 30 series. We could have a problem as the 8 gig gpu's will be useless in a couple more years. My next upgrade will be at least 16 gig vram gpu.
Now why praytel did nvidia not do that with the 3070 ti. It only has 8 gig version. But it seems all the other midrange cards have two flavours of vram. Nvidia cheapened out because they knew the 40 series was coming and just could not be bothered. Shame on you nvidia.lol. I did pay a lot for that gpu. So I kinda have a right to be a little pissed about it. That being said, the 4060 ti 16 gig version would be ok for higher resolutions. But I still think the bus width for the 40 series on everything below the 4090 should be to the same specs as yhe previous generation. Maybe they did it this way so the 50 series will get more "wows" when they use the same bandwidth as the 30 series respectively. With more RT cores and DLSS 3 is revamped with more A.I. involvement. I think the 50 series will be a smashing success. But if it is better than the 40 series like the 40 was better than the 30 series. We could have a problem as the 8 gig gpu's will be useless in a couple more years. My next upgrade will be at least 16 gig vram gpu.
I know but show me a 3070 ti with 16 gigs of vram. I was unable to locate one and I was told NVIDIA did not make one yet and probably wouldn't either. Guess what, only 8 gig version was produced. Not long after i purchased my 3070 ti, Nvidia was anouncing the 40 series. And if games are made to run with more than 8 gigs then gpu manufacturers should be producing video cards with enough vram and what do you know, now they are. At least AMD and INTEL figured it out and cared enough to put the nesesary amount of vram on their cards. Whereas Nvidia figured they'll wait for a few thousand more people to spend exorberant amounts of money to play the games. Look where it got them today. Now they listen to us gamers sort of.
I know but show me a 3070 ti with 16 gigs of vram. I was unable to locate one and I was told NVIDIA did not make one yet and probably wouldn't either. Guess what, only 8 gig version was produced. Not long after i purchased my 3070 ti, Nvidia was anouncing the 40 series. And if games are made to run with more than 8 gigs then gpu manufacturers should be producing video cards with enough vram and what do you know, now they are. At least AMD and INTEL figured it out and cared enough to put the nesesary amount of vram on their cards. Whereas Nvidia figured they'll wait for a few thousand more people to spend exorberant amounts of money to play the games. Look where it got them today. Now they listen to us gamers sort of.
its a just damage control for nvidia and business as usual. they will be back to their usual tricks
a while back nvidia was caught with performance lowering drivers to force users to upgrade
Oh wow i forgot about that. Not like they got in trouble for it. I am seriously leaning towards AMD for my next upgrade. At least they try to be honest with us. Maybe even INTEL if they can manage to get some more cards with competative performance against NVIDIA.
I know but show me a 3070 ti with 16 gigs of vram. I was unable to locate one and I was told NVIDIA did not make one yet and probably wouldn't either. Guess what, only 8 gig version was produced. Not long after i purchased my 3070 ti, Nvidia was anouncing the 40 series. And if games are made to run with more than 8 gigs then gpu manufacturers should be producing video cards with enough vram and what do you know, now they are. At least AMD and INTEL figured it out and cared enough to put the nesesary amount of vram on their cards. Whereas Nvidia figured they'll wait for a few thousand more people to spend exorberant amounts of money to play the games. Look where it got them today. Now they listen to us gamers sort of.
That's just the normal Nvidia product cycle since the 20-series:
1. Release nerfed product as the new high-end for ridiculous prices. Make sure everybody knows it's the best of the best.
2. Release slightly less nerfed refresh for ludicrous prices. Tell everybody that it's even better than the last one.
3. Generation change: rinse and repeat.
The 16 GB 3070 Ti would have broken this cycle by not being nerfed. That cannot happen these days.
Indeed. Don't get me wrong, I like NVIDIA. Over the years they have produced some amazing technology and were able to apply it to their GPUs and voilet you have a blazing fast video card that rocks. But AMD has been able to produce some great cards and technology themselves. And that has allowed them to stay competative with nvidia. Something we seem to forget is AMD is fighting two different wars on two fronts. On the one side, they are competeing with INTEL in tge CPU market and doing a bangup job of it. Yay AMD go get them. But on the other side they are also competeing with NVIDIA in the graphics market, and not doing a bad job or that either. Admittedly they could do better if they could put more people hrs into R & D for the GPU devision. I believe that given enough time, they will persavere and take NVIDIA on a long run in gpu labd.
for my use case (older mmos at 3840x2160; no RT, no DLSS, but 4K all maxxed options) I plan to get the cheapest 16GB card of this gen with decent horsepower I can. My 2070 8GB bogs down pretty hard regularly, although my ancient cpu is prolly part of the problem (planning to do a whole new rig). This might do the trick, depending on price/perf vs higher tier or the upcoming offerings from team red. C'mon AMD, step up already!
Literally nothing I play supports either RT or DLSS, nor is it likely to be added. And for anything other than games (office, browsing, etc), even an igp is good enough, so the card is really just for my MMOs. And I doubt I'm the only one.
After the precedent they set with a 3060 12 GB and 2060 12GB VRAM editions having more VRAM than cards higher up the stack, nothing surprises me anymore.
4060TI/4060 with 16 GB VRAM could unironically be the most future proof budget card in the Ada line up compared to the 4070/4070TI with just 12 GB VRAM once games require 16 GB VRAM for maximum texture quality.
Which will be the reality in a few years time.
What happened with the 3070 will happen with the 4070 again down the road.
Now why praytel did nvidia not do that with the 3070 ti. It only has 8 gig version. But it seems all the other midrange cards have two flavours of vram. Nvidia cheapened out because they knew the 40 series was coming and just could not be bothered. Shame on you nvidia.lol. I did pay a lot for that gpu. So I kinda have a right to be a little pissed about it. That being said, the 4060 ti 16 gig version would be ok for higher resolutions. But I still think the bus width for the 40 series on everything below the 4090 should be to the same specs as yhe previous generation. Maybe they did it this way so the 50 series will get more "wows" when they use the same bandwidth as the 30 series respectively. With more RT cores and DLSS 3 is revamped with more A.I. involvement. I think the 50 series will be a smashing success. But if it is better than the 40 series like the 40 was better than the 30 series. We could have a problem as the 8 gig gpu's will be useless in a couple more years. My next upgrade will be at least 16 gig vram gpu.
Lovelace has the same bus as previous generations, but new names that not correspond to its percentage of cudas (tmu, rt, tensor, etc). If you look at the real gpu instead of the fake new name, then you'll understand it. For example:
All I can tell you is that Nvidia works in mysterious ways. They do oddball things from time to time and their logic is unfathomable when they do it.
Edit: I could reference all the way back to the Kepler series. The GTX 780 came with a 6 GB VRAM version and the GTX 780 Ti, while the GPU was considerably faster, only came with 3 GB VRAM.
My theory on that is they wanted to sell more 780s for SLI. Giving 6GB to the Titans also was a sell point. $1000 Titan Black was basically a 780 Ti with 6GB.