Thursday, May 19th 2022
NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6
The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.
Source:
VideoCardz
140 Comments on NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6
Edit: Also unlike AMD 6400/6500 there won't be stupid limitation of PCI-e 4x for entire stack rather an option left out for AIB makers for SFF/HTPC cards.
What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
At this point apparently, anything goes. I mean, what is this, honestly? Is this Nvidia releasing a single Intel Arc Killer here? :D Spec wise it seems about right...
The Integrated graphics on the 9700K tends to choke when using the HLSL CRT shaders, which can be too bandwidth heavy when running at a 4K native resolution, but one of these would be able to do the job quite nicely.
All depends on the price.
A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
EDIT: Just rechecked RGHD video with it. Mea culpa I was way in over my head lol, he had awful fps on RX6400 on RTX path.
The most prominent 'feature' of that, is that it shows your old card is lacking hardware to run something you probably didn't care about with a proper rasterized lighting implementation that has every potential to look just as good and run faster.
AMD might have a much stronger long term strategy here by limiting their dedicated RT hardware. Seeing as card TDPs are going through the roof with its competitor that is adamant to keep pushing it, and with prolonged and increasing pressure on resources and production capacity... Its really going to be interesting how this develops. RT only for the happy few will die a certain death, that much is certain.
Let's be honest most great games of the past will never see any work done on them, it's too time consuming and it can go horribly wrong in the wrong hands. XIII for example, it was best they just haded RTX to it (not that it's the best game for rtx, but serves for an example of a remake gone wrong)
Are we going to have a new chip based on Turing down the line (132-123mm²?) or this is just a stopgap product with very minor stock and life time because i don't see Nvidia throwing away potential 1650 profits for extensive amount of time?
The performance will be much slower than 6400, it will be slower than a 2016 $139 1050Ti, RX 560 4GB was $119 back in the day but it will be slower than 1630, so if it is based on TU117 probably Nvidia will not go less than $119?
I was hoping for a new smaller chip eventually in order to hit the magic $99 price point.
You are making an Apples to Oranges comparison.
~ 650 Ti Boost. minimum I would buy is a 4050
define people"" or the workloads, 1630 should be fine for a workstation.
The original reference clocked 1650 (1665MHz boost) had very high actual boost clock (1890MHz) so I don't think that a 3% (1950MHz at max) in actual frequency (if it's even 3% and not similar...) will do much to TDP.
Shouldn't the design be at most 60W (or less) in order to offer 1 slot low profile solutions like 1050Ti or 6400?
www.techpowerup.com/review/evga-gtx-1650-sc-ultra-black/33.html