Thursday, May 19th 2022

NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.
Source: VideoCardz
Add your own comment

140 Comments on NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

#1
Chaitanya
Shame on prices but hopefully those 730 and 1030 will be put to rest for good.
Edit: Also unlike AMD 6400/6500 there won't be stupid limitation of PCI-e 4x for entire stack rather an option left out for AIB makers for SFF/HTPC cards.
Posted on Reply
#2
Vayra86
LOL.

What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
Posted on Reply
#3
ramjithunder24
Nvidia with its "rereleases"... IDK how to approach the GPU market at this point lmao
Posted on Reply
#4
Ruru
S.T.A.R.S.
Vayra86LOL.

What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
The fake card sellers have got that already. :laugh:
Posted on Reply
#5
Vayra86
LenneThe fake card sellers have got that already. :laugh:
Yeah I'd not even be surprised if Nvidia bought those, gave them a refurb treatment and sold them as new again.

At this point apparently, anything goes. I mean, what is this, honestly? Is this Nvidia releasing a single Intel Arc Killer here? :D Spec wise it seems about right...

Posted on Reply
#6
Ruru
S.T.A.R.S.
Vayra86Yeah I'd not even be surprised if Nvidia bought those, gave them a refurb treatment and sold them as new again.

At this point apparently, anything goes. I mean, what is this, honestly? Is this Nvidia releasing a single Intel Arc Killer here? :D Spec wise it seems about right...

They couldn't even release GT 1010 even though it was announced :laugh:
Posted on Reply
#7
Bomby569
ramjithunder24Nvidia with its "rereleases"... IDK how to approach the GPU market at this point lmao
It's just left over chips that can't handle more than that. Certainly better than not doing anything with them.
Posted on Reply
#8
Valantar
I guess it's good to see the 1030 and 730 and all that crap get replaced? Though I can't imagine this being a particularly attractive proposition, considering that an RX 6400 matches the 1650 and the 6500 XT beats it soundly. Even accounting for both of those losing ~10% performance on PCIe 3.0 this will be slower than the 6400. And at least here in Sweden, you can get an RX 6500 for less money than any 1650.
Posted on Reply
#9
The Quim Reaper
Might be interested in one of these for a SFF arcade emulator build, using MAME, with an old 9700K CPU I have lying around doing nothing.

The Integrated graphics on the 9700K tends to choke when using the HLSL CRT shaders, which can be too bandwidth heavy when running at a 4K native resolution, but one of these would be able to do the job quite nicely.

All depends on the price.
Posted on Reply
#10
Ruru
S.T.A.R.S.
ValantarI guess it's good to see the 1030 and 730 and all that crap get replaced? Though I can't imagine this being a particularly attractive proposition, considering that an RX 6400 matches the 1650 and the 6500 XT beats it soundly. Even accounting for both of those losing ~10% performance on PCIe 3.0 this will be slower than the 6400. And at least here in Sweden, you can get an RX 6500 for less money than any 1650.
My guess is that they have leftover TU117s so better get rid of them.
Posted on Reply
#12
ExcuseMeWtf
Sigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
Posted on Reply
#13
Ruru
S.T.A.R.S.
ExcuseMeWtfSigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
Would you use RTX with a low-end card anyway..?
Posted on Reply
#14
ExcuseMeWtf
Yes, for the specific game I mentioned in that post.
EDIT: Just rechecked RGHD video with it. Mea culpa I was way in over my head lol, he had awful fps on RX6400 on RTX path.
Posted on Reply
#15
Valantar
_A.T.Omix_Will this feature an NVENC chip?
NVENC is a feature of the GPU, not a discrete chip. And as this will in 99.9% likelihood use the TU117 chip, it will most likely get the same Volta-era NVENC support.
Posted on Reply
#16
Bomby569
ExcuseMeWtfSigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
Q2 RTX is not an easy game on cards. A card like this would not handle it
Posted on Reply
#17
Vayra86
Q2 RTX is a fine display of everything that is wrong with Realtime RT. The magic wears off fast and the processing power is ridiculous for the end result you get. The cherry on top? Nine out of ten times the so called accurate lighting of RT is plain wrong, out of place, and reveals itself as a calculation not unlike any other, albeit very expensive.

The most prominent 'feature' of that, is that it shows your old card is lacking hardware to run something you probably didn't care about with a proper rasterized lighting implementation that has every potential to look just as good and run faster.

AMD might have a much stronger long term strategy here by limiting their dedicated RT hardware. Seeing as card TDPs are going through the roof with its competitor that is adamant to keep pushing it, and with prolonged and increasing pressure on resources and production capacity... Its really going to be interesting how this develops. RT only for the happy few will die a certain death, that much is certain.
Posted on Reply
#18
Bomby569
Vayra86Q2 RTX is a fine display of everything that is wrong with Realtime RT. The magic wears off fast and the processing power is ridiculous for the end result you get. The cherry on top? Nine out of ten times the so called accurate lighting of RT is plain wrong, out of place, and reveals itself as a calculation not unlike any other, albeit very expensive.
RTX on older games will never run like newer games unless they redesign the game. I still think it is awesome, and most people i know loved it too, and i would hope they used it on more games from the past. I happily live with the trade offs.
Let's be honest most great games of the past will never see any work done on them, it's too time consuming and it can go horribly wrong in the wrong hands. XIII for example, it was best they just haded RTX to it (not that it's the best game for rtx, but serves for an example of a remake gone wrong)
Posted on Reply
#19
ModEl4
This a mainstream part, at this price level the volume (demand) isn't small, but TU117 cut down in half seems bit excessive as the main release.
Are we going to have a new chip based on Turing down the line (132-123mm²?) or this is just a stopgap product with very minor stock and life time because i don't see Nvidia throwing away potential 1650 profits for extensive amount of time?
The performance will be much slower than 6400, it will be slower than a 2016 $139 1050Ti, RX 560 4GB was $119 back in the day but it will be slower than 1630, so if it is based on TU117 probably Nvidia will not go less than $119?
I was hoping for a new smaller chip eventually in order to hit the magic $99 price point.
Posted on Reply
#20
Valantar
Vayra86Q2 RTX is a fine display of everything that is wrong with Realtime RT. The magic wears off fast and the processing power is ridiculous for the end result you get. The cherry on top? Nine out of ten times the so called accurate lighting of RT is plain wrong, out of place, and reveals itself as a calculation not unlike any other, albeit very expensive.

The most prominent 'feature' of that, is that it shows your old card is lacking hardware to run something you probably didn't care about with a proper rasterized lighting implementation that has every potential to look just as good and run faster.

AMD might have a much stronger long term strategy here by limiting their dedicated RT hardware. Seeing as card TDPs are going through the roof with its competitor that is adamant to keep pushing it, and with prolonged and increasing pressure on resources and production capacity... Its really going to be interesting how this develops. RT only for the happy few will die a certain death, that much is certain.
As with all tools, it takes training, practice, and familiarity to use them correctly. It's hardly a wonder that the tool that's been used for decades is handled more competently than the one that's barely been accessible for a few years, and in a very limited state at that. We'll see how this plays out in the future, but I don't foresee any type of RT takeover any time soon. Selective, limited, but well planned and executed use where it has the greatest effect seems far more sensible and likely to me.
ModEl4This a mainstream part, at this price level the volume (demand) isn't small, but TU117 cut down in half seems bit excessive as the main release.
Are we going to have a new chip based on Turing down the line (132-123mm²?) or this is just a stopgap product with very minor stock and life time because i don't see Nvidia throwing away potential 1650 profits for extensive amount of time?
The performance will be much slower than 6400, it will be slower than a 2016 $139 1050Ti, RX 560 4GB was $119 back in the day but it will be slower than 1630, so if it is based on TU117 probably Nvidia will not go less than $119?
I was hoping for a new smaller chip eventually in order to hit the magic $99 price point.
I guess we could hope that it replaces the 1030 at its original $80 MSRP? That seem doubtful though.
Posted on Reply
#21
The Quim Reaper
Vayra86Q2 RTX is a fine display of everything that is wrong with Realtime RT
Quake 2 RTX is fully path traced RT...which is on a whole different level of GPU/CPU requirements, Its FAR more demanding, for example, than the likes CP 2077 to run.

You are making an Apples to Oranges comparison.
Posted on Reply
#22
ModEl4
ValantarAs with all tools, it takes training, practice, and familiarity to use them correctly. It's hardly a wonder that the tool that's been used for decades is handled more competently than the one that's barely been accessible for a few years, and in a very limited state at that. We'll see how this plays out in the future, but I don't foresee any type of RT takeover any time soon. Selective, limited, but well planned and executed use where it has the greatest effect seems far more sensible and likely to me.


I guess we could hope that it replaces the 1030 at its original $80 MSRP? That seem doubtful though.
Doubtful indeed, GP108 was 74mm² on a cheaper 14nm Samsung process (started at $70), I don't see an above 100mm² 12nm TSMC design being able to match it in price, let alone the 200mm² TU117 based one.
Posted on Reply
#23
Arkz
Vayra86LOL.

What's next, a re-release of the 550ti? After all it was super popular, why not? Apparently people buy anything.
Pretty sure it's for HTPC users that usually use a 1030 or 710.
ExcuseMeWtfSigh.

A budget card without RTX, so can't play Q2 RTX on that. Could on RX6400, but that one has limited features in its own right.
Is it that hard to make a completely featured entry level card?
Well it is a 10 series card really. Hence GTX not RTX. And it's obviously pretty low level for HTPC use, running entirely from board power. Wouldn't expect it to have RT features. And even if it did it would run Q2 RT like crap.
Posted on Reply
#24
ppn
Vayra86550ti? … people buy anything.
50% TU117.
~ 650 Ti Boost. minimum I would buy is a 4050

define people"" or the workloads, 1630 should be fine for a workstation.
Posted on Reply
#25
ModEl4
Also the 75W TDP hasn't changed despite being 512 CUDA cores?
The original reference clocked 1650 (1665MHz boost) had very high actual boost clock (1890MHz) so I don't think that a 3% (1950MHz at max) in actual frequency (if it's even 3% and not similar...) will do much to TDP.
Shouldn't the design be at most 60W (or less) in order to offer 1 slot low profile solutions like 1050Ti or 6400?


www.techpowerup.com/review/evga-gtx-1650-sc-ultra-black/33.html
Posted on Reply
Add your own comment
Dec 19th, 2024 14:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts