Thursday, May 19th 2022

NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.
Source: VideoCardz
Add your own comment

140 Comments on NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

#26
Bomby569
ModEl4Also the 75W TDP hasn't changed despite being 512 CUDA cores?
The original reference clocked 1650 (1665MHz boost) had very high actual boost clock (1890MHz) so I don't think that a 3% (1950MHz at max) in actual frequency (if it's even 3%...) will do much to TDP.
Shouldn't the design be at most 60W (or less) in order to offer 1 slot low profile solutions like 1050Ti or 6400?


www.techpowerup.com/review/evga-gtx-1650-sc-ultra-black/33.html
they just downclock it, low profiles are usually less efficent in cooling anyway
Posted on Reply
#27
Baum
Would be nice as low profile card to replace 1030GT or 1050... :p
Posted on Reply
#28
ModEl4
Bomby569they just downclock it, low profiles are usually less efficent in cooling anyway
Sure, but will be the downclock enough (within spec) for 1 slot solutions if the TDP is 75W?
Also it will be a little bit slower than 60W 1050Ti, will Nvidia admitting lower performance/W vs the Pascal based solution?
I'm just examine the possibilities, it may well just be exactly like the report.
Posted on Reply
#29
Ruru
S.T.A.R.S.
ExcuseMeWtfYes, for the specific game I mentioned in that post...
Even the fastest GTX card aka 1080 Ti can barely run it at 720p :(
Posted on Reply
#30
Bomby569
LenneEven the fastest GTX card aka 1080 Ti can barely run it at 720p :(
not sure, but aren't worst at RTX then even a 3050? no dedicated RTX hardware, they can't do miracles
Posted on Reply
#31
catulitechup
at simple seek seem more rival for arc a310 than rx 6400, maybe arc a3xx stay more closer but only left price

:)
Posted on Reply
#32
Ruru
S.T.A.R.S.
Bomby569not sure, but aren't worst at RTX then even a 3050? no dedicated RTX hardware, they can't do miracles
Yeah... though in the few games where RT on GTX works, in Shadow of the Tomb Raider it doesn't hurt THAT much. Though it's only the shadows which are ray-traced..
Posted on Reply
#33
Bomby569
LenneYeah... though in the few games where RT on GTX works, in Shadow of the Tomb Raider it doesn't hurt THAT much. Though it's only the shadows which are ray-traced..
SoTR is not a good RTX example/game, it's more like a gimmick on that game. Not to say it's perfect on most games, but on that one it's worst (the gimminck factor i mean)
Posted on Reply
#34
Ruru
S.T.A.R.S.
Bomby569SoTR is not a good RTX example/game, it's more like a gimmick on that game. Not to say it's perfect on most games, but on that one it's worst (the gimminck factor i mean)
Well, it doesn't ruin the performance with 1080 Ti.
Posted on Reply
#35
ThrashZone
Hi,
Guessing nvidia missed the TPU poll @W1zzard
Would you buy a 4gb GPU in 2022 I believe the results were a resounding no :laugh:
Posted on Reply
#36
Ruru
S.T.A.R.S.
ThrashZoneHi,
Guessing nvidia missed the TPU poll @W1zzard
Would you buy a 4gb GPU in 2022 I believe the results were a resounding no :laugh:
I could grab a Fury in my 2nd rig tho :cool:
Posted on Reply
#37
Bomby569
ThrashZoneHi,
Guessing nvidia missed the TPU poll @W1zzard
Would you buy a 4gb GPU in 2022 I believe the results were a resounding no :laugh:
gddr6 is expensive for a card that i think it's mean to be on the cheap side, 8gb on this card i think it would make no financial sense. I think that's why it's 4

they will release a later version with 8GB of gddr3, and another with less badwith, etc... you know nvidia :D
Posted on Reply
#38
chrcoluk
I wonder if Nvidia will ever make a 30TDP sub £/$50 card ever again.

pats my GT 1030
Posted on Reply
#39
ThrashZone
LenneI could grab a Fury in my 2nd rig tho :cool:
Hi,
Yeah I've seen fury animals and even dolls in rigs too :laugh:
Posted on Reply
#40
Durvelle27
ValantarI guess it's good to see the 1030 and 730 and all that crap get replaced? Though I can't imagine this being a particularly attractive proposition, considering that an RX 6400 matches the 1650 and the 6500 XT beats it soundly. Even accounting for both of those losing ~10% performance on PCIe 3.0 this will be slower than the 6400. And at least here in Sweden, you can get an RX 6500 for less money than any 1650.
10% is a mere indication. Most reviews put performance lost close to 20% with some games seeing loses of 40%
PCIE 3.0 is not the way to go with the 6400/6500

So hopefully with this card they give it more PCIe connects atleast x8
Posted on Reply
#41
medi01
ExcuseMeWtfRX6400, but that one has limited features in its own right
Such as.
Posted on Reply
#42
Logoffon
Surprised that they didn't use the ones from T400 (384 CUDA cores) or T600 (640 CCs) and instead went with making another cut-down chip that sits between those...
Posted on Reply
#43
ExcuseMeWtf
medi01Such as.
Video encoding/decoding formats...
Posted on Reply
#44
Valantar
LenneI could grab a Fury in my 2nd rig tho :cool:
Want one? I've got a spare :p
Durvelle2710% is a mere indication. Most reviews put performance lost close to 20% with some games seeing loses of 40%
PCIE 3.0 is not the way to go with the 6400/6500

So hopefully with this card they give it more PCIe connects atleast x8
... That's not how averages work. 10% was a bit generous, but TPU says 14%. That roughly indicates half of the tested games are at or worse than 14%, just like half are at or better than 14% - barring significant outliers. There are definitely outliers where these GPUs are absolutely trounced at 3.0, but those are an exception, not the rule. TPU's testing shows a lot of titles where the bandwidth difference is none or very minor (and some of the worse results, like AC:V and CP2077 aren't playable even on 4.0, raising the question of how lower quality settings would affect this difference). I still think AMD made a dumb mistake limiting these chips to x4, but they aren't quite as bad as some people make them out to be.
Posted on Reply
#45
Ruru
S.T.A.R.S.
ValantarWant one? I've got a spare :p


... That's not how averages work. 10% was a bit generous, but TPU says 14%. That obviously means half of the tested games are at or worse than 14%, just like half are at or better than 14%. There are definitely outliers where these GPUs are absolutely trounced at 3.0, but those are an exception, not the rule. TPU's testing shows a lot of titles where the bandwidth difference is none or very minor (and some of the worse results, like AC:V and CP2077 aren't playable even on 4.0, raising the question of how lower quality settings would affect this difference). I still think AMD made a dumb mistake limiting these chips to x4, but they aren't quite as bad as some people make them out to be.
You'll live next to me, actually I may be interested as there isn't customs or other BS :D
Posted on Reply
#46
nienorgt
Why Nvidia is getting stuck on the GTX 1XXX series naming in the last few years? Are we talking about the same Nvidia that rebranded old GPU dies to "newer series" for years in an attempt to woes customers?
Posted on Reply
#47
Durvelle27
ValantarWant one? I've got a spare :p


... That's not how averages work. 10% was a bit generous, but TPU says 14%. That roughly indicates half of the tested games are at or worse than 14%, just like half are at or better than 14% - barring significant outliers. There are definitely outliers where these GPUs are absolutely trounced at 3.0, but those are an exception, not the rule. TPU's testing shows a lot of titles where the bandwidth difference is none or very minor (and some of the worse results, like AC:V and CP2077 aren't playable even on 4.0, raising the question of how lower quality settings would affect this difference). I still think AMD made a dumb mistake limiting these chips to x4, but they aren't quite as bad as some people make them out to be.
You forgot Doom and F1 as well lose upto 70% performance at a mere 1080P. Rainbow Six Siege as well lost over 20% plus more. We can look at the averages but these are current games players are playing that suck bad on PCIe 3.0
Posted on Reply
#48
Vayra86
The Quim ReaperQuake 2 RTX is fully path traced RT...which is on a whole different level of GPU/CPU requirements, Its FAR more demanding, for example, than the likes CP 2077 to run.

You are making an Apples to Oranges comparison.
You say that, but it still doesnt show correct lighting everywhere.

Apples to Oranges is exactly the point: nobody ever asked for oranges but somehow thats all we must eat now in the green camp. You even get oranges if you never intend to eat them.
Posted on Reply
#49
Valantar
Durvelle27You forgot Doom and F1 as well lose upto 70% performance at a mere 1080P. Rainbow Six Siege as well lost over 20% plus more. We can look at the averages but these are current games players are playing that suck bad on PCIe 3.0
Yes, as I said, there are outliers, some of which are downright unplayable. Yet even when accounting for those, the average loss is 14%, which tells us that for the majority of games the loss is significantly less than 14%. As for whether these are games people are currently playing, that applies pretty broadly across the test suite, no? Or are you claiming that people play the more bottlenecked games more than the ones that perform okay?
nienorgtWhy Nvidia is getting stuck on the GTX 1XXX series naming in the last few years? Are we talking about the same Nvidia that rebranded old GPU dies to "newer series" for years in an attempt to woes customers?
Naming is arbitrary after all, and Nvidia wants to differentiate these from their more prestigious RTX cards, making those look more attractive. Nothing more nefarious than that going on.
Posted on Reply
#50
Bomby569
nienorgtWhy Nvidia is getting stuck on the GTX 1XXX series naming in the last few years? Are we talking about the same Nvidia that rebranded old GPU dies to "newer series" for years in an attempt to woes customers?
customers at these price points don't know/care about numbers, and Nvidia GTX is probably better recognized for an outsider/someone that doesn't follow closely then RTX anyway
Posted on Reply
Add your own comment
Dec 19th, 2024 20:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts