Monday, June 27th 2022

NVIDIA GeForce GTX 1630 Set To Launch Tomorrow

The NVIDIA GeForce GTX 1630 is set to be officially unveiled tomorrow as a successor to the GTX 1050 Ti with Colorful already listing one such model on their website. The GTX 1630 will be an entry-level card featuring a TU117-150 GPU with 512 CUDA cores running at 1785 MHz paired with 4 GB of GDDR6 memory on a 64-bit memory bus for a total bandwidth of 96 GB/s. The leaked Colorful GTX 1630 BattleAx features a dual-fan cooling solution, triple display connectors, and an additional 6-pin power input essentially copying the company's GTX 1650 model. The NVIDIA GeForce GTX 1630 will be available from multiple board partners when it launches tomorrow and could reportedly retail for ~150 USD according to some Chinese retailers.
Source: VideoCardz
Add your own comment

70 Comments on NVIDIA GeForce GTX 1630 Set To Launch Tomorrow

#1
P4-630
Finally!! I was waiting for this for years!!! :D
Posted on Reply
#2
natr0n
I dont think i have seen a pure alu cooler in forever. Kinda cool/neat
Posted on Reply
#3
Daven
Too little too late as the Nvidia fans would say. Or at least the ones who commented on the launch of the RX 6700.

FYI, Companies can launch any products at any time regardless of those sitting at home reading tech sites. You can call it Daven’s law.

:)
Posted on Reply
#4
FreedomEclipse
~Technological Technocrat~
that shroud design reminds me of MSi gaming line of graphics cards

Posted on Reply
#5
bonehead123
.....w.T.f....

Maybe I'm just missing something here, but why on earth would such a low-end card need a FULL aluminum heatsink PLUS 2 big fans to cool it... ????

Is there something inherently different about this card that makes it run extra hot, or were the designers were just extra lazy and didn't give a sh*t about the thermals when they laid out all of the parts, or are the parts of such low quality that they create runaway temps ???
Posted on Reply
#6
PLAfiller
I wonder if we will get an LP version of this, due to the fact there is one for GTX1650 already.
Posted on Reply
#7
agatong55
bonehead123.....w.T.f....

Maybe I'm just missing something here, but why on earth would such a low-end card need a FULL aluminum heatsink PLUS 2 big fans to cool it... ????

Is there something inherently different about this card that makes it run extra hot, or were the designers were just extra lazy and didn't give a sh*t about the thermals when they laid out all of the parts, or are the parts of such low quality that they create runaway temps ???
Use what you already have, why make a new heatsink or fan design when its cheaper to use what you already have in stock.
Posted on Reply
#8
DeathtoGnomes
bonehead123.....w.T.f....

Maybe I'm just missing something here, but why on earth would such a low-end card need a FULL aluminum heatsink PLUS 2 big fans to cool it... ????

Is there something inherently different about this card that makes it run extra hot, or were the designers were just extra lazy and didn't give a sh*t about the thermals when they laid out all of the parts, or are the parts of such low quality that they create runaway temps ???
Yes this card should have been the size of a Nano.
agatong55Use what you already have, why make a new heatsink or fan design when its cheaper to use what you already have in stock.
Thats a thought, over stocked of cooler and fans.
Posted on Reply
#9
80-watt Hamster
There's going to be slot-powered models of the 1630, right?
Posted on Reply
#12
defaultluser
Well, at lest they stepped-up to the gold with GDDR6 memory. I was concerned that they would continue the castrated 64-bit card for one more generation!!

now, we only need to see how many devices sporting ddr4 show up?
Posted on Reply
#13
Chaitanya
So a 6 pin connector necessary for "entry level" GPU, what a great time to be alive.
Posted on Reply
#14
catulitechup
150us for this outdated turing gpu, capped, dont know if nvidia cut encode capabilities like gt 1030, 64bit memory, less shaders compared gtx 1050ti...................................only can respond this



:)
Posted on Reply
#15
ModEl4
It doesn't make sense to impose a 6pin requirement based on market positioning/specs, it is like they doing it on purpose just to make the public get used to the high TBP era having no alternative solution in any price level except unnecessary overpowered versions, there is no logic (no logic, just like what US Open committee is doing to Djokovic's case making unreasonable impositions / requirements without any relevant scientific basis, but that is another subject, sorry)
Posted on Reply
#16
Vayra86
A dual slot and you still need power from pcie for this utter piece of junk?!

What the hell is the market for this? Russia? There's even a reintroduction of DVI on this baby, man I had half expected a VGA next to it. What's next, GDDR5 makes a return in the midrange?

It truly is one step forward and two steps back with Nvidia ever since they adopted RTX. Holy shit, what a cesspool. Luckily 450W ADA is around the corner boys!
Posted on Reply
#18
ThrashZone
Hi,
3050 nv is 250.us if they really wanted to do some good drop it's price to 150.us which sounds about right.
Maybe they just wanted to feed the memes some more after the pitful 1650 drop :laugh:
Posted on Reply
#19
ARF
I know what they want :D

Posted on Reply
#20
Chrispy_
WHOA HOLD UP A MINUTE

512 CUDA cores? I was expecting 1280 CUDA cores or something. I know the way a Turing "core" is classed compared to Ampere makes this better than it sounds compared to the 3050's 2560 CUDA cores, but this thing is going to be slower than the vanilla 1650 which you sometimes find on clearance for $125 or so, and is flooding the used market for ~$100 (converting to USD but I'm looking in EUR and GBP markets).

If it's 512 Turing CUDA cores and boosts to the moon (like, >2.6GHz), it's still going to only just match the woeful laptop 1650 max-Q (1024 cores at 1290MHz) and that was slated for being too slow compared to the variant without max-Q to be worthy of the name. So if the desktop 1650 was disappointing performance for its price and power draw, the laptop variant was even more disappointing and the max-Q variant was bad even by those doubly-disappointing standards. Oh, and we're talking about disappointing in the market three years ago, not tomorrow.

I'm guessing that this is still TSMC 12nm, not Samsung 8nm - and therefore boosting to the moon isn't a reality. At a more realistic 1700MHz, this thing isn't going to distinguish itself much from a GTX960, something you can pick up on ebay for $50.
ARFWhat does it target? :kookoo:
"Intel HD graphics" in a 3-year-old Dell/HP office PC
ARFNVIDIA GeForce GTX 1630 Specs | TechPowerUp GPU Database
OH SHIT
Has @W1zzard broken the review embargo and spoiled "the surprise" by uploading his review data to the database too early, or is this just estimated data based on (core count*clocks) until otherwise tested?
Posted on Reply
#21
ModEl4
ARFWhat does it target? :kookoo:


NVIDIA GeForce GTX 1630 Specs | TechPowerUp GPU Database
Probably we will have versions like 75W TBP 1650 that don't need 6pin.
If Nvidia is greedy maybe they will go for a similar to RX6400 performance/$ for those that want to upgrade (an old Skylake system for example) pricing it at $119, who knows? (It won't be slower than -38% vs GTX 1650 irrespective from what TPU database says, i hope @W1zzard to test it soon)




at $149 tune probably the below ( due to currency lol):
Posted on Reply
#22
Chrispy_
I found this - the T400, which is 384 CUDA cores on the same silicon as the GTX 1630 but 80GB/s of memory bandwidth instead of 96GB/s

So, it's probably very slightly nicer to game on that this dumpster fire which basically can't handle any games made in the last four years.

Posted on Reply
#23
AusWolf
First thoughts: Dual fan design and 6-pin power? Isn't that excessive for a card like this?

Second thoughts after looking at the heatsink: Oh... It'll probably be noisy as hell anyway.
Posted on Reply
#24
Alan Smithee
The big question is the NVENC/NVDEC feature set. We sorely need a low-end card with decent memory bandwidth for non-gaming (business & entry level content creation, e.g. training session re-edits and basic YouTube). The 1050 had a full encode/decode feature set for its time, but the 1030 did not. The "1630" name thus worries me. Is this really a replacement for the 1050 or the 1030?
Posted on Reply
#25
Dr. Dro
ARFWhat does it target? :kookoo:


NVIDIA GeForce GTX 1630 Specs | TechPowerUp GPU Database
Unreleased product, don't take that estimate as a gospel - besides you're comparing a full-die AMD card vs. an hilariously crippled last-generation product. The full TU117 wouldn't fare as poorly. This graphics card is a GT 1030 replacement, if the name didn't make that abundantly clear by now. These are intended to add multimedia support to computers without, or obsolete integrated graphics, at best very light gaming.

Navi 24 is certainly faster, but it's an incompetent HTPC GPU due to its poor display engine (inability to drive more than two displays) and limited support for media handling (no encoding capabilities whatsoever, limited decoding support). Pick your poison, do you want to play games or do you want multiple display-outs and the ability to transcode and watch movies? If it's the latter, the 1630 will be a better product to own.
Alan SmitheeThe big question is the NVENC/NVDEC feature set. We sorely need a low-end card with decent memory bandwidth for non-gaming (business & entry level content creation, e.g. training session re-edits and basic YouTube). The 1050 had a full encode/decode feature set for its time, but the 1030 did not. The "1630" name thus worries me. Is this really a replacement for the 1050 or the 1030?
It is but an assumption but I feel it's a safe one to make, that this would carry the same NVENC/NVDEC capabilities of the GTX 1650. It's the same die, and it's being marketed as a GTX, not a GT. Besides, being slower, NVIDIA needs this advantage against AMD, especially since Navi 24 inherently lacks the capability to do this even in the 6500 XT - the hardware simply cannot do it.
Posted on Reply
Add your own comment
May 29th, 2024 06:42 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts