Thursday, May 19th 2022

NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.
Source: VideoCardz
Add your own comment

140 Comments on NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

#126
catulitechup
ModEl4TPU estimation is wrong.
It should be at least +15% more from where they place it.
Around 89% of 1050Ti and that's the worst case scenario imo:
(Not to mention that 1050 isn't going to be only -20% vs 1050Ti in today's TPU test due to 2GB ram)


Intel ARC A310 isn't supposedly a 512 shader cores design?
It should be a little bit slower than 1630 despite the frequency advantage.
But if someone doesn't interested in older games and if the SRP is competitive, ARC A310 is an interesting alternative due to DX12.2 Ultimate support and the better media engine, plus the performance difference will going smaller and smaller each year as Intel's drivers mature in time and future games have better support for Intel ARC architecture.
curiously gtx 1630 seems more weak than gtx 1050 in some areas for example:

gtx 1050 have 32 rops meanwhile gtx 1630 have 16 rops

gtx 1050 have 40 tmus meanwhile gtx 1630 have 32 tmus

gtx 1050 have 640 shaders meanwhile gtx 1630 have 512 shaders

however gtx 1630 have 4gb of vram meanwhile gtx 1050 non ti only 2gb

in my case stay very interested in arc because i dont want give more money to nvidia or amd because my use is only for basic things like as playback videos or light gaming and emulators

but gtx 1630 price can provide a better idea about which price could have intel arc a310, at least as your said drivers can be trouble specially for windows users but in my case use linux most times and linux drivers normally are better than windows drivers

as other said this companies obtain huge gains for believe them cant offer products in 100us or lower, this companies dont have any excuses to dont offer 100us products because them take huge advantage of market in mining times because many of us cant buy any gpu

:)
Posted on Reply
#127
80-watt Hamster
Something to remember is that the 1630 isn't even out yet. It's not going to be great (it's not meant to), and will almost certainly be overpriced (at this point, how could it not?). To those calling this a "garbage" card and the like, so what? Should this not exist because it doesn't meet your performance requirements? That Nvidia is launching this card at all suggests there's a market for graphics that are stronger than integrated and don't cost USD200. Not being capable of 30fps at 1080p High in CP2077 doesn't mean there isn't a use case.

But the 64-bit memory bus seems like a BS call. It'd be a shame for another card to contain a chip held back by its own memory bus. Looking at you, 6500 XT.
Posted on Reply
#128
R0H1T
trsttteThey don't necessarily need bad chips to sell a lower end product, they can just fuse off perfectly fine chips to fill up shelf space and the product lineup (this is a pretty normal practice as stupid as it sounds). They also probably have left over stuff from the mobile MX550 which doesn't make much sense even for the marketing value of an nvidia sticker on the laptop.
Sure, I was talking about one (of the few) scenario where NVIDIA could sell it rather cheaply, mainly if they were forced to.
ModEl4I explained the reasons imo why it won't match gt 1030 price, let's agree to disagree.
I didn't say they would, it was how (if they needed to) price it really low i.e. instead of throwing away lots of defective parts sell them cheap.
Posted on Reply
#129
ModEl4
catulitechupcuriously gtx 1630 seems more weak than gtx 1050 in some areas for example:

gtx 1050 have 32 rops meanwhile gtx 1630 have 16 rops

gtx 1050 have 40 tmus meanwhile gtx 1630 have 32 tmus

gtx 1050 have 640 shaders meanwhile gtx 1630 have 512 shaders

however gtx 1630 have 4gb of vram meanwhile gtx 1050 non ti only 2gb

in my case stay very interested in arc because i dont want give more money to nvidia or amd because my use is only for basic things like as playback videos or light gaming and emulators

but gtx 1630 price can provide a better idea about which price could have intel arc a310, at least as your said drivers can be trouble specially for windows users but in my case use linux most times and linux drivers normally are better than windows drivers

as other said this companies obtain huge gains for believe them cant offer products in 100us or lower, this companies dont have any excuses to dont offer 100us products because them take huge advantage of market in mining times because many of us cant buy any gpu

:)
In some areas 1630 is indeed weaker than GT 1050, but not so much as the on paper differences suggest.
For example the rops is indeed double but the gt 1050 design is bandwidth limited, so if you check with an old 3DMark or beyond3D or whatever the pixel fillrate, it won't be anywhere near the theoretical difference, also Turing has better compression.
Or take account the shaders, although gt 1630 has -20% it can do concurrent floating point and integer operation so the effective IPC is not the same.
Videocardz reported that the memory is 12Gbps GDDR6, GDDR6 also reported by the TPU, while the database mentions 8Gbps GDDR5 which is probably wrong.
Since GTX 1650 had only 1665MHz turbo frequency officially and 1630 has much higher 1800MHz and since the TDP according to the report is the same 75W despite 1630 being so extensively cutdown, I assumed that the real median frequency that we will see in a similar design is going to be higher than 1890MHz that a reference clocked dual fan 1650 can achieve and I went with a 3% (1950MHz) saying that in this case it will be at worst 89% of GTX 1050ti.
1950MHz is a little bit on the high side, so not my most preferable frequency prediction but even in the case that the actual median frequency ends just the same with GTX 1650 then it will be at worst 86% of GTX 1050ti.
In any case, it will be a lot faster than 1050 2GB in today's TPU games setup. Even in 2016 where 2GB memory wasn't such a handicap as 2022, GT 1050 was only 79-80% of GTX 1050Ti in 1080p high, so despite the on paper advantages GT 1050 will be slower than 1630 imo.
If you ask me, I would give Intel a chance if the pricing is competitive (despite the fact that I have many older DX11 games that I still didn't have the chance to play yet, so I don't think that I would have a good experience in some of them) and the reason is that I want to support them because we need a third player imo.
Posted on Reply
#130
budget_Optiplex
As a low-end gamer running an ancient system with a 280W power supply this card looks interesting to me as a low power upgrade over my current GTX 745 that performs well enough for me on my 1360x768 TV. I'm limited to PCI-E 2.0 so the RX6400 is out, I suspect this will still be PCI-E x16 so I'm hoping it will be reasonably priced. It would be nice to buy a new card for a change instead of my usual well used ones.:p

Looking forward to seeing how this card performs and is priced!
Posted on Reply
#131
catulitechup
ModEl4If you ask me, I would give Intel a chance if the pricing is competitive (despite the fact that I have many older DX11 games that I still didn't have the chance to play yet, so I don't think that I would have a good experience in some of them) and the reason is that I want to support them because we need a third player imo.
In my think in same way about arc gpu and really needs another player as your said, for improve the market

:)
Posted on Reply
#132
Tartaros
budget_OptiplexI'm limited to PCI-E 2.0 so the RX6400 is out
I doubt pcie 2.0 will be that noticeable. You get upgraded video decoding too.
Posted on Reply
#133
trsttte
TartarosI doubt pcie 2.0 will be that noticeable. You get upgraded video decoding too.
In this case it very much is because the RX6400 is only running 4 lanes (noticeable for intensive applications, being a display output shoud not see any difference).
Posted on Reply
#134
AusWolf
TartarosI doubt pcie 2.0 will be that noticeable. You get upgraded video decoding too.
It is noticeable with the 6400 because it only uses x4 lanes.
Posted on Reply
#135
AleXXX666
ArkzPretty sure it's for HTPC users that usually use a 1030 or 710.


Well it is a 10 series card really. Hence GTX not RTX. And it's obviously pretty low level for HTPC use, running entirely from board power. Wouldn't expect it to have RT features. And even if it did it would run Q2 RT like crap.
HTPC is NOT a gaming PC, come on, why you need "power" there lmfao
ThrashZoneHi,
Guessing nvidia missed the TPU poll @W1zzard
Would you buy a 4gb GPU in 2022 I believe the results were a resounding no :laugh:
not every single person builds a pc for gaming lol. and having some HDMI exit for TV watching is better than buying overpriced MB because of HDMI port with trashy iGPU lol
Posted on Reply
#136
lexluthermiester
AleXXX666HTPC is NOT a gaming PC, come on, why you need "power" there lmfao
Why not? Most people do both with an HTPC. A Home Theater Personal Computer is usually going to be an entertainment device and there is no reason why it can't and shouldn't do gaming.
AleXXX666not every single person builds a pc for gaming lol.
While true, most people would like to have that option.
Posted on Reply
#137
AleXXX666
lexluthermiesterWhy not? Most people do both with an HTPC. A Home Theater Personal Computer is usually going to be an entertainment device and there is no reason why it can't and shouldn't do gaming.
well, one could build i9-12900K-powered thing and call it "HTPC". let's name stuff with it's own names - a HTPC is usually more multimedia-oriented not a beefy performance wise thing with good enough APU unit, well, some persons are strange putting 3060Ti inside but at the same calling it "HTPC":D
Posted on Reply
#138
kapone32
AleXXX666well, one could build i9-12900K-powered thing and call it "HTPC". let's name stuff with it's own names - a HTPC is usually more multimedia-oriented not a beefy performance wise thing with good enough APU unit, well, some persons are strange putting 3060Ti inside but at the same calling it "HTPC":D
I had a 5600G based system attached to my TV. I would call that HTPC as I was using it to watch Disney +, DAZN, My TV provider, Youtube and my easy to run Steam Games. The MB has a HDMI 2.0 connector so I unable to get 120HZ at 4K. I had a RX570 but that was the same. I found a 6500XT and now enjoy all of those things at 120Hz refresh with VRR. I have no problem turning it down to 1080P for Games I like to play on my TV but it is still a HTPC. I will admit that the TV being smart means that as soon as my Daughter starts Grade 1 it will be hers.
Posted on Reply
#139
Arkz
AleXXX666HTPC is NOT a gaming PC, come on, why you need "power" there lmfao


not every single person builds a pc for gaming lol. and having some HDMI exit for TV watching is better than buying overpriced MB because of HDMI port with trashy iGPU lol
To have the GPU do HW decoding probably.
Posted on Reply
Add your own comment
Dec 19th, 2024 20:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts