Thursday, May 19th 2022

NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.
Source: VideoCardz
Add your own comment

140 Comments on NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6

#101
Durvelle27
ARFNo, GTX 1630 is a direct competitor to RX 6400..
Maybe nvidia fears that AMD will sell many RX 6400s and it feels that launch should get a response.

The red circle will be the performance estimate for a GTX 1630..


AMD Radeon RX 6500 XT Specs | TechPowerUp GPU Database
Nope as the RX 6400 trades blows with the GTX 1650 Non Super

from where it stands there will be no direct competitor to the GTX 1630 as nothing falls in that segment as performance can’t be to close to the GTX 1650
Vayra86The reasons are for many intents and purposes, non-issues, that is the point people are trying to make.

It matters when and if you are looking for those specific features, and if you did, you wouldn't buy said card missing them. If you don't, the gaming performance compares them best. Its like expecting it to RT, obviously it won't and even if it did, why would anyone care - in much the same way if the 6500XT was the subject.

See comparisons matter because things are available at similar price points, that is the primary concern for a customer - what can I get for X money; or if you value the feature over the expense: what does it cost to get feature X. Everything at any price point you pick, is by definition comparable and you compare on features that you need, and/or performance that you'd use.

No need to keep going on about it, but I hope that explains better why you guys have talked past each other the last page :D



Yeah man, I bought a 1080 for 420 eur in 2017... feels close to winning the hindsight lottery. Card's still relevant, if you don't RT all you want to run, runs on it... I'm even on 3440x1440 and haven't been backing down below High. It is that fact that keeps me solid in the RT=Stick it somewhere I don't see it-camp. I don't miss it for a second and any tech that inflates GPU pricing across the board should be a fat no-no right now - and that's not even considering the climate impact of making processing jobs like this more expensive for very little reason other than shareholder gains. All of this was clear when mining surged and took GPUs away from people - not the last surge, but the previous one(s). I honestly don't understand how blind people can be, or perhaps ignorance is bliss.

Could probably still sell it today for the initial price. A clear sign you don't want to even think of buying a GPU at this point...

All I can say is, I hope the upcoming gen is a good one. Otherwise its defo going to be a retro-PC for me, full of emulators and legacy stuff that'll keep me and others going for, I dunno... a few dozen years lol... that bucket list of pre-2020 content is pretty long still :)

But did you sell all three for 450 together?! Whuuu...
Not as a bundle but just the combined total
Posted on Reply
#102
ARF
Durvelle27Nope as the RX 6400 trades blows with the GTX 1650 Non Super

from where it stands there will be no direct competitor to the GTX 1630 as nothing falls in that segment as performance can’t be to close to the GTX 1650
RX 6400 in PCIe 3.0 boards.
Posted on Reply
#103
catulitechup
ARFNo, GTX 1630 is a direct competitor to RX 6400..
Maybe nvidia fears that AMD will sell many RX 6400s and it feels that launch should get a response.

The red circle will be the performance estimate for a GTX 1630..


AMD Radeon RX 6500 XT Specs | TechPowerUp GPU Database
Sadly according gpu database seems card performance is much worst than rx 6400, apparently runs like my actual gtx 1050 non ti 2gb but gtx 1630 have 4gb of vram

www.techpowerup.com/gpu-specs/geforce-gtx-1630.c3916
this card seems more a response to intel arc a310 than rx 6400

:)
Posted on Reply
#104
mechtech
I remember years ago when an intro cards were under $100. And they usually had a 128-bit bus.

I almost feel like things are regressing instead of progressing.
Posted on Reply
#105
TheinsanegamerN
mechtechI remember years ago when an intro cards were under $100. And they usually had a 128-bit bus.

I almost feel like things are regressing instead of progressing.
well you know, gamers will consoom anything, and pay royally to do so. I'm surprised it took this long for card makers to catch on.
Posted on Reply
#106
TheoneandonlyMrK
I wouldn't be suggesting this or the 6400 to gamer's, I'll leave it at that.
Posted on Reply
#107
TheinsanegamerN
TheoneandonlyMrKI wouldn't be suggesting this or the 6400 to gamer's, I'll leave it at that.
The problem is those of us with SFF builds that dont have PCIe connections have no real upgrade path available to us, especially those of us with PCIe 3.0 systems dont want to award AMD for the disaster that is the 6400.
Posted on Reply
#108
ThrashZone
TheinsanegamerNThe problem is those of us with SFF builds that dont have PCIe connections have no real upgrade path available to us, especially those of us with PCIe 3.0 systems dont want to award AMD for the disaster that is the 6400.
Hi,
Sure you do it's called a console they to have dropped in price.
Posted on Reply
#109
AusWolf
ThrashZoneHi,
Sure you do it's called a console they to have dropped in price.
A console is never an upgrade path for any PC user. It's not even an upgrade path, imo. It's just a box that you game on and then sell, or throw in the bin after a couple of years.
TheinsanegamerNwell you know, gamers will consoom anything, and pay royally to do so. I'm surprised it took this long for card makers to catch on.
It's not that simple. Game technologies haven't evolved so much that you couldn't game on a 4-5 year old graphics card. Like it's been said here, the GTX 10-series is still fine. If AMD and Nvidia can make the same 4-5 year old card at lower costs (same performance from a smaller GPU, same bandwidth on 64-bit GDDR6 as 128-bit GDDR5), then they will, and I can't blame them. We can drool over reviews of 6800s and 3080s, but the thing is: the average gamer doesn't need them. Those cards are the product of the death of Crossfire and SLi, which is reflected in chip design as well: a Navi 23 is two Navi 24s, a Navi 21 is two Navi 22s.
Posted on Reply
#110
ARF
TheinsanegamerNThe problem is those of us with SFF builds that dont have PCIe connections have no real upgrade path available to us, especially those of us with PCIe 3.0 systems dont want to award AMD for the disaster that is the 6400.
Can't you use integrated graphics as in Ryzen 7 5700G?
AMD Ryzen™ 7 5700G | AMD
Posted on Reply
#111
AusWolf
ARFCan't you use integrated graphics as in Ryzen 7 5700G?
AMD Ryzen™ 7 5700G | AMD
The iGPU in that chip is around GT 1030 level in performance. If you want something better, you have the used 1050 Ti LP, the rare 1650 LP, or the 6400.
Posted on Reply
#116
KLiKzg
They would be better off with GA 3030 versions, as TU with 1630 is about year or 2 late! :cool:
Posted on Reply
#117
ARF
AusWolfDifferent graphical settings exist to tweak performance in such cases.
I know..
AusWolfNo one said that you need the highest of the highest even with a relatively cheap graphics card.
How much does lowering the settings help, though? You see miserable 13 FPS on an RX 6400 PCIe 3. I wouldn't have too much expectations...
Posted on Reply
#118
Valantar
ARFI know..



How much does lowering the settings help, though? You see miserable 13 FPS on an RX 6400 PCIe 3. I wouldn't have too much expectations...
That's at 1080p Ultra. Ultra gaming is dumb even on high-end GPUs as it is extremely wasteful vs. the visual quality gains over the next step down - and there are plenty of performance gains to be had going even lower. 1080p medium would most likely be perfectly playable for AC:V on a 6400 on PCIe 3.0.
ARFIf you want to seriously game, you don't use a SFF.
Lol, tell that to my 5800X+6900 XT in 13l. And you can quite easily fit something like an RX 6600 in a ~5-6l case if you know what you're doing - but it's not necessarily cheap or easy. The challenge is if your case only supports LP GPUs, or you have a low power PSU without PCIe power, which is what @AusWolf brought up. There's significant value in improving performance of <75W GPUs for those cases.
Posted on Reply
#119
catulitechup
ARFWhen I see this kind of framerates, I lose motivation to look for that awful graphics card:


AMD Radeon RX 6400 Tested on PCI-Express 3.0 - Assassin's Creed Valhalla | TechPowerUp
sadly this card in performance is dont enough for actual games and is more notorius at 1080p (maybe at 720p can be better), this include other cards like rx 6500 xt and others with similar performance

for actual games as your said rx 6600 is a beggining of 60fps

resuming gtx 1630 and similars are more for media pcs (decode capabilities) and light / older games

:)
Posted on Reply
#120
ModEl4
catulitechupSadly according gpu database seems card performance is much worst than rx 6400, apparently runs like my actual gtx 1050 non ti 2gb but gtx 1630 have 4gb of vram





this card seems more a response to intel arc a310 than rx 6400

:)
TPU estimation is wrong.
It should be at least +15% more from where they place it.
Around 89% of 1050Ti and that's the worst case scenario imo:
(Not to mention that 1050 isn't going to be only -20% vs 1050Ti in today's TPU test due to 2GB ram)


Intel ARC A310 isn't supposedly a 512 shader cores design?
It should be a little bit slower than 1630 despite the frequency advantage.
But if someone doesn't interested in older games and if the SRP is competitive, ARC A310 is an interesting alternative due to DX12.2 Ultimate support and the better media engine, plus the performance difference will going smaller and smaller each year as Intel's drivers mature in time and future games have better support for Intel ARC architecture.
Posted on Reply
#121
chrcoluk
80-watt HamsterNope, though I'd love to be wrong. Sub-100 isn't viable. 1030 launched at 80 because that's how much it needed to cost to make anyone any money. There's no reason to expect its successor to cost the same or less five years later.
They must have got costs down a lot then, I paid £30 for mine brand new. It was also the GDDR5 not DRAM version.
Posted on Reply
#122
Valantar
chrcolukThey must have got costs down a lot then, I paid £30 for mine brand new. It was also the GDDR5 not DRAM version.
Either that was a pricing error or some kind of clearance, as that was most definitely sold at a loss for someone. £30 isn't a viable sales price for any GPU at retail - most likely the BOM+production costs are higher than that, never mind shipping, amortization of design costs, etc.
Posted on Reply
#123
R0H1T
Should've released this 2 years back during Covid, but Nvidia's gonna do Nvidia :shadedshu:
ModEl4I don't see an above 100mm² 12nm TSMC design being able to match it in price, let alone the 200mm² TU117 based one.
Depends on the binning, if they have lots of TU117 to throw away then they'd better sell them as this instead not to mention (manufacturing) prices must've come down a lot for this old node.
Posted on Reply
#124
ModEl4
R0H1TShould've released this 2 years back during Covid, but Nvidia's gonna do Nvidia :shadedshu:

Depends on the binning, if they have lots of TU117 to throw away then they'd better sell them as this instead not to mention (manufacturing) prices must've come down a lot for this old node.
I explained the reasons imo why it won't match gt 1030 price, let's agree to disagree.
Your argument is that if they have lots of TU117 to through away (what, they had secretly building so much excessively defective stock the last 3 years, as if 12nm TSMC wasn't one of the highest yield processes of the post FinFet era and especially Nvidia custom 12nm (12FFN) that emphasizes more on density and yield instead of frequency, allowing Nvidia to attempt a consumer 754mm² monolith design while at the same time TU117 is only 200mm², so whatever excessively defective stock there is, it's so small that the workstation market would have easily absorbed it, but forget all about that because I don't want to have meaningless arguments for nothing) they better sell them, OK let's suppose it's that case, they have lot's of defective stock (cut in half defective) and they must sell them, what will force them to match GT1030 SRP, is there competition? Back in the day at Q2 2017 when gt 1030 launched we had RX 560 2GB at $99 and RX560 2GB is 1.8X faster or around that range vs gt 1030 on 1080p high, now RX 6400 is $160 and it will be at best case scenario 1.5X faster than gt 1630 4GB in today's 1080p TPU setup and let's not talk about the -14% PCI-E 3 deficit, so is there a reason for Nvidia to sell it less than $119? (unless it takes account the upcoming price compression current ≤$499 model lineup will face after Ada Lovelace and RDNA3 launch and of course let's not forget Intel's ARC series that with the delay they facing, Intel will be forced to price their entire lineup based on next gen competition, but still $99 is the limit due to die size and cost process differences imo, although the die size difference is enough support my case, I would argue that the node price situation isn't what you have in mind but again let's agree to disagree)
Posted on Reply
#125
trsttte
R0H1TDepends on the binning, if they have lots of TU117 to throw away then they'd better sell them as this instead not to mention (manufacturing) prices must've come down a lot for this old node.
They don't necessarily need bad chips to sell a lower end product, they can just fuse off perfectly fine chips to fill up shelf space and the product lineup (this is a pretty normal practice as stupid as it sounds). They also probably have left over stuff from the mobile MX550 which doesn't make much sense even for the marketing value of an nvidia sticker on the laptop.

They have competition from AMD and if leaks are to be believed will have strong competion from Intel at this price point, doesn't make sense to design a new chip for the lower end of the market when they have a well established one ready to fill the gap. They could also discount the 1650 which would be a lot more appealing, I guess it will all depend on how the market moves.

They are already selling an even more cutdown version of this as the Quadro T400 and the slighly better Quadro T600, search for benchmarks on those cards for a closer aproximation of performance offered, this will be in the middle.
Posted on Reply
Add your own comment
Dec 19th, 2024 20:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts