Thursday, May 19th 2022
NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6
The NVIDIA GeForce GTX 1630 graphics card is set to be launched on May 31st according to a recent report from VideoCardz. The GTX 1630 is based on the GTX 1650 featuring a 12 nm Turing TU117-150 GPU with 512 CUDA cores and 4 GB of GDDR6 memory on a 64-bit memory bus. This is a reduction from the 896 CUDA cores and 128-bit memory bus found in the GTX 1650 however there is an increase in clock speeds with a boost clock of 1800 MHz at a TDP of 75 W. This memory configuration results in a maximum theoretical bandwidth of 96 GB/s which is exactly half of what is available on the GDDR6 GTX 1650. The NVIDIA GeForce GTX 1630 may be announced during NVIDIA's Computex keynote next week.
Source:
VideoCardz
140 Comments on NVIDIA GeForce GTX 1630 Launching May 31st with 512 CUDA Cores & 4 GB GDDR6
from where it stands there will be no direct competitor to the GTX 1630 as nothing falls in that segment as performance can’t be to close to the GTX 1650 Not as a bundle but just the combined total
this card seems more a response to intel arc a310 than rx 6400
:)
I almost feel like things are regressing instead of progressing.
Sure you do it's called a console they to have dropped in price.
AMD Ryzen™ 7 5700G | AMD
Normal gaming graphics begins from Radeon RX 6600...
AMD Radeon™ RX 6600 Graphics Card | AMD
AMD Radeon™ RX 6600 XT Graphics Card | AMD
ASRock RX 6600 Challenger ITX Specs | TechPowerUp GPU Database
AMD Radeon RX 6400 Tested on PCI-Express 3.0 - Assassin's Creed Valhalla | TechPowerUp
for actual games as your said rx 6600 is a beggining of 60fps
resuming gtx 1630 and similars are more for media pcs (decode capabilities) and light / older games
:)
It should be at least +15% more from where they place it.
Around 89% of 1050Ti and that's the worst case scenario imo:
(Not to mention that 1050 isn't going to be only -20% vs 1050Ti in today's TPU test due to 2GB ram)
Intel ARC A310 isn't supposedly a 512 shader cores design?
It should be a little bit slower than 1630 despite the frequency advantage.
But if someone doesn't interested in older games and if the SRP is competitive, ARC A310 is an interesting alternative due to DX12.2 Ultimate support and the better media engine, plus the performance difference will going smaller and smaller each year as Intel's drivers mature in time and future games have better support for Intel ARC architecture.
Your argument is that if they have lots of TU117 to through away (what, they had secretly building so much excessively defective stock the last 3 years, as if 12nm TSMC wasn't one of the highest yield processes of the post FinFet era and especially Nvidia custom 12nm (12FFN) that emphasizes more on density and yield instead of frequency, allowing Nvidia to attempt a consumer 754mm² monolith design while at the same time TU117 is only 200mm², so whatever excessively defective stock there is, it's so small that the workstation market would have easily absorbed it, but forget all about that because I don't want to have meaningless arguments for nothing) they better sell them, OK let's suppose it's that case, they have lot's of defective stock (cut in half defective) and they must sell them, what will force them to match GT1030 SRP, is there competition? Back in the day at Q2 2017 when gt 1030 launched we had RX 560 2GB at $99 and RX560 2GB is 1.8X faster or around that range vs gt 1030 on 1080p high, now RX 6400 is $160 and it will be at best case scenario 1.5X faster than gt 1630 4GB in today's 1080p TPU setup and let's not talk about the -14% PCI-E 3 deficit, so is there a reason for Nvidia to sell it less than $119? (unless it takes account the upcoming price compression current ≤$499 model lineup will face after Ada Lovelace and RDNA3 launch and of course let's not forget Intel's ARC series that with the delay they facing, Intel will be forced to price their entire lineup based on next gen competition, but still $99 is the limit due to die size and cost process differences imo, although the die size difference is enough support my case, I would argue that the node price situation isn't what you have in mind but again let's agree to disagree)
They have competition from AMD and if leaks are to be believed will have strong competion from Intel at this price point, doesn't make sense to design a new chip for the lower end of the market when they have a well established one ready to fill the gap. They could also discount the 1650 which would be a lot more appealing, I guess it will all depend on how the market moves.
They are already selling an even more cutdown version of this as the Quadro T400 and the slighly better Quadro T600, search for benchmarks on those cards for a closer aproximation of performance offered, this will be in the middle.