# Gainward GeForce GTX 1630 Ghost



## W1zzard (Jun 28, 2022)

The NVIDIA GeForce GTX 1630 launches today. The new card is targeted at the sub-$200 segment and goes head-to-head with AMD's Radeon RX 6400 and Intel's ARC A380. Unlike the Radeon, the GTX 1630 has support for the full PCIe x16 interface.

*Show full review*


----------



## ARF (Jun 28, 2022)

Thank you for the review - very deep and nice.

This card should have a new price - 49$.


----------



## Quicks (Jun 28, 2022)

They could have at least given it a 128 bit interface. Paying more than 60$ is too much for this.


----------



## Dr. Dro (Jun 28, 2022)

Flop at anything above $79. It must match the street price of the aging RX 550, as it is better than that card in practically every regard. It will find its market, if the price is right.


----------



## birdie (Jun 28, 2022)

Pros:

Three times as fast as the GT 1030 it replaces
Cons

Three times as expensive as the GT 1030 it replaces
Summary

Would be a great card for people with no iGPU if not for the exorbitant price


----------



## ExcuseMeWtf (Jun 28, 2022)

This is worse than I expected lmao and I didn't even expect it to match RX 6400.


----------



## watzupken (Jun 28, 2022)

The performance is not unexpected since we can take cue from how GTX 1650 performs. The price is bad, and this basically made the RX 6400 look better, instead of challenging it. To be honest, this card will have its place in systems with CPUs there is no iGPU, and graphic performance is completely unimportant. Also as to whether AV1 decoder is important or not, it really depends on the use case. For people looking to build a multimedia PC that can last for a number of years, I think it may be good to have this feature.


----------



## mahirzukic2 (Jun 28, 2022)

Absolutely bad. Even at 75$ performance per dollar is OK-ish at best. This is the sign of what kind of prices to expect for the next gen cards.



watzupken said:


> The performance is not unexpected since we can take cue from how GTX 1650 performs. The price is bad, and this basically made the RX 6400 look better, instead of challenging it. To be honest, *this card will have its place* in systems with CPUs there is no iGPU, *and graphic performance is completely unimportant*. Also as to whether AV1 decoder is important or not, it really depends on the use case. For people looking to build a multimedia PC that can last for a number of years, I think it may be good to have this feature.


Not really, because you have that already with 1030. This thing costs 150$ 20 lower than 1650 MSRP. 12% reduction in price for 40% reduction in performance. Yaaaaaay. NOT.


----------



## renz496 (Jun 28, 2022)

made to replace GT1030. nvidia can't make GT1030 forever. as for the price partner most likely going to give them huge discount soon.


----------



## Blaeza (Jun 28, 2022)

We were expecting bad, but THIS bad?  Oh dear...


----------



## defaultluser (Jun 28, 2022)

Quicks said:


> They could have at least given it a 128 bit interface. Paying more than 60$ is too much for this.




Yeah.if you thought the 6500 XT's pathetic-sized infinity cache was bad, just imagine pulling off the same trick without said cache?

they could have added a single memory chip, and bumped things to 96-bit/6gb, and maybe added $10 to the price?


----------



## ARF (Jun 28, 2022)

The only sensible explanation that I have is that nvidia is forced (contract? for a number of wafers?) to produce something using the aging TSMC process, hence they don't stop the production and move to 7 nm and 4 nm. It is ridiculous, and the bad thing is that it is always the user who loses.


----------



## Oberon (Jun 28, 2022)

*waits for the wailing and gnashing of teeth over things like the gimped PCIe interface that we saw with the low-end Radeons...*


----------



## docnorth (Jun 28, 2022)

Ouch, that’s indeed worse than 1050ti. Now given the low consumption and the 16 PCIe lines, it can have a place in the market _IF _the retail price falls to 1030 levels...


----------



## 80-watt Hamster (Jun 28, 2022)

AMD:  "You know, anything will sell in this market.  Let's take these harvested chips that our partners don't seem to want, and configure them as a discrete product in a way that doesn't even make the best use of its own resources, and sell it for too much money."  (releases RX 6400/6500)

Nvidia:  "I can't believe we haven't done that already.  Hey, intern; get on that."  (releases GTX 1630)

Everybody (mostly) hated the 1030, but at least it initially landed at a reasonable price point, particularly with any kind of discount, which wasn't usually too hard to find.  Counter to popular opinion, they _could_ game if one stuck to less-recent titles at low-to-medium detail and resolutions. The 1630 looks like it could do the same, but needs a USD30 price cut at minimum. To those asking for sub-$100, it ain't gonna happen outside of rebates or clearances, since less than that won't even cover production/distribution costs (speculating there, of course).


----------



## ShurikN (Jun 28, 2022)

Imagine releasing a card that makes the RX 6400 look amazing.


----------



## Punkenjoy (Jun 28, 2022)

ARF said:


> The only sensible explanation that I have is that nvidia is forced (contract? for a number of wafers?) to produce something using the aging TSMC process, hence they don't stop the production and move to 7 nm and 4 nm. It is ridiculous, and the bad thing is that it is always the user who loses.


Well i suspect they wanted to get a super high margin card for the low end to benefits from the GPU price hike due to crypto. The bubble just busted a little bit too early. 

But does it matter ? nah. Many people will buy this instead of a 6400/6500 XT just for the Nvidia name. In the end, it's just a refresh of the 1030.


----------



## Denver (Jun 28, 2022)

Realistically the price should be $99-120, people tend to say much lower prices, but that's probably impossible, the memories themselves must cost about $50


----------



## Pumper (Jun 28, 2022)

nvidia must have worked really hard to make a turd like that on purpose just to beat AMD as the one having the worst GPU on the market.


----------



## thewan (Jun 28, 2022)

There is no automagic for 8k youtube videos, dear mr w1zzard. if you try playing an 8k youtube video, at least on my system, it uses av1 and i doesn't fall back to vp9. Since I have a gtx 1050, it uses my rather ancient cpu to do the decoding, which more or less brings my poor PC to a halt.
So please don't use the word magic without doing proper research please. AV1 decode is a necessity, not in the future, but now in fact. 

I'm assuming that since this has dp 1.4a, it supports 8k TVs. There will be people having 8k TVs (please don't talk about practicality, people will buy them regardless they can tell the difference or not) and using this card in a htpc is a big nono based on the above.


----------



## ixi (Jun 28, 2022)

Intel A380 doesn't look that terrible at that price.


----------



## john_ (Jun 28, 2022)

AMD comes out with RX 6500 and everyone is cursing that card. AMD's idea was great in a period of scalpers and miners, offering a card that no scalper or miner would want to get, but the price, performance and feature set cut down, was definitely good reasons for the negative reaction.

What follows is ridiculous and ANTI CONSUMER. Because we have Intel and Nvidia producing cards based on DESKTOP GPUs - so no limitations there that couldn't be avoided, that are even slower than RX 6400. Today RX 6500 XT looks like a good options and IT IS NOT. It is a F regression. Just look at where RX 570 4GB is. I mean. WHAT THE H?


----------



## r9 (Jun 28, 2022)

ARF said:


> Thank you for the review - very deep and nice.
> 
> This card should have a new price - 49$.


.... and then buy it when it's on sale for $29.99


----------



## Al Chafai (Jun 28, 2022)

EVGA is listing this card for 199$ and the 1650 is selling for 179 with rebate lol.


----------



## Chaitanya (Jun 28, 2022)

Didn't expect to see the day when even AMD's heavily castrated 6500XT and 6400 look like decent value propositions.


----------



## r9 (Jun 28, 2022)

Pumper said:


> nvidia must have worked really hard to make a turd like that on purpose just to beat AMD as the one having the worst GPU on the market.


Nah the only true golden turd competitor is Intel ARC.


----------



## AnarchoPrimitiv (Jun 28, 2022)

I would love to see how this card performs against AMD's integrated 680m...just doing a quick check on notebookcheck of the 680m shows that it looks to be substantially faster


thewan said:


> There is no automagic for 8k youtube videos, dear mr w1zzard. if you try playing an 8k youtube video, at least on my system, it uses av1 and i doesn't fall back to vp9. Since I have a gtx 1050, it uses my rather ancient cpu to do the decoding, which more or less brings my poor PC to a halt.
> So please don't use the word magic without doing proper research please. AV1 decode is a necessity, not in the future, but now in fact.
> 
> I'm assuming that since this has dp 1.4a, it supports 8k TVs. There will be people having 8k TVs (please don't talk about practicality, people will buy them regardless they can tell the difference or not) and using this card in a htpc is a big nono based on the above.


Is there a demographic that exists that is in the market for a cheap 1630, BUT can afford an 8K display?


----------



## Chrispy_ (Jun 28, 2022)

Ignoring the almost uselessly low performance (even at 1080p it fails to hit 30fps in most titles tested) why is it such a tall card?

Even though it barely violates the PCI card specifications, it also has a top-mounted PCIe power connector, adding to the spec violation by another 30mm or so....

There are ATX, mATX, mITX cases, enclosures etc that do not have room for out-of-spec expansion cards. HTPC cases are a prime example of this, often needing not just an in-spec card, but a low-profile or single-slot one as well.

There are two standard GPU heights in the official spec:

Standard (full-height) = bracket height 120mm, PCB height 107mm from edge connector (effectively flush with the top of the bracket)
MD1 & MD2 low-profile = bracket height 79.2mm, PCB height 64.41mm (MD1 & MD2 vary only in length of the card)
Everything above this red line is a violation of the 107mm max height in the spec, and for such a low-power card, there's absolutely no need for such nonsense. This isn't a gaming card, because it barely runs games, so it's a media encode/decode card - _far more likely_ than most other dGPUs to be put into a space-constrained HTPC case or SFF!


----------



## ThrashZone (Jun 28, 2022)

Hi,
Engineers must of had a good laugh on this gpu request 
I'm sad for W1zard that has to take time out of his likely busy life to test these pos gpu's


----------



## Anymal (Jun 28, 2022)

What a waste of materials. Its sad and irresponsible.


----------



## ARF (Jun 28, 2022)

r9 said:


> Nah the only true golden turd competitor is Intel ARC.



Arc is the only chip that supports both AV1 encode and decode, which makes it a really good choice for HTPC cases.

Intel ARC GPUs trump AMD and Nvidia with full hardware AV1 codec support as game streaming demo vs HEVC shows - NotebookCheck.net News



AnarchoPrimitiv said:


> I would love to see how this card performs against AMD's integrated 680m
> 
> Is there a demographic that exists that is in the market for a cheap 1630, BUT can afford an 8K display?



No, but at least decode of AV1 is recommended.
Neither this nor Navi 24 (RX 6400 and RX 6500 XT) support it.


----------



## bobalazs (Jun 28, 2022)

For christ sakes at least make a rx 480 equivalent for that price.


----------



## r9 (Jun 28, 2022)

They should make a 16GB version of the card it would be best seller as people love their gigabytes.


----------



## LabRat 891 (Jun 28, 2022)

Credit where it's due: 
A. It performs correctly for the model number assigned.
B. It's at least as 'efficient' as Navi 24

Now, for the overall:
If the 6500XT was so badly panned, this thing should be ravenously hated. 
Realistically though, this could end up 'priced right' and be an acceptable option for some. -Gotta keep in mind how many consumers aren't comfortable with buying used, even when it's by far the better value proposition (Sometimes, even including "buy twice, cry twice".)


----------



## droopyRO (Jun 28, 2022)

Thanks for the review but, i wish you had tested this type of cards in 720p and 900p too. Knowing that at 4K the 1630 has 3 fps in some game is not useful.


----------



## looniam (Jun 28, 2022)

Chrispy_ said:


> There are two standard GPU heights in the official spec:
> 
> Standard (full-height) = bracket height 120mm, PCB height 107mm from edge connector (effectively flush with the top of the bracket)
> MD1 & MD2 low-profile = bracket height 79.2mm, PCB height 64.41mm (MD1 & MD2 vary only in length of the card)



not sure where you got those "official" specs from.



nor does it account for the tolerances allowed . . just saying.


----------



## ARF (Jun 28, 2022)

droopyRO said:


> Thanks for the review but, i wish you had tested this type of cards in 720p and 900p too. Knowing that at 4K the 1630 has 3 fps in some game is not useful.



It will probably run CS Source at 4K with medium settings.


----------



## jesdals (Jun 28, 2022)

Nice review - but Challenging the AMD RX 6400 at the race for bottom performance should have been the title. I do find it hard to be excited about a 4gb card with a 64 bit bus


----------



## Athlonite (Jun 28, 2022)

This review should have started and ended with these words 

IT'S A PIECE OF SHIT SO DON"T BUY IT

with a small addendum if you are this desperate for a dGPU then buy a second hand RX570/580


----------



## defaultluser (Jun 28, 2022)

Al Chafai said:


> EVGA is listing this card for 199$ and the 1650 is selling for 179 with rebate lol.





f youre already being forced into a 6-in., why not a 130w card?



			https://www.evga.com/products/product.aspx?pn=08G-P5-3551-KR


----------



## ModEl4 (Jun 28, 2022)

Like a Swiss watch, it performs exactly like i thought, -38% of GTX 1650 or -15% from a GTX 1050Ti based on the average frequency it achieved in TPU tests (1868MHz) but I though that it will have at least GTX 1650's real tested average frequency (1890MHz, so it would have been -14% from GTX 1050Ti then) since the TDP is the same and the advertised boost is 1785MHz instead of 1665MHz of GTX 1650 and the chip is cut down in half.
What is the unknown factor is the SRP, if Nvidia doesn't issue a press release with a SRP just boycott this model (also if the SRP is above $119)


----------



## mechtech (Jun 28, 2022)

Ummmm.  Lol?

also
“Neither Gainward nor NVIDIA were willing to provide any pricing information”

ya waiting for reviews and see comments 

looks like it should be 1/2 the cost of a 6400


----------



## Valantar (Jun 28, 2022)

Pumper said:


> nvidia must have worked really hard to make a turd like that on purpose just to beat AMD as the one having the worst GPU on the market.


Hey, they're the biggest actor around, they likely feel the need to truly offer a _full range_ of GPU performance - from the best of the best, down to the worst of the worst. That's what a market leader should, right?


thewan said:


> I'm assuming that since this has dp 1.4a, it supports 8k TVs. There will be people having 8k TVs (please don't talk about practicality, people will buy them regardless they can tell the difference or not) and using this card in a htpc is a big nono based on the above.


Even if this is true, the niche use case you're talking about is extremely unlikely, and, well, if you've got an 8k TV and buy this to run it ... that's PEBKAC, not a product issue. If you buy a Honda Accord and sign up for the 24h Le Mans race, you don't get to complain that you finish last. There's no such thing as an idiot-proof product.


----------



## TheoneandonlyMrK (Jun 28, 2022)

They ran out of 1030's and didn't see the dip coming IMHO, because this shit would have passed as doable a year ago, way after they would have been ordered for production.

Now, this at that price is a joke.

F$£K this existing new now is a joke the first 16 series card was in 2019 wasn't it?


----------



## usiname (Jun 28, 2022)

It wouldn't be a bad GPU if it was 35w and $60-80, but the power consumption is like 1650, the price is like 1650 and the performance is 45% worse than 1650 and 3 years later...


----------



## catulitechup (Jun 28, 2022)

droopyRO said:


> Thanks for the review but, i wish you had tested this type of cards in 720p and 900p too. Knowing that at 4K the 1630 has 3 fps in some game is not useful.


@W1zzard

thanks for test but as other said 720p test will be good because this card is a big shame for more than 720p in games, specially on newer titles

and respect to gtx 1630 at 150us price:


----------



## Garrus (Jun 28, 2022)

The RTX 3050 is 3 times faster LOL. This has got to be the worst GPU ever released. The Radeon 6400 and 6500 XT look like a steal in comparison.

This GPU will keep the GTX 1060 in people's computers for another half a decade lol... 24 average fps...


----------



## GhostRyder (Jun 28, 2022)

Ok, I expected this to be a competitor to the RX 6400...  How is it that this is completely blown away by the RX 6400 which can be had cheaper/same price in most cases I see right now (Its not like we can argue about ray tracing even).  I mean at that performance it at a bear minimum should be sub $100 and be a much smaller cooler.  I mean its a quiet card but you can get a single slot low profile RX 6400 and this looks like it should be at least a RTX 3050 at the size it sits at.

Honestly a disappointing release in my opinion.


----------



## _Flare (Jun 28, 2022)

Performs like a 4GB 1050 non-Ti.
@W1zzard are you interested in checking that?


----------



## ModEl4 (Jun 28, 2022)

_Flare said:


> Performs like a 4GB 1050 non-Ti.
> @W1zzard are you interested in checking that?


Why even bother and anyway there wasn't 4GB 1050 non-Ti for desktop if i remember.
Still GTX 1630 would have been around +7% than a theoretical 1050 4GB, if he test the 2GB version in today's TPU testbed due to memory limitation the gap will be a lot bigger than 7% logically


----------



## 80-watt Hamster (Jun 28, 2022)

Al Chafai said:


> EVGA is listing this card for 199$ and the 1650 is selling for 179 with rebate lol.



Something... not-good is happening at EVGA behind the scenes, I feel.  B-stock used to track pretty closely with used prices, but they've REALLY de-coupled over the past several months.  Now they want one to fork over an extra fifty for the privilege of buying the EVGA version of something that's not worth the $150 it's "supposed" to be?  I am disappoint.


----------



## AusWolf (Jun 28, 2022)

Gamers: "the 6400 is the worst graphics card of the decade"

Nvidia: "hold my beer"


----------



## sLowEnd (Jun 28, 2022)

It's a shame about the price. It's priced way too close to the GTX 1650.


----------



## GoldenX (Jun 28, 2022)

Man NVENC is expensive.
Remember when I said the 6500 was setting an awful precedent for the low end market?


----------



## Lew Zealand (Jun 28, 2022)

The 6400 has at least a few things going for it:

<50W for real weak PSUs
True single slot low profile card

So it can click in _anywhere _there's a PCIe slot, even an 8 year old i5 SFF office PC. And it'll still be wayyy faster than the 1630 which can't even fit in the majority of these.

The problem is that the 6400 should be $99, _maybe _$129 for the fit-anywhere convenience.  So the 1630 should be $69-89, right where the 1030 was.  These products can be OK, but only at the right price.


----------



## catulitechup (Jun 28, 2022)

AusWolf said:


> Gamers: "the 6400 is the worst graphics card of the decade"
> 
> Nvidia: "no gtx 1630 is the worst gpu in 2022"



Intel: "arc a310 hold my beer"


----------



## WeeRab (Jun 28, 2022)

And they said the RX6500 was dog.......


----------



## InVasMani (Jun 28, 2022)

Why is Nvidia launching a GPU with this level of performance getting trounced by AMD RX6400 with it's x4 PCIE link bus width with a x16 PCIE bus width it can't even leverage!!? They've got a far greater R&D budget and this is what decide to launch? Are they overcompensating for their next GPU's launch and the atrocious power draw expected?


----------



## ModEl4 (Jun 28, 2022)

Lew Zealand said:


> The 6400 has at least a few things going for it:
> 
> <50W for real weak PSUs
> True single slot low profile card
> ...


To tell you the truth, i don't even remember Nvidia having a single slot low profile competitor that launched the last 5 years except GP108 based ones at 30W or less, so RX6400 has no competition essentially for what it offers performance-wise.
But at 53W TBP it wouldn't be a good experience for gaming in a SFF chassis (for decoding it's another matter) a friend of mine had back in the day a HP 8200 Elite i5 2400 SFF and he tried a low profile GTX 1050Ti (dual slot) and it had some heat problems if i remember


----------



## Lew Zealand (Jun 28, 2022)

ModEl4 said:


> To tell you the truth, i don't even remember Nvidia having a single slot low profile competitor that launched the last 5 years except GP108 based ones at 30W or less, so RX6400 has no competition essentially for what it offers performance-wise.
> But at 53W TBP it wouldn't be a good experience for gaming in a SFF chassis (for decoding it's another matter) a friend of mine had back in the day a HP 8200 Elite i5 2400 SFF and he tried a low profile GTX 1050Ti (dual slot) and it had some heat problems if i remember



I can give you an update to that in a week if you'd like.  I have a pile of SFF Optiplex 9020s back @work (I'm offsite this week) and hopefully one of 'em has an i7 in it but I'll take the i5 if that's what I get.

Oh yeah, and a 6400.  Also a 1050Ti slot-power only, but that's full height and been in the MT Optiplex 9020 I'm testing with now with the 6400, which even at PCIe 3 is better than expected.  I expected notably worse than the 1050Ti but that hasn't been the case at all.  Generally 20% faster at playable settings (1080p ultra need not apply).  Still early days with it though and I have the Sapphire 6400, which has a better cooler, though still in that tiny form factor.  It gets to 72C in the Dell MT at continuous 100% load but just because that case has space inside doesn't mean it has decent airflow.  Can't wait until next week to see how the SFF setup goes.


----------



## Jism (Jun 28, 2022)

I cant wait for what Intel Arc brings to the table, lol.


----------



## InVasMani (Jun 28, 2022)

The bar is low enough that even Intel can beat it at graphics.


----------



## Shatun_Bear (Jun 28, 2022)

This should be 25 quid.


----------



## efikkan (Jun 28, 2022)

InVasMani said:


> Why is Nvidia launching a GPU with this level of performance getting trounced by AMD RX6400 with it's x4 PCIE link bus width with a x16 PCIE bus width it can't even leverage!!? They've got a far greater R&D budget and this is what decide to launch? Are they overcompensating for their next GPU's launch and the atrocious power draw expected?


You, like so many others in here see this product through the view of a gamer. Well, there are many non-gamers who needs GPUs too. There are many PC owners who want to upgrade a fully working desktop with better display outputs, more displays or newer codecs. Or they just want a smooth desktop experience. Or a non-gaming HTPC build. Products like this are perfect for that, as long as they are not intended for gaming.

My only objection here is the price. A MSRP of $99 would be much better.


----------



## Lew Zealand (Jun 28, 2022)

birdie said:


> @W1zzard
> 
> Could you please explain this? What's wrong with GTX 1660 Super and RTX 2060 both of which have 6GB of VRAM which is exactly the same as RX 5600 XT? GTX 1060 is suddenly 4 times faster than GTX 1660? Um, what?


You could almost argue an AMD vs Nvidia driver bug/thing, except the 1060 seems fine here.  Inconsequential as it's 4K but still an odd occurrence.

???


----------



## Chrispy_ (Jun 28, 2022)

looniam said:


> not sure where you got those "official" specs from.
> 
> nor does it account for the tolerances allowed . . just saying.


I got mine from the ATX spec, TBH the first result Google returned was ATX 2.2 and I didn't spend long hunting, I just hit CTRL+F

I think the tiny discrepancy is the height of the slot a card sits in. The PCI slot is physically taller than a PCIe slot, I guess the card sits prouder from the motherboard PCB by a few mm as a result. The spec remains unchanged and consistent either way - the top edge of the bracket is the max height according to the ATX, and by proxy also the PCI, PCIe, and AGP standards - all of which are valid cards for use in an ATX expansion slot.





Going back a few years, the reference/standard cards were all close to identical in height which seemed to be about 2mm taller than the slot cover, possibly they manufactured them as big as they were allowed to by the OEMs they were making them for, and the extra couple of mm over the spec covers the heads of the screws on the slot covers or something. It can't be coincidence that they're all practically identical over the course of the last 15 years, +/- 2mm or so...

Nvidia:




 









ATI/AMD:
















It's also worth noting that for many of the reference cards pictured above, the power connectors were on the short edge of the card to prevent the cables violating the height restriction. This is something that's only changed in the last 5 years or so, and only on some cards. Ampere and RDNA2 are the first generation from both companies where the reference/founders models have just completely given up on trying to adhere to the ATX size specification. Both the 6900 and 3090 are so completely over spec that they're not even pretending to be close.


----------



## RedBear (Jun 28, 2022)

Thanks for the review and congratulations for being the first to review this, let's euphemistically call it "weird", GPU.


InVasMani said:


> Why is Nvidia launching a GPU with this level of performance getting trounced by AMD RX6400 with it's x4 PCIE link bus width with a x16 PCIE bus width it can't even leverage!!? They've got a far greater R&D budget and this is what decide to launch? Are they overcompensating for their next GPU's launch and the atrocious power draw expected?


IMO it's simply because this GPU was designed before the recent collapse of the cryptomining market and the consequent fall of GPU prices (so much for all the poor souls who said that it wasn't because of cryptos that we couldn't find a GPU at MSRP), even if it's worse than the RX 6400 it still might have been somewhat viable at $150 in 2021, in part because of Nvidia's brand; it's pure nonsense now, but it was nearly ready for release at this point.


----------



## AusWolf (Jun 28, 2022)

catulitechup said:


> Intel: "arc a310 hold my beer"


At least there's competition not just for the position of the best graphics card on the market, but also for the worst one.


----------



## ModEl4 (Jun 28, 2022)

Lew Zealand said:


> I can give you an update to that in a week if you'd like.  I have a pile of SFF Optiplex 9020s back @work (I'm offsite this week) and hopefully one of 'em has an i7 in it but I'll take the i5 if that's what I get.
> 
> Oh yeah, and a 6400.  Also a 1050Ti slot-power only, but that's full height and been in the MT Optiplex 9020 I'm testing with now with the 6400, which even at PCIe 3 is better than expected.  I expected notably worse than the 1050Ti but that hasn't been the case at all.  Generally 20% faster at playable settings (1080p ultra need not apply).  Still early days with it though and I have the Sapphire 6400, which has a better cooler, though still in that tiny form factor.  It gets to 72C in the Dell MT at continuous 100% load but just because that case has space inside doesn't mean it has decent airflow.  Can't wait until next week to see how the SFF setup goes.


It would be interesting for people that want to upgrade an old commercial SFF PC (Although 9020 was Dell's flagship chassis, above 7020/3020, and they had also DT versions between MT & SFF models (but not for 9020 series if i remember), so depending the model, the results may be not applicable in every SFF case, still very useful)


----------



## AusWolf (Jun 28, 2022)

ModEl4 said:


> To tell you the truth, i don't even remember Nvidia having a single slot low profile competitor that launched the last 5 years except GP108 based ones at 30W or less, so RX6400 has no competition essentially for what it offers performance-wise.
> But at 53W TBP it wouldn't be a good experience for gaming in a SFF chassis (for decoding it's another matter) a friend of mine had back in the day a HP 8200 Elite i5 2400 SFF and he tried a low profile GTX 1050Ti (dual slot) and it had some heat problems if i remember


I have a low profile 6400 in my HTPC, and I can confirm, it's doing okay. Sure, it hearts up to 70+ °C if I fire up a game, but who cares? The case actually has an 8 cm fan slot right under it. If I put a fan there, the card never exceeds 65 °C. Decoding is also fine, as it can do all formats except for AV-1.


----------



## ModEl4 (Jun 28, 2022)

AusWolf said:


> I have a low profile 6400 in my HTPC, and I can confirm, it's doing okay. Sure, it hearts up to 70+ °C if I fire up a game, but who cares? The case actually has an 8 cm fan slot right under it. If I put a fan there, the card never exceeds 65 °C. Decoding is also fine, as it can do all formats except for AV-1.


Excellent!
What's the chassis/case?


----------



## looniam (Jun 28, 2022)

Chrispy_ said:


> I got mine from the ATX spec, *{SNIP}*


huh. i posted the ATX* spec, so . . . i don't think you did.

actually PCI-SIG since atx covers PSUs not add in cards.  (section 9 in attached)


----------



## AusWolf (Jun 29, 2022)

ModEl4 said:


> Excellent!
> What's the chassis/case?


It's an AeroCool CS-101.

Oh, and I keep it on the bottom shelf of an open cabinet, so it doesn't even enjoy completely unrestricted airflow.


----------



## Udyr (Jun 29, 2022)

ModEl4 said:


> To tell you the truth, i don't even remember Nvidia having a single slot low profile competitor that launched the last 5 years except GP108 based ones at 30W or less, so RX6400 has no competition essentially for what it offers performance-wise.


PNY's got your back, homie.





						PNY GeForce® GTX 1650 XLR8 Gaming Overclocked Edition
					

The GeForce GTX 1650 is built with the breakthrough graphics performance of the award-winning NVIDIA Turing™ architecture. With up to 2X the performance of the GeForce GTX 950, it’s a supercharger for today’s most popular games, and even faster with modern titles.




					www.pny.com


----------



## AusWolf (Jun 29, 2022)

Udyr said:


> PNY's got your back, homie.
> 
> 
> 
> ...


That's not low profile.


----------



## Udyr (Jun 29, 2022)

AusWolf said:


> That's not low profile.


Closest to what you can get in single slot for these GPUs.


----------



## AusWolf (Jun 29, 2022)

Udyr said:


> Closest to what you can get in single slot for these GPUs.


It's not even single slot. It only has a single slot backplate, but the cooler needs two slots, which is the worst construction, imo, as the backplate offers no structural support whatsoever (not that you need it for a plastic shroud, but still).


----------



## ModEl4 (Jun 29, 2022)

Udyr said:


> PNY's got your back, homie.
> 
> 
> 
> ...


For commercial SFF you will need low profile versions, like the MSI GTX 1650 GT V809-3250R (168x69x37mm) but the problem is the TBP imo (especially if you don't have warranty anymore for your system)


----------



## Lew Zealand (Jun 29, 2022)

Udyr said:


> PNY's got your back, homie.
> 
> 
> 
> ...



That's neither single slot nor low profile, which is the point about the RX 6400.  It does serve a specific set of use cases that this 1650 can't, even with it's technical compromises.  It should just be $30-60 cheaper.


----------



## Udyr (Jun 29, 2022)

ModEl4 said:


> For commercial SFF you will need low profile versions, like the MSI GTX 1650 GT V809-3250R (168x69x37mm) but the problem is the TBP imo (especially if you don't have warranty anymore for your system)


You're correct, and the lowest TBP I've seen for these is 75W. However, depending on the SFF size, you could argue the PNY would fit some of them, cause unfortunately it's either a low profile dual slot or single not but not such a low profile.

And to those saying this is not single slot/low profile: I'm aware of it. The original comment was more of a tongue-in-cheek (the homie should've given it away).


----------



## simlife (Jun 29, 2022)

this card is massively weaker then the ps4... from 2013.... a memory bandwith of 96 GB/s... the ps4 was 176...


----------



## AusWolf (Jun 29, 2022)

Udyr said:


> And to those saying this is not single slot/low profile: I'm aware of it. The original comment was more of a tongue-in-cheek (the homie should've given it away).


Ah, so you're saying that the closest you can get to a single slot, low profile, passively cooled, sub-75 W GTX 1630 is a dual slot, normal height one with a fan and a 6-pin power connector. I get it.


----------



## Udyr (Jun 29, 2022)

AusWolf said:


> Ah, so you're saying that the closest you can get to a single slot, low profile, passively cooled, sub-75 W GTX 1630 is a dual slot, normal height one with a fan and a 6-pin power connector. I get it.


And where does in ModEI4's comment say it has to be passively cooled? 
But we're getting unnecessarily off-topic now. Have a good one


----------



## grammar_phreak (Jun 29, 2022)

The POWER of a GTX 660ti..... 10 years later.


----------



## AusWolf (Jun 29, 2022)

Udyr said:


> And where does in ModEI4's comment say it has to be passively cooled?
> But we're getting unnecessarily off-topic now. Have a good one


I didn't get your joke, you didn't get mine. I guess we're even now.



grammar_phreak said:


> The POWER of a GTX 660ti..... 10 years later.


Yep. It could be fine for a HTPC if it came in low profile without a power connector, and for $50-80 max. Though I have a feeling that all of this is pretty steep thing to ask for.


----------



## ModEl4 (Jun 29, 2022)

simlife said:


> this card is massively weaker then the ps4... from 2013.... a memory bandwith of 96 GB/s... the ps4 was 176...


Probably you're joking, but actually GTX 1630 is at least +35% faster than PS4's GPU and if you take account the potential CPU difference (e.g. i5 8400 or i3 10100F vs 1.6GHz jaguar 7 core) at PS4 resolution (mostly 900p the last years) & settings the difference can be huge for any game that isn't optimized well for jaguar (very few i would guess)
bandwidth isn't comparable, PS4 has shared memory and GCN architecture needs a lot more memory bandwidth (GTX 1650 super achieves same performance level as RX580 with half the bandwidth and RX580 has much more memory savings features implemented than 2013 PS4's APU)


----------



## GeorgeMan (Jun 29, 2022)

That GPU could have been great if (a) it released a couple of years ago, (b) cost ~79$, (c) had similar power requirements to the GT1030 it replaces.
But unfortunately none of these conditions are met so it's yet another stagnated "You want 213% performance increase over the old gen? You pay 213% more money", just for the low end GPU market now. Disgusting.


----------



## Frick (Jun 29, 2022)

AusWolf said:


> Gamers: "the 6400 is the worst graphics card of the decade"
> 
> Nvidia: "hold my beer"



Gamers absolutely suck. The 6400 has problems, but performance is not one of them. It's basically a replacement for the RX550, and seen as such it's pretty great.

This card though... Not great. It uses way to much power, and it's way to expensive.


----------



## W1zzard (Jun 29, 2022)

birdie said:


> Could you please explain this? What's wrong with GTX 1660 Super and RTX 2060 both of which have 6GB of VRAM which is exactly the same as RX 5600 XT? GTX 1060 is suddenly 4 times faster than GTX 1660? Um, what?


I suspect it has to do with the memory management techniques used by these cards. Maybe, for performance reasons, they allocate memory in a certain way that's faster, but less space efficient, etc.


----------



## AusWolf (Jun 29, 2022)

For anyone interested (please don't be), I've just found the first card you can pre-order in the UK. It's more expensive than a 6400, a 6500 XT, or even a 1650!


----------



## progste (Jun 29, 2022)

How is this worse than a gtx 1650 while costing 150$??


----------



## laszlo (Jun 29, 2022)

Nvidia HQ: "the special 'gpu launch' operation is going as planned; we'll end this operation when AMD will stop fighting and surrender"


----------



## Chrispy_ (Jun 29, 2022)

looniam said:


> huh. i posted the ATX* spec, so . . . i don't think you did.
> 
> actually PCI-SIG since atx covers PSUs not add in cards.  (section 9 in attached)


You linked PCI-Express SIG spec, which is nice if you're a motherboard maker, but my beef is with case compatibility and enclosures which is clearly the realm of the ATX spec. Cases are built to ATX/mATX/MITX standards, not PCI-SIG standards.

It's potayto potahto anyway - the 4mm difference in slot height between ATX 2.2 and PCIe SIG is referring to the other end of the height dimension, which I'm not interested in since it isn't relevant for case/enclosure compatibility. All that matters (and my original point) is that cards aren't supposed to peek out above the expansion bracket. That's the spec, whether you use ATX, PCIe, VESA, or anything else; All standards agree on that.



progste said:


> How is this worse than a gtx 1650 while costing 150$??



I think if it costs more than a 1650, people will just buy the 1650 instead, right? It's the superior product by nearly a factor of two, there's a surplus of them on the used market for ~$100 each, and it has the same silicon and features set.

If you're worried about new vs used, I doubt any of these 1630s are made to high-quality standards, they all look like bottom-of-the-barrel dirt quality using very basic cooling, probably the cheapest sleeve-bearing fans they could source, and likely the bare minimum of PCB quality/components because margins are slim at this end of the market. I'd take a well-made used 1650 over a brand new 1630 any day of the week. Hell, with a bit of searching I could probably afford to buy _two_ used 1650s for the price of a new 1630 because that's how bad the MSRP is on these....

These are past listings, so this is a random selection (ebay's ordering) of what people actually paid for working GTX 1650s:


----------



## prtskg (Jun 29, 2022)

Never thought Rx 6400 will start looking so good so soon. 
Considering Intel Arc supports AV1 decode and encode, I think AMD and Nvidia should release new low level GPUs with such capability.


----------



## Tom Yum (Jun 29, 2022)

I don't understand the point about saying 'well at least it has PCI-E 16x'. This thing is so slow that even a hobbled Rx6400 on PCI-E Gen 3 4x would absolutely trounce this thing, while using less power and costing the same. This thing has zero redeeming features, what a bizarre launch from nVidia!


----------



## AusWolf (Jun 29, 2022)

progste said:


> How is this worse than a gtx 1650 while costing 150$??


Nvidia had to cut more GPU components this time. More effort = higher price.  



Chrispy_ said:


> I think if it costs more than a 1650, people will just buy the 1650 instead, right? It's the superior product by nearly a factor of two, there's a surplus of them on the used market for ~$100 each, and it has the same silicon and features set.
> 
> If you're worried about new vs used, I doubt any of these 1630s are made to high-quality standards, they all look like bottom-of-the-barrel dirt quality using very basic cooling, probably the cheapest sleeve-bearing fans they could source, and likely the bare minimum of PCB quality/components because margins are slim at this end of the market. I'd take a well-made used 1650 over a brand new 1630 any day of the week. Hell, with a bit of searching I could probably afford to buy _two_ used 1650s for the price of a new 1630 because that's how bad the MSRP is on these....
> 
> These are past listings, so this is a random selection (ebay's ordering) of what people actually paid for working GTX 1650s:


In the UK (at the moment), you can buy a new 1650, or even a 6500 XT for the price of the 1630. Nvidia will have to drop the price if they want to sell any of these.


----------



## Ruined Mind (Jun 29, 2022)

Hello. I need a low-profile dual-slot card. (Gigabyte has a low-profile version of the 1630.)

I'm afraid to buy a used card, because of the lack of a warranty. Even if I'd consider a used one, I won't buy a 1650, because I've noticed reviews about the low-profile versions of the 1650 from people who said the fans were so loud that they decided to return them and stay with their much quieter 1050 Ti cards. So, if I'd consider a used one, I'd choose the 1050 Ti. I have a PCI-Express 2.0 slot. Please don't tell me to buy a new computer. My i5-2400 processor hasn't been a bottleneck in the games I've played. I've checked with MSI Afterburner.

Regardless of whether I'd buy a used one, I still want an answer to this question from the professionals of this website, especially the leader called "W1zzard":
Would an RX 6400 be so limited in a PCI-Express 2.0 x4 configuration that the 1630 would be faster, or would the RX 6400 still be faster than the 1630, even in a nasty 2.0 x4 situation?


----------



## Chrispy_ (Jun 29, 2022)

Ruined Mind said:


> Hello. I need a low-profile dual-slot card. (Gigabyte has a low-profile version of the 1630.)
> 
> I'm afraid to buy a used card, because of the lack of a warranty. Even if I'd consider a used one, I won't buy a 1650, because I've noticed reviews about the low-profile versions of the 1650 from people who said the fans were so loud that they decided to return them and stay with their much quieter 1050 Ti cards. So, if I'd consider a used one, I'd choose the 1050 Ti. I have a PCI-Express 2.0 slot. Please don't tell me to buy a new computer. My i5-2400 processor hasn't been a bottleneck in the games I've played. I've checked with MSI Afterburner.
> 
> ...


The fans on the 1630 aren't likely to be much quieter. The 1650 is the same piece of silicon running at similar clockspeeds and voltages so you can see from the power consumption chart in the review that it pulls a good fraction of the 1650's power despite providing far less performance.

What are you trying to do with your low-profile card - 3D gaming or just encode/decode? If you're not after 3D performance - and that's going to suffer in a PCIe 2.0 x4 slot anyway, then a GT 1030 can be had with a far lower power draw than anything in the 16-series. There are low-profile fully-passive variants, even.

EDIT:
Oh hey, I just had a thought - if you're thinking about spending $150 on a 1630, but your board (and CPU) are so old that PCIe 2.0 is all you have, then perhaps you should consider replacing the whole thing with a newer CPU that has integrated graphics. You can pick up Intel 10th Gen on clearance discount for a very low price. The IGP sucks for gaming, but it'll do your display outputs and has modern codec support for hardware encode/decode. Alternatively, $150 is about the going rate for a Ryzen 4600G which is pretty capable in both the CPU and IGP department. You'll still need a motherboard and RAM but presumably a $250 budget for the whole PC isn't unreasonable if you were willing to part with $150 for a miserable little GTX 1630.


----------



## Ruined Mind (Jun 29, 2022)

I want to play games. My GT 1030 has been enough to play all of the games I've wanted to far, but it won't be enough for future games. It has been enough because I use low settings and a low resolution. My screen's resolution is 1360 x 768. A speed boost to around 2 to 2.5 times the power of the GT 1030 would be great. The RX 6400 would be even faster, but I need to know whether it would actually be slower than the 1630 because of the PCI-E 2.0 x4 limitation.

To be clear, I have a 2.0 x16 slot, but the RX 6400 has only four lanes., while the 1630 could use all 16 lanes.


----------



## ppn (Jun 29, 2022)

I don't understand why he reviewed this. there is no review of 1030, same should go for 1630

you can play chess on it i guess, but anything else is causing pain, and some people may enjoy this, but I wouldn't touch anything less than 1660S.

GTX 3030 based on the never released GA107 that was supposed to be a 3050 would have been a far better placeholder for the lower end until 4030.


----------



## ModEl4 (Jun 29, 2022)

ppn said:


> I don't understand why he reviewed this. there is no review of 1030, same should go for 1630
> 
> you can play chess on it i guess, but anything else is causing pain, and some people may enjoy this, but I wouldn't touch anything less than 1660S.
> 
> GTX 3030 based on the never released GA107 that was supposed to be a 3050 would have been a far better placeholder for the lower end until 4030.


There are a lot of people that are interested for a below $160 VGA product, me included.
Either they live in low/middle-income countries or are poor or they just don't want to pay much if they game casually or they just want to know in order to recommend it to their parents/friends or whatever.
The only problem is pricing with this product, no-one is surprised with the performance level being a so cut down iteration of TU117, if this product was at $119 it would be an alternative solution in the current market condition and at $99 the demand would be great.
Depending the games tested, it should be nearly 1.35X faster than a 5700G on 1080p Ultra and at least 1.1X faster in 1080p minimum settings.
Edit: actually it's more than the above, on average at least 1.4X faster on FHD Ultra and at least 1.32X on FHD min with same level CPU but tested without the older esport titles.


----------



## Chrispy_ (Jun 29, 2022)

Ruined Mind said:


> I want to play games. My GT 1030 has been enough to play all of the games I've wanted to far, but it won't be enough for future games. It has been enough because I use low settings and a low resolution. My screen's resolution is 1360 x 768. A speed boost to around 2 to 2.5 times the power of the GT 1030 would be great. The RX 6400 would be even faster, but I need to know whether it would actually be slower than the 1630 because of the PCI-E 2.0 x4 limitation.
> 
> To be clear, I have a 2.0 x16 slot, but the RX 6400 has only four lanes., while the 1630 could use all 16 lanes.











						AMD Radeon RX 6500 XT PCI-Express Scaling
					

The AMD Radeon RX 6500 XT comes with only a narrow PCI-Express x4 interface. In this article, we took a closer look at how performance is affected when running at PCI-Express 3.0; also included is a full set of data for the academically interesting setting of PCI-Express 2.0.




					www.techpowerup.com
				



The RX 6400 is likely to also lose about one-third of its performance in a PCIe 2.0 slot, making it a close match for the GTX 1630 in your situation. Those charts show that the lower the resolution is, the bigger the performance drop from using a PCIe 2.0 slot!

Here's a new question for you - do you have cooling/airflow limitations, power supply limitations, or physical size limitations? What's your reasoning for wanting such a low-power card?

If it's cooling - a 1060 founders edition or RX480 reference card are good candidates that dump all their heat outside the case and therefore shouldn't get hot+noisy.
If it's power, there are plenty of QUIET GTX 1650 cards that do not require additional power plugs.
If it's physical size and you *must* have a low-profile card, the Gigabyte 1630 low profile _may actually be_ your best bet with a PCIe 2.0 slot.
There are plenty of people here who would be willing to suggest something - maybe worth starting a new forum thread for it with more details...


----------



## looniam (Jun 29, 2022)

Chrispy_ said:


> You linked PCI-Express SIG spec, which is nice if you're a motherboard maker, but my beef is with case compatibility and enclosures which is clearly the realm of the ATX spec. Cases are built to ATX/mATX/MITX standards, not PCI-SIG standards.


what? no. add in cards fall under pci-e sig. ATX specs the PSU only and most certainly not wikipedia for a source.


Chrispy_ said:


> * All that matters (and my original point) is that cards aren't supposed to peek out above the expansion bracket. *That's the spec, whether you use ATX, PCIe, VESA, or anything else; All standards agree on that.


as above, there is one standard for add in cards and thats pcie-sig period. maybe look at them:





your info is clearly all wrong accounting for the space differences for no edge connector:


 
114.55-16.15 is 98.4 as illustrated here:





cases and their openings are (where you have your beef at):



 

i'll also point out how every schematic that has a bracket attached to the pcb shows the pcb is higher than the bracket. PCIE-SIG covers all that _you are more than welcome to look at the latest ATX standards (attached) and see add in cards are no where to be found._ read pages 14-15 and you'll see all the other related PCIE-SIG documents/standards.







problem is PCIE-SIG keeps all that behind a paywall and leads to what you are doing, relying on unreliable  "popular wisdom" information. read the specs - there is a whole section that covers full/half height I/O brackets.


----------



## defaultluser (Jun 29, 2022)

Ruined Mind said:


> I want to play games. My GT 1030 has been enough to play all of the games I've wanted to far, but it won't be enough for future games. It has been enough because I use low settings and a low resolution. My screen's resolution is 1360 x 768. A speed boost to around 2 to 2.5 times the power of the GT 1030 would be great. The RX 6400 would be even faster, but I need to know whether it would actually be slower than the 1630 because of the PCI-E 2.0 x4 limitation.
> 
> To be clear, I have a 2.0 x16 slot, but the RX 6400 has only four lanes., while the 1630 could use all 16 lanes.



Yeah, then you will probably see similar performance  to the 6400.  Or, you can step thins up, and likely go significantly faster than teh 1630 with a full-fat 1650!


----------



## W1zzard (Jun 29, 2022)

Added a test run at 1080p lowest possible settings in all games









						Gainward GeForce GTX 1630 Ghost Review - Challenging the AMD RX 6400
					

The NVIDIA GeForce GTX 1630 launches today. The new card is targeted at the sub-$200 segment and goes head-to-head with AMD's Radeon RX 6400 and Intel's ARC A380. Unlike the Radeon, the GTX 1630 has support for the full PCIe x16 interface.




					www.techpowerup.com


----------



## Chrispy_ (Jun 29, 2022)

looniam said:


> what? no. add in cards fall under pci-e sig. ATX specs the PSU only and most certainly not wikipedia for a source.
> 
> as above, there is one standard for add in cards and thats pcie-sig period. maybe look at them:
> 
> problem is PCIE-SIG keeps all that behind a paywall and leads to what you are doing, relying on unreliable  "popular wisdom" information. read the specs - there is a whole section that covers full/half height I/O brackets.


So what you're saying is that this 1630 might _actually_ be compliant with the spec if you look at some paywalled info? Fine.

That's great, but it doesn't change a damn thing in the real-world because of that top-edge connector effectively negating the point of the spec even having a maximum height in the first place. Intel and the PCI-SIG can pedantically disagree by a few milimeters as much as they want, but the end result is still the same; The GPU won't fit in someone's case because of where the connector is oriented and you're saying that is omitted from the spec altogether?

What a dumpster fire of a spec if it imposes height restrictions but then fails to provide any guidelines for plugs and connectors that are *mandatory* for the card to operate, could be literally _anywhere_ on a card (*cough* 3000-series FE cards) and completely violate those very locked-down height restrictions that the spec covered in so much detail.

If this were a mainstream gaming GPU I'd give it a pass; Gaming PC cases are typically huge and fit almost anything - but this card is a niche product that brings nothing attractive to the table other than its low power consumption and modern encode/decode hardware. It's therefore of far more interest to the HTPC crowd than the gaming crowd, and the HTPC market is chock-full of slim, low-profile, space-constrained, cooling-constrained, incompatible cases that cannot fit your typical gaming GPU these days.

For what it's worth, all is not lost with dumb PCIe power connector locations. Sometimes, you can make an HTPC work using these, assuming the backplate on the card isn't too chunky:


----------



## looniam (Jun 29, 2022)

Chrispy_ said:


> So what you're saying is that this 1630 might _actually_ be compliant with the spec if you look at some paywalled info? Fine.


no. i am saying it is compliant to specs to the documentation that_ i have attained over the years of scouring the internet. _i know firsthand google/wikipedia will not give accurate info on what is normally behind a paywall. which is why there is so much misinformation. (education/student subscriptions come in handy!)
paywalls<determination=FTW! all it takes is time.  

btw, i have attached them to each post mentioned. so no paywall in your way, so whats problem?


Chrispy_ said:


> That's great, but it doesn't change a damn thing in the real-world because of that top-edge connector effectively negating the point of the spec even having a maximum height in the first place. Intel and the PCI-SIG can pedantically disagree by a few milimeters as much as they want, but the end result is still the same; The GPU won't fit in someone's case because of where the connector is oriented and you're saying that is omitted from the spec altogether? What a dumpster fire of a spec!


it is all listed in the PCIE-SIG standards, again intel's ATX PSU specs have NOTHING to do with this! 

its all there but, even trying to spoon feed you is to no avail


----------



## Chrispy_ (Jun 29, 2022)

W1zzard said:


> Added a test run at 1080p lowest possible settings in all games
> 
> 
> 
> ...


Big Oof.
Even at the lowest possible settings it sucks.
Let's assume you have a budget freesync monitor and you can accept 48fps instead instead of 60fps. that's still only a third of the games tested, and that's an AVERAGE not a 99th percentile so you're still going to get dropped frames and stutterfest all over the place :\



looniam said:


> no. i am saying it is compliant to specs to the documentation that_ i have attained over the years of scouring the internet. _i know firsthand google/wikipedia will not give accurate info on what is normally behind a paywall. which is why there is so much misinformation. (education/student subscriptions come in handy!)
> paywalls<determination=FTW! all it takes is time.
> 
> btw, i have attached them to each post mentioned. so no paywall in your way, so whats problem?
> ...


Alright captain pedantic, your years of trying to find obscure information have merited you the technical win but I still don't know what point you're trying to make.

My point is crystal clear, I hope; Overly-tall cards, or cards that become tall once you plug in the PCIe connector simply don't fit in many slim cases, which are prevalent in HTPC use.

I'm not sure why you're so obsessed about a 4mm discrepancy between the ATX spec and the PCIE-SIG spec when it doesn't answer any questions that people have asked, or solve any problems that real-world examples encounter.

So, politely, what is your point please?


----------



## looniam (Jun 29, 2022)

Chrispy_ said:


> Alright captain pedantic, your years of trying to find obscure information have merited you the technical win but I still don't know what point you're trying to make.


first of all let me remind you of your first post before resorting to name calling,







Chrispy_ said:


> My point is crystal clear, I hope; Overly-tall cards, or cards that become tall once you plug in the PCIe connector simply don't fit in many slim cases, which are prevalent in HTPC use.


if you would read further, (notes are important!) you would see how that is all addressed. you can have the view the card is too tall for some use cases but saying it doesn't comply with specs is wrong.


Chrispy_ said:


> I'm not sure why you're so obsessed about a 4mm discrepancy between the ATX spec and the PCIE-SIG spec when it doesn't answer any questions that people have asked, or solve any problems that real-world examples encounter.


i am obsessed with 4mm? refer to above image. and again, _ATX specs have nothing to do with this _- there are several pages in PCIE-SIG's CEM on brackets and designs to fit in cases and points out what to avoid. and i'll point out that not all case manufacturers adhere to the specs themselves or that the mobo stand off heights are not always appropriate, if you want real world problems.


Chrispy_ said:


> So, politely, what is your point please?


you're not correct? the info you are relying on is not accurate? i thought that was clear.


----------



## ModEl4 (Jun 29, 2022)

Hi @W1zzard, thanks for the effort!
The 1080p min test, is absolutely minimum in every setting and the AF at 16X on a 5800X, correct?
It gives nearly 2.35X vs ultra regarding average fps score, i wonder when the cross-gen period is going to end, how it will effect the Ultra/min difference (probably will shrink a little bit)


----------



## Chrispy_ (Jun 29, 2022)

looniam said:


> first of all let me remind you of your first post before resorting to name calling,
> 
> 
> 
> ...


I think in this case, "captain pedantic" isn't name calling but an accurate title. If you are offended rather than pleased by that title then you have some reflecting to do because you have been going out of your way to earn it in this thread. Everyone appreciates accuracy and meticulous attention to detail sometimes, but this is not one of those times.

Look, I understand that those 4mm of clearance above the red line are actually within spec, and that *I was wrong* to use an older, deprecated ATX spec but it doesn't really change anything about the point I'm trying to make, and it doesn't solve the problem for the many of us using slim HTPC cases in the real world. Whether the blame lies with the case manufacturer or the SIG doesn't matter. Trying to get a case that fits your aesthetics under a TV with limited dimensions and cooling in what is typically a short-depth piece of furniture is not easy. We buy what's on the market and if the manufacturer doesn't 100% comply with a specific spec, we can't just send it back and say "hey, please re-tool your entire production line because this card that was released 4 years after your case was released doesn't fit any more".

You still seem hung up on this pedantic irrelevance that a minor deviation in height is the issue - and it has nothing to do with my point. Real world height is what it is, including the airgap required for the heatsink to exhaust and the additional height any mandatory cables require. We're already off-topic enough and bickering about minor differences in spec really isn't adding anything to this review.


----------



## W1zzard (Jun 29, 2022)

ModEl4 said:


> The 1080p min test, is absolutely minimum in every setting and the AF at 16X on a 5800X, correct?


absolutely minimum everything including AF. So AF set to trilinear where available. Exact same test system


----------



## looniam (Jun 29, 2022)

Chrispy_ said:


> I think in this case, Captain pedantic isn't name calling but an accurate title. If you are offended rather than pleased by that title then you have some reflecting to do because you have been going out of your way to earn it in this thread. Everyone appreciates accuracy and meticulous attention to detail sometimes, but this is not one of those times.


you do not speak for everyone, i am more than sure the peanut gallery appreciates getting docs that are not publicly accessible and how to get them.


Chrispy_ said:


> Look, I understand that those 4mm of clearance above the red line are actually within spec, and that *I was wrong* to use an ATX spec


gee would that have been so hard to realize in the beginning?


Chrispy_ said:


> but it doesn't really change anything about the point I'm trying to make, and it doesn't solve the problem for the many of us using HTPC cases in the real world.
> 
> *You *are hung up on this pedantic irrelevance about a 4mm deviation between the spec I found and the actual current 2022 spec, not me, and it has nothing to do with my point.


look pal, i pointed everything out in my first reply; this whole conversation since then is also on you. if you really like titles, i like to hand out knucklehead from time to time.

and btw, no i cannot find the newest CEM since what i posted,_ so if you have it_, then by all mean share.   

until then nice chat.


----------



## Chrispy_ (Jun 29, 2022)

looniam said:


> look pal, i pointed everything out in my first reply; this whole conversation since then is also on you. if you really like titles, i like to hand out knucklehead from time to time.


LOL what? My first response you was this:


Chrispy_ said:


> TBH the first result Google returned was ATX 2.2 and I didn't spend long hunting, I just hit CTRL+F


I'm saying right there, at the beginning of this entertaining discussion that I didn't really look too hard and admitting it's an old spec. I'm not pretending that I care or that I know better. If that makes me a knucklehead according to you then I can live with that.

Out of interest, what's a CEM? I'm 99.9% certain I don't have the newest one of them.


----------



## catulitechup (Jun 29, 2022)

W1zzard said:


> Added a test run at 1080p lowest possible settings in all games
> 
> 
> 
> ...



@W1zzard 

thanks for your results but this only reconfirm 1080p for this card and other similar are too much, 720p test will be more interesting and more usable than 1080p


----------



## W1zzard (Jun 29, 2022)

catulitechup said:


> 720p


Are people actually playing games at 720p? try it, it looks terrible


----------



## looniam (Jun 29, 2022)

Chrispy_ said:


> LOL what? My first response you was this:
> 
> I'm saying right there, at the beginning of this entertaining discussion that I didn't really look too hard and admitting it's an old spec. I'm not pretending that I care or that I know better. If that makes me a knucklehead according to you then I can live with that.


well i don't really get into he said/he said narratives of what is already posted. just reread the comments for any questions. 


Chrispy_ said:


> Out of interest, what's a CEM? I'm 99.9% certain I don't have the newest one of them.


*C*ard* E*lectro*m*ecanical Specfications that are listed in the newest ATX 3.0 guide:




maybe read it?



thats the problem, you had argued as if you did have them. 

glad we got that cleared up have a good day.


----------



## Ruined Mind (Jun 29, 2022)

W1zzard said:


> Are people actually playing games at 720p? try it, it looks terrible


I always use either the 1280 x 720 resolution or the 1360 x 768 resolution. My screen's resolution is 1360 x 768. Using the 1280 x 720 resolution and enlarging it to fill the 1360 x 768 canvas looks fine to me.


----------



## Chrispy_ (Jun 29, 2022)

W1zzard said:


> Are people actually playing games at 720p? try it, it looks terrible


_Just about_ acceptable on a 13" laptop, which is also the only class of device likely to be running solely UHD Graphics 620 with a single-channel of soldered DDR4-2400.
It's not pretty, but sometimes it's _necessary_.

Thankfully most games have resolution scaling these days which at least renders the HUD and UI at native resolution. FSR/DLSS are better but if you have to run at 720p your hardware probably isn't powerful enough to use either of those options.


----------



## AusWolf (Jun 29, 2022)

Ruined Mind said:


> Hello. I need a low-profile dual-slot card. (Gigabyte has a low-profile version of the 1630.)
> 
> I'm afraid to buy a used card, because of the lack of a warranty. Even if I'd consider a used one, I won't buy a 1650, because I've noticed reviews about the low-profile versions of the 1650 from people who said the fans were so loud that they decided to return them and stay with their much quieter 1050 Ti cards. So, if I'd consider a used one, I'd choose the 1050 Ti. I have a PCI-Express 2.0 slot. Please don't tell me to buy a new computer. My i5-2400 processor hasn't been a bottleneck in the games I've played. I've checked with MSI Afterburner.
> 
> ...


Hi there. 

Low profile 1650 variants can be good or bad. I used to have an Asus one that was great. I still regret selling it to this day. I've heard Gigabyte and MSI are crap, but can't confirm it.

As for the 6400 in a pci-e 2.0 board, its performance will hugely depend on the game you wish to play. Some of them will suffer no performance hit, some of them might be crippled badly. If you let me know what you want to play, and if I happen to have it, I'll gladly test it for you and PM you the results.


----------



## catulitechup (Jun 29, 2022)

W1zzard said:


> Are people actually playing games at 720p? try it, it looks terrible



yeah many people still using 720p (old machines where supposed gtx 1630 offer improvement) and more on too weak gpus like gtx 1630 and similars

1080p is too much for this hardware


----------



## Athlonite (Jun 30, 2022)

@Chrispy_  & @looniam if you two want to argue take it to your DM's otherwise behave like grown ups and stick to being on topic


----------



## Mussels (Jun 30, 2022)

On the AV1 and video streaming

Literally, on PC onlyyoutube lets you do 4K content easily.
Netflix requires an add-on HEVC codec to enable 4K, that has its own conditions and limitations - and they will sooner or later do the same with AV1
Disney+ is still locked at 720p even in the windows store app

It's a nightmare situation where modern CPU's block you from playing bluray, and modern GPU's can block you from playing streaming content in 4K, so anything that makes that situation more complex is just not fun.

(intel 6th gen through 10th gen work for 4k blu ray and 4k streaming, but nothing older or newer can do both)


----------



## the ram (Jun 30, 2022)

It's possible flash the BIOS with a gtx 1650 sample? Or something?


----------



## Udyr (Jun 30, 2022)

W1zzard said:


> Are people actually playing games at 720p? try it, it looks terrible


That +/- 12% under 1080p should greatly benefit from a product like this, at the right price. Unfortunately it's a no-go for them at the moment.


----------



## Ruined Mind (Jun 30, 2022)

Udyr said:


> That +/- 12% under 1080p should greatly benefit from a product like this, at the right price. Unfortunately it's a no-go for them at the moment.


The percentage may be higher. I haven't ever received the notification from the Steam program that asks people to contribute to the Hardware Survey, so I'm not represented in Steam's statistics. Imagine how many other very active Steam users haven't received Steam's survey.


----------



## AusWolf (Jun 30, 2022)

Udyr said:


> That +/- 12% under 1080p should greatly benefit from a product like this, at the right price. Unfortunately it's a no-go for them at the moment.


Nearly 6% on 1366 (which is basically a laptop 720p), and 2% on 900p. That's 8%. While not the majority, I wouldn't write them off, yet.


----------



## Udyr (Jun 30, 2022)

Ruined Mind said:


> The percentage may be higher. I haven't ever received the notification from the Steam program that asks people to contribute to the Hardware Survey, so I'm not represented in Steam's statistics. Imagine how many other very active Steam users haven't received Steam's survey.


It is definitely higher because of this, and there are certainly more people out there that don't use steam at all not represented in this statistic.



AusWolf said:


> Nearly 6% on 1366 (which is basically a laptop 720p), and 2% on 900p. That's 8%. While not the majority, I wouldn't write them off, yet.


I wouldn't either. That's a high number considering the amount of gamers in the world.


----------



## ppn (Jun 30, 2022)

In some games - odyssey and warzone. It seems that lowest settings provides worse FPS than the one step above. maybe because it tries to downscale and it requires additional work to be done. So it could be making it worse. Noticed this on both my GTX 760 and 2070 i think. now on the GTX 760 I play mostly at 720p or 900p whenever possible, but it's not worth it, simply torture. This card replaces integrated video. and should cost respectively to make sense, like the difference between I5-12100 and 12100F, plus the 4GB buffer.


----------



## kiakk (Jun 30, 2022)

I expected a sligtly better 4K performance.


----------



## AusWolf (Jun 30, 2022)

ppn said:


> In some games - odyssey and warzone. It seems that lowest settings provides worse FPS than the one step above. maybe because it tries to downscale and it requires additional work to be done. So it could be making it worse. Noticed this on both my GTX 760 and 2070 i think. now on the GTX 760 I play mostly at 720p or 900p whenever possible, but it's not worth it, simply torture. This card replaces integrated video. and should cost respectively to make sense, like the difference between I5-12100 and 12100F, plus the 4GB buffer.


It's torture on a higher resolution monitor, but it's normal on a small (laptop) panel that has it as its native resolution.


----------



## GeorgeMan (Jun 30, 2022)

It's always good to see a low-end product review here. Not everyone is interested in playing latest AAA games in 4K, there are much more casual gamers or just company workers that play lan parties after their working hours, at work  
I, personally, swore to never buy a GPU again as long as they are that expensive. Everything seems pointless to me after buying a brand new 1080Ti FTW3 with 5years warranty extension directly from EVGA for less than 600€ in 2018. So if I was in the gpu market right now, I'd be very interested to buy a low end gpu just to play casual stuff until this paranoia ends.


----------



## W1zzard (Jun 30, 2022)

the ram said:


> It's possible flash the BIOS with a gtx 1650 sample? Or something?


I might be possible to flash, but you can't unlock the gpu cores, or increase the memory bus


----------



## the ram (Jun 30, 2022)

And flashing the card will change something like improved performance?


----------



## Valantar (Jun 30, 2022)

the ram said:


> And flashing the card will change something like improved performance?


Why would it?


----------



## AusWolf (Jun 30, 2022)

the ram said:


> And flashing the card will change something like improved performance?


If you can't enable shader cores, then it'll do absolutely nothing. Even if you can, you'll be limited by the 64-bit memory bus.

People, please, stop believing the fairy tale that BIOS flashing does anything good to your graphics cards! Even in the best case scenario, it does nothing, while it can completely break it in worse cases. The factory BIOS is fine, leave it alone!


----------



## mahirzukic2 (Jun 30, 2022)

Leave Britney I mean BIOS alone!!!


----------



## AdmiralThrawn (Jul 7, 2022)

27 FPS on The Witcher 3 (2015)

This is a total embarassment. The fact this card is in the Geforce lineup is a Joke. You cannot even play 7 year old games at 1080p. I guess if you like playing minesweeper in 4k for 200 dollars this card is for you.

I hate being so negative but its hard to find something positive here.


----------



## ixi (Jul 7, 2022)

Nvidia to AMD. Here, hold my beer for some time.


----------



## Valantar (Jul 7, 2022)

ixi said:


> Nvidia to AMD. Here, hold my beer for some time.


Given the performance of this GPU, AMD would struggle to catch Nvidia's beer as the arm holding it would be moving so choppily and unevenly


----------



## Forza.Milan (Jul 8, 2022)

This is blasphemy! This is madness!


----------



## Udyr (Jul 8, 2022)

Forza.Milan said:


> This is blasphemy! This is madness!


----------



## Athlonite (Jul 9, 2022)

Valantar said:


> Given the performance of this GPU, AMD would struggle to catch Nvidia's beer as the arm holding it would be moving so choppily and unevenly


----------



## Minxie (Jul 9, 2022)

AdmiralThrawn said:


> I hate being so negative but its hard to find something positive here.


Ah don't worry, while forums like these usually love to be cynical about everything, this is a valid thing to be negative about. This thing shouldn't even exist.


----------



## rvalencia (Jul 11, 2022)

Low-end GPUs need Genshin Impact benchmarks.


----------

