# NVIDIA RTX A2000



## W1zzard (Mar 3, 2022)

NVIDIA's RTX A2000 is the fastest graphics card that fits into a low-profile form factor. It's fast enough for 1080p Full HD Gaming at maximum details and achieves that with 75 W slot power only, which makes it even more energy-efficient than AMD's 6 and 7 nanometer RDNA2 graphics cards.

*Show full review*


----------



## DeeJay1001 (Mar 3, 2022)

And you can't buy them anywhere because it's the one of the best mining cards on the market right now


----------



## lexluthermiester (Mar 3, 2022)

DeeJay1001 said:


> And you can't buy them anywhere because it's the one of the best mining cards on the market right now


Nonsense. You can't buy them anywhere because they are not consumer cards. They are meant for small form-factor workstations and enterprise customers.



W1zzard said:


> Fan speed can't be reduced below 30%


I do not count this as a downside. To me that is a positive feature. In cramped spaces it will prevent overheating..

Now NVidia needs to make a 3050 low-profile, slot powered GPU and NOT gimp on specs.


----------



## DeeJay1001 (Mar 3, 2022)

lexluthermiester said:


> Nonsense. You can't buy them anywhere because they are not consumer cards. They are meant for small form-factor workstations and enterprise customers.


Do a search on youtube for this card, All the top reviews are either using this card for mining or gaming benchmarks. When these cards were released, I saw countless posts on a mining group of people buying stacks of these. ANYONE can buy these cards from CDW, shopBLT, Lenovo, or HP. I have a business rep with Lenovo for bulk orders and inquired about these cards before they were even released, I was unable to order more than 2 without a manual review and serial numbers from compatible machine due to lenovo placing these cards on a special reserve list due to the demand from large mining operations. I can guarantee more of these cards are being used for mining than any professional use.


----------



## Dorek (Mar 3, 2022)

Seen 8 of these in stock in some finnish pc web shop, though they are PNY made.


----------



## lexluthermiester (Mar 3, 2022)

DeeJay1001 said:


> Do a search on youtube for this card, All the top reviews are either using this card for mining or gaming benchmarks. When these cards were released, I saw countless posts on a mining group of people buying stacks of these. ANYONE can buy these cards from CDW, shopBLT, Lenovo, or HP. I have a business rep with Lenovo for bulk orders and inquired about these cards before they were even released, I was unable to order more than 2 without a manual review and serial numbers from compatible machine due to lenovo placing these cards on a special reserve list due to the demand from large mining operations. I can guarantee more of these cards are being used for mining than any professional use.


Context is important and you missed it. The AXXXX series card from NVidia are NOT consumer cards. Your statement shows this really well. The vast majority of buyers are NOT going to have accounts at resale distributors, nor are they going to buy direct from OEMs.


----------



## DeeJay1001 (Mar 3, 2022)

lexluthermiester said:


> Context is important and you missed it. The AXXXX series card from NVidia are NOT consumer cards. Your statement shows this really well. The vast majority of buyers are NOT going to have accounts at resale distributors, nor are they going to by direct from OEMs.


I think you missed it. Literally anyone can make an account on any of those sites and buy a single card. Sure, they aren't the go-to retailers for the average consumer, but they aren't restricted either.


----------



## Kissamies (Mar 3, 2022)

Looks amazing when thinking about its efficiency.


----------



## W1zzard (Mar 3, 2022)

DeeJay1001 said:


> And you can't buy them anywhere because it's the one of the best mining cards on the market right now


I bought my card on Amazon Germany, with my personal account (not a company acc), there seems to be plenty of supply at other stores, too



Dorek said:


> though they are PNY made.


Exact same thing as the NVIDIA card


----------



## Dorek (Mar 3, 2022)

W1zzard said:


> I bought my card on Amazon Germany, with my personal account (not a company acc), there seems to be plenty of supply at other stores, too
> 
> 
> Exact same thing as the NVIDIA card


Guessing these are all to spec regardless of makers, but very cool idea for some mini pc gaming.


----------



## lexluthermiester (Mar 3, 2022)

DeeJay1001 said:


> I think you missed it.


No, I didn't. I'm a retailer and my wholesale supplier can supply these cards, but only in 5 packs.


DeeJay1001 said:


> Literally anyone can make an account on any of those sites and buy a single card.


Not sure where in the world you are, but things stateside seem to work a little differently. Wholesale distributors generally don't sell to the public. You need proof of business license, proof of a tax ID(because wholesale is sans tax) and you generally have to buy in bulk.

These cards might be out there, I'm not arguing against that. What I'm saying is that they are NOT generally available to the public. THAT is why few can buy them. Example?...



W1zzard said:


> I bought my card on Amazon Germany


...I just looked, not one example on Amazon.com or Ebay and only a few examples on Newegg(not sold by Newegg themselves). Things seem to work differently in Deutschland.



DeeJay1001 said:


> Sure, they aren't the go-to retailers for the average consumer


Thank you for conceding the point.


----------



## Selaya (Mar 3, 2022)

wow, you madman. you really did it. nice.
that being said


> Installation requires *three* slots in your system.


https://www.techpowerup.com/review/nvidia-rtx-a2000/2.html
that can't be right


----------



## londiste (Mar 3, 2022)

> In gaming, power consumption is extremely low. Since the card is designed to run with PCIe slot power only, which can supply up to 75 W, NVIDIA set a board power limit of 70 W.


Technically spec says 5.5A (66W) at 12V - which is what most graphics cards take as input. Plus 3A (9.9W) at 3.3V and combined up to 75W.


----------



## damric (Mar 3, 2022)

@W1zzard  I wonder if this card can be firmware modded like I did with that M2000 to exceed the power and clock limits. I've been folding on that one 24/7 since that experiment, and it hasn't yet caused any worry for the PCIE slot occasionally getting a 120w spike, with typical power draw around 70-80w on a cheap ASRock B450 board.


----------



## W1zzard (Mar 3, 2022)

Selaya said:


> wow, you madman. you really did it. nice.
> that being said
> 
> https://www.techpowerup.com/review/nvidia-rtx-a2000/2.html
> that can't be right


fixed


----------



## ricoh (Mar 3, 2022)

> The real ace up the A2000's sleeve has to be its full 192-bit GDDR6 memory bus in line with that of the RTX 3060. The RTX 3050 uses a 50% narrower 128-bit bus.


No, the RTX 3050 has a 33% narrower 128-bit bus than the A2000. But the A2000 has a 50% wider 192-bit bus than the RTX 3050.

If the RTX 3050 WOULD have a 50% narrower bus than the A2000, it would have been 96 bit.


----------



## Mussels (Mar 3, 2022)

Page 1 fails to have the phrase "can it run crysis?"


0/10 bad review, IGN


The fact it matches a 3050 despite having 1/3 the base clock says that either this is a super efficient design, or that nvidias base/boost clocks make even less sense than they used tor
(cores ROPS, base, boost)


----------



## Selaya (Mar 3, 2022)

could also be binning, they could've saved the best dies for this since they get to charge quadro tax here


----------



## mechtech (Mar 3, 2022)

Interesting little card.  What codecs included in the media engine?  W1zz, on chart you have 562mhz for core clock speed?

It's too bad AMD didn't make the 6500xt and 6600/xt in stubby versions and low profile versions.


----------



## Beertintedgoggles (Mar 3, 2022)

It's available at Micro Center:  https://www.microcenter.com/product...00-single-fan-6gb-gddr6-pcie-40-graphics-card  In fact, they have two available at the time of this post.  Have to buy in person and limit of 1 per....   but that's a far cry from "Nonsense. You can't buy them anywhere because they are not consumer cards."


----------



## Selaya (Mar 3, 2022)

okay, suggestion time:
add 1650 (vanilla non-super) and maybe 1050ti to the charts bc this is an SFF card (and really shouldnt be bought for any other usecase) and the 1650/1050ti are its main uh _competitors_


----------



## docnorth (Mar 3, 2022)

lexluthermiester said:


> Now NVidia needs to make a 3050 low-profile, slot powered GPU and NOT gimp on specs.


Exactly, except it would be better to have both low-profile and full height options.


----------



## Lew Zealand (Mar 3, 2022)

If Nvidia and AMD would clock their cards at reasonable speeds with reasonable voltages, all cards of this generation would have similar high efficiency.  As most of my games don't need the full performance of my 6600xt, I usually run it at 2300 MHz (-350) cores and 2200MHz (+200) memory and games average around 90-95W with full GPU usage.  On some benchmarks I can get it to top out at ~105W but I've haven't seen that in-game yet.  FPS drop from full speed is about 10%.

That's comparable to what's seen with the A2000 here and shows that Ampere and RDNA2 can run much more efficiently if one chooses to.


----------



## docnorth (Mar 3, 2022)

Selaya said:


> okay, suggestion time:
> add 1650 (vanilla non-super) and maybe 1050ti to the charts bc this is an SFF card (and really shouldnt be bought for any other usecase) and the 1650/1050ti are its main uh _competitors_


Well, we can use 1650 super or rx 570 as guides and estimate the 1650 non-super performance at 55-57% compared to the a2000. Of course someone else’s calculations might differ slightly. Interesting comparison anyway.


----------



## Camm (Mar 4, 2022)

I want one for my home server to accelerate decode and provide GPU acceleration for any VM's running. But the price is still a bit ludicrous right now for me to consider it.


----------



## Patriot (Mar 4, 2022)

I would say the fastest ITX card would be the 3060ti mini from asus.... this isn't even as fast as my 2070 mini...

Might be the fastest slot powered current gen card though.


----------



## wolf (Mar 4, 2022)

Damn that was quick @W1zzard ! Firstly let me say a massive thank you for taking a look at this card just because you were asked by forum members, what a community leader. 

Many thanks too for covering this with the usual depth and thoroughness. The price is yikes (what isn't atm), but damn I'd love in in an ultra small SFF build, like well under 5L.

Just look at that energy efficiency. Holy cow.



Patriot said:


> I would say the fastest ITX card would be the 3060ti mini from asus.... this isn't even as fast as my 2070 mini...
> 
> Might be the fastest slot powered current gen card though.


Fastest half height / low profile card


----------



## nguyen (Mar 4, 2022)

Been running my 3090 at 750mV undervolt too, amazing efficiency


----------



## wolf (Mar 4, 2022)

nguyen said:


> Been running my 3090 at 750mV undervolt too


What clocks? I've gone as low as 775mv @ 1725mhz, was running 750mv but the heaviest craziest games (Quake-RTX basically) couldn't take that 100% stable.


----------



## tabascosauz (Mar 4, 2022)

@W1zzard is it possible to get a lower than default idle fan speed from Afterburner custom curve? Or is it like 20 series FE where it refuses to listen to any program


----------



## nguyen (Mar 4, 2022)

wolf said:


> What clocks? I've gone as low as 775mv @ 1725mhz, was running 750mv but the heaviest craziest games (Quake-RTX basically) couldn't take that 100% stable.



1725mhz/750mV, use around ~260W in games @ 4K. I use CP2077 with RT as a stress test, that game crashes very quickly with unstable clocks (while other games are stable with 1755mhz/750mV)


----------



## Nater (Mar 4, 2022)

DeeJay1001 said:


> Do a search on youtube for this card, All the top reviews are either using this card for mining or gaming benchmarks. When these cards were released, I saw countless posts on a mining group of people buying stacks of these. ANYONE can buy these cards from CDW, shopBLT, Lenovo, or HP. I have a business rep with Lenovo for bulk orders and inquired about these cards before they were even released, I was unable to order more than 2 without a manual review and serial numbers from compatible machine due to lenovo placing these cards on a special reserve list due to the demand from large mining operations. I can guarantee more of these cards are being used for mining than any professional use.


He's right ^  I've had ONE on order at multiple websites for MONTHS.  Lenovo flat out cancelled my order when they were supposed to come "in-stock".  

I don't know where Wizard is getting $700 pricetag from.  If you're lucky you can snag one at Microcenter for $650.  Otherwise they're $900+ from what I've seen.  And I've literally been checking every single day since Novemberish.



Beertintedgoggles said:


> It's available at Micro Center:  https://www.microcenter.com/product...00-single-fan-6gb-gddr6-pcie-40-graphics-card  In fact, they have two available at the time of this post.  Have to buy in person and limit of 1 per....   but that's a far cry from "Nonsense. You can't buy them anywhere because they are not consumer cards."


This is only the second time they've been in stock at MC that I know of.  They'll all be gone again by the weekend.


----------



## Mussels (Mar 4, 2022)

wolf said:


> What clocks? I've gone as low as 775mv @ 1725mhz, was running 750mv but the heaviest craziest games (Quake-RTX basically) couldn't take that 100% stable.


That's very similar to mine, around 1650 is the max at 750mv
I run 1600 @725mv for my everyday clocks
(sorry for off topic)


----------



## AnotherReader (Mar 4, 2022)

Thanks for the quick review. This shows what's possible by binning for efficiency. I'm surprised by the clocks and TDP of the laptop GPUs from both AMD and Nvidia. This card shows that in the 165 W TGP of some RTX 3080 mobile laptops, you could get a real RTX 3080 or 6900 XT as long as they were clocked low enough.


----------



## renz496 (Mar 4, 2022)

Dorek said:


> Seen 8 of these in stock in some finnish pc web shop, though they are PNY made.


not sure how it is right now but from what i remember PNY are the only partners that make nvidia professional card like this.


----------



## wolf (Mar 4, 2022)

Mussels said:


> That's very similar to mine, around 1650 is the max at 750mv
> I run 1600 @725mv for my everyday clocks


What makes it accepatable is that they're just such a big powerhouse that sacrificing 5-10-15% on clocks barely makes a difference to the playability of any game.


----------



## nguyen (Mar 4, 2022)

AnotherReader said:


> Thanks for the quick review. This shows what's possible by binning for efficiency. I'm surprised by the clocks and TDP of the laptop GPUs from both AMD and Nvidia. This card shows that in the 165 W TGP of some RTX 3080 mobile laptops, you could get a real RTX 3080 or 6900 XT as long as they were clocked low enough.



Workstation GPU like this A2000 or mobile part utilize low leakage chips (low ASIC quality), they can't reach as high clocks with the same voltage as high leakage part but they have much better efficiency.

I have a desktop 2080Ti and a 2070Super Max-Q, the 2070S MaxQ can't reach the same clock/voltage as the desktop 2080Ti (150mhz lower at the same voltage), but 2070S MaxQ would surely beat 2080Ti if both are locked to 90W.

So not only it's a waste to use GA102 die (or Navi21) on laptop, but those chips would also perform worse than GA104 or Navi22 when used in Laptop with a constrained TDP


----------



## W1zzard (Mar 4, 2022)

tabascosauz said:


> @W1zzard is it possible to get a lower than default idle fan speed from Afterburner custom curve? Or is it like 20 series FE where it refuses to listen to any program


Not possible, the fan speed slider goes from 30% to 100% (not 0% to 100%)


----------



## Taraquin (Mar 4, 2022)

Very interesting card. Unfortunately costs 850usd in Norway :/ Several aspects of this card is intriguing:
-Efficiency: if you compare evga 3060 to this card they both need 975mv to reach 1850MHz core. This card probably has the same voltage/core curve as all other Ampere cards, but in general they are much more efficient at 700-750mv than the stock 950-1080mv. Tuning this card with afterburner could be fun as even OC has some voltage headroom. My 3060 tuf used less than 100W in gaming when running at 1650MHz@743mv (lowest voltage before vram downclocks from 15 to 10GHz). This card seems to be able to overclock vram at lower voltages than regular Amperes (725-750mv depending on binning being minimum for full vram speed). Something like 1600MHz@700mv might be doable on this card within 75W. My 3060 could do that, and used 70-80W then, but vram dropped to 10GHz so performance tanked. 
-Size is awesome, this is by far the best performing low height itx card I've seen. Excellent for minipcs with ITX MB. 
-Noise is suprizingly good given blower design, with afterburner tuning/undervolting it can probably drop below 30dB at full load.


----------



## qubit (Mar 4, 2022)

Pah! It can't run Crysis, so useless!


----------



## W1zzard (Mar 4, 2022)

qubit said:


> Pah! It can't run Crysis, so useless!


Runs perfectly fine, but why bother with Crysis, the remake is terrible


----------



## WonkoTheSaneUK (Mar 4, 2022)

Hope our office IT department can source a few of these. Our PCs are on Kepler GPUs (now EOL by Nvidia) & we need "certified" cards for our CAD software.


----------



## DeeJay1001 (Mar 4, 2022)

lexluthermiester said:


> Not sure where in the world you are, but things stateside seem to work a little differently. Wholesale distributors generally don't sell to the public. You need proof of business license, proof of a tax ID(because wholesale is sans tax) and you generally have to buy in bulk.
> 
> These cards might be out there, I'm not arguing against that. What I'm saying is that they are NOT generally available to the public. THAT is why few can buy them. Example?...


Every single one of these listings accessible and free to be purchased by anyone without any special credentials. All of these are out of stock because they have been scooped up by miners.

NVIDIA RTX A2000 - graphics card - RTX A2000 - 6 GB - VCNRTXA2000-PB - - (cdw.com)
NVIDIA RTX A2000 - graphics card - RTX A2000 - 12 GB - VCNRTXA200012GB-PB - - (cdw.com)
NVIDIA RTX A2000 - graphics card - RTX A2000 - 6 GB - 340L0AA - - (cdw.com)
ShopBLT.com: HP Hewlett Packard Nvidia Rtx A2000 6gb 4mdp Video Graphics Card
Nvidia RTX A2000 6GB miniDP*4 Graphics card with HP Bracket | Lenovo US
PNY Technologies RTX A2000 Graphics Card VCNRTXA2000-PB B&H (bhphotovideo.com)
HP RTX A2000 Graphics Card 340L0AA B&H Photo Video (bhphotovideo.com)

EDIT: I've just now seen that my local Microcenter and the 2 next closest stores all have then in stonk on the shelves available for anyone who walks in.


----------



## MentalAcetylide (Mar 4, 2022)

DeeJay1001 said:


> Do a search on youtube for this card, All the top reviews are either using this card for mining or gaming benchmarks. When these cards were released, I saw countless posts on a mining group of people buying stacks of these. ANYONE can buy these cards from CDW, shopBLT, Lenovo, or HP. I have a business rep with Lenovo for bulk orders and inquired about these cards before they were even released, I was unable to order more than 2 without a manual review and serial numbers from compatible machine due to lenovo placing these cards on a special reserve list due to the demand from large mining operations. I can guarantee more of these cards are being used for mining than any professional use.


I don't know, I think its more of a case with just the cards themselves being restricted as to who can buy them. My RTX A6000 that I use for iray rendering is a Lenovo, and those cards are usually reserved for government/research/enterprise workstations, so it can be much harder(and expensive) to get those professional workstation cards. The few A6000's that I was able to find a few months ago were crappy PNY brand. So I don't think cripple currency has as much of an impact when it comes to these particular cards. Gaming cards are a different story since they don't have the same restrictions and aren't being prioritized to businesses/enterprise systems.


----------



## lexluthermiester (Mar 4, 2022)

qubit said:


> Pah! It can't run Crysis, so useless!


Nonsense, It'll run Crysis with it's eye's closed. 



W1zzard said:


> Runs perfectly fine, but why bother with Crysis, the remake is terrible


The original is still fine though.



DeeJay1001 said:


> Every single one of these listings accessible and free to be purchased by anyone without any special credentials. All of these are out of stock because they have been scooped up by miners.
> 
> NVIDIA RTX A2000 - graphics card - RTX A2000 - 6 GB - VCNRTXA2000-PB - - (cdw.com)
> NVIDIA RTX A2000 - graphics card - RTX A2000 - 12 GB - VCNRTXA200012GB-PB - - (cdw.com)
> ...


You've go a point with MicroCenter, but CDW is not very well known, neither is ShopBLT, or Lenovo. B&H is a little more well known, but again you missed the bigger point.


----------



## Assimilator (Mar 4, 2022)

Mussels said:


> The fact it matches a 3050 despite having 1/3 the base clock says that either this is a super efficient design, or that nvidias base/boost clocks make even less sense than they used tor
> (cores ROPS, base, boost)
> View attachment 238626


This is a slightly cut-down 3060 running at stupidly low voltages and power limits... why would you be surprised it can match a 3050, a card that is miles down the ladder compared to the 3060?

Base clock ceased to mean anything the moment that boost clock became a thing, and base clock has become ever less relevant as boost algorithms have become smarter. The only reason base clocks are still quoted is for marketing/e-peen reasons. This 562MHz is likely a carefully-considered value that is the worst-case scenario this card could ever see if it was 100% loaded by a power virus like Furmark, while remaining inside that 70W budget.


----------



## LupintheIII (Mar 4, 2022)

Lew Zealand said:


> If Nvidia and AMD would clock their cards at reasonable speeds with reasonable voltages, all cards of this generation would have similar high efficiency.  As most of my games don't need the full performance of my 6600xt, I usually run it at 2300 MHz (-350) cores and 2200MHz (+200) memory and games average around 90-95W with full GPU usage.  On some benchmarks I can get it to top out at ~105W but I've haven't seen that in-game yet.  FPS drop from full speed is about 10%.
> 
> That's comparable to what's seen with the A2000 here and shows that Ampere and RDNA2 can run much more efficiently if one chooses to.


Exactly, I have both an RX 6800 and an RTX A2000 (yes I like power efficent cards ) and while the A2000 is impressive, in God of War at 1440p max settings my RX 6800 sits at around 80W of power consumption if I cap framerate to 60fps, wich make it actually more power efficent than my A2000 considering the tiny Quadro can't even reach 60fps in that title at same settings.



wolf said:


> Damn that was quick @W1zzard ! Firstly let me say a massive thank you for taking a look at this card just because you were asked by forum members, what a community leader.
> 
> Many thanks too for covering this with the usual depth and thoroughness. The price is yikes (what isn't atm), but damn I'd love in in an ultra small SFF build, like well under 5L.
> 
> ...


Totally agree, stellar review and this card really is the SFF dream... infact I got it day one to put inside my In-Win Chopin


----------



## wolf (Mar 5, 2022)

LupintheIII said:


> Totally agree, stellar review and this card really is the SFF dream... infact I got it day one to put inside my In-Win Chopin


Heck yeah! What's the rest of the system specs. Little backpack beast you got there.

Have you played with the voltage / frequency curve? Seems like there is a good chunk left to gain while still using 70w, pushing performance and efficiency even higher.


----------



## DeeJay1001 (Mar 5, 2022)

lexluthermiester said:


> Nonsense, It'll run Crysis with it's eye's closed.
> 
> 
> The original is still fine though.
> ...


I walked into Microcenter this afternoon as any normal person and walked out with an A2000, it's currently mining away @ 40 MH/s. If these were in stock, anyone could buy them from a plethora of stores, it is a consumer card. The whole "professional use" stigma is simply to justify the cost.


----------



## lexluthermiester (Mar 5, 2022)

DeeJay1001 said:


> I walked into Microcenter this afternoon as any normal person and walked out with an A2000, it's currently mining away @ 40 MH/s. If these were in stock, anyone could buy them from a plethora of stores, it is a consumer card. The whole "professional use" stigma is simply to justify the cost.


Good for you mate! Rather silly attempt at rubbing my nose in it, but the reality is this: The vast majority of people won't be doing that. And BTW, 40MH/s is going to earn you about $0.95(if you're lucky) per day which means your ROI on that purchase will not break even for nearly two years. Yay for you...


----------



## DeeJay1001 (Mar 5, 2022)

lexluthermiester said:


> Good for you mate! Rather silly attempt at rubbing my nose in it, but the reality is this: The vast majority of people won't be doing that. And BTW, 40MH/s is going to earn you about $0.95(if you're lucky) per day which means your ROI on that purchase will not break even for nearly two years. Yay for you...


it's a drop in the bucket, you had nothing to do with the purchase by the way, dont flatter yourself too much. I've been looking for one of these for a long while.


----------



## Nater (Mar 5, 2022)

DeeJay1001 said:


> I walked into Microcenter this afternoon as any normal person and walked out with an A2000, it's currently mining away @ 40 MH/s. If these were in stock, anyone could buy them from a plethora of stores, it is a consumer card. The whole "professional use" stigma is simply to justify the cost.



I'm so disappointed my "local" (2 hr drive) MicroCenter didn't get any this time apparently.  I wonder if they just didn't update the website, or somebody snagged them all day one.


----------



## TheHughMan (Mar 5, 2022)

It's a workstation card yet TPU only benchmark games on it.


----------



## wolf (Mar 5, 2022)

TheHughMan said:


> It's a workstation card yet TPU only benchmark games on it.


W1zzard was petitioned to review this from a gaming perspective, as it's the fastest low profile GPU on the market and appealing to SFF builders.


----------



## LowProfileDegenBuild (Mar 5, 2022)

I hope to get one of these for my HP Compaq 8200 Elite SFF or the passive cooled 1650 currently on a ZOTAC International GT 1030 Fan version 2GB GDDR5


----------



## Haile Selassie (Mar 5, 2022)

This is the same approach as MaxQ mobile graphics card. Those dies also run at about 0.7-0.8V and fit in a 75-110W power budget. Nothing new to see here.


----------



## lexluthermiester (Mar 5, 2022)

wolf said:


> W1zzard was petitioned to review this from a gaming perspective, as it's the fastest low profile GPU on the market and appealing to SFF builders.


This. From this perspective, the card does well.


----------



## dyonoctis (Mar 5, 2022)

MentalAcetylide said:


> I don't know, I think its more of a case with just the cards themselves being restricted as to who can buy them. My RTX A6000 that I use for iray rendering is a Lenovo, and those cards are usually reserved for government/research/enterprise workstations, so it can be much harder(and expensive) to get those professional workstation cards. The few A6000's that I was able to find a few months ago were crappy PNY brand. So I don't think cripple currency has as much of an impact when it comes to these particular cards. Gaming cards are a different story since they don't have the same restrictions and aren't being prioritized to businesses/enterprise systems.


Isn't PNY the only authorized manufacturer for Nvidia professional GPU ?


----------



## Jolly (Mar 5, 2022)

I've been waiting for 3 months for one of these from shop BLT.  Also would love to know more about how it compares to the w6400 (sff build, no gaming, but I do want to drive 4k 144hz monitors)


----------



## Selaya (Mar 5, 2022)

the w6400 uses the same die as the 6500xt, so you'll be ... _inheriting_ all its bad ideas (2 outputs, 4x4, lack of hardware de- and encoding), just clocked lower to hit the 75w limit


----------



## lexluthermiester (Mar 5, 2022)

dyonoctis said:


> Isn't PNY the only authorized manufacturer for Nvidia professional GPU ?


Don't think so. But anything is possible.


----------



## LupintheIII (Mar 5, 2022)

wolf said:


> Heck yeah! What's the rest of the system specs. Little backpack beast you got there.
> 
> Have you played with the voltage / frequency curve? Seems like there is a good chunk left to gain while still using 70w, pushing performance and efficiency even higher.


Rest of specs are:

Asus H370 Strix
i7 8700
RTX A2000
Cooler Master Masterair G200P
Vengeance lpx 2666 CL12 16GB
HDPLEX 160W DC-ATX
WD Blue 250GB M.2
Seagate Barracuda 1TB 2.5"
Added a picture to show how the back looks like.

Tried some OC following handy graph provided by @W1zzard, got second graphic score in Timespy for now  (first 2 are with 2XA2000 in some kind of SLI).
To be honest I don't think there would be much benefit to mess around with voltage/freq curve since the card is very much at it's limit and I don't want to fry my PCIe slot... the little GA106 can have pretty huge spikes as per Ampere tradition.
Another strange stuff is Ampere cards like to spit out 1000+ frames during menu screens for some reason (yes it's not only The New World it's just with that game you actually sit on menu for hours). Took me some time to figure out what was causing my system to hard reset... I knew it was power related, wich is to be expected since the whole thing run on a 160W PSU, but that was happening just in menu or in some specific transition area of the benchmark, probably the CPU was also drawing a ton of power trying to keep up with 1000+ fps even tho I've limited it to 45W. Setting max framerate to 144fps in Nvidia control panel fixed the issue. Aslo had this kind of stuff with my RTX 3070 but with a 650W PSU the system didn't crash (never had any of this on my RX 6800 system tho).
Now on GPU wich are severely power limited like the A2000 or 3070 it's not the biggest of the issue, but I'm not sure how a 400W 3080 would cope with that.
In the end reached 60fps avg in Horizon Zero Dawn 1440p max settings, so mission complete


----------



## AusWolf (Mar 5, 2022)

Why can't this be the RTX 3050?


----------



## Lew Zealand (Mar 6, 2022)

AusWolf said:


> Why can't this be the RTX 3050?



Because either people would complain about it being undervolt/clock limited to 75W in the locked/slot-power models, and/or people would OC it to match a 3060 in the unlocked/6-pin power models.


----------



## lexluthermiester (Mar 6, 2022)

Lew Zealand said:


> Because either people would complain about it being undervolt/clock limited to 75W in the locked/slot-power models, and/or people would OC it to match a 3060 in the unlocked/6-pin power models.


However, there are a great many people that want a card that is low profile, single slot design and runs from slot power for small form factor systems where performance better than IGP is needed. I just recently have gone hunting for such a card and have discovered a shocking lack of recent tech examples. I had to settle for a GTX-750ti because a GTX-1050 or 1650 could not be found or found for a sensible cost, to say nothing of anything from recent GPU generations. This A2000 is an excellent performer for the size but it is not going to be something most consumers will want to afford. So Auswolf's suggestion was spot on, though I think NVidia would be more likely to release something like a GT-3040...


----------



## nguyen (Mar 6, 2022)

LupintheIII said:


> Rest of specs are:
> 
> Asus H370 Strix
> i7 8700
> ...



Try overclock + undervolt your A2000 to 700mV or less to prevent the dangerous peak power usage, capping FPS (or enabling V-Sync) work but for online games you would like the highest FPS possible (without negatively affect system stability).

Here is how to Overclock + undervolt (in your case, type in 330mhz core clocks and undervolt to 700mV)


----------



## AusWolf (Mar 6, 2022)

lexluthermiester said:


> However, there are a great many people that want a card that is low profile, single slot design and runs from slot power for small form factor systems where performance better than IGP is needed. I just recently have gone hunting for such a card and have discovered a shocking lack of recent tech examples. I had to settle for a GTX-750ti because a GTX-1050 or 1650 could not be found or found for a sensible cost, to say nothing of anything from recent GPU generations. This A2000 is an excellent performer for the size but it is not going to be something most consumers will want to afford. So Auswolf's suggestion was spot on, though I think NVidia would be more likely to release something like a GT-3040...


Exactly! It's sad to think about how many high-end cards nvidia has released in the Ampere range (3070, 3070 Ti, 3080, 3080 Ti, 3090, 3090 Ti), yet there isn't a single option for small form factor gaming / HTPC from either nvidia or AMD.  On AMD's side, the RX 6500 XT / 6400 could have saved the segment if AMD hadn't decided on using a PCI-e x4 laptop GPU with no video decode unit whatsoever. On nvidia's side, there is absolutely no reason for the 3050 to be a full-height card with a power connector, as it is clearly demonstrated here by the A2000. -2% relative performance at 1080p with nearly half the power consumption. GeForce cards are ridiculously overpowered, which is just another sad fact. On the other hand, the 6500 XT delivers 75% of its performance while consuming 50% more, which makes it an even worse contender.


----------



## Cutechri (Mar 6, 2022)

AusWolf said:


> GeForce cards are ridiculously overpowered, which is just another sad fact.


Yep, I undervolted my 3070 and set a static core clock of 2100 MHz and it works just fine while consuming 60W less. From 240W to 180W. Insane.


----------



## AusWolf (Mar 6, 2022)

Cutechri said:


> Yep, I undervolted my 3070 and set a static core clock of 2100 MHz and it works just fine while consuming 60W less. From 240W to 180W. Insane.


Even just playing with the power target is quite interesting. My 2070 performs only 7% worse if I lower the power target by 29%.

Why can't nvidia set humanly workable voltages and power targets, and save the extra for overclockers?


----------



## Selaya (Mar 6, 2022)

AusWolf said:


> Why can't this be the RTX 3050?


because it'd make little sense from a binning perspective. The A2000 has about as many cores active as the 3060 - the _point_ of the 3050 is to use up the dies that didn't quite make it to a 3060 or A2000, but aren't like totally broken to the point of being wholly useless.
Since the vast majority of users aren't doing SFF, the 3050 consuming 150W for higher performance's just fine, if it means more availability through binning.
After all, that's why the A2000 exists - _an option_ for those who cannot use a 3050 instead.


----------



## londiste (Mar 6, 2022)

AusWolf said:


> Why can't nvidia set humanly workable voltages and power targets, and save the extra for overclockers?


Saving extra for overclockers is rather bad business sense.
Other than that - competitive landscape. And not only AMD vs Nvidia but also previous generation cards. It needs to be better on xx% better than the competition and to get that edge, silicon is pushed as far as reasonably possible. Improvements in manufacturing process updates have slowed down considerably, bringing even more incentive for trying to maximize gains.


----------



## Lew Zealand (Mar 6, 2022)

lexluthermiester said:


> However, there are a great many people that want a card that is low profile, single slot design and runs from slot power for small form factor systems where performance better than IGP is needed. I just recently have gone hunting for such a card and have discovered a shocking lack of recent tech examples. I had to settle for a GTX-750ti because a GTX-1050 or 1650 could not be found or found for a sensible cost, to say nothing of anything from recent GPU generations. This A2000 is an excellent performer for the size but it is not going to be something most consumers will want to afford. So Auswolf's suggestion was spot on, though I think NVidia would be more likely to release something like a GT-3040...



The A2000 has a wider memory bus and far fewer cores cut than the current 3050, making it in most aspects a 3055, more expensive to make than the current 3050, but downclocked to fit a specific wattage target.  Which is a type of card Nvidia hasn't made for consumers in a decade or more.  That's why I answered that it could never be a 3050 and instead, a cheaper to make, higher clocked, but consequently higher power card is a better match for Nvidia to call the 3050, which they did.  IMO a downclocked 2560 (or lower) core 3050, called a 3040 or whatever, is a more likely 75W SFF target.  I don't expect to see this product made.

No argument that there is a demand for a current-gen, well specced SFF slot-power only card, I'd love one.  And while there may be a great many people interested in this product, it's not nearly a great many enough to convince Nvidia or AMD yet to release something current-gen to fill that niche.  Or apparently even for the AIBs to produce enough SFF 1650s to fulfill demand during the current CF situation.  I wanted one about a year ago but went a different direction instead.



AusWolf said:


> Even just playing with the power target is quite interesting. My 2070 performs only 7% worse if I lower the power target by 29%.
> 
> Why can't nvidia set humanly workable voltages and power targets, and save the extra for overclockers?



That's not optimized for benchmark comparisons, throwing 50% more power at 10% more performance is.  Those who care about heat/longevity of components/power usage etc. just undervolt/clock and use our PCs that way.  I learned this by necessity as I started with power- and cooling-restricted PCs and GPUs and just got used to doing this optimization for every one I own.


----------



## tabascosauz (Mar 6, 2022)

W1zzard said:


> Not possible, the fan speed slider goes from 30% to 100% (not 0% to 100%)



I meant more the custom fan curve functionality in Afterburner, where you can manually draw the entire curve. Does it still limit you to 30% at the lowest point?

Though I guess it could just as easily accept a < 30% speed on the curve but just ignore it


----------



## lexluthermiester (Mar 6, 2022)

Lew Zealand said:


> The A2000 has a wider memory bus and far fewer cores cut than the current 3050, making it in most aspects a 3055, more expensive to make than the current 3050, but downclocked to fit a specific wattage target.


I'd be happy with a Geforce branded card like that, but it needs to be single-slot.


----------



## Mussels (Mar 7, 2022)

AusWolf said:


> Even just playing with the power target is quite interesting. My 2070 performs only 7% worse if I lower the power target by 29%.
> 
> Why can't nvidia set humanly workable voltages and power targets, and save the extra for overclockers?





Cutechri said:


> Yep, I undervolted my 3070 and set a static core clock of 2100 MHz and it works just fine while consuming 60W less. From 240W to 180W. Insane.



Because AMD is genuinely competing this time around.
When theres competition, use up all the headroom.
No competition? Focus on efficiency, use that headroom for reliably known future products (for the shareholders)
(See: intel running quad cores almost the same for 10+ years, with only memory clocks really changing)


----------



## AusWolf (Mar 7, 2022)

Mussels said:


> Because AMD is genuinely competing this time around.
> When theres competition, use up all the headroom.
> No competition? Focus on efficiency, use that headroom for reliably known future products (for the shareholders)
> (See: intel running quad cores almost the same for 10+ years, with only memory clocks really changing)


I see what you mean. Though on my end, it's sad to see competition being more important than consumer needs. I get it, competition is good for the market. It's just not good for the final product. I mean, OK, everybody cried about Intel releasing extremely similar quad cores for such a long time, but was anybody forced to buy them? Not really. You could get away with using Sandy or Ivy bridge for a decade, which I think was great. I have no intention of upgrading my PC every year unless I need to, and I have no intention of swapping out a perfectly good PSU just to be able to feed a 300+ W graphics card with 5 power connectors.


----------



## LupintheIII (Mar 7, 2022)

Selaya said:


> because it'd make little sense from a binning perspective. The A2000 has about as many cores active as the 3060 - the _point_ of the 3050 is to use up the dies that didn't quite make it to a 3060 or A2000, but aren't like totally broken to the point of being wholly useless.
> Since the vast majority of users aren't doing SFF, the 3050 consuming 150W for higher performance's just fine, if it means more availability through binning.
> After all, that's why the A2000 exists - _an option_ for those who cannot use a 3050 instead.


The only reason RTX 3050 is configured that way is to allow Nvidia to switch to GA107 chip as soon as this craze go away and prices return to normal (you can see the GA106 chip is cut down to match full GA107 exactly), infact there is no chip so defective that 40% of the cores, 50% of the memory controller and 50% of the PCIe lanes need to be disabled and still works fine. 
Yes chips going to RTX 3050 are pretty bad binns, but GA106 in RTX 3060 is already cut down quite a bit and A2000 even more. Perfect chips goes to laptops, defective chip wich can clock hig but needing more power goes to 3060 and defective chips not clocking as fast but with no power leaks goes to A2000.
3050 it's an artificially cut-down chip to meet a performance target, a marketing stunt to rain on 6500XT parade if you will.


----------



## lexluthermiester (Mar 7, 2022)

Selaya said:


> Since the vast majority of users aren't doing SFF, the 3050 consuming 150W for higher performance's just fine, if it means more availability through binning.
> After all, that's why the A2000 exists - _an option_ for those who cannot use a 3050 instead.


You seem to be missing some important context..


----------



## Selaya (Mar 7, 2022)

Do enlighten me then.
The 75W/SFF crowd is an incredibly small tiny one, and guess what an option (the RTX A2000) actually exists for them.
Generally speaking, it is just much more profitable to jam way more power into your dies to increase performance vs jamming more cores/CUs/w/e into your die, so from a business perspective all the current SKUs make perfect sense.
I don't necessarily agree w/ how nv/AMD(/intel) operate at all, but from their perspective all of this is perfectly logical and makes perfect sense. And I can see that.

So yeah, deal w/ it.


----------



## AusWolf (Mar 8, 2022)

Selaya said:


> Do enlighten me then.
> The 75W/SFF crowd is an incredibly small tiny one, and guess what an option (the RTX A2000) actually exists for them.
> Generally speaking, it is just much more profitable to jam way more power into your dies to increase performance vs jamming more cores/CUs/w/e into your die, so from a business perspective all the current SKUs make perfect sense.
> I don't necessarily agree w/ how nv/AMD(/intel) operate at all, but from their perspective all of this is perfectly logical and makes perfect sense. And I can see that.
> ...


I've just had a look... the A2000 isn't available through UK retailers. I've managed to find one on ebay for £660. So no, SFF people don't have an option.

Other than that, I see your point.


----------



## Selaya (Mar 8, 2022)

AusWolf said:


> I've just had a look... the A2000 isn't available through UK retailers. I've managed to find one on ebay for £660. So no, SFF people don't have an option.
> 
> Other than that, I see your point.


I mean, I feel you but technically that's still _availability_. 'Twas no differet w/ uhm _ordinary_ GPUs until a (short) while ago after all, anyways.


----------



## Mussels (Mar 8, 2022)

AusWolf said:


> I see what you mean. Though on my end, it's sad to see competition being more important than consumer needs. I get it, competition is good for the market. It's just not good for the final product. I mean, OK, everybody cried about Intel releasing extremely similar quad cores for such a long time, *but was anybody forced to buy them? *Not really. You could get away with using Sandy or Ivy bridge for a decade, which I think was great. I have no intention of upgrading my PC every year unless I need to, and I have no intention of swapping out a perfectly good PSU just to be able to feed a 300+ W graphics card with 5 power connectors.



Yes, many people were forced to buy them.
Anyone who needed a new PC with warranty was forced to, anyone upgrading and wanting warranty, and so on.

We had almost 10 years where if a part failed, you had to risk the second hand market, or buy a new mobo/CPU combo even if you had working parts. Theres a reason my 2500K sat in a box for 5+ years,  because i couldnt get a reliable mobo to use it with, and because buying new still got me a quad core so i kept it in case i DID find a mobo


This GPU wasnt made for the home user, no matter their intended use - its for those half height dell and HP prebuilts that need to reach minimum spec for 'workstation' applications and get the certified professional drivers


----------



## chrcoluk (Mar 8, 2022)

AusWolf said:


> I see what you mean. Though on my end, it's sad to see competition being more important than consumer needs. I get it, competition is good for the market. It's just not good for the final product. I mean, OK, everybody cried about Intel releasing extremely similar quad cores for such a long time, but was anybody forced to buy them? Not really. You could get away with using Sandy or Ivy bridge for a decade, which I think was great. I have no intention of upgrading my PC every year unless I need to, and I have no intention of swapping out a perfectly good PSU just to be able to feed a 300+ W graphics card with 5 power connectors.



Competition can be good but also can be bad.

If we go back to the Intel stagnation from sandy bridge to start of coffee lake, the benefit of that era was game dev's were forced to reign in their coding to work on what was out there, and people with sandy bridge kept their cpu's for the majority of a decade without needing to upgrade.

Now cpu's are progressing rapidly from generation to generation again, and not just the cpu but also the chipset, tech has accelerated, this basically means pc's will feel obsolete quicker so makes pc gaming more expensive, at least in theory, I have excluded the current pricing crisis as that seems to have other circumstances, so my cost argument is based on frequency of upgrading.

With gpu's it seems that pressure to compete with a competitive AMD is going to lead to power hungry 4000 series which might need a new PSU standard.  Another cost to add to the equation.  Remains to be seen if pci express gen 3 will be ok on 4080s and 4090s.


----------



## AusWolf (Mar 8, 2022)

Mussels said:


> Yes, many people were forced to buy them.
> Anyone who needed a new PC with warranty was forced to, anyone upgrading and wanting warranty, and so on.


If you want warranty, you always have to buy new parts. That was the case 10 years ago, and that's the case today as well. What I meant is, people on Sandy / Ivy Bridge were settled for a decade. My brother still rocks a first gen Core i5 (Westmere, I think?), and only now is he starting to feel the need to upgrade.



Mussels said:


> We had almost 10 years where if a part failed, you had to risk the second hand market, or buy a new mobo/CPU combo even if you had working parts.


How is that different from any other era in IT history (or even nowadays)?



Mussels said:


> This GPU wasnt made for the home user, no matter their intended use - its for those half height dell and HP prebuilts that need to reach minimum spec for 'workstation' applications and get the certified professional drivers


I get that.  All I'm saying is that it could very well be a commercial (gamer) product too. I see no reason why my new GeForce should require a kilowatt power supply and a new power connector simply because its designation isn't A-something.


----------



## Mussels (Mar 8, 2022)

AusWolf said:


> If you want warranty, you always have to buy new parts. That was the case 10 years ago, and that's the case today as well. What I meant is, people on Sandy / Ivy Bridge were settled for a decade. My brother still rocks a first gen Core i5 (Westmere, I think?), and only now is he starting to feel the need to upgrade.
> 
> 
> How is that different from any other era in IT history (or even nowadays)?
> ...


Question was asked: did intel force you to buy them

The answer was yes, people were forced to buy devices, basically the same, year after year.

School IT department? Every year, new purchases... saaaaame as last year. Except higher prices and more bundled garbage.


----------



## AusWolf (Mar 8, 2022)

Mussels said:


> Question was asked: did intel force you to buy them
> 
> The answer was yes, people were forced to buy devices, basically the same, year after year.
> 
> School IT department? Every year, new purchases... saaaaame as last year. Except higher prices and more bundled garbage.


Who was forced? I wasn't.

My school certainly didn't buy tech every year. They jumped from 486 and Pentium 1 all the way to Core 2 Duo and Quad. I graduated in that era, but I wouldn't be surprised if they still used those PCs even now.

I'm talking about home gaming anyway. The deals Intel cut with companies and other agencies don't concern me.


----------



## MentalAcetylide (Mar 8, 2022)

dyonoctis said:


> Isn't PNY the only authorized manufacturer for Nvidia professional GPU ?


No, but you probably wouldn't be able to outright purchase a Lenovo brand RTX A5000/6000 from some place like Best Buy or most other places where you would typically buy a PNY or other similar graded consumer card. To begin with, its a ball ache for the average Joe like myself to get just one unless you're with a company that orders enterprise systems/parts directly from Lenovo. I'm not 100% certain, but I think one of the main reasons for this is simply due to the level of quality control. If Lenovo is catering to customers such as the US government, which pays a lot more for better QC of products it buys in bulk, I don't think they're going to be producing extra to sell to the rest of us and expect to make much of a profit from it since they're not going to make the same product with two different levels of QC.


----------



## Mussels (Mar 9, 2022)

AusWolf said:


> Who was forced? I wasn't.
> 
> My school certainly didn't buy tech every year. They jumped from 486 and Pentium 1 all the way to Core 2 Duo and Quad. I graduated in that era, but I wouldn't be surprised if they still used those PCs even now.
> 
> I'm talking about home gaming anyway. The deals Intel cut with companies and other agencies don't concern me.


Oops sorry, forgot that you as an individual matter more than all other metrics

Yes, many places are forced to buy new hardware. Lucky you for not being one of them.
School students here? Forced to buy laptops  and iPads (yep, MUST be apple) - and if intel release shit, you get shit for your money.


----------



## AusWolf (Mar 9, 2022)

Mussels said:


> Oops sorry, forgot that you as an individual matter more than all other metrics
> 
> Yes, many places are forced to buy new hardware. Lucky you for not being one of them.
> School students here? Forced to buy laptops  and iPads (yep, MUST be apple) - and if intel release shit, you get shit for your money.


We never had any Apple stuff in my school!  Maybe I would have grown to like it a bit more if we did. We'll never know now.


----------



## Selaya (Mar 9, 2022)

Okay, a different question since apparently it hadn't been asked before but:
As w/ all Quadros this uses the Quadro driver, yes? Any issues/things of note w/ them?

Game benchmarks seem to have run fine. I guess the GeForce Experience's out of the question (not that I care, but yea), or is there some sorcery & witchcraft you could do w/ nvcleaninstall to convince that it's actually a GeForce card or something?


----------



## LupintheIII (Mar 9, 2022)

Selaya said:


> Okay, a different question since apparently it hadn't been asked before but:
> As w/ all Quadros this uses the Quadro driver, yes? Any issues/things of note w/ them?
> 
> Game benchmarks seem to have run fine. I guess the GeForce Experience's out of the question (not that I care, but yea), or is there some sorcery & witchcraft you could do w/ nvcleaninstall to convince that it's actually a GeForce card or something?


Yes, quadro drivers, no Geforce Experience (wich to me is a benefit also because after Nvidia got hacked there are fake drivers floating around).


----------



## Mussels (Mar 9, 2022)

AusWolf said:


> We never had any Apple stuff in my school!  Maybe I would have grown to like it a bit more if we did. We'll never know now.


Big long story incoming about dealing with decades of forced hardware choices. Bonus cookies for anyone who can actually read it all.


*Work part:*
Once a school or business gets offered cheaper prices to go exclusive, they do. Always. The true cost is that it means that's all those people learn to use: and it has a long, ongoing snowball effect (look at how graphics designers think mac is the only way, because those art schools got their cheap macs - and then the students had to sell kidneys to get the software for themselves later)

I wont go into details (legally cant) but i've worked at pizza stores and burger stores as IT support, and we were contractually forced to buy certain hardware only, from certain brands only - and the entire store at a time. No single purchases. Owners didn't usually notice that part,

Running decades old linux variants off 400Mhz Pentium's in 2021 was... Look i still have trauma. Not like it was forced over a few hundred stores or anything, nah cant possibly be.
(When a store upgraded one machine they had to upgrade ALL machines - so the old ones got traded around to try and prevent stores needing full upgrades)
We had a store get new hardware and the new OS that came with it, and rather than train the old staff for the new software they fired everyone and trained new people because the old system was just too far apart from the new one, it was impossible to unlearn.



Spoiler: longer education/school story



Small country town with literally two high schools, across the road from each other. If one school fell behind in something - they just shared.

My generation in that town was forced to use either Dell pieces of poop (Mmmm pentium 4), or apple iMac G3's.
We had two high schools next to each other, both forced by contract to use the one supplier only.
Catholic college had to use Apple, the government one had to use Dell.
Us aussies had invented wifi back then, but it wasnt commercially available - so no laptops or ipads in that era.

What this meant is the catholic college kids grew up using apple and it's all they knew, while the public school kids grew up using windows.
Oh, except that the catholic college had to send their kids over to our school to break use the dells, because the macs couldn't get any spare parts, and were not compatible with the early online systems and word processing apps of the era the government required.
This was pre-wikipedia, pre google, pre wifi... look i'm old. You wanted information? You had to fire up Encyclopedia Britannica, search the info, print it, quit EB, open up... was it still word and office back then? retype it, print it, hand it in. It was rather shite.
So we had one school with working computers and one with MSpaint and CD-ROM drives, but no software - because you had to buy apples approved specialty mac software... which was simply not available, or really highly priced (cheap hardware, expensive software)

So y'know - an entire school forced by contract to use computers that didn't work with any of the required software or systems. Most of the kids from that apple school i still talk to gave up on tech and can't even do a copy paste. Yeah, it's f*cking sad and i wouldn't believe it if they didn't keep asking me for help.

Why does this happen? Because companies give out huge discounts and rebates if you buy into their ecosystem exclusively. Apple will give their stuff out free to schools if it means kids grow up using it, so their parents buy it at home, and they get a lifetime of software purchase income. Once they spend money and find it cant be transferred OUT of that ecosystem, they tend to stick around even longer.

Skipping on the decades, we had '*One Laptop per Child Australia'* which of course used the cheapest shittiest laptops out there. These things were under spec and cheaply made - they did technically do what was required for the classrooms but they were fragile, broke easily on their own and the moment the software requirements changed (like adding in a mandated bloated antivirus like mcafee or nortons) they became near useless.

Following that, some schools thought: oh, let's use something EASIER and just get a bulk deal for ipads.
Apple gave huge discounts, locked the ipads into school accounts only with total oversight (so zero resale value, locked and disabled outside the agreed upon apps and initial school accounts)
kids bring em home... find they cant do half what they need. Oh sure they can login to submit their work, but they'd have to do the work on a fully working PC or mac, email it to the ipad , then submit it to the school websites... except half the time they couldnt do that since apple didnt allow apps to share files, and had no file browser.

Every year, they had to buy the new series. If kids broke any the previous year (they're kids, they broke tons) they couldn't be replaced with the original - making classes split between different tech, needing different instructions for the kids.
So that turned fast into every year, every kid in every class needs the new model ipad (i swear we did this with the Ti-8x calculators in my day too) at school prices - and if they managed to keep it working, they couldnt log out of the school account and use it for personal use anyway - and within a few years it'd suddenly stop getting updated apps and become useless for the school approved apps anyway.




My kid's 8, he's been trained (via gaaaaames) to use iOS, windows 7 10 and 11, Edge, chrome, firefox, discord, and basic file browsing/copy-pasting.

Their solution this year? One country wide app that only works on iOS and Android (phone only, no tablet, PC, web browser support) that is a glitchy fucking mess. I'm meant to download homework, print it have him complete it, and upload photos of it.




TL;DR: Tech companies play politics forcing sales of what they want sold, and when people grow up using it or spend their entire education using it - they're locked in for life.


----------



## Nater (Mar 10, 2022)

^  Yep.  Microsoft bought out Michigan State University back in 2000/2001.  I spent my first year in CSE learning on Sun Unix systems.  2nd year it was "here's a free volume license of Windows XP, and Visual Studio"  ALL new PC's in the entire engineering building.  I was totally lost.

In other news - still no ETA on my PNY A2000 from ShopBLT ordered months ago.  I cancelled my A4000 order the other day after it was pushed back to July.  I'm content to sit and wait now, the A4000 is in stock everywhere else with prices crashing.  $1600 a little over a month ago, and I'm seeing brand new Dell pulls for $1150 on e-bay now.  NiB cards for $1250-$1300.


----------



## uuee (Mar 14, 2022)

Available for 880EUR in Hungary, while 3060 starts at around 600EUR. 



Mussels said:


> We had almost 10 years where if a part failed, you had to risk the second hand market, or buy a new mobo/CPU combo even if you had working parts.


We have a very strong second hand market here (much more reasonable prices than ebay), so it was never a problem.



Mussels said:


> School IT department? Every year, new purchases... saaaaame as last year. Except higher prices and more bundled garbage.


In the 3rd world, schools doesnt have such contracts. Instead it meant that their purchases werent become obsolote by the time they got the resources for another upgrade.


----------



## omerfak (Apr 6, 2022)

What I'm surprised by is no game in your benchmark suite requires more than 6GB of VRAM. Even at 4K. Is that normal? No games actually require more than 6GB of VRAM or am I missing something like the card not being powerful enough for the VRAM to be a limiting factor?


----------



## lexluthermiester (Apr 6, 2022)

omerfak said:


> or am I missing something


Yup. A lot of games will run on 6GB. They just don't run as well as they would a card with more VRAM.


----------



## omerfak (Apr 6, 2022)

lexluthermiester said:


> Yup. A lot of games will run on 6GB. They just don't run as well as they would a card with more VRAM.


I watched a few videos saying even 8gb cards throttle on some games at 4k with max textures like doom eternal. So I assumed 6gb would be a issue at 4k. Not that A2000 is a 4k card but with dlss it's possible to get a decent performance.


----------



## lexluthermiester (Apr 6, 2022)

omerfak said:


> So I assumed 6gb would be a issue at 4k.


It really would unless you turn some settings down. The A2000 should really be considered a great 1080p raytracing card for small-form-factor PC's if using it for gaming. Though that is not it's primary intended focus by NVidia.


----------



## W1zzard (Apr 6, 2022)

omerfak said:


> but with dlss


DLSS uses less memory than 4K due to its lower render resolution, shouldn't be a problem


----------



## LowProfileDegenBuild (Apr 7, 2022)

lexluthermiester said:


> It really would unless you turn some settings down. The A2000 should really be considered a great 1080p raytracing card for small-form-factor PC's if using it for gaming. Though that is not it's primary intended focus by NVidia.


That's good to know thank you


----------



## The red spirit (Jul 11, 2022)

Mussels said:


> School students here? Forced to buy laptops  and iPads (yep, MUST be apple) - and if intel release shit, you get shit for your money.


WTF is this shit? Somebody should sue them ASAP. Outside of It class student's don't need any computer at all and definitely shouldn't be lugging one either. And any attempt to force students into some single brand should guarantee some jail time and permanent removal of permit to work in education.


----------



## Mussels (Jul 11, 2022)

The red spirit said:


> WTF is this shit? Somebody should sue them ASAP. Outside of It class student's don't need any computer at all and definitely shouldn't be lugging one either. And any attempt to force students into some single brand should guarantee some jail time and permanent removal of permit to work in education.


Try this one from the UK
Student Punished for Arriving With 93 Percent Battery in Her iPad at UK School (news18.com)

(But lets keep this on topic, my fault for derailing originally)


----------



## qubit (Jul 11, 2022)

Mussels said:


> Try this one from the UK
> Student Punished for Arriving With 93 Percent Battery in Her iPad at UK School (news18.com)
> 
> (But lets keep this on topic, my fault for derailing originally)


That's insane! 

Sorry, won't say anything more about this.


----------



## SOAREVERSOR (Jul 11, 2022)

The red spirit said:


> WTF is this shit? Somebody should sue them ASAP. Outside of It class student's don't need any computer at all and definitely shouldn't be lugging one either. And any attempt to force students into some single brand should guarantee some jail time and permanent removal of permit to work in education.



People have been forced into a brand of calculator and other items for decades.  That's how it works.  The school standardizes on item X and you need to go out and get it.

Like it or not the iPad has taken over the tablet market.  And for higher end art classes, compsci, or straight science the macpro is what's supported at most schools now and Windows laptops means get thine ass back to the lower level classes with the stupids.


----------



## 95Viper (Jul 11, 2022)

Keep it on topic!
Stop the off color remarks (insults).


----------



## Nater (Jul 12, 2022)

So apparently I forgot to cancel one of my RTX A2000 orders.  It showed up in the mail the other day.  $582 new-in-box, figure I'll go with it.
And yeah, it's twice as fast as an RTX 3080 w/ reg hack in SolidWorks 2022 w/ FSAA and Enhanced Graphics option turned on.


----------



## Tomgang (Sep 18, 2022)

So now I have had the pleasure of trying out rtx a2000 this weekend. A pleasant experience. 

I have only tested 1 game ( Wolfenstein ii: the new colossus) so far as I have only had time for gaming and testing this weekend. But I will share my personal experiences. 

Performance is definitely good for the size and form factor. I have compared it to my old gtx 1650 with gddr5 memory and low profile cooler and 75 watt tdp rating. So the slowest og gtx 1650 version, but since that card is with low profile cooler and similar tdp rating. It's a very fair comparison. 

So with game settings 1440P and medium settings, my GTX 1650 could barely manage 90 fps at best. I could not go higher do to vram limit. Rtx a2000 at 1440P and all settings at high or ultra. It manages at least t 110 fps and hovered between 120 and 130 fps the most and up to 145 fps at best. So a significant jump with even 5 watt less used.

Fan noise was quite pleasant as well. Ilde is 3000 rpm and at 100 % fan speed, my card goes to 6500 rpm. In normal use the settle in at around 4000 rpm or 50 % fan speed and temperature at around 72 to 75 degrees Celsius. You can manually lock the fan speed to a given speed it will keep. From 30 % to 100 % fan speed. 

Overclocking is quite good. In the game I tried. I dial in 280+ for gpu core clock and 1300+ for memory in msi afterburner. That results in gpu clock jump from 1170 mhz to between 1300 and 1350 mhz just by oc it despite power limit. In an area I tested where I at stock got 130 fps, the oc raised that fps to 141. So there are some oc potential in this card. I might have to dial this down as it might not be stable in other games.

Power target can be a justed. From 100 % and down to 14 %. Or from 10 watt to 70 watt. But despite that. The card can't be limited to less that than 45 watt or after around 60 % or lower power target. The card will keep pulling 45 watt. But you can still save around 25 watt of power if you lower the power target. You can't go above the 70 watt. It's locked. 

The good thing
Fast for a mini-itx card size.
Very power efficient
Despite the high ilde fan rpm, ilde is close to silent. At 4000 rpm, you can hear it, but it's far from annoying.

The bad thing 
It's expensive
I would have wished nvidia would have used all 75 watt pcie can deliver to give the card as much power headroom as possible. 
8 GB vram had been more fitting for this card. At least when we talk about gaming.
6 gb vram is still a limiting factor for this card.

So if you can live with the price and you need a small card for your mini-itx build. Yes I will recommend the card. It's definitely going to be a nice upgrade for me over gtx 1650.


----------



## Lew Zealand (Sep 19, 2022)

Thanks for the report, that was great.  I have a slot power only GTX 1050 Ti and it prefers to draw about 68.5W max though it will occasionally stray to 69W and very rarely will momentarily touch 70W.  I assume your A2000 does the same.  With my card, it's already exiting it's most efficient range so those extra W wouldn't make much of a difference but your card is likely operating well within its max efficiency range so every extra W would have been a nice boost.  Even so, the times when those extra FPS will be actually noticeable in-game will probably be quite rare and +280/1300 is already pretty nice.

What CPU/Mobo are you using?  Mine is just in an Optiplex 9020, i7-4790, 16GB Dell RAM so losing any little efficiency on my 1050 Ti is a minor thing as the rest of the system fails to be beastly.


----------



## AusWolf (Sep 19, 2022)

Lew Zealand said:


> Thanks for the report, that was great.  I have a slot power only GTX 1050 Ti and it prefers to draw about 68.5W max though it will occasionally stray to 69W and very rarely will momentarily touch 70W.  I assume your A2000 does the same.  With my card, it's already exiting it's most efficient range so those extra W wouldn't make much of a difference but your card is likely operating well within its max efficiency range so every extra W would have been a nice boost.  Even so, the times when those extra FPS will be actually noticeable in-game will probably be quite rare and +280/1300 is already pretty nice.
> 
> What CPU/Mobo are you using?  Mine is just in an Optiplex 9020, i7-4790, 16GB Dell RAM so losing any little efficiency on my 1050 Ti is a minor thing as the rest of the system fails to be beastly.


Somebody correct me if I'm wrong, but I think those cards only touch South of 70 W even with an official TDP of 75 W to accommodate any potential power spikes (70 W is an average - not maximum). If you had an average consumption of 75 W with spikes of up to 80-85 W, the PCI-e slot wouldn't be able to handle it. By spikes, I mean the kind that last milliseconds and so can't be detected by software.


----------



## qubit (Sep 19, 2022)

AusWolf said:


> Somebody correct me if I'm wrong, but I think those cards only touch South of 70 W even with an official TDP of 75 W to accommodate any potential power spikes (70 W is an average - not maximum). If you had an average consumption of 75 W with spikes of up to 80-85 W, the PCI-e slot wouldn't be able to handle it. By spikes, I mean the kind that last milliseconds and so can't be detected by software.


Indeed, graphics card mfrs have to be very careful when pulling power from only the PCIe slot as more could damage the mobo. So much more headroom and leeway with a PCIe power connector.


----------



## Tomgang (Sep 19, 2022)

Lew Zealand said:


> Thanks for the report, that was great.  I have a slot power only GTX 1050 Ti and it prefers to draw about 68.5W max though it will occasionally stray to 69W and very rarely will momentarily touch 70W.  I assume your A2000 does the same.  With my card, it's already exiting it's most efficient range so those extra W wouldn't make much of a difference but your card is likely operating well within its max efficiency range so every extra W would have been a nice boost.  Even so, the times when those extra FPS will be actually noticeable in-game will probably be quite rare and +280/1300 is already pretty nice.
> 
> What CPU/Mobo are you using?  Mine is just in an Optiplex 9020, i7-4790, 16GB Dell RAM so losing any little efficiency on my 1050 Ti is a minor thing as the rest of the system fails to be beastly.


Yes rtx a2000 is pulling the exact same wattage as your gtx 1050 ti does. Around 67 to 69 watts.

I am using the card in a dual system. Meaning two computers in 1 case. A2000 is hooked to a Ryzen 5 5600X cpu and a Asus rog strix B550-I gaming mini-itx motherboard and 32 GB DDR4 ram in a custom dual system built. You can see my system under "project logs".


----------



## kraiggers (Nov 10, 2022)

I'm trying to determine if I can use this card to drive two 5k displays (with TB3 inputs).  The review says 4x 4k are supported, but this PDF from Nvidia says 4x 5k60 is ALSO supported.  Which one of those is mistaken, the article or the info sheet?  Anyone have any real-world experience with 5k on this card?


----------



## Mussels (Nov 11, 2022)

kraiggers said:


> I'm trying to determine if I can use this card to drive two 5k displays (with TB3 inputs).  The review says 4x 4k are supported, but this PDF from Nvidia says 4x 5k60 is ALSO supported.  Which one of those is mistaken, the article or the info sheet?  Anyone have any real-world experience with 5k on this card?


trust your nvidia source, it says this




the fine print says you'll need two DP 1.4a monitors to do it


----------



## kraiggers (Nov 11, 2022)

Mussels said:


> the fine print says you'll need two DP 1.4a monitors to do it



Well, I can't find a 5k display that I like and can afford two of.  Apple Studio Display is *baller* but doesn't really play that nicely with windows/pc.  LG Ultrafine is… not well made, even though the panel is great.  I'll just use the LG ultrawide 5k2k I already have.  It's really nice, even though it doesn't have the 220 ppi of a good 5k display.  

What I really want, is a 37" diagonal 7680 8k wide by 2880 3k tall 21:9 display, at 220 ppi.  That would be _*PERFECT*_.  But, this does not exist, as far as I can tell...


----------



## lexluthermiester (Nov 11, 2022)

kraiggers said:


> Well, I can't find a 5k display that I like and can afford two of.


Why 5k? Is 4k not a large enough resolution for what you do?



kraiggers said:


> What I really want, is a 37" diagonal 7680 8k wide by 2880 3k tall 21:9 display, at 220 ppi. That would be _*PERFECT*_. But, this does not exist, as far as I can tell...


Video editing maybe?


----------



## kraiggers (Nov 12, 2022)

lexluthermiester said:


> Why 5k? Is 4k not a large enough resolution for what you do?
> 
> Video editing maybe?


no video editing. I just really like big high dpi screens. Ideally 220 dpi. 

For a mid 30s-ish ultrawide, at 220 dpi, do some math, and you end up at 8k3k. I’d pay $3-4k for that. (I mean preferably less, but lots of pixels costs $$).

integrated gpus can now drive full 8k over DisplayPort 1.4, so it’s just cost and market problems.


----------



## lexluthermiester (Nov 12, 2022)

kraiggers said:


> no video editing. I just really like big high dpi screens. Ideally 220 dpi.
> 
> For a mid 30s-ish ultrawide, at 220 dpi, do some math, and you end up at 8k3k. I’d pay $3-4k for that. (I mean preferably less, but lots of pixels costs $$).
> 
> integrated gpus can now drive full 8k over DisplayPort 1.4, so it’s just cost and market problems.


Fair enough. If you want high DPI, get yourself a 32" 4k display. That should satisfy you.


----------



## Nater (Nov 13, 2022)

lexluthermiester said:


> Fair enough. If you want high DPI, get yourself a 32" 4k display. That should satisfy you.


Doubt it.  The only thing that comes close to what he's wanting is the 32" 6K Retina Display from Apple or the 8K screen from Dell.  The rate things are going I don't  think we see the monitor he wants within the next 4 years, if at all.  And if you're buying $4000 monitors, why are you dinking around w/ a $450 entry level workstation card? 

*edit* almost forgot, found this neat little calculator for Size/Resolution/PPI








						Screen size calculator · toolstud.io
					

Video screen size calculator, from pixel dimensions, resolution to dimensions




					toolstud.io


----------



## lexluthermiester (Nov 13, 2022)

Nater said:


> Doubt it.  The only thing that comes close to what he's wanting is the 32" 6K Retina Display from Apple or the 8K screen from Dell.  The rate things are going I don't  think we see the monitor he wants within the next 4 years, if at all.  And if you're buying $4000 monitors, why are you dinking around w/ a $450 entry level workstation card?
> 
> *edit* almost forgot, found this neat little calculator for Size/Resolution/PPI
> 
> ...


My point was that for the size and resolution it'd would be close enough and good for what they want. Though a 27" 4k display would be closer. The only display that will get really close would be the following;








						LG 27MD5KL-B Ultrafine 27" IPS LCD 5K UHD Monitor - Newegg.com
					

Buy LG 27MD5KL-B Ultrafine 27" IPS LCD 5K UHD Monitor with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				



27" 5120x2880 at 217DPI
https://toolstud.io/video/screensiz...unit=inch&resolution_w=5120&resolution_h=2880
But it's also $1300 and the pixel refresh rate is 14ms! I wouldn't touch that if they paid me..


----------



## Mussels (Nov 13, 2022)

32" 4k is great

I run at 150% scale to get exactly (and it is mathematically exact) the same size as my 1440p 32" display

At 100% scale everythings too damn small to read and use anyway, higher DPI doesnt benefit you at that level (And its why 28" and smaller 4K displays are useless, you cant use them without the high DPI scaling negating a lot of the benefits)


----------

