• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

More GeForce GTX 465 Details Surface

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,668 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
More details have surfaced about NVIDIA's upcoming GPU which is on the low-end of the GeForce GTX 400 series, the GTX 465. Chinese website eNet.com.cn published some lesser known specifications about the GPUs, which pieced together with known details more or less completes the picture. The GTX 465 has not four, but five streaming multiprocessors (SMs) disabled from the GF100 core, yielding 352 CUDA cores (against the earlier known number of 384). With a 256-bit wide GDDR5 memory interface, the GPU has 32 out of 48 ROPs enabled, and makes use of 1 GB of memory.

Out of the few benchmarks the GTX 465 was put through, it scored 5488 points in the eXtreme preset of 3DMark Vantage, which is roughly 20% less than what a GTX 470 would manage. It is said to have outperformed the ATI Radeon HD 5870 in Far Cry 2, while was slower than Radeon HD 5830 in Crysis Warhead. The GeForce GTX 465 will be launched on the 3rd of June, at the Computex 2010 event.



View at TechPowerUp Main Site
 
Last edited:
Goody Goody, I've been waiting for a while.

Interesting how its low end
 
Looks like a lemon... SLOWER than a 5830 in anything is bad.
 
lol @ binge, I was just gonna say I hope its cheap as chips or it isnt going far;)
 
hahahah faster then a HD5870 in farcry 2, hahaha
 
Is it just me or do these new shaders perform worse clock for clock than those on the 2xx series? Clock speeds are about the same.
 
Too crippled. I'll pass.
 
Swinging the old farcry 2 bat again, that game is so Nvidia optimized up the ass it's not worth shit, a real game like crysis that demands absolute GPU power, separates the men from the boys.

If it is priced correctly like at £140 here in the UK it may do well, but as Nvidia/AIB's and it's resellers in the UK are currently ripping of consumers with their extortionate pricing, I very much doubt it and can see it coming in at £200 or above in which case they can take a flying f**k.
 
Swinging the old farcry 2 bat again, that game is so Nvidia optimized up the ass it's not worth shit, a real game like crysis that demands absolute GPU power, separates the men from the boys.

If it is priced correctly like at £140 here in the UK it may do well, but as Nvidia/AIB's and it's resellers in the UK are currently ripping of consumers with their extortionate pricing, I very much doubt it and can see it coming in at £200 or above in which case they can take a flying f**k.

The problem is that its priced as 5850...

http://www.overclockers.co.uk/showproduct.php?prodid=GX-215-AS&groupid=701&catid=56&subcat=411

Problem nr 2.
It uses way more power than 5850, and more than 5870 ??

GG!
 
Looks like a lemon... SLOWER than a 5830 in anything is bad.

When life gives you lemons, make lemonade! I think the question we're all asking ourselves is: "Will it blend?" :pimp:
 
v.good card for mario bro.
 
why the hell do they even go with 256 bit GDDR5 if they're going to clock it that horribly slow, i really don't understand this, it isn't really hard or expensive to put 5gbs GDDR5 on these cards like 5850's have..... sheez

this thing wont end up much faster (if at all) than a GTX285, in fact maybe even slower at first. I guess we'll see, but at this stage it looks like a useless card, unless drivers work serious magic.
 
i just facepalmed, what is the point of this, okay its for lowend power consump....IM done with this card
 
But can it play Lemmings?
 
These Fermi cards are looking more and more FAIL with every card that comes out. :(
 
Swinging the old farcry 2 bat again, that game is so Nvidia optimized up the ass it's not worth shit, a real game like crysis that demands absolute GPU power, separates the men from the boys.

And by "separates the men from the boys" you really just picked it to do that because it is the exact opposite of Farcry 2, and is ATi optimized up the ass it's not worth shit, right?:laugh:

I think I'll wait for a W1z review that give an overall impression of the card, and not just two games, before passing judgement.

Hell, the GTX470 actually turned out to be a decent card but no one cared until W1z did the review, everyone just lumped it in with the horribly power hungry GTX480.
 
i just facepalmed, what is the point of this, okay its for lowend power consump....IM done with this card

Point is to use all the dud Fermis. Wait for GTX 460, it'll be "better". Anyhow, I'm almost done with ATI, you simply can't buy a 5850, 6months after release they aren't available (for decent price). So I might just get one of these or GTX 460.

But the wait.. ..come on be June already :)
 
Guess the performance will be superior but very close to a HD5830 becoming the substitute for GTX275. nVidia is smartly filling the price/performance gaps between ATi’s offerings.
 
If it is priced correctly like at £140 here in the UK it may do well, but as Nvidia/AIB's and it's resellers in the UK are currently ripping of consumers with their extortionate pricing, I very much doubt it and can see it coming in at £200 or above in which case they can take a flying f**k.

the cheapest 5830 i can find is £180 so thats the pricepoint it will try to compete at.
I'll pass
 
i love when the details surface *waves hands* and disappears with a smoke bomb
 
The problem with all these "disabled" GPU's is that they get less and less efficient. Less performance per watt. Less green. Less cool. Less interesting.

It's like taking a top of the line v12 engine, then making cheaper cars by disabling cylinders. It just doesnt work. Yes, you get a weaker performer at a lower pricepoint, but at the same time, less efficient to boot.

Fermi is failing.While the top-end might have had its "top player"credits, these weaker siblings are embarrasing. Really, a 200W card with only mid performance. Nasty.
 
So how much power do you think this card will draw? 65 80?
 
The problem with all these "disabled" GPU's is that they get less and less efficient. Less performance per watt. Less green. Less cool. Less interesting.

It's like taking a top of the line v12 engine, then making cheaper cars by disabling cylinders. It just doesnt work. Yes, you get a weaker performer at a lower pricepoint, but at the same time, less efficient to boot.

Fermi is failing.While the top-end might have had its "top player"credits, these weaker siblings are embarrasing. Really, a 200W card with only mid performance. Nasty.

You sure about that? Becuase the GTX470 is one of those "disabled" GPU's, and it has a better performance per watt then the GTX480, is more green(uses less power), and is cooler(when fan speed and noise are the same).

And hey, while we are no the subject, the HD5850 is one of these "disabled" GPU's too. Lets look at it. Yep, more performance per watt than the HD5870, is more green(uses less power), and is cooler(again at the same fan speed/noise level).

Hmmm...

And really, the GTX470 is probably the least embarrasing of the Fermi cards right now, with very reasonable performance per watt numbers actually. If you remove the simply amazing HD5000 series, which goes way beyond what has been considered normal for performance per watt up until now, and look at all the other cards in recent history, the GTX470 is prett good for a top-teir card. Beating out previous generation's top-tear cards actually. It beats out pretty much the entire HD4800 series in performance per watt, and the GTX200 series. That probably would have been considered pretty damn good if it wasn't for the HD5000 series simply rocking in power consumption.
 
Last edited:
Back
Top