# ASUS Designs Own Monster Dual-GTX 285 4 GB Graphics Card



## btarunr (May 28, 2009)

ASUS has just designed a new monster graphics card that breaks the mold for reference design GeForce GTX 295, called the ASUS MARS 295 Limited Edition. The card, although retains the name "GeForce GTX 295", same device ID, and is compatible with existing NVIDIA drivers, has two huge innovations put in by ASUS, which go far beyond being yet another overclocked GeForce GTX 295: the company used two G200-350-B3 graphics processors, the same ones that make the GeForce GTX 285. The GPUs have all the 240 shader processors enabled, and also have the complete 512-bit GDDR3 memory interface enabled. This dual-PCB monstrosity holds 32 memory chips, and 4 GB of total memory (each GPU accesses 2 GB of it). Apart from these, each GPU system uses the same exact clock speeds as the GeForce GTX 285: 648/1476/2400 MHz (core/shader/memory).






Each PCB holds 16 memory chips, a 6-phase digital PWM power circuit, drawing auxiliary power from an 8-pin PCI-E power connector, the GeForce GTX 285-class GPU, and its companion NVIO2 processor. The PCB holding the PCI-Express bus interface, also holds the bridge chip. ASUS broke away with using the nForce 200 chip, and instead is using a yet to be disclosed third-party bridge chip. Currently, PLX and IDT are two likely sources for such a chip. The memory consists of high-density 0.77 ns memory chips made by Hynix.



 

 

The electrical-management on each PCB is care of a Volterra VRM controller, which supports the I2C interface, which means that the card supports software voltage control, perhaps a big plus for ASUS' Voltage Tweak feature that is gaining in popularity. Fused power circuit provides Over Current Protection while also facilitating extreme overclocking.



 



The cooler internally has the same basic construction as the reference cooler, it uses a single leaf-blower. The card spans across two expansion slots and is slightly higher than the reference design card. ASUS also used slightly longer internal bridges that make more room for third-party coolers, and the likes. Our source from ASUS EMEA conducted a quick 3DMark Vantage test proving the card's seamless compatibility with existing drivers, while also providing a significant boost in performance over existing GTX 295 cards. Being Quad-SLI capable, this card finally makes GeForce GTX 285 (effective) quad-SLI possible, and makes for the most powerful desktop multi-GPU setup ever conceived. ASUS designed this card despite pressure from NVIDIA enforcing its rigid policy of restricting its partners from custom-designing GeForce GTX 295. If everything goes smooth throughout the development process, the card might make it for a gala launch at Computex.



 



*View at TechPowerUp Main Site*


----------



## kyle2020 (May 28, 2009)

Is it just me or did this seem to happen "over night"? No prior warning, nothing?!


----------



## Disparia (May 28, 2009)

Bout time we saw something unique. Go Asus!

nVidia, know your place! Get back to the fab and cook me more GPUs!


----------



## El_Mayo (May 28, 2009)

what's the point?
will the extra... 2-ish GB ram actually give any performance increase?


----------



## Cheeseball (May 28, 2009)

ASUS has created a god among video cards.


----------



## kyle2020 (May 28, 2009)

Imagine the trouble 32bit users would have running this.


----------



## El_Mayo (May 28, 2009)

kyle2020 said:


> Imagine the trouble 32bit users would have running this.



how come you say that?


----------



## btarunr (May 28, 2009)

El_Mayo said:


> what's the point?
> will the extra... 2-ish GB ram actually give any performance increase?



Crank up the texture quality in GTA4, and watch that GTX 285 1GB cry.


----------



## El_Mayo (May 28, 2009)

btarunr said:


> Crank up the texture quality in GTA4, and watch that GTX 285 1GB cry.



oh.. amount of memory directly relates to texture quality?


----------



## kyle2020 (May 28, 2009)

El_Mayo said:


> how come you say that?



32bit OS's use a max of 3.2GB (I think?) of ram, and that includes GPU ram, so if you have 2GB of ram and then a 4GB card in, it may cause conflicts.

Anyone feel free to correct me if Im wrong.


----------



## Necrofire (May 28, 2009)

El_Mayo said:


> how come you say that?



32-bit OS = 4GB max of memory address space. If you had, say, 2GB of ram, then there's only 2 left for addressing, which means that your shiny gpu can use a max of 2GB of ram.
If you had 4GB of ram, then to fit all that vram in, it takes away from system ram, leaving you with way less than 3GB most people get from a 32-bit OS with 4GB of ram.

Typed in a hurry, does it make sense though?


----------



## El_Mayo (May 28, 2009)

kyle2020 said:


> 32bit OS's use a max of 3.2GB (I think?) of ram, and that includes GPU ram, so if you have 2GB of ram and then a 4GB card in, it may cause conflicts.
> 
> Anyone feel free to correct me if Im wrong.



ahh right yeah
i didn't know video RAM counted you see
(i'm noobish)



Necrofire said:


> Typed in a hurry, does it make sense though?



yes.. indeed it does.


----------



## alexp999 (May 28, 2009)

btarunr said:


> Crank up the texture quality in GTA4, and watch that GTX 285 1GB cry.



I have run GTA IV maxed out (turned restrictions off) on my old GTX 260, it run fine for while 

lol.

I want to know where this came from, TPU rarely is the one breaking the news.


----------



## El_Mayo (May 28, 2009)

alexp999 said:


> TPU rarely is the one breaking the news.



oh no he di-int!


----------



## h3llb3nd4 (May 28, 2009)

and the cost?
lets hope that it is not $600....


----------



## laszlo (May 28, 2009)

i think is a collector's edition


----------



## CyberDruid (May 28, 2009)

I'm a collector


----------



## btarunr (May 28, 2009)

alexp999 said:


> I have run GTA IV maxed out (turned restrictions off) on my old GTX 260, it run fine for while
> 
> lol.
> 
> I want to know where this came from, TPU rarely is the one breaking the news.



Yeah, your card is awesome. Especially when an app is asking for 1.5 GB of video memory. 



alexp999 said:


> I want to know where this came from, TPU rarely is the one breaking the news.



Straight from ASUS EMEA.


----------



## kyle2020 (May 28, 2009)

btarunr said:


> Yeah, your card is awesome. Especially when an app is asking for 1.5 GB of video memory.



I dont think that was the point of his post?


----------



## btarunr (May 28, 2009)

kyle2020 said:


> I dont think that was the point of his post?



I don't think you got the point of my post either. 

App requiring 1.5 GB of video memory, on a card with 896 MB = at least 100% more read-backs, can't play the game beyond a point. Hence he said "for a while".


----------



## alexp999 (May 28, 2009)

You dont have to be so arrogant bta. I wasnt being.

All my point was that my GTX 260 didnt scream, and just because the game settings say thats how much its using doesnt mean thats what it uses. Also drivers assign around 2Gb of shared system memory as an overflow to the on board memory.

When I said a while it went for about an hour before crashing and that was with tons of vehicles and explosions, no popp-ins, crazy textures nothing.


----------



## btarunr (May 28, 2009)

alexp999 said:


> You dont have to be so arrogant bta. I wasnt being.



I wasn't being either. Maybe I should decorate my posts with more smilies now on.


----------



## El_Mayo (May 28, 2009)

btarunr said:


> I wasn't being either. Maybe I should decorate my posts with more smilies now on.



good idea


----------



## werez (May 28, 2009)

El_Mayo said:


> what's the point?
> will the extra... 2-ish GB ram actually give any performance increase?



In games NO ! since it`s GDDR3 , the performance might actually drop in some games .Take HD4870 for example . The card uses GDDR5 , and you barely see any difference between the 512 and 1024 version , just in really high resolutions when you run out of buffer . But again , no big difference .
This card is made just to satisfy the "need for more" , a syndrome common in our days . I want to buy a Ferrari too , i don`t like the car , but it`s expensive , and there are not many who own it , so i must be some kind of god if i have it . This is actually just a "stock" card , but tuned on the outside . I don`t think the cooling is better because i see the same blower  in there , and it`s just like the reference design . BIG ? yes , POWERFUL ? not really ... Expensive ? YOU BET ! If you want to have something special , design your own card and use the original PCB . Save money


----------



## El_Mayo (May 28, 2009)

werez said:


> In games NO ! since it`s GDDR3 , the performance might actually drop in some games .Take HD4870 for example . The card uses GDDR5 , and you barely see any difference between the 512 and 1024 version , just in really high resolutions when you run out of buffer . But again , no big difference .
> This card is made just to satisfy the "need for more" , a syndrome common in our days . I want to buy a Ferrari too , i don`t like the car , but it`s expensive , and there are not many who own it , so i must be some kind of god if i have it . This is actually just a "stock" card , but tuned on the outside . I don`t think the cooling is better because i see the same blower  in there , and it`s just like the reference design . BIG ? yes , POWERFUL ? not really ... Expensive ? YOU BET ! If you want to have something special , design your own card and use the original PCB . Save money




that's a damn fine first post


----------



## Scrizz (May 28, 2009)

Holy $h1zzz
Unstoppable


----------



## allen337 (May 28, 2009)

$700 at least, how many they think there gonna sell?


----------



## Animalpak (May 28, 2009)

Wow now Crysis 100 fps at 2560 x 1600



allen337 said:


> $700 at least, how many they think there gonna sell?



No more than 1000.


----------



## El_Mayo (May 28, 2009)

Animalpak said:


> Wow now Crysis 100 fps at 2560 x 1600



you have more ridiculous expectations than Jizzler


----------



## Animalpak (May 28, 2009)

I wanna know if is that developed for playing games or for professional users ?


----------



## Animalpak (May 28, 2009)

El_Mayo said:


> you have more ridiculous expectations than Jizzler



I was not serious mate.  

I think it is just an exaggeration this graphics card.


----------



## btarunr (May 28, 2009)

Animalpak said:


> I wanna know if is that developed for playing games or for professional users ?



General enthusiast consumers. Yes, there are only 1000 pieces in the making.


----------



## tzitzibp (May 28, 2009)

I dont know how many will buy this beast, but it seems to me that Asus is experimenting on certain aspects of the graphics card... sort of like a test base of small adjustments, improvements for future releases (car companies do it all the time).

in any case...


----------



## El_Mayo (May 28, 2009)

Animalpak said:


> I was not serious mate.
> 
> I think it is just an exaggeration this graphics card.



i dont think jizzler  is ever serious either xD



tzitzibp said:


> I dont know how many will buy this beast, but it seems to me that Asus is experimenting on certain aspects of the graphics card... sort of like a test base of small adjustments, improvements for future releases (car companies do it all the time).



a concept card?


----------



## werez (May 28, 2009)

It`s better to buy two gtx 275 from Zotac , Galaxy , or Inno3d , the new ones that come with the 
Accelero Xtreme Cooler and SLI them . The Accelero Xtreme let you push the cards to the max , and you actually kick ass with just 896 ram / card .  One good deal right there . Oh .. i almost forgot , they look sexy 


Do you remember this one ? Bet you don`t ...
http://cache.gizmodo.com/assets/resources/2008/03/asustrinity.jpg

The ASUS EAH3850 TRINITY , tricore card . 
Now tell me please , what can i do with this card in our days ?  I think you get the point ...


----------



## tzitzibp (May 28, 2009)

El_Mayo said:


> a concept card?



No.... thats a different category!

this falls somewhere in the middle.... (cant think of the right word for it, Mercedes Benz releases such a car, about 6 months before a new release)
existing product with added or/and enhanced features, that will be real life tested... and depending on the acceptance and good performance, these "added or/and enhanced features" are made standard features for new products....

sorry about that, alex...


----------



## alexp999 (May 28, 2009)

El_Mayo said:


> i dont think jizzler  is ever serious either xD
> 
> 
> 
> a concept card?



Please stop double posting, use the edit button.


----------



## theorw (May 28, 2009)

El_Mayo said:


> that's a damn fine first post



+1


----------



## El_Mayo (May 28, 2009)

alexp999 said:


> Please stop double posting, use the edit button.



my bad
can i still quote again using the edit button?


----------



## Deleted member 24505 (May 28, 2009)

More fodder for the rich show offs i think.


----------



## zithe (May 28, 2009)

El_Mayo said:


> my bad
> can i still quote again using the edit button?



You can use multi-quote. Or, hit quote, copy the text, go into edit, paste it to your post, and comment on it.

You get this: 





alexp999 said:


> Hit the quote button, copy it, edit last post and paste it in.


----------



## alexp999 (May 28, 2009)

El_Mayo said:


> my bad
> can i still quote again using the edit button?



Hit the quote button, copy it, edit last post and paste it in.


----------



## Disparia (May 28, 2009)

El_Mayo said:


> you have more ridiculous expectations than Jizzler



WHAT!!!! SOMEONE MORE RIDICULOUS THAN ME?!!? 

Well, my belief that technology moves as slow as it does is because we as consumers accept it - is serious. I may go about it in an exaggerated way though.

Case today: nVidia stifling board partners. Asus' show of rebellion (although small) brings me some happiness and consumers a unique option. Now people can ask themselves, will this Asus GTX 295 or a standard GTX 295 better serve me? Now, I accept nearly all answers and try not to judge others. e-peen points? Folding? Quadro alternative? Heck, I'll even accept gaming as a reason to buy one!


----------



## alexp999 (May 28, 2009)

Is it just me or are the overclocks extremely unrealistic.

There is no way a GT200 is capable of 833 core and 1861 shaders. That and the memory is far too slow for a GTX 285


----------



## [I.R.A]_FBi (May 28, 2009)

title has a typow


----------



## Mussels (May 28, 2009)

El_Mayo said:


> ahh right yeah
> i didn't know video RAM counted you see
> (i'm noobish)
> 
> ...



read the link in my sig about 64 bit OS - not many people were aware of it, so i spread the word.


----------



## wolf (May 28, 2009)

i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want i want


----------



## Mussels (May 28, 2009)

werez said:


> In games NO ! since it`s GDDR3 , the performance might actually drop in some games .Take HD4870 for example . The card uses GDDR5 , and you barely see any difference between the 512 and 1024 version , just in really high resolutions when you run out of buffer . But again , no big difference .
> This card is made just to satisfy the "need for more" , a syndrome common in our days . I want to buy a Ferrari too , i don`t like the car , but it`s expensive , and there are not many who own it , so i must be some kind of god if i have it . This is actually just a "stock" card , but tuned on the outside . I don`t think the cooling is better because i see the same blower  in there , and it`s just like the reference design . BIG ? yes , POWERFUL ? not really ... Expensive ? YOU BET ! If you want to have something special , design your own card and use the original PCB . Save money



every generation, the monster cards are for people with monster resolutions. This gen, thats 1080P and above.

2GB cards like this (its 2GB effectively since SLI isnt additive) are for people at resolutions like 2560x1600.

The thing is, when you DO run out of memory cause of high res and high AA... the performance doesnt just degrade, it goes instant slideshow.


----------



## TechnicalFreak (May 28, 2009)

I tell you this, my next system will be Intel based, and I must... *must* have one of them cards!

Asus fan boy?

H*ll yes!


----------



## laszlo (May 28, 2009)

btarunr said:


> only 1000 pieces in the making.




i was right about purpose 

another cookie for people with too much $


----------



## Fitseries3 (May 28, 2009)

3 of these would fold like theres no tomorrow..... let alone the 45k in 3dm vantage i'll have....

expect a few of these to show up at my house pretty quickly.


----------



## Mussels (May 28, 2009)

TechnicalFreak said:


> I tell you this, my next system will be Intel based, and I must... *must* have one of them cards!
> 
> Asus fan boy?
> 
> H*ll yes!



quad socket nehalem EX (64 threads) minimum 24GB of ram, and two of those in SLI.
Oh and 8 256GB SSD's in raid.
and then buy me one.


anything less would starve this card of its required awesome factor.
(and dont forget the 3000W PSU)


----------



## alexp999 (May 28, 2009)

Has anyone still not noticed the unrealistic clocks?


----------



## Bjorn_Of_Iceland (May 28, 2009)

Fitseries3 said:


> 3 of these would fold like theres no tomorrow..... let alone the 45k in 3dm vantage i'll have....
> 
> expect a few of these to show up at my house pretty quickly.


Dang. youve got an oil rig at the back o yer house?


----------



## Disparia (May 28, 2009)

alexp999 said:


> Has anyone still not noticed the unrealistic clocks?



We're too caught up in the awesomeness tornado!!!! WHIRL WHIRL WHIRL!


----------



## wolf2009 (May 28, 2009)

alexp999 said:


> Is it just me or are the overclocks extremely unrealistic.
> 
> There is no way a GT200 is capable of 833 core and 1861 shaders. That and the memory is far too slow for a GTX 285





alexp999 said:


> Has anyone still not noticed the unrealistic clocks?



Voltage tweaked probably. 

There could be other reasons too, like change in stock cooler, using WC or something.


----------



## Mussels (May 28, 2009)

Jizzler said:


> We're too caught up in the awesomeness tornado!!!! WHIRL WHIRL WHIRL!



if you think its a tornado now, wait til people start getting 4 of em in a PC  WHIRRRRRRRRRRRRRRRRRRRRR goes the fans.


----------



## Disparia (May 28, 2009)

Time to strap a box fan to the side of your case 

"Yeah, my case is cooled entirely by one fan"


----------



## btarunr (May 28, 2009)

Yes, those clocks are unreal for that cooler.


----------



## kyle2020 (May 28, 2009)

I suspect water / Dice pot on that with some daft voltage tweaks. Its like they do with car adverts - "the brand new Ford Crazymobile from only £6,995" - They show you a model with gorgeous alloys, metallic paint, a sporty looking spoiler, yet the small print says "Model shown Ghia 1337 model starting from £9,995". Its just a way of tempting buyers in.


----------



## GSG-9 (May 28, 2009)

You think it has dice on it? You don't customize cards before you buy them though. I bet its air/water with voltage tweaks, and if it runs hot... *shrugs.


----------



## Mussels (May 28, 2009)

alright i'm going to admit this, due to a little confusion.

is this just a custom 285, or is this a DUAL 285 - four GPU's. cause the screenshot says 4 GPU's on the GPU-Z page.


----------



## W1zzard (May 28, 2009)

Mussels said:


> alright i'm going to admit this, due to a little confusion.
> 
> is this just a custom 285, or is this a DUAL 285 - four GPU's. cause the screenshot says 4 GPU's on the GPU-Z page.



the screenshot says 4 because two of these cards are running in quad sli. imagine this cards physical appeareance like a gtx 295


----------



## GSG-9 (May 28, 2009)

Mussels said:


> alright i'm going to admit this, due to a little confusion.
> 
> is this just a custom 285, or is this a DUAL 285 - four GPU's. cause the screenshot says 4 GPU's on the GPU-Z page.



Good question, 4 would be rather insane, I thought a 285 had 2 physical gpus though, I only see 2 gpus on the die.


----------



## FreedomEclipse (May 28, 2009)

werez said:


> In games NO ! since it`s GDDR3 , the performance might actually drop in some games .Take HD4870 for example . The card uses GDDR5 , and you barely see any difference between the 512 and 1024 version , just in really high resolutions when you run out of buffer . But again , no big difference.



some what true but there are small differences between th 512mb > 1024mb cards & it all depends on how you run your games. from my understanding running with 16xAA is better with a 1024mb card due to the space available to buffer/store all the textures.

Depending on the resolutions, settings u play at & also games. you may see upto 3-5fps increase. but for performance boosts on anything above 1024mb then thats still debatable but from what I have read it hardly makes any difference at all unless u play at super high resolutions on a 40-60" monitor. - its all debatable.


----------



## Mussels (May 28, 2009)

512MB to 1GB isnt about gaining FPS, its preventing the sudden slideshow when you run out.

Run a gaming PC with 4GB of ram and drop to 1GB - everything goes slow and stuttery when you run out. its not about boosting FPS over 512MB cards, its about preventing the slowdown when you run out of memory.

same goes for 2/4GB cards.


----------



## W1zzard (May 28, 2009)

alexp999 said:


> Has anyone still not noticed the unrealistic clocks?



that's what can happen if you give the card to hardcore pro overclocker who's not afraid of the rma department. and no i doubt he's not using air. note how it says "done by kinc"

edit: lolforgot to insert a "not"


----------



## kinc (May 28, 2009)

GSG-9 said:


> You think it has dice on it? You don't customize cards before you buy them though. I bet its air/water with voltage tweaks, and if it runs hot... *shrugs.



This is with liquid nitrogen thermal solution installed on the cards. I didnt have time to insluate well enough so cards are only running -40C with 1.175V set via software (up from 1.15V). There is headroom


----------



## W1zzard (May 28, 2009)

kyle2020 said:


> Imagine the trouble 32bit users would have running this.



read: http://support.microsoft.com/kb/940105


----------



## alexp999 (May 28, 2009)

kinc said:


> This is with liquid nitrogen thermal solution installed on the cards. I didnt have time to insluate well enough so cards are only running -40C with 1.175V set via software (up from 1.15V). There is headroom
> 
> http://www.kinc.se/mars3.jpg



Well now that explains it 

Any chance of a real world bench (i.e stock clocks? Or Air OC?)

Maybe just one card?

That pic screams of e-penis.


----------



## Polarman (May 28, 2009)

laszlo said:


> i think is a collector's edition



My guess too. You see 1/1000 on the picture.

Makes me remember of the 1950XTX Uber edition that had ony 500 card made.


----------



## El_Mayo (May 28, 2009)

Jizzler said:


> Time to strap a box fan to the side of your case
> 
> "Yeah, my case is cooled entirely by one fan"



Aerocool S9 Pro xD
http://www.sanalmarketim.com/_prod/_img/l/f8fa597d27_AeroCool_S9-Pro.jpg


----------



## Mussels (May 28, 2009)

kinc said:


> This is with liquid nitrogen thermal solution installed on the cards. I didnt have time to insluate well enough so cards are only running -40C with 1.175V set via software (up from 1.15V). There is headroom
> 
> http://www.kinc.se/mars3.jpg



thanks man. good to see you around these parts.


----------



## GSG-9 (May 28, 2009)

kinc said:


> This is with liquid nitrogen thermal solution installed on the cards. I didnt have time to insluate well enough so cards are only running -40C with 1.175V set via software (up from 1.15V). There is headroom
> 
> http://www.kinc.se/mars3.jpg



Yeah kyle2020 suggested dice, I was kinda questioning the suggestion since there was not space for it. It makes me oogle though.


----------



## newtekie1 (May 28, 2009)

Why do I have to be broke?!?!


----------



## Roph (May 28, 2009)

How much power will this beast suck up?


----------



## 3870x2 (May 28, 2009)

kyle2020 said:


> 32bit OS's use a max of 3.2GB (I think?) of ram, and that includes GPU ram, so if you have 2GB of ram and then a 4GB card in, it may cause conflicts.
> 
> Anyone feel free to correct me if Im wrong.



Im sure this has already been replied to, but it is 2x2gb, each GPU addressing only 2, meaning there are 2 different pots of memory.  32-bit will address the whole 4GB, unless i've missed something.


----------



## Mussels (May 28, 2009)

3870x2 said:


> Im sure this has already been replied to, but it is 2x2gb, each GPU addressing only 2, meaning there are 2 different pots of memory.  32-bit will address the whole 4GB.



32 bit applications have a 2GB address space limit per application, so... no. Sure, you'll have 2GB for the card and 2GB usable for the OS, but once it starts duplicating into ram (assuming you're using a DX9 app, 32 bit OS likely means XP) then you're totally screwed.


----------



## 3870x2 (May 28, 2009)

Mussels said:


> 32 bit applications have a 2GB address space limit per application, so... no. Sure, you'll have 2GB for the card and 2GB usable for the OS, but once it starts duplicating into ram (assuming you're using a DX9 app, 32 bit OS likely means XP) then you're totally screwed.



I see your point, here are my counterpoints:

It is like saying that a total of 3.2 GB of ram can only be addressed.  So if you already have 3GB of ram, you can only address .2GB of your graphics card ram?
Also, who ever needs more than 3.2GB in any game? im sure no game needs more than 2GB at its max.

Im sure you are probably right, but it just doesnt seem right.  If it is per application, it is just bad programming, as games can, and have before, ran as more than 1 application (check "processes" tab on your games)


----------



## mlee49 (May 28, 2009)

kinc said:


> This is with liquid nitrogen thermal solution installed on the cards. I didnt have time to insluate well enough so cards are only running -40C with 1.175V set via software (up from 1.15V). There is headroom
> 
> http://www.kinc.se/mars3.jpg



Nice to meet you Kinc, I'm glad to see an Asus rep here.  Any chance for more pics?


----------



## Mussels (May 28, 2009)

3870x2 said:


> I see your point, here are my counterpoints:
> 
> It is like saying that a total of 3.2 GB of ram can only be addressed.  So if you already have 3GB of ram, you can only address .2GB of your graphics card ram?
> Also, who ever needs more than 3.2GB in any game? im sure no game needs more than 2GB at its max.
> ...



check the link in my sig and take the conversation over there.


----------



## 3870x2 (May 28, 2009)

Mussels said:


> check the link in my sig and take the conversation over there.



As said before, you have more knowledge than I on this matter, ill  take your word for it


----------



## W1zzard (May 28, 2009)

Mussels said:


> 32 bit applications have a 2GB address space limit per application, so... no. Sure, you'll have 2GB for the card and 2GB usable for the OS, but once it starts duplicating into ram (assuming you're using a DX9 app, 32 bit OS likely means XP) then you're totally screwed.



did nobody read the link?

any normal app won't need access to the full video memory. you tell directx, "give me that", and you get a memory pointer back to the object. then you do the same with the next object. vista/directx are smart enough to shuffle stuff around, and use memory pages so you can use all these 2 gb.


----------



## werez (May 28, 2009)

FreedomEclipse said:


> some what true but there are small differences between th 512mb > 1024mb cards & it all depends on how you run your games. from my understanding running with 16xAA is better with a 1024mb card due to the space available to buffer/store all the textures.
> 
> Depending on the resolutions, settings u play at & also games. you may see upto 3-5fps increase. but for performance boosts on anything above 1024mb then thats still debatable but from what I have read it hardly makes any difference at all unless u play at super high resolutions on a 40-60" monitor. - its all debatable.



Yes . 
Picture this . I`m an average gamer that goes for the "gameplay" and not for the "eyecandy"
I do own a high end video card ( well it was kinda high end when i bought it )  Sapphire Radeon HD 4870 2GB version . Now , i game at different resolutions depending on the game .
I`m playing Counter-strike ( yes , i`m oldschool) , Fifa , Crysis , Unreal Tournament , COD4 , COD5 , Starcraft and so on . Fifa , Starcraft  and Counter-strike don`t use the true power of my video card since , Starcraft is 2d , Counter-strike works on really lower end video cards . Now lets talk about Crysis . Ok , all settings high , higher resolution , you name it .
There is absolutely no difference between my video card , and the card a friend owns , a Sapphire Radeon HD 4870 1GB version in 1920x1200 with AA enabled . Now if people are buying a video card with 4 Gb of ram , i believe they are not playing Counter-strike , Day of Defeat or Starcraft . If you are going to buy that card to be "future proof" to play games based on engines like Crysis uses , you won`t have any performance increase. You need better GPU . 2 fps more in Crysis are not noticeable . However in Cod4 you will probably get 10 fps more at a higher resolution , but that won`t really matter since you already get 100 and you can`t see any difference . Now , in the EU   the ASUS ENGTX295 is about 430E , but the starting pricepoint was 460 . Extra 2gb`s of GDDR3 and a custom cooler design increases manufacturing costs . Don`t forget that Asus has a higher price on the market than other competitors...
If a GTX295 was 460 Euros , i expect this card to be around 550-560 . So you pay extra money for that cool looking plastic shroud ? Well what happens when you realize you need a water block for your card ? that shroud must come off right ? 
I believe Asus just wanted to show the world they can do more to improve a reference video card , other manufacturers can do that two but they don`t , because they know that 2gb extra ram are useless . Asus knows that too since they are just giving 1000 pieces away , don`t you think ?
Sorry for my bad English , but i hope you get the point ...


----------



## Mussels (May 28, 2009)

W1zzard said:


> did nobody read the link?
> 
> any normal app won't need access to the full video memory. you tell directx, "give me that", and you get a memory pointer back to the object. then you do the same with the next object. vista/directx are smart enough to shuffle stuff around, and use memory pages so you can use all these 2 gb.



i did. it doesnt use the full memory, but only whats required. but if you ask it to use 2GB of textures in a 32 bit app, somethings gotta give.

you have a better understanding than i do, cause you've programmed 3D apps. You're the w1zzard damnit, its not fair to argue with you. you need to share your knowledge more!


----------



## 3870x2 (May 28, 2009)

WoW werez...
regardless of the addressing, I dont see using  that much any time soon until modern warfare 2 or a round of 8 player BGH on stacraft II.


----------



## W1zzard (May 28, 2009)

Mussels said:


> i did. it doesnt use the full memory, but only whats required. but if you ask it to use 2GB of textures in a 32 bit app, somethings gotta give.
> 
> you have a better understanding than i do, cause you've programmed 3D apps. you need to share your knowledge more!



nobody has a single 3d object that consumes 2 GB of memory. so you have lots of small thingies that are processed individually. imagine how you move ALL that is in your apartment and takes up so much space through the little door when moving


----------



## Mussels (May 28, 2009)

W1zzard said:


> nobody has a single 3d object that consumes 2 GB of memory. so you have lots of small thingies that are processed individually. imagine how you move ALL that is in your apartment and takes up so much space through the little door when moving



i can imagine an epic fail.

so to summarise your analogy: its not like its going to suck from the get go, but once the address space limit is full up, things are going to slow down until you got the old furniture out and the new furniture in.

Running x64 here just gives you a bigger door.


----------



## werez (May 28, 2009)

I think we are forgetting something :
RV870 , DirectX11 are just around the corner . Nvidia is moving to GDDR5 . Gpu`s are currently hitting 1000MHZ . Imagine the performance boost . So do we need 4GB VRAM ? how do we benefit from that ? we don`t ... This is like the last line of defense , other manufacturers won`t respond to this "monstrosity" . A stock GTX295 is more than you need . And like i first said , two GTX 275 in SLI , will be a better solution to improve performance , and you will save money .


----------



## Mussels (May 28, 2009)

if no one ever came out with top models with ridiculous amounts of ram, we'd all still be on 16MB cards werez.


----------



## Kitkat (May 28, 2009)

sounds cool but also sounds pointless havent we been down this road to see no real result and only bragging rights.



Jizzler said:


> Bout time we saw something unique. Go Asus!
> 
> nVidia, know your place! Get back to the fab and cook me more GPUs!



Thats funny I thought there place was childish arguments with intel???


----------



## werez (May 28, 2009)

yes , but i can play Counter-strike with my 16mb video card , and it`s still the most played game out there , isnt it ? 
That was my point ,  Mussels ...


----------



## GSG-9 (May 28, 2009)

Mussels said:


> if no one ever came out with top models with ridiculous amounts of ram, we'd all still be on 16MB cards werez.






werez said:


> yes , but i can play Counter-strike with my 16mb video card , and it`s still the most played game out there , isnt it ?
> That was my point ,  Mussels ...


Sadly I think its surpassed by wow these days, and you have to count 1.5-/1.6/CS:CZ/CSS all separately, you cant count them as the same product. I love cs...


----------



## 3870x2 (May 28, 2009)

He kinda has a point, I can play crysis full settings 1920x1200 at 20+ FPS on my 4870 while having warcraft III in the background.  512 MB ram.  4GB isnt really needed.  2GB isnt really needed.  By the time you need any more than 1.5GB of VRAM, the GTX295 will be obsolete like the 7950GX2.


----------



## Weer (May 28, 2009)

Calling it a "Limited Edition" card = They can charge a boat for it.


----------



## btarunr (May 28, 2009)

Weer said:


> Calling it a "Limited Edition" card = They can charge a boat for it.



I agree. 2x GTX 285 + digital PWM  + 2x memory + third-party bridge chip..should cost something.


----------



## W1zzard (May 28, 2009)

usually i am convinced that there is no point in going 2 GB VRAM. simply because any card would be too slow (fps wise) at such settings. no difference between 7 and 14 fps.. both isnt playable. with this card however, there might be chance that we could see some benefits.

only exception might be gta4 which seems to benefit nicely from more memory even at less demanding resolutions/AA. read: shit engine. not to mention the extremely gay DRM - i boycott it for benchmarks


----------



## Weer (May 28, 2009)

3870x2 said:


> He kinda has a point, I can play crysis full settings 1920x1200 at 20+ FPS on my 4870 while having warcraft III in the background.  512 MB ram.  4GB isnt really needed.  2GB isnt really needed.  By the time you need any more than 1.5GB of VRAM, the GTX295 will be obsolete like the 7950GX2.



That's not completely false, but think 2560x1600. It needs 1024MB to run @ 20FPS. Now, are you going to get 20 FPS running Crysis @ 2560x1600 on this card? Yes. So, 1024-1536MB has a point. And the extra 512MB is always good to have.. sandbox editors, running multiple games, games that have texture issues that take up tons of RAM but look great.. I know all the issues that exist because I have an 8800 GTS 512. A card that had too little vRAM.


----------



## h3llb3nd4 (May 28, 2009)

W1zzard said:


> usually i am convinced that there is no point in going 2 GB VRAM. simply because any card would be too slow (fps wise) at such settings. no difference between 7 and 14 fps.. both isnt playable. with this card however, there might be chance that we could see some benefits.
> 
> only exception might be gta4 which seems to benefit nicely from more memory even at less demanding resolutions/AA. read: shit engine. not to mention the *extremely gay DRM* - i boycott it for benchmarks



Now you know why there is extreme piracy...

@ weer: But 2560x1600 is rare and hardly anyone uses it...I bet you rich ppl wont buy those screens until they're more popular...


----------



## Weer (May 28, 2009)

btarunr said:


> I agree. 2x GTX 285 + digital PWM  + 2x memory + third-party bridge chip..should cost something.



Oh no, I just meant the two words. Pricey!

But this is technically how it goes. In order to produce a card better than the maximum, they'd _have_ to call it something of a "Limited Edition", because otherwise no one would buy the "regular" GTX 295. Now that they have that title.. they can go wild.. and they have.

Do you remember the "Limited Edition" X1950XTX Limited-Crossfire-Edition? I'm sure you know when it came out. Two super-charged X1950XTX's in both looks and performance, with the best GDDR4, that came in a collector's silver-brushed metallic case, and cost about twice the price of the already not-worth-it X1950XTX.

But this card is definitely worth it.. for me and everyone else who wants to game this summer without waiting for GTX 380. If it costs 600$, I'll get it. But it's extremely unlikely, says me.


----------



## Weer (May 28, 2009)

h3llb3nd4 said:


> Now you know why there is extreme piracy...
> 
> @ weer: But 2560x1600 is rare and hardly anyone uses it...I bet you rich ppl wont buy those screens until they're more popular...



What are you talking about, African brother? 750 US$ for the 30-incher.


----------



## h3llb3nd4 (May 28, 2009)

Weer said:


> What are you talking about, African brother? 750 US$ for the 30-incher.



Not African....I just immigrated there.
I'm trying to say 2560x1600 is not as popular as 1920x1080


----------



## Weer (May 28, 2009)

W1zzard said:


> usually i am convinced that there is no point in going 2 GB VRAM. simply because any card would be too slow (fps wise) at such settings. no difference between 7 and 14 fps.. both isnt playable. with this card however, there might be chance that we could see some benefits.
> 
> only exception might be gta4 which seems to benefit nicely from more memory even at less demanding resolutions/AA. read: shit engine. not to mention the extremely gay DRM - i boycott it for benchmarks



Yup. That's just about what I said/thought. And I got some back-up.

Since the first time a graphics card was forcefully glued 1024MB of vRAM without needing it because it was an entry-level card, this card is the only one that makes the ridiculous amount of memory seem cool.

Oh, and I hate the crappy porting job those idiots did to GTA 4. Isn't it fun to laugh at them?


----------



## Mussels (May 28, 2009)

h3llb3nd4 said:


> Not African....I just immigrated there.
> I'm trying to say 2560x1600 is not as popular as 1920x1080



and if even 1/10th of the people with those 30" screens want these monster cards, asus would sell out.

its not like these are going to be mass produced and sold at best buy for $2K for the next 2 years. they're a brief thing for those that want the best, and want it now.


----------



## Weer (May 28, 2009)

h3llb3nd4 said:


> Not African....I just immigrated there.
> I'm trying to say 2560x1600 is not as popular as 1920x1080



So, you're white living in a black man's world?! White power, brotha!

Yes, and gold is not as popular as bronze. I used to have bronze.. then I gave it away to someone who wouldn't be able to use that past tense as quickly as me. Hey, if they ever came out with a 37" ~3400x2130 resolution, I'd do something else with the 30". Although, since it was refurbished from Dell when I got it, I may not just give it away. Besides, it's fucking awesome.


----------



## Weer (May 28, 2009)

Mussels said:


> and if even 1/10th of the people with those 30" screens want these monster cards, asus would sell out.
> 
> its not like these are going to be mass produced and sold at best buy for $2K for the next 2 years. they're a brief thing for those that want the best, and want it now.



My.. god. You just described me. Except, I don't want it for that high of a price. No can do, huh?

This is going to be as expensive as the Ultra. Gosh, I wonder if there's someone who doesn't know what I am talking about.


----------



## Duncan1 (May 28, 2009)

Great news btanunr!

Fantastic cooler, but I think the ''all-black'' would be better.. Is there any picture with the cooler itself?


----------



## btarunr (May 28, 2009)

Duncan1 said:


> Great news btanunr!
> 
> Fantastic cooler, but I think the ''all-black'' would be better.. Is there any picture with the cooler itself?



The first picture is the complete card  

Edit. it doesn't have the expansion bracket in the first picture.


----------



## h3llb3nd4 (May 28, 2009)

Weer said:


> So, you're white living in a black man's world?! White power, brotha!
> 
> Yes, and gold is not as popular as bronze. I used to have bronze.. then I gave it away to someone who wouldn't be able to use that past tense as quickly as me. Hey, if they ever came out with a 37" ~3400x2130 resolution, I'd do something else with the 30". Although, since it was refurbished from Dell when I got it, I may not just give it away. Besides, it's fucking awesome.



Guess again!
Im Taiwanese


----------



## Duncan1 (May 28, 2009)

btarunr said:


> The first picture is the complete card
> 
> Edit. it doesn't have the expansion bracket in the first picture.



I meant a picture from the back of the cooler...


----------



## trt740 (May 28, 2009)

*get ready to morgage you home for it lol!!*

still Holy Mother of Moses is that a beast.


----------



## W1zzard (May 28, 2009)

h3llb3nd4 said:


> @ weer: But 2560x1600 is rare and hardly anyone uses it...I bet you rich ppl wont buy those screens until they're more popular...



we will be adding 2560x1600 to future vga reviews. all praise zotac for providing us an LG 30"

using 2560x1600 is like going from weed to cocaine. i really really have to buy one for my desktop.


----------



## h3llb3nd4 (May 28, 2009)

I will when I have cash...
those 30"ers are Frikkin expensive!!


----------



## ZoneDymo (May 28, 2009)

How can this beast work with 1 x 8-pin?


----------



## GSG-9 (May 28, 2009)

ZoneDymo said:


> How can this beast work with 1 x 8-pin?



I was impressed by this as well.


----------



## btarunr (May 28, 2009)

That's one 8-pin connector per PCB. Two in all.


----------



## El_Mayo (May 28, 2009)

i swear this thread was about a 4GB GTX 295?
now it says "Dual GTX 285"?


----------



## GSG-9 (May 28, 2009)

btarunr said:


> That's one 8-pin connector per PCB. Two in all.



Ah, thats more what I expected,


----------



## Weer (May 28, 2009)

W1zzard said:


> we will be adding 2560x1600 to future vga reviews. all praise zotac for providing us an LG 30"
> 
> using 2560x1600 is like going from weed to cocaine. i really really have to buy one for my desktop.



That's actually a pretty accurate metaphor.

Too bad they don't have such awesome deals in Germany.


----------



## WarEagleAU (May 28, 2009)

Too bad its only a collector edition. Id like to see something like this for ATI cards as well.


----------



## El Fiendo (May 28, 2009)

Wow. Now the only burning question remaining for me is where's the part that allows me to sex it?


----------



## btarunr (May 28, 2009)

El Fiendo said:


> Wow. Now the only burning question remaining for me is where's the part that allows me to sex it?



The SmartDoctor app comes with a/some sliders.


----------



## h3llb3nd4 (May 28, 2009)

I hate ASUS S/D...
Looks bad.... EVGA precision FTW!!


----------



## 1Kurgan1 (May 28, 2009)

Another Asus innovation, seems they should almost be making their own GPU's as they also are the only company who makes 4850x2's.


----------



## ZoneDymo (May 28, 2009)

btarunr said:


> That's one 8-pin connector per PCB. Two in all.



A yes you are right, thanks


----------



## erocker (May 28, 2009)

With only 1000 being made I would expect workstation card prices.  $1000 dollars minimum.


----------



## locoty (May 28, 2009)

1Kurgan1 said:


> Another Asus innovation, seems they should almost be making their own GPU's as they also are the only company who makes 4850x2's.



Do you forget Sapphire?

in fact i think Sapphire is the only company who makes 4850x2


----------



## method526 (May 28, 2009)

damn.  that thing looks like a c4 brick.  i hope cooling will be sufficient.  i think the performance wont be a problem.


----------



## soldier242 (May 28, 2009)

what a beast


----------



## INSTG8R (May 28, 2009)

Meh, just another "I have the biggest e-penis card"....


----------



## happita (May 28, 2009)

I would so get this if it was released a few months ago, but now I'm just waiting on DX11 cards. That's too bad, because that card looks like a freakin' tank! 
4GB makes my pee-pee go, da doing doing doing!!


----------



## qubit (May 28, 2009)

Frakking awesome card and sell my GTX285 coz me want!

I might just get one and burn out my credit card with the expense.  Question is, which card??! 

And then the GT300 will come out and I won't want this toy any more, but the shiny new one. hehehehehe


----------



## buggalugs (May 28, 2009)

qubit said:


> Frakking awesome card and sell my GTX285 coz me want!
> 
> I might just get one and burn out my credit card with the expense.  Question is, which card??!
> 
> And then the GT300 will come out and I won't want this toy any more, but the shiny new one. hehehehehe



Dont do it my friend. DX 11 cards are less than 6 months away and this card will look much less impressive and lose most of its value.


----------



## btarunr (May 28, 2009)

There's always something better 6 months down the line, and will always be so, unless, God forbid, North Korea sends something flying towards California.


----------



## El_Mayo (May 28, 2009)

btarunr said:


> There's always something better 6 months down the line, and will always be so. Unless, God forbid, North Korea sends something flying towards California.



roflmao
that's highly unlikely


----------



## 1Kurgan1 (May 28, 2009)

locoty said:


> Do you forget Sapphire?
> 
> in fact i think Sapphire is the only company who makes 4850x2



Good eye, forgot it was Sapphire, hmmm.... Just seems weird these PCB manufacturers are going crazy, kinda cool though.


----------



## qubit (May 28, 2009)

buggalugs said:


> Dont do it my friend. DX 11 cards are less than 6 months away and this card will look much less impressive and lose most of its value.



Indeed, that's good advice my friend.  While I would very much like to have it, I wasn't actually being serious. It's just that my GTX 285 is very nice, so double that in a special edition card is twice as desirable!

Can you imagine the shock to the system when the GT300 comes out, DX11 support, 4GB GDDR5 _as standard_ (and crucially, all visible) and 3 times the power of a GT200 - the Asus will go from what, $1000 to 150, 100, 50? 

While one would still be paying it off....

Doesn't bear thinking about, does it?


----------



## Assassin48 (May 28, 2009)

im pretty sure that one of atis partner will bring out the 4890x2 no matter what ati says to compete against this thing and people are willing to pay for the latest and greatest 

WE WANT POWER !


----------



## BOSE (May 28, 2009)

This card is for the sheer purpose of bragging rights for Asus, that they were the first to slap 4Gb of ram.

Next thing we will see some one will make a card with 6GB of Ram, then 12GB of Ram, and so on and so on.

Ill bet 4GB will be standard on all performance cards with in a year.


----------



## El Fiendo (May 28, 2009)

Um, it also uses GTX285 processors instead of GTX275. It has a 512 bit interface and the full 240 shader processors so its not just more memory.


----------



## erocker (May 28, 2009)

El Fiendo said:


> Um, it also uses GTX285 processors instead of GTX275. It has a 512 bit interface and the full 240 shader processors so its not just more memory.



Both the GTX285 and 275 have 240 shader GPU's.  The GTX275 just uses the 448 bit bus and 896mb of ram.


----------



## alexp999 (May 28, 2009)

El Fiendo said:


> Um, it also uses GTX285 processors instead of GTX275. It has a 512 bit interface and the full 240 shader processors so its not just more memory.



The GTX 295 uses the GTX 260 GPU, so 216 SPs


----------



## BOSE (May 28, 2009)

It doesnt matter. The point, is that Asus trying to be first at something that no one has done it yet.

Soon it will be standard among all VGA makers. Thus no reason to piss all over your self like a school girl in some cheap porn.


----------



## Hayder_Master (May 28, 2009)

best performance card on the world now


----------



## Kitkat (May 28, 2009)

BOSE said:


> It doesnt matter. The point, is that Asus trying to be first at something that no one has done it yet.
> 
> Soon it will standard among all VGA makers.



Hardly. Next couple years will be all about the process SIZE and ram speed. Efficiency over ram amounts. we are on the brink of 22 28 32 45 and 55 (which ppl are running away from even tho would be just as awesome with new architecture.) 4GB as a "soon standard" is silly. The card is a gimmick like most 2g cards now that are the same speed (YES EVEN TEXTURE WISE) as there 1g counterparts. This all goes back to good solid code and efficiency. Look at crysis 1 as today's standard benchmark, Also silly. Compared side by side with the second one it was "poorly coded". Advancements in CODE were made and the system its self and its performance the second time around was MUCH better. Believing that crysis would "soon become the standard demand for vidcards" was completely SILLY. Running to higher ram (unnecessary) amounts on vid cards leads to looser code that utilizes UNNECESSARY racecourses. With smaller chip and ram processes coming easier, and sooner than they did before, they are the future. Personally too on the side i dunno if its just me but as a coder i think CONSTRAINTS make better,tighter and more efficient code, which brings out performance that's expected and sometimes even UNEXPECTED from hardware. Sure crysis caused an arms race that gave us the stuff we have now but when they came back the second and even from the previews Ive seen the 3rd time around we saw what CODE did that which hardware couldn't. Always CODE over hardware. I love hardware but if code doesn't utilize ANY of this $hi+ whats the point.


----------



## BOSE (May 28, 2009)

Kitkat said:


> Hardly. Next couple years will be all about the process SIZE and ram speed. Efficiency over ram amounts. we are on the brink of 22 28 32 45 and 55 (which ppl are running away from even tho would be just as awesome with new architecture.) 4GB as a "soon standard" is silly. The card is a gimmick like most 2g cards now that are the same speed (YES EVEN TEXTURE WISE) as there 1g counterparts. This all goes back to good solid code and efficiency. Look at crysis 1 as today's standard benchmark, Also silly. Compared side by side with the second one it was "poorly coded". Advancements in CODE were made and the system its self and its performance the second time around was MUCH better. Believing that crysis would "soon become the standard demand for vidcards" was completely SILLY. Running to higher ram (unnecessary) amounts on vid cards leads to looser code that utilizes UNNECESSARY racecourses. With smaller chip and ram processes coming easier, and sooner than they did before IS the future. Personally too on the side i dunno if its just me but as a coder i think CONSTRAINTS make better,tighter and more efficient, code which brings out performance that's expected and sometimes even UNEXPECTED from hardware. Sure crysis caused an arms race that gave us the stuff we have now but when they came back the second and even from the previews Ive seen the 3rd time around we saw what CODE did that hardware couldn't. always CODE over hardware. I love hardware but if code doesn't utilize ANY of this $hi+ whats the point.




And there was time when we were happy with 32MB of ram in Win 95. Now cell phones have as much ram as desktop PC used to have.

Point is, you cant stop hardware progress, no matter how good the code is. There is always money to be made from something newer, bigger, better, faster hardware. Today its Crysis, tomorrow its the new Packman that will raise the bar.


----------



## DeathTyrant (May 28, 2009)

alexp999 said:


> The GTX 295 uses the GTX 260 GPU, so 216 SPs


 No, the GTX 295 is exactly 2x275.

http://www.nvidia.com/object/product_geforce_gtx_295_us.html


> Processor Cores: 480 (240 per GPU)


----------



## alexp999 (May 28, 2009)

DeathTyrant said:


> No, the GTX 295 is exactly 2x275.
> 
> http://www.nvidia.com/object/product_geforce_gtx_295_us.html



Hmm, I stand corrected


----------



## El_Mayo (May 28, 2009)

what if.. 
i think already said this a few pages back
but what if nVidia made 40nm versions of the 9800GTX
and stuck four of those on one card?
wouldn't that be a tiny bit better?
cheaper and less power hungry!


----------



## Marineborn (May 28, 2009)

im a ati fanboy, but this is extremly temping...possible 2 of them slid!BWHAHWAHAWHAWHAW HWAHAW HW H!!!!  ! ! V@#@!*&^)$  *goes mad with power*


----------



## El_Mayo (May 28, 2009)

Marineborn said:


> im a ati fanboy, but this is extremly temping...possible 2 of them slid!BWHAHWAHAWHAWHAW HWAHAW HW H!!!!  ! ! V@#@!*&^)$  *goes mad with power*



you must be joking. lmfao
just why in god's name would you want that? 
is your monitor a cinema screen?


----------



## alexp999 (May 28, 2009)

El_Mayo said:


> you must be joking. lmfao
> just why in god's name would you want that?
> is your monitor a cinema screen?



Its called e-penis.


----------



## zithe (May 28, 2009)

You'd think an e-penis so big would be painful and require some medical attention. Mental help for those who see it.


----------



## El Fiendo (May 28, 2009)

Thanks Erocker.



alexp999 said:


> Hmm, I stand corrected



Yea, I got confused by their SP per chip thing thinking that it had 216 SP. Mind you I knew it was 275s in there I was simply mistaken as to what the 275 and 295 had under the hood. Cripes nVidia's lineup is tough sometimes.


----------



## Kitkat (May 28, 2009)

BOSE said:


> And there was time when we were happy with 32MB of ram in Win 95. Now cell phones have as much ram as desktop PC used to have.
> 
> Point is, you cant stop hardware progress, no matter how good the code is. There is always money to be made from something newer, bigger, better, faster hardware. Today its Crysis, tomorrow its the new Packman that will raise the bar.



No u sure cant which i stated clearly. but has nothing to do with 4mb > 1G. Those "hardware progressions" will come from the process size and ram speed for the next 10 -15 years. 4GB is for pissing rights it wont be a standard, raise speed or make textures go any faster. Its not progress.


----------



## alexp999 (May 29, 2009)

El Fiendo said:


> Thanks Erocker.
> 
> 
> 
> Yea, I got confused by their SP per chip thing thinking that it had 216 SP. Mind you I knew it was 275s in there I was simply mistaken as to what the 275 and 295 had under the hood. Cripes nVidia's lineup is tough sometimes.



Both companies are as bad as each other. AMD naming structure for their ATI HD4xxx series makes no sense. Especially with a 4730 coming out.


----------



## El_Mayo (May 29, 2009)

alexp999 said:


> Both companies are as bad as each other. AMD naming structure for their ATI HD4xxx series makes no sense. Especially with a 4730 coming out.


not confusing at all to me
when is the 4730 coming out?


----------



## El Fiendo (May 29, 2009)

alexp999 said:


> Both companies are as bad as each other. AMD naming structure for their ATI HD4xxx series makes no sense. Especially with a 4730 coming out.



Oh I'm sure, I'm not saying nVidia is the worst. I just don't have enough knowledge of ATI. I actually haven't had any ownership contact with ATI since my ATI Rage Pro in my parents IBM, and haven't kept up with their lineup at all.


----------



## BUCK NASTY (May 29, 2009)

Oh, I'll be in line for that if it's ever released. oh yeah..


----------



## entropy13 (May 29, 2009)

El Fiendo said:


> Oh I'm sure, I'm not saying nVidia is the worst. I just don't have enough knowledge of ATI. I actually haven't had any ownership contact with ATI since my ATI Rage Pro in my parents IBM, and haven't kept up with their lineup at all.



Nvidia's slightly worse, because there's GT 1xx (OEMs), GTS 250, and GTX 2xx. Then of course there's still the 9xxx GT's.

For ATi right now it's just the Radeon HD 4xxx. 4500, 4600, 4700, 4800 series.


----------



## eidairaman1 (May 29, 2009)

i expect Fitseries to have 4 of these in his core i7 machine, that way his powersupply gets utilized properly.



entropy13 said:


> Nvidia's slightly worse, because there's GT 1xx (OEMs), GTS 250, and GTX 2xx. Then of course there's still the 9xxx GT's.
> 
> For ATi right now it's just the Radeon HD 4xxx. 4500, 4600, 4700, 4800 series.



4500, never heard about it.


----------



## BOSE (May 29, 2009)

Kitkat said:


> No u sure cant which i stated clearly. but has nothing to do with 4mb > 1G. Those "hardware progressions" will come from the process size and ram speed for the next 10 -15 years. 4GB is for pissing rights it wont be a standard, raise speed or make textures go any faster. Its not progress.



As i said before. At some point, 256MB of RAM was for pissing rights, now 1GB is standard. Tomorrow its 4GB. The code is getting bigger and bigger, thus will require more space. The code can be optimized only so much.


----------



## Wile E (May 29, 2009)

BOSE said:


> As i said before. At some point, 256MB of RAM was for pissing rights, now 1GB is standard. Tomorrow its 4GB. The code is getting bigger and bigger, thus will require more space. The code can be optimized only so much.



And if you want to get technical, this is more like 2 2GB cards, not a single 4GB one. Only 2GB of framebuffer can be used at any given moment.

I'd love to have this card. 2GB will even max GTA IV seamlessly.


----------



## 1c3d0g (May 29, 2009)

I wonder how good this card can Fold/Crunch with the extra memory, as some BOINC projects can use well over 1.6 GB of system RAM alone...


----------



## a_ump (May 29, 2009)

lol beast for sure, but a waste imo. as said many time, just for those that desire e-peen and the utmost best even if it's 2% better for 200% the price. 

bout that memory addressing stuff with 32-bit, i ran an 8800GT 512mb and then it died and i have my 7800GTX 256mb now. But i didn't gain 256mb in available memory, i've always had 2.8gb out of 4gb, even though the 7800GTX has half the memory of the 88GT. So can any of you explain why i didn't get a jump in avail system memory when switching to my 7800GTX?


----------



## Mussels (May 29, 2009)

a_ump said:


> lol beast for sure, but a waste imo. as said many time, just for those that desire e-peen and the utmost best even if it's 2% better for 200% the price.
> 
> bout that memory addressing stuff with 32-bit, i ran an 8800GT 512mb and then it died and i have my 7800GTX 256mb now. But i didn't gain 256mb in available memory, i've always had 2.8gb out of 4gb, even though the 7800GTX has half the memory of the 88GT. So can any of you explain why i didn't get a jump in avail system memory when switching to my 7800GTX?



its not directly linked to how much video ram you have, nor is video ram what actually limits it. its related to the motherboard in my findings - i've seen some systems do 3.25GB and others 3.5GB, but those numbers never changed when different video cards were installed.


----------



## a_ump (May 29, 2009)

Mussels said:


> its not directly linked to how much video ram you have, nor is video ram what actually limits it. its related to the motherboard in my findings - i've seen some systems do 3.25GB and others 3.5GB, but those numbers never changed when different video cards were installed.



i always assumed that motherboard was the main factor in usable memory, simply because my friend has an 8800GTX, 2 more HDD then me, a sound card(i use onboard) and his system allows 3.5gb usuable. thx


----------



## Katanai (May 29, 2009)




----------



## entropy13 (May 29, 2009)

eidairaman1 said:


> 4500, never heard about it.



Actually there isn't a 4500 series.....because there's only one card, the 4550. Forgot to include the 4350 as well.


----------



## PCpraiser100 (May 29, 2009)

If only if I kept my crappy Honda for another year.


----------



## Kitkat (May 29, 2009)

BOSE said:


> As i said before. At some point, 256MB of RAM was for pissing rights, now 1GB is standard. Tomorrow its 4GB. The code is getting bigger and bigger, thus will require more space. The code can be optimized only so much.



thanks on the first part but the secondpart is totaly backwards as i said we are in a much DIFERENT time then yesterday. Code > HW. and now processs are made MUCH faster as i said in the very first post amounts WONT mean much. as they dont now. Yesterday all they could do was raise it. Now we have faster dev to smaller processes. to repeat myself 22 28 32 40 55. Same with ram. Ram speeds are about to go UP. and 2Gigs will do X times more than what it used to asn FASTER. This card will be no faster than the 1g version. In ANY feild. on 3 frames maybe... over the 100 it was already?? Will u notice NO. Chip / ram speeds have WAY more influence then a ram amount. We should be over amounts and on to speed.


----------



## cragdor (May 29, 2009)

*GPU 4 Gig of RAM*



El_Mayo said:


> ahh right yeah
> i didn't know video RAM counted you see
> (i'm noobish)
> 
> ...



It doesn't matter what the host OS is running. 32Bit systems don't register the VRAM the GPU does and that runs its own Architecture which is neither 32bit nor 64bit. The Host CPU doesn't understand what the VRAm is it just sends commands over the common PCIe interface to the GPU. Its like saying if i have a home network with a 64 bit computer and a 32 bit computer then because i have the 32bit computer connected to the 64 bit one the 64bit one can only see 3.2GB of its own RAM. And for those of you with Integrated Graphics with 4GB of RAM on 32bit OS's assign the spare RAM to the integrated chip! As the BIOS on the computer will split the RAM hence not having memory sat doing nothing.


----------



## Tony (May 30, 2009)

Kitkat said:


> thanks on the first part but the secondpart is totaly backwards as i said we are in a much DIFERENT time then yesterday. Code > HW. and now processs are made MUCH faster as i said in the very first post amounts WONT mean much. as they dont now. Yesterday all they could do was raise it. Now we have faster dev to smaller processes. to repeat myself 22 28 32 40 55. Same with ram. Ram speeds are about to go UP. and 2Gigs will do X times more than what it used to asn FASTER. This card will be no faster than the 1g version. In ANY feild. on 3 frames maybe... over the 100 it was already?? Will u notice NO. Chip / ram speeds have WAY more influence then a ram amount. We should be over amounts and on to speed.



At the moment Code > HW maybe the thing when it comes to games but I can tell you its the other way around when it comes to design software like when I am using something like Solidworks or Illustrator.  Once my work starts getting complex the HW chokes; there is no system out there that you can't choke (and not just for the sake of it either - I'm talking real comprehensive creativity). Even with a 64 bit windows 7 that has the system limit of 4g removed (what its about 80G?) I could still choke the system using such a card. For productive creativity its all about workflow and the current software can more than handle it - but the hardware is still limited.


----------



## Snipermonkey2 (May 30, 2009)

Necrofire said:


> 32-bit OS = 4GB max of memory address space. If you had, say, 2GB of ram, then there's only 2 left for addressing, which means that your shiny gpu can use a max of 2GB of ram.
> If you had 4GB of ram, then to fit all that vram in, it takes away from system ram, leaving you with way less than 3GB most people get from a 32-bit OS with 4GB of ram.
> 
> Typed in a hurry, does it make sense though?



A: There is a patch from Microsoft to allow the usage of more memory in a 32bit system but in turn it might eat your boot loader cause there is list showing which motherboards it works with.

B: I don't think thats going to be a issue because the hardware no matter what can see you have more ram than your OS says you do so I think it would work without any issues.


----------



## Wile E (May 30, 2009)

Kitkat said:


> thanks on the first part but the secondpart is totaly backwards as i said we are in a much DIFERENT time then yesterday. Code > HW. and now processs are made MUCH faster as i said in the very first post amounts WONT mean much. as they dont now. Yesterday all they could do was raise it. Now we have faster dev to smaller processes. to repeat myself 22 28 32 40 55. Same with ram. Ram speeds are about to go UP. and 2Gigs will do X times more than what it used to asn FASTER. This card will be no faster than the 1g version. In ANY feild. on 3 frames maybe... over the 100 it was already?? Will u notice NO. Chip / ram speeds have WAY more influence then a ram amount. We should be over amounts and on to speed.



I have to disagree. Sure, only a couple games will benefit from a buffer over 1GB, but they do exist. These games may not get a huge increase in average framerates, but they most certainly do benefit in terms of smoothness and playability. The more you swap out to system memory, the more stutters, pauses and texture anomalies you get when you lack sufficient frame buffer, even with OCed GDDR5. These issues don't show up in benchmarks where average framerates are taken, but they sure as hell are noticeable while playing the game. Furthermore, the gfx ram speed makes no difference in the issue, as it's already much faster than the system memory the machine has to swap out to.

Before my current setup, I had a 4870 512MB. I sent it back for a 1GB 4870, and the differences were noticeable in games like Crysis, FC2, and GTA IV. My Fraps averages were roughly the same, but the games were a hell of a lot smoother with the 1GB.

GTA IV, with it's huge textures, will see an improvement in gameplay with cards with over 1GB. Do you honestly believe that no other future game will have large textures like that?


----------



## Johnytxtc (May 30, 2009)

I dont really care about any of this.


----------



## freakytiki4u (May 30, 2009)

*Question*



kinc said:


> This is with liquid nitrogen thermal solution installed on the cards. I didnt have time to insluate well enough so cards are only running -40C with 1.175V set via software (up from 1.15V). There is headroom
> 
> http://www.kinc.se/mars3.jpg



Off topic I know, but what is the motherboard mounted to? Can it be purchased? I change a lot of capacitors on motherbaords and I have been looking for quite a while for a device I can mount a motherboard to so I don't have to hold it while trying to solder caps in and out. If anyone knows anything that is good for this please advise. I would really prefer something that holds the motherboard in the same upright position as a case does but is open on both sides. Oh yeah, im drooling for that videocard...........


----------



## BOSE (May 30, 2009)

Kitkat said:


> thanks on the first part but the secondpart is totaly backwards as i said we are in a much DIFERENT time then yesterday. Code > HW. and now processs are made MUCH faster as i said in the very first post amounts WONT mean much. as they dont now. Yesterday all they could do was raise it. Now we have faster dev to smaller processes. to repeat myself 22 28 32 40 55. Same with ram. Ram speeds are about to go UP. and 2Gigs will do X times more than what it used to asn FASTER. This card will be no faster than the 1g version. In ANY feild. on 3 frames maybe... over the 100 it was already?? Will u notice NO. Chip / ram speeds have WAY more influence then a ram amount. We should be over amounts and on to speed.



We are moving away from 32bit code and towards solely 64bit code.. So come back and talk to me in few years, when your code needs more space.


----------



## Hayder_Master (May 30, 2009)

also it is good card for folding , but seems you need water cool for this card , they do everything great in this card except the cooler


----------



## MrHydes (Jun 1, 2009)

the results are with 2x 295 mars, (4 GPUs) that's how they anchived X25k and almost 25k on
gpu score on xtreme vantage marks.
I Think Asus should at least Make 10 times more for limited edition, because i belive they all
would fly even with these crysis and knowing next gen it's poping out anytime


----------

