# NVIDIA Designs New GTX 260 PCB, Further Reduces Manufacturing Costs



## btarunr (Feb 7, 2009)

The introduction of the new G200b series graphics processors sought to revive NVIDIA's stronghold over the high-end graphics market, by reducing manufacturing costs, and facilitating high-end graphics cards at unusually low price-points, to compete with rival ATI. The first SKU using the G200b GPU was the new GeForce GTX 260. The PCB of design of the new model (P654) saw several drastic changes, that also ended up contributing to the cost-cutting: all memory chips were placed in the business end of the PCB, and the VRM area rearranged. News emerging from Expreview suggests that NVIDIA has worked out an even newer PCB reference design (model: P897) that aims mainly to cut production costs further. The reference design graphics board based on the PCB will be given the internal name "D10U-20". A short list of changes is as follows:

The number of PCB layers has been reduced from 10 to 8, perhaps to compress or remove blank, redundant or rudimentary connections
A 4+2 phase NVVDD power design using the ADP4100 voltage regulator IC, the FBVDDQ circuit has been reduced from 2 phases to 1, and the MOSFET package has been changed from LFPAK to DPAK grouping, to reduce costs. The ADP4100 lacks the I2C interface, which means voltage control will be much more difficult than on current PCBs of the GeForce 260,280, 285 and 295
The optional G200b support-brace has been removed
While the length of the PCB remains the same, the height has been reduced to cut costs
BIOS EEPROM capacity reduced from 1 Mbit (128 KB) to 512 Kb (64 KB)
Cheaper DVI connectors


The new PCB is expected to reduce costs by as much as US $15 which will impact on the overall product cost, and help step up the competitiveness. Expreview notes that the new PCB will be available to the partners by the third week of this month. Below are the drawing and picture of the PCB. For reference, the second picture is that of the older P654 design. 



 

 



*View at TechPowerUp Main Site*


----------



## alexp999 (Feb 7, 2009)

So current gen GT200 PCB's are better then?

- Voltage Control
- Better DVI ports
- Bigger EEPROM

And do you mean width of the PCB is 1.5cm less? Got me confused for a sec cus I was thinking, "But the PCB is only about 5mm thick!  ", lol.


----------



## Disruptor4 (Feb 7, 2009)

It appears they are skimping out on PCB manufacturing which is not good.


----------



## btarunr (Feb 7, 2009)

alexp999 said:


> So current gen GT200 PCB's are better then?
> 
> - Voltage Control
> - Better DVI ports
> ...


Yes, software voltage control could get difficult as the VRM IC doesn't support the standard interface in which voltage is controlled by software. DVI ports really wouldn't make a noticeable change the older ones had EMI shielding. We doubt if the G200b made use of the extra 64 KB of EEPROM space on the older PCB. Your BIOS .rom file for the GTX260  always weighed 64KB. The "height" as in:







Look at the red line. That dimension for a PCB is called its height. For example, "half-height" cards are HTPC or Slim form-factor friendly.


----------



## rpsgc (Feb 7, 2009)

alexp999 said:


> And do you mean *width* of the PCB is 1.5cm less? Got me confused for a sec cus I was thinking, "But the PCB is only about 5mm thick!  ", lol.



*Height*.


----------



## Zubasa (Feb 7, 2009)

alexp999 said:


> So current gen GT200 PCB's are better then?
> 
> - Voltage Control
> - Better DVI ports
> ...


Its call the Height as the card is always perpendicular to the motherboard.


----------



## iamverysmart (Feb 7, 2009)

Why not just call it length.


----------



## rpsgc (Feb 7, 2009)

iamverysmart said:


> Why not just call it length.



Uhm... because it's not?

http://en.wikipedia.org/wiki/Length


----------



## btarunr (Feb 7, 2009)

iamverysmart said:


> Why not just call it length.








^Length.


----------



## DaedalusHelios (Feb 7, 2009)

Good news post and thanks for explaining it too.


----------



## EarlZ (Feb 7, 2009)

Which means we will get a lesser durable and lesser overclocking card, kudos to nvidia then.


----------



## Tatty_One (Feb 7, 2009)

Disruptor4 said:


> It appears they are skimping out on PCB manufacturing which is not good.



it's good if it works and reduces costs.


----------



## DaedalusHelios (Feb 7, 2009)

EarlZ said:


> Which means we will get a lesser durable and lesser overclocking card, kudos to nvidia then.



The only people it will hold back will be the ones trying to use software voltmod techniques. The people serious about volt modding already use the hardware method instead.

I guess some people see all news as bad news. It shouldn't make any difference except lower costs to the manufacturer and customer. Sounds like an upside.


----------



## Jarman (Feb 7, 2009)

sod up full cover waterblocks too id imagine?


----------



## EarlZ (Feb 7, 2009)

DaedalusHelios said:


> The only people it will hold back will be the ones trying to use software voltmod techniques. The people serious about volt modding already use the hardware method instead.
> 
> I guess some people see all news as bad news. It shouldn't make any difference except lower costs to the manufacturer and customer. Sounds like an upside.



Maybe, but i cant see how reducing the PCB layers and VR's would not somewhat reduce the overclocking capabilities and durability.. also "cheaper" DVI connectors.. image quality degradation perhaps?


----------



## btarunr (Feb 7, 2009)

EarlZ said:


> also "cheaper" DVI connectors.. image quality degradation perhaps?



No. All a DVI connector does is connect the card to the monitor. It's just a piece of plastic with a few sockets and metal conveying the signal. All that's different between the new one and the old, is that the old one used an EMI shield. Evidently NVIDIA found that unnecessary. Image quality is care of the NVIO2 processor. That's what handles display, and the fact that it's isolated from the GPU (and its power-hungry components) shows they've already lopped EMI, or any form of interference, although the intention of separating display logic was because the GPU die had become too big.


----------



## eidairaman1 (Feb 7, 2009)

only thing i see people really complaining about is the eeprom size being shrunk, perhaps can have larger wired in.


----------



## buggalugs (Feb 7, 2009)

I dont like it. Sounds like poorer quality all round. $15 saving on a $300-$400 card doesnt sound like its worth it.


----------



## DrPepper (Feb 7, 2009)

buggalugs said:


> I dont like it. Sounds like poorer quality all round. $15 saving on a $300-$400 card doesnt sound like its worth it.



I don't think they will pass the savings on to the consumer, this is probably so they can make more money.


----------



## eidairaman1 (Feb 7, 2009)

more yields= more money to them.


----------



## AddSub (Feb 7, 2009)

With the global economy in such a bad shape this was to be expected. In fact they have already reduced costs with their 55nm lineup. The coolers on their 55nm lineup have been cut down. When compared to the coolers on the original 65nm GTX 260/280 lineup, the new coolers are lighter, have no back cover, heatpipes are shorter and smaller and many benchmarks reveal that in many situations 55nm parts actually run hotter than 65nm parts. Cutting down on the actual circuitry was the next logical step. They are taking the route AMD took a year ago. Hence the massive jump in Radeon 3xxx RMA's when compared to the previous generation, something that also carried on with the Radeon 4xxx lineup. In the following months you can expect a heathy increase in threads with titles such as "OMG! My brand new GTX 260 is DEAD after 2 days" or "My brand new nvidia GPU is artifacting at stock clocks!". Just watch.

Nehalem from Intel and 65nm GTX GPU's from nVidia are truly the last quality products we will see from both manufacturers since due to the worsening global economic conditions they will be cutting down on quality assurance along with everybody else in the industry. Here is an easy prediction: next massive GPU release from nVidia (384SP monster) gets pushed back by at least 6 months.


----------



## newtekie1 (Feb 7, 2009)

If it makes the card cheaper to produce, and allows the manufacturers to reduce the prices on the cards to be more competitive, this can only be seen as a good thing in the consumer's eyes.

Most consumers don't overclock the cards, and even fewer volt-mod them.  So the reductions won't affect the majority of people buying the cards.  And the ones that would be affected by the changes will just buy one of the more expensive versions that use the old PCB, as I'm sure  both will exist side by side on the market.


----------



## LittleLizard (Feb 7, 2009)

imo this has it pros and con

pros: cheaper good nvidia cards.
cons: bad overclock, the pre-overclocked cards will have a higher DOA rate, not as durable as the first gen.

Conclusion: as long as u dont overclock, this is good, if u overclock, this is bad


----------



## newtekie1 (Feb 7, 2009)

LittleLizard said:


> imo this has it pros and con
> 
> pros: cheaper good nvidia cards.
> cons: bad overclock, the pre-overclocked cards will have a higher DOA rate, not as durable as the first gen.
> ...



Overclocking shouldn't affect durability with these changes, in fact these changes shouldn't affect durability at all unless more voltage is being run through the card(ie Volt-mods).


----------



## phanbuey (Feb 7, 2009)

newtekie1 said:


> Overclocking shouldn't affect durability with these changes, in fact these changes shouldn't affect durability at all unless more voltage is being run through the card(ie Volt-mods).



+1

This is nothing but good news.  The cost saving will have minimal, if any impact on the performance of the card; they are just becomeing more skilled with the manufacture of these cards and eliminating unnecessary waste.


----------



## Haytch (Feb 7, 2009)

I would rather pay the $15 extra and not have those features taken away.
The cons of this move, too heavily outweigh the pro here. I mean pro because it only has one real pro, thats the $15 saving, which infact doubles up as a con.

If Nvidia are unable to wipe $15 off their products without stripping them down then they are in trouble. But we all know that this is not the case.  Its one thing to reduce costs, its another thing to strip the card of parts.

To me, this is nowhere near a $15 price reduction or an improvement in manufacturing.  What i gather is that Nvidia worked out a way to take off more then $15 worth of parts and allow the card to still work which gives the enduser a $15 saving with Nvidia raking in way over $15 in profit.

Nothing they removed was ' unessessary ', unless ofcourse your a monkey.


----------



## soryuuha (Feb 7, 2009)

imho, pcb layer is important for rams overclocking, so does graphic card..?

correct me if im wrong


----------



## DrPepper (Feb 7, 2009)

I think this is a good move on nvidias part. I doubt if it had even come up in the news many people would even notice. 

As for them removing unecessary parts I think if they were necessary they would have kept them there, its not like nvidia to make rediculous mistakes concearning board design.


----------



## EastCoasthandle (Feb 7, 2009)

btarunr said:


> No. All a DVI connector does is connect the card to the monitor. It's just a piece of plastic with a few sockets and metal conveying the signal. All that's different between the new one and the old, is that the old one used an EMI shield. Evidently NVIDIA found that unnecessary. Image quality is care of the NVIO2 processor. That's what handles display, and the fact that it's isolated from the GPU (and its power-hungry components) shows they've already lopped EMI, or any form of interference, although the intention of separating display logic was because the GPU die had become too big.



Well, I take it you are, for the most part, against this kind of practice

Well folks, this isn't th 1st time they did this.
Read here and here


----------



## LittleLizard (Feb 7, 2009)

phanbuey said:


> +1
> 
> This is nothing but good news.  The cost saving will have minimal, if any impact on the performance of the card; they are just becomeing more skilled with the manufacture of these cards and eliminating unnecessary waste.



ok, so the only real con would be less overclock, but i prefer cheaper because it is already a good card.


----------



## btarunr (Feb 7, 2009)

EastCoasthandle said:


> Well, I take it you are, for the most part, against this kind of practice



My personal opinion changed. 13 months is sufficient time for peoples' ways of thinking to change. I'm more informed now, so are my opinions.

My being for or against this practice hasn't surfaced in this thread, and is irrelevant anyway.


----------



## R_1 (Feb 7, 2009)

So, basically GTX260 will become mainstream in Nvidia lineup. Probably a HTPC GPU, something like 9600-9800GT  now are and assume serous drop in price will follow. New parts are coming .


----------



## EastCoasthandle (Feb 7, 2009)

btarunr said:


> My personal opinion changed. 13 months is sufficient time for peoples' ways of thinking to change. I'm more informed now, so are my opinions.
> 
> My being for or against this practice hasn't surfaced in this thread, and is irrelevant anyway.


Your posts in this thread gave me the impression that you were giving an opinion.  Also, per your own post your opinion has changed.  Which is why I inquired.  But thanks for the response none the less.


----------



## PCpraiser100 (Feb 7, 2009)

Nice plan, now I might have second thoughts with this card. Any chances that it could reduce power consumption?


----------



## newtekie1 (Feb 7, 2009)

Haytch said:


> I would rather pay the $15 extra and not have those features taken away.
> The cons of this move, too heavily outweigh the pro here. I mean pro because it only has one real pro, thats the $15 saving, which infact doubles up as a con.
> 
> If Nvidia are unable to wipe $15 off their products without stripping them down then they are in trouble. But we all know that this is not the case.  Its one thing to reduce costs, its another thing to strip the card of parts.
> ...




What cons are you talking about exactly?



soryuuha said:


> imho, pcb layer is important for rams overclocking, so does graphic card..?
> 
> correct me if im wrong



If the PCB layers are going unused, or are only there to provide redundancy, then no they are not important for anything and removing them shouldn't affect overclocking.


----------



## Haytch (Feb 7, 2009)

If your the type of user to plug in and play and never touch anything, there are no cons. . . Then again there are no pro's either.

I guess what i meant earlier by this card having more cons then pro's was more in regards to overclocking capability and less room to play with in the bios.  Taking away a phase would result in less efficiency and high temperatures at its weak point.  

I do believe that Nvidia is capable of redesigning their cards to make them more efficient, more powerfull and cheaper to produce, but this card doesnt cover all 3. Maybe some of the lines are reduntant now, maybe they really are . . . 

I think i would like to see this card directly compared.  Anyone ?


----------



## Kursah (Feb 7, 2009)

I would say wait till the product is released and see what happens, sure more power phases sounds good for overclocking, but losing a phase for efficiency doesn't necessarily mean loss of overclockability. If the gpu runs cooler, and faster, with fewer phases and can still keep up with it's older bretheren, then I see no issue, plus if we start seeing sub-200 GTX260's more commonplace, I really see no issue with that. Giving many gamers a chance to enjoy some serious performance out of a truly great card, I've had mine since July, I did step up to a 216 core in september, but mine is still a 65nm beast, it rocks in every game I play and then some, folds like a champ and runs cool, plus it uses less core voltage for more shader cores and decent clocks, runs cooler and is just as stable as my original card.

I think this is a good progression of the GTX, though dropping 30-60 shaders, bringing memory down to like 640mb/512mb and calling it a GTS250 would've been a good move too imo. Sell it at a 150-170 pricepoint and gamers would be very happy indeed.


----------



## Tatty_One (Feb 7, 2009)

buggalugs said:


> I dont like it. Sounds like poorer quality all round. $15 saving on a $300-$400 card doesnt sound like its worth it.



GTX260 @ $300 - $400, damn where do you live 



soryuuha said:


> imho, pcb layer is important for rams overclocking, so does graphic card..?
> 
> correct me if im wrong



You may be right, but 95% of gfx card buyers dont overclock, so...... if 95% get a better deal.....thats good, as the other 5% have not bought the card yet.... it isnt yet "bad".  

At the end of the day, a reduction in costs has gotta be good, if with that comes an un-acceptable amount of returns then thats bad and they have failed but it's not as if either manufacturers have a particularily strong record in that department although I dont quite understand why NVidia are doing it at this late stage, both ATI and NVidia have new models on the way, NVidia already have the fastest overall card plus the 3 fastest single card solutions.......makes you wonder why they are doing this TBH.


----------



## pentastar111 (Feb 7, 2009)

AddSub said:


> With the global economy in such a bad shape this was to be expected. In fact they have already reduced costs with their 55nm lineup. The coolers on their 55nm lineup have been cut down. When compared to the coolers on the original 65nm GTX 260/280 lineup, the new coolers are lighter, have no back cover, heatpipes are shorter and smaller and many benchmarks reveal that in many situations 55nm parts actually run hotter than 65nm parts. Cutting down on the actual circuitry was the next logical step. They are taking the route AMD took a year ago. Hence the massive jump in Radeon 3xxx RMA's when compared to the previous generation, something that also carried on with the Radeon 4xxx lineup. In the following months you can expect a heathy increase in threads with titles such as "OMG! My brand new GTX 260 is DEAD after 2 days" or "My brand new nvidia GPU is artifacting at stock clocks!". Just watch.
> 
> Nehalem from Intel and 65nm GTX GPU's from nVidia are truly the last quality products we will see from both manufacturers since due to the worsening global economic conditions they will be cutting down on quality assurance along with everybody else in the industry. Here is an easy prediction: next massive GPU release from nVidia (384SP monster) gets pushed back by at least 6 months.


Actually due to the economy I predict a RETURN to better quality, in terms of customer service and reliability.  With money tight manufacturers are going to have to have good products and service in order to get and keep customers...Why would a reputable company make a card so cheaply that it would have to be returned in a few months? You can't keep people buying your stuff if it's crappy and your customers service is the same. One step further; if there was no return policy, why would anyone in their right mind purchase the things in the first place? I don't think we have a thing to worry about in the long run. As far as the 55nm line up goes...I upgraded from 2 640mb 8800GTS's to the GTX285's. not only do they kick some butt, they don't run any hotter than the older cards.


----------



## LAN_deRf_HA (Feb 7, 2009)

pentastar111 said:


> Actually due to the economy I can see a RETURN to better quality, in terms of customer service and reliability.  With money tight manufacturers are going to have to have good products and service in order to get and keep customers...



That's not how it usually works... especially when you start cutting your customer service reps.


----------



## raptori (Feb 7, 2009)

no no NVIDIA you killed the best and the most popular card on the market .... i should go and find  another 65nm GTX260 before they run out ...... or change my avatar.


----------



## DarkMatter (Feb 7, 2009)

Tatty_One said:


> GTX260 @ $300 - $400, damn where do you live
> 
> 
> 
> ...



Probably that 5% of the people that would volt mod the card already bought it or will choose another one by this dates.

As for why they are doing this I think that reducing costs is just a good reason to do it. Even if they release new cards, the GTX260 will stay for long IMO, just as the 8800GT, and making it cheaper is always good. I don't think this will result in higher returns or crippled overclocking. Many non-reference boards from many vendors are cheaper and simpler and that doesn't make them worse. Sometimes they're better than the reference ones, because they had the time to test many things and they can correct what its "wrong". This is no different.


----------



## spearman914 (Feb 8, 2009)

Good news but IMO i would rather spent $15 for voltage control so u can oc the crap out of it.


----------



## Mussels (Feb 8, 2009)

btarunr said:


> My personal opinion changed. 13 months is sufficient time for peoples' ways of thinking to change. I'm more informed now, so are my opinions.
> 
> My being for or against this practice hasn't surfaced in this thread, and is irrelevant anyway.



i agree with BTA on this. last time i was all "yay cheaper!" but then my friends who bought the cheaper cards had heaps of failures, unlike mine which is still working to this day.

cheaper is fine if it doesnt affect reliability or performance, but those always seem to get sacrifieced.



spearman914 said:


> Good news but IMO i would rather spent $15 for voltage control so u can oc the crap out of it.



check the news page. EVGA is offering software control with theirs, so they're definately going to stick with the current design of PCB.


----------



## EarlZ (Feb 8, 2009)

$15 dollars saving is waaay to little for all that reduction.


----------



## eidairaman1 (Feb 8, 2009)

well complaining about it here wont stop them so its like set in stone now.


----------



## DaedalusHelios (Feb 8, 2009)

If they already had those stats when it was released there wouldn't be so much teary eyed fear going on about it. I doubt it will make any difference to the end-user. Its not like they released it with less shaders by accident.


----------



## eidairaman1 (Feb 8, 2009)

just trying to get greater yields and remove unused layers is all, if you want ultimate performance go with a 285 or 4870.


----------



## DarkMatter (Feb 8, 2009)

EarlZ said:


> $15 dollars saving is waaay to little for all that reduction.



I think that people overstimate the price of a PCB.


----------



## EarlZ (Feb 8, 2009)

DarkMatter said:


> I think that people overstimate the price of a PCB.



The op does say $15


----------



## DarkMatter (Feb 8, 2009)

EarlZ said:


> The op does say $15



I meant that $15 is a lot actually. Keep in mind that a good chunk of the retail price is for the retailer and I mean 25% or more. Another good chunk is for the vendor, but I can't estimate how much, because I never worked for one. You have to extract the price of the packaging and bundles... In the end the manufacturing cost of a PCB can't exceed $50 too much. In this case we could be talking about $65 to $50 reduction, which is a lot.


----------



## Hayder_Master (Feb 8, 2009)

the gtx 260 it is best chose for nvidia and if it become more cheap it will be stranded chose like 8800gt,gts before


----------



## Mussels (Feb 8, 2009)

hayder.master said:


> the gtx 260 it is best chose for nvidia and if it become more cheap it will be stranded chose like 8800gt,gts before




http://store.steampowered.com/hwsurvey/

see video card description.

almost 12% of steam users (which is a hell of a lot of people) have the 8800 series cards. Its the most popular card overall, and almost 23% of people running DX10 hardware are doing so on an 8800 series card.


----------



## DarkMatter (Feb 8, 2009)

Mussels said:


> http://store.steampowered.com/hwsurvey/
> 
> see video card description.
> 
> almost 12% of steam users (which is a hell of a lot of people) have the 8800 series cards. Its the most popular card overall, and almost 23% of people running DX10 hardware are doing so on an 8800 series card.



Wow! Successful chip this G92. If you add 9800 results it's 15.6% and almost 30% respectively, which is very impressive indeed. 1 out 3 DX10 cards are G92.


----------



## iamverysmart (Feb 8, 2009)

btarunr said:


> ^Length.



I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.

I'm pretty sure expreview just made a mistake during the translation.


----------



## Mussels (Feb 8, 2009)

iamverysmart said:


> I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.
> 
> I'm pretty sure expreview just made a mistake during the translation.



making it smaller in any dimension makes it cheaper to produce. thats all there is to it.


----------



## DaedalusHelios (Feb 8, 2009)

iamverysmart said:


> I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.
> 
> I'm pretty sure expreview just made a mistake during the translation.



Makes the card unbalanced??? Are you going to use it as a graphics card or a door stop?


----------



## btarunr (Feb 8, 2009)

iamverysmart said:


> I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.
> 
> I'm pretty sure expreview just made a mistake during the translation.



Both Expreview En and Cn websites stated the figure at 1.5 cm (~0.6"), which is too much height to lose, from what the pictures show. It can't be 1.5 mm (eradicating a typo) either, as that would be too less a height to lose. They basically shed the extra height the first iteration of the GTX 260 55nm PCB had, near the VRM area. So the figure isn't 1.5 cm, but is a significant amount nonetheless.


----------



## Mussels (Feb 8, 2009)

it could be 1.5"

that'd be around 3-4CM


----------



## Valdez (Feb 8, 2009)

DarkMatter said:


> Wow! Successful chip this G92. If you add 9800 results it's 15.6% and almost 30% respectively, which is very impressive indeed. 1 out 3 DX10 cards are G92.



I don't think it's a good thing. G92 is not a good chip for dx10.


----------



## DaedalusHelios (Feb 8, 2009)

Valdez said:


> I don't think it's a good thing. G92 is not a good chip for dx10.



LOL The G92 is as good or better than the 3870 you have. Unless you don't think its good for DX 10 either.


----------



## Laurijan (Feb 8, 2009)

I hope some manufactures are staying with the old disign to allow customers better OC abilities.. the 1 phase design probably makes OCing more difficult..


----------



## Mussels (Feb 8, 2009)

Valdez said:


> I don't think it's a good thing. G92 is not a good chip for dx10.



i certainly have no problems with my G92 card in DX10. It had a lot less issues when it was launched than the ATI 3xx0 series cards did, with DX10 compatibility.

at least my card does AA properly.


----------



## Valdez (Feb 8, 2009)

DaedalusHelios said:


> LOL The G92 is as good or better than the 3870 you have. Unless you don't think its good for DX 10 either.



r600 is better for dx10. But we will see when the first dx10 app comes out.


----------



## Valdez (Feb 8, 2009)

Mussels said:


> i certainly have no problems with my G92 card in DX10. It had a lot less issues when it was launched than the ATI 3xx0 series cards did, with DX10 compatibility.
> 
> at least my card does AA properly.



There are no native dx10 games on the market as i know.


----------



## Laurijan (Feb 8, 2009)

Valdez said:


> There are no native dx10 games on the market as i know.



Whats with Crysis?


----------



## Valdez (Feb 8, 2009)

Laurijan said:


> Whats with Crysis?


----------



## DaedalusHelios (Feb 8, 2009)

You get extra effects with directX 10 effects enabled in most modern games now.

Its a good feature set to support well.

Unless thats removed from the hungarian versions for some reason.


----------



## btarunr (Feb 8, 2009)

Return to topic.


----------



## Nitro-Max (Feb 8, 2009)

This is not good ok they might be able to offer them to us cheaper which is good but who wants cheaply made hardware id rather pay that bit more for quality tbh.

a thinner pcb means more chance of flexing and damage .


----------



## Bjorn_Of_Iceland (Feb 8, 2009)

> You get extra effects with directX 10 effects enabled in most modern games now.


DX10 is a joke. Basically just wrapped the existing dx9 api and added few, non-groundbreaking, stuff. Nothing compared to what dx8 vs dx9 showed. Clearly it was just a filler api built to lure gamers into Vista. It has been like 3 years already and still performance is a joke. DX9 was just out for a few months and we saw a leap.


In anycase, Galaxy uses this new design on the first gen GTX260.. really cuts the price off.


----------



## CrAsHnBuRnXp (Feb 9, 2009)

iamverysmart said:


> Why not just call it length.



Im not even going to say it...


----------



## Mussels (Feb 9, 2009)

lets end discussions of DX9 vs DX10. keep it about these cards... anyone seen retail prices for these yet?


----------



## DarkMatter (Feb 9, 2009)

Valdez said:


> I don't think it's a good thing. G92 is not a good chip for dx10.



I would like to see proofs of that. 
If there's no DX10 apps, how do you know if it's good o not? 



Valdez said:


> r600 is better for dx10. But we will see when the first dx10 app comes out.



3Dmark Vantage IS DX10, Unigine, FarCry 2, Crysis Warhead (let's say Crysis was not, this one IS) and many other games and apps are DX10. G92 does just well on those apps, much better than the R600/RV670 in most cases, so nice try.


----------



## Kursah (Feb 9, 2009)

Valdez said:


> There are no native dx10 games on the market as i know.



Not enough DX10 users out there to justify the risk of a DX10 only game at this point in time, would be a bad move for profits. 

But with cards as powerful as the 4870's and GTX260's dropping in price, these could very well end up being next-gen mid-range cards that could make DX10+ performance better in games and maybe start to justify the costs of DX10 only games. Sure I would like to see it, but I also like the fact that you can do DX9 too if you don't have the hardware or performance capabilites for 10. And if the AMD/ATI and NV units I mentioned earlier were to become a midrange GTS350 and HD5430 or whatever have you, at an easier price point, bang for the buck would be there. Pretty much a pipe dream at this point, but the prices these cards are at now for this generation is pretty sweet. I'm curious to see how this card does, I have a feeling it will do quite well overall, there will be some that don't care if it's got less power phases or can't support voltage change via hardware, when it flat out runs like a champ. The 260 is no slouch at stock.


----------



## spearman914 (Feb 9, 2009)

STALKER Clear Sky is the best DX10 game i've seen so far.


----------



## DarkMatter (Feb 9, 2009)

spearman914 said:


> STALKER Clear Sky is the best DX10 game i've seen so far.



I always forget about Stalker CS. 

But I'm not so sure about it being best in anything though. It uses almost the same features as Crysis does with very few variations and when everything is used it runs worse than Crysis Very High. It even uses a lot of memory, more than under DX9 and memory utilisation is something that DX10 was suposed to improve.

Great game, not so good engine IMO.


----------



## iamverysmart (Feb 9, 2009)

I don't think there is anything like "ATI is better for DX10". DX10 will work just like DX9, it is dependant on the actual game engine and driver support.


----------



## eidairaman1 (Feb 9, 2009)

i guess you didn't read the previous posts here but a Moderator intervened and said get back on topic, this thread is not about ATI, DirectX, it is about the Geforce GTX260 PCB, so drop the DX and ATI stuff.


----------

