# AMD Starts Shipping ATI Radeon HD 4850 Video Cards



## malware (May 31, 2008)

The ATI Radeon HD 4850 cards are reportedly already shipping to OEM partners and retailers. According to TG Daily everything is going as planned and AMD/ATI is aiming for a sizable launch of the new product generation, with Radeon 4850 512MB boards leading the charge. The final prices for all Radeon HD 4 series cards will be officially declared during the Computex 2008 tradeshow which will open doors on Monday. Higher-end Radeon 4870 cards with 512MB of onboard GDDR5 memory are expected to ship in volume sometime this summer, with flagship Radeon HD 4870 X2 to follow soon after that.

*View at TechPowerUp Main Site*


----------



## jbunch07 (May 31, 2008)

yes!
good news!


----------



## PVTCaboose1337 (May 31, 2008)

I cannot wait to see real world benchies.


----------



## selway89 (May 31, 2008)

jbunch07 your avatar says it all 'get in' lol
but yes very good news, eagerly awaiting benchmarks


----------



## Megasty (May 31, 2008)

...& you can put it on the board....YES!


----------



## mandelore (May 31, 2008)

sweet, cannit bleedin wait for the x2 benchies. as long as drivers are up to scratch


----------



## FilipM (May 31, 2008)

Come on then, I'm waiting since stone age now...


----------



## WarEagleAU (May 31, 2008)

Megasty, nice Hawk reference from the White Sox (though Im a cubbies fan myself).

This was posted earlier today by Crackerjack, well something like this. But good news nonetheless. Cant wait to see the prices on the 4870. Hopefully, it wont be too much.


----------



## jbunch07 (May 31, 2008)

im thinking my 3870 will be for sale within the next month or so : )


----------



## Megasty (May 31, 2008)

WarEagleAU said:


> Megasty, nice Hawk reference from the White Sox (though Im a cubbies fan myself).
> 
> This was posted earlier today by Crackerjack, well something like this. But good news nonetheless. Cant wait to see the prices on the 4870. Hopefully, it wont be too much.



 I think I'm in the minority here but I cheer on both teams & when they play each other I flip a coin 

lol @ off-topic-ness


----------



## newtekie1 (May 31, 2008)

I can't wait for some performance and pricing numbers.  I'm hopeful that we will actually see something worth buying from ATi.


----------



## springs113 (May 31, 2008)

newtekie1 said:


> I can't wait for some performance and pricing numbers.  I'm hopeful that we will actually see something worth buying from ATi.



dont know if i should wait and get the 4870 or get another 3870 from bb for 130 - 10%-$5 coupon...


----------



## dataoverride (May 31, 2008)

suddenly i think i shuld've waited for the upgrade on mine. Oh well 

*sigh*


----------



## eidairaman1 (May 31, 2008)

newtekie1 said:


> I can't wait for some performance and pricing numbers.  I'm hopeful that we will actually see something worth buying from ATi.



You have me Confused While back you were talkin crap about ati and how great Nvidia was, are you neutral or what?


----------



## eidairaman1 (May 31, 2008)

dataoverride said:


> suddenly i think i shuld've waited for the upgrade on mine. Oh well
> 
> *sigh*


What you upgrade to?

you need to add stuff to your sys specs signature.

If your so ashamed of your system. You shouldnt be as mine is Retro.


----------



## thoughtdisorder (May 31, 2008)

Benchies, can't wait!


----------



## imperialreign (Jun 1, 2008)

eidairaman1 said:


> What you upgrade to?
> 
> you need to add stuff to your sys specs signature.
> 
> If your so ashamed of your system. *You shouldnt be as mine is Retro.*





retro - funny how when we apply that in the tech market we're only talking a couple of years old


----------



## J-Man (Jun 1, 2008)

I think I'm gonna order another 3870 X2 in a few weeks and have those in crossfire for a few months before I upgrade to the 4870 X2.


----------



## 15th Warlock (Jun 1, 2008)

Nice, this summer will bring the long overdue showdown between nVidia and Ati, best of all is we consumers can only win as they trade punches and bring prices of their products down


----------



## newtekie1 (Jun 1, 2008)

eidairaman1 said:


> You have me Confused While back you were talkin crap about ati and how great Nvidia was, are you neutral or what?



Yes, I am neutral.  Most fanboys seem to think that if you say anything bad about a company you are "talkin crap" or "bashing" their beloved company.  Pointing out negative points about a company and it's products doesn't mean you don't like them or you favor another company.  The fact of the matter at the time WAS that nVidia was a better buy, nVidia had ATi beat in both price and performance.  Pointing out that fact doesn't mean I was saying nVidia was great and ATi was crap, it was just the facts at the time(and still is the fact, but lets see what these new cards bring).


----------



## pentastar111 (Jun 1, 2008)

Glad I've waited on my new build...


----------



## suraswami (Jun 1, 2008)

Woo Hoo! 

But bastards they are forcing me to build my Gaming machine and pulling me out of AGP world.

Phenom 9850 + Crossfire board this summer and  from wife.


----------



## Para_Franck (Jun 1, 2008)

Do you guys think that these will CFX with the actual 38xx series?


----------



## Psychoholic (Jun 1, 2008)

I'd have to get one of these if it weren't for my BFG Trade up program on my 9800GTX, hoping for a 260, dont want to pay the difference for a 280.


----------



## Haytch (Jun 1, 2008)

Excellent news, everything seems to be going in accordance to the schedule if not better.
I dont see why these wouldnt work in Hybrid mode with the 38** series, provided you have the new board.

Hopefully prices will be under the $500au


----------



## entilza (Jun 1, 2008)

I'd like to see the Crossfire Compatibilty chart with this new 4800 series...

I hope my 3870 can remain as I just love it


----------



## Apocolypse007 (Jun 1, 2008)

this card looks very promising. I expect nvidia's competitor card to be more power hungry and inefficient (still ddr3 on Nvidia's flagship... Really?).

im hoping this will help AMD regain some of its popularity with its gamer fans.

HD3000 series was nice. My 3870 is serving me well. Upgraded to it from my x1800xt and haven't regretted it. It would be nice if you could Xfire this with the 3000 series, but my board only has 1 PCI-e slot :shadedshu .


----------



## robodude666 (Jun 1, 2008)

entilza said:


> I'd like to see the Crossfire Compatibilty chart with this new 4800 series...
> 
> I hope my 3870 can remain as I just love it



Same here. Would also hope that any Vista + oh say atikmdag.sys get fixed with the HD 4800 driver release. Wouldn't mind getting a HD 48x0 instead of a 8800GT in late june and CrossFire w/ my 3870.


----------



## jonmcc33 (Jun 1, 2008)

My X1900XT is still going strong. I'll use it until there's a game worth playing all the time for me to upgrade.


----------



## Wile E (Jun 1, 2008)

Apocolypse007 said:


> this card looks very promising. I expect nvidia's competitor card to be more power hungry and inefficient (still ddr3 on Nvidia's flagship... Really?).


The fact that nVidia is using DDR3 doesn't matter for two reasons. First, it rumored their top end cards will be using a 512bit bus, greatly increasing bandwidth, and second, because nothing is really able to use all the bandwidth available anyway.


----------



## jonmcc33 (Jun 1, 2008)

Wile E said:


> The fact that nVidia is using DDR3 doesn't matter for two reasons. First, it rumored their top end cards will be using a 512bit bus, greatly increasing bandwidth, and second, because nothing is really able to use all the bandwidth available anyway.



One reason they are using it still is because ATi helped develop GDDR5. So when you are left in the dust about development of something like that what do you expect? 

It has already been proven that the memory bus in particular isn't the greatest factor. ATi already tried the 512-bit memory bus and it didn't last well. nVIDIA had 320-bit and 384-bit and notice that the G92 core with 256-bit is better performing in AA/AF benchmarks? 

When you can have a clock speed of 1.125GHz on your memory it's massive bandwidth even with a 256-bit bus. Not to mention the additional benefits and performance improvements of the technology.


----------



## Wile E (Jun 1, 2008)

jonmcc33 said:


> One reason they are using it still is because ATi helped develop GDDR5. So when you are left in the dust about development of something like that what do you expect?
> 
> It has already been proven that the memory bus in particular isn't the greatest factor. ATi already tried the 512-bit memory bus and it didn't last well. nVIDIA had 320-bit and 384-bit and notice that the G92 core with 256-bit is better performing in AA/AF benchmarks?
> 
> When you can have a clock speed of 1.125GHz on your memory it's massive bandwidth even with a 256-bit bus. Not to mention the additional benefits and performance improvements of the technology.


I know what the benefits of the bus are. My point is, the wider bus requires less memory clock speed to achieve the same bandwidth, making GDDR5 unnecessary for nVidia in terms of overall throughput.

And if you never noticed, nothing above GDDR3 has shown significant gains on ATI cards, even when all else is equal. So the bandwidth of both GDDR5 on a 256b bus, or GDDR3 on a 512b bus is essentially overkill.


----------



## Silverel (Jun 1, 2008)

GDDR5 is cheaper, and can use a smaller bus with the same bandwidth. The architecture used allows for uneven length trace lines (i.e shorter and less expensive), less PCB layers to accomodate the larger bus, and will run cooler as well. It's not just about the clock speed, but the overall cost of the board that will make the big difference. Not to mention, all it has to do is meet the same speeds to provide the same bandwidth. If it's not going to make a difference performace-wise due to some architecture limitation within ATI cards, at least it will be cheaper... and price/performance will drop accordingly. Or at least their profits will rise.


----------



## tkpenalty (Jun 1, 2008)

Something tells me from Nvidia not using GDDR5/4, that they are holding back...


----------



## Blacklash (Jun 1, 2008)

Good news. If they rock I'll replace my two HD 3850s @ 760|2038 with them.


----------



## DarkMatter (Jun 1, 2008)

Silverel said:


> *GDDR5 will/could be cheaper in 3-6 months or so*, and can use a smaller bus with the same bandwidth. The architecture used allows for uneven length trace lines (i.e shorter and less expensive), less PCB layers to accomodate the larger bus, and will run cooler as well. It's not just about the clock speed, but the overall cost of the board that will make the big difference. Not to mention, all it has to do is meet the same speeds to provide the same bandwidth. If it's not going to make a difference performace-wise due to some architecture limitation within ATI cards, at least it will be cheaper... and price/performance will drop accordingly. Or at least their profits will rise.



Corrected that for you. Right now GDDR5 is way more expensive.


----------



## DarkMatter (Jun 1, 2008)

newtekie1 said:


> Yes, I am neutral.  Most fanboys seem to think that if you say anything bad about a company you are "talkin crap" or "bashing" their beloved company.  Pointing out negative points about a company and it's products doesn't mean you don't like them or you favor another company.  The fact of the matter at the time WAS that nVidia was a better buy, nVidia had ATi beat in both price and performance.  Pointing out that fact doesn't mean I was saying nVidia was great and ATi was crap, it was just the facts at the time(and still is the fact, but lets see what these new cards bring).



Been there. People just don't want to hear the truth when it hurts. That sentiment is extremely intensified in the case of fanbois.


----------



## Esse (Jun 1, 2008)

DarkMatter said:


> Corrected that for you. Right now GDDR5 is way more expensive.



Well someone needs to tread in the deep end other wise no one will ever get in the pool


----------



## candle_86 (Jun 1, 2008)

imperialreign said:


> retro - funny how when we apply that in the tech market we're only talking a couple of years old



that XP of his isnt a couple of years old nor his his board. NF2 and AThlonXP are 2002 tech remember. Heck the 754 A64's are near 6 years old now. Processors are slowing down in advancement and preformace at least to me.


----------



## yogurt_21 (Jun 1, 2008)

DarkMatter said:


> Corrected that for you. Right now GDDR5 is way more expensive.



don't think you're paying attention mate. gddr5 is cheaper to implement because it uses a less complicated process. and right now the chips are on 10% more expensive than a lower speed gdd3 counterpart. the 512bit bus on the gt200 has made the chip huge and expensive to produce. there's more than chip cost involved in the price of memory. check out the gddr5 thread for more info. gddr5 is MUCH cheaper to implement. so no it is not at all more expensive.


----------



## jonmcc33 (Jun 1, 2008)

Wile E said:


> I know what the benefits of the bus are. My point is, the wider bus requires less memory clock speed to achieve the same bandwidth, making GDDR5 unnecessary for nVidia in terms of overall throughput.
> 
> And if you never noticed, nothing above GDDR3 has shown significant gains on ATI cards, even when all else is equal. So the bandwidth of both GDDR5 on a 256b bus, or GDDR3 on a 512b bus is essentially overkill.



No it is not overkill when you have DX10.1 which requires 4XAA. When you play with the AA/AF cranked then bandwidth becomes very important and is the bottleneck. 

GDDR5 isn't unnecessary, it's unattainable by nVIDIA because they aren't involved in the development of it. Come to think of it, didn't ATi help develop GDDR3 as well? The same memory technology that nVIDIA is using on it's flagship cards? 

http://en.wikipedia.org/wiki/GDDR3

Yes it is! ATi and JEDEC developed it! I guess some companies are still followers on the market, eh?


----------



## snuif09 (Jun 1, 2008)

im only gettin dx10 cards when its becoming a standard atm there arent enough dx10 games that are much better than dx9


----------



## Disparia (Jun 1, 2008)

Cools. Sign me up for two of them! (as there's a lack of chain-able DisplayPort monitors, video cards).


----------



## newtekie1 (Jun 1, 2008)

DarkMatter said:


> Been there. People just don't want to hear the truth when it hurts. That sentiment is extremely intensified in the case of fanbois.



Exactly, and if people would look farther back than just a week or two they would see me complaining about nVidia's utterly shit driver support for the Pre-8000 series cards when the 8800's came out.


----------



## imperialreign (Jun 1, 2008)

candle_86 said:


> that XP of his isnt a couple of years old nor his his board. NF2 and AThlonXP are 2002 tech remember. Heck the 754 A64's are near 6 years old now. *Processors are slowing down in advancement and preformace at least to me.*



that's because AMD is behind, and Intel is sandbagging.  The CPU market has become stale because of it.  Until AMD can re-enter the ring with the thousand-hand slap, we won't see anything remotely brilliant come out of Intel . . .


. . . come to think of it, we haven't really seen anything brilliant or noteworthy come out of Intel since they slapped two P4's together on the same die.  IIRC, even the looming-sometime-this-decade-release Nehalem is still milking the core2 architecture.





			
				jonmcc33 said:
			
		

> No it is not overkill when you have DX10.1 which requires 4XAA. When you play with the AA/AF cranked then bandwidth becomes very important and is the bottleneck.
> 
> GDDR5 isn't unnecessary, it's unattainable by nVIDIA because they aren't involved in the development of it. Come to think of it, didn't ATi help develop GDDR3 as well? The same memory technology that nVIDIA is using on it's flagship cards?
> 
> ...




sad, too, ATI jumps on new technology as soon as they can, and nVidia lags behind - they remind me of Intel in many respects, as long as their current tech works and dominates the market, why change it?  If they get ahead of the competition, they sandbag their current tech and milk it for as long as they can.  

ATI was the first to offer GDDR3, GDDR4, GDDR5, DVI, HDMI, PCI-E 2.0, DX10.1 (they also would've had the first DX10 release, if the merger with AMD didn't hold up the HD2000 series as long as it did), etc, etc - and people will purchase their products simply for the new tech, but it doesn't make up much for their lagging performance 


but, y'know, IMO, if AMD designed GPU cores as massive as nVidia has, we'd have seen a different story the last few series.


----------



## jonmcc33 (Jun 1, 2008)

candle_86 said:


> that XP of his isnt a couple of years old nor his his board. NF2 and AThlonXP are 2002 tech remember. Heck the 754 A64's are near 6 years old now. Processors are slowing down in advancement and preformace at least to me.



You are aware that the architecture improvement of the Penryn core over the Conroe actually improved performance at the same clock speed, right? 

Currently quad core processors are out, soon we'll see 8 core processors, etc. It's still advancing and heavily. The real thing is applications making use of SMP so they gain from these multicore processors.


----------



## DarkMatter (Jun 1, 2008)

yogurt_21 said:


> don't think you're paying attention mate. gddr5 is cheaper to implement because it uses a less complicated process. and right now the chips are on 10% more expensive than a lower speed gdd3 counterpart. the 512bit bus on the gt200 has made the chip huge and expensive to produce. there's more than chip cost involved in the price of memory. check out the gddr5 thread for more info. gddr5 is MUCH cheaper to implement. so no it is not at all more expensive.



1- GDDR5 uses a *smaller* process. smaller != simpler != cheaper. For comparison 55nm process in the HD3000 series was not a lot cheaper (if ay all at launch time) than 65 nm in the time. We can say the same about firts 45 nm Intels. Check your facts.

2- Right now slow GDDR5 is 10-20% more expensive than *higher speed* GDDR3 (0.8 ns). Slower GDDR3 such as 1ns memory is way cheaper. GTX260, the only possible "direct" competitor to Ati, uses the cheaper one and has  more or less the same bandwidth as the 770XT. GTX 280 has a significant higher bandwidth, so we are not comparing apples to apples there. We could easily say that we are comparing extreme high GDDR3 vs. slow GDDR5 and the actual result is Nvidia has the higher bandwidth, so Ati solution being cheaper makes sense, it's what we should expect from the slower part. 512 bit + GDDR3 here is giving more bandwidth, it's overkill, unnecerary IMO, but faster still.

3- I didn't talk about the implementation, but the price of memory. If you read his post you would notice he *first* says GDDR5 is cheaper and *then* starts talking about the implementation like this: cheaper GDDR5 + cheaper controler, PCB, etc = WIN WIN. That's what I understand in his statement. And that's not true NOW, it will in 3-6 months probably. And that's what I stated. And it's in this same moment, having to justify my statements, when newtekie's post #20 makes even more sense to me...



Esse said:


> Well someone needs to tread in the deep end other wise no one will ever get in the pool



Indeed! Who said the oposite? I only made clear that GDDR5 is not cheaper NOW. I didn't say it was a bad decision or anything. I just say that price wise NOW may not be better, not by much at least. In reality we don't know if it's cheaper at all: 

1- Memory is more expensive and in short supply NOW*. So much that Ati halved the frame buffer to 512 MB in the 770XT and uses GDDR3 in the Pro. Never forget this, as it pictures the truth in a "check the reality" fashion.

2- We have no real proofs that making a 256 bit GDDR5 controler is cheaper than 512/448 bit GDDR3 controler RIGHT NOW* and get the same performance. Remember DDR vs DDR2, DDR2 vs DDR3, GDDR3 vs GDDR4...

3- PCB is a lot cheaper, but IMO it's the only part where there's an econ0mic benefit RIGHT NOW* and it doesn't account all that much to the retail price..

*Sorry because I used it so much, but I think that I have to make clear that I am talking about today. Otherwise some people will think I'm saying something that I am not. I have no doubts it will be better with the time. That's why I said 3-6 months.


----------



## DarkMatter (Jun 2, 2008)

jonmcc33 said:


> GDDR5 isn't unnecessary, it's unattainable by nVIDIA because they aren't involved in the development of it. Come to think of it, didn't ATi help develop GDDR3 as well? The same memory technology that nVIDIA is using on it's flagship cards?
> 
> http://en.wikipedia.org/wiki/GDDR3
> 
> Yes it is! ATi and JEDEC developed it! I guess some companies are still followers on the market, eh?



I've been searching info about this for some time, as I don't have the Wiki in my highest reliable sources list, and it seems the info is right. I never pay as much attention to who develops what as I do to the specs and benchmarks of the thing. But I may do it in the future.

That's the sadest thing I have heard in a long time. It's extrememly unfair and bad for free competition. :shadedshu
I had no clue Ati was involved in the development of GDDR. I thought it was the JEDEC who does this things in conjunction with memory developers in any case (when talking about memory of course). And not only ONE of the consumers. No wonder why Nvidia is not using it! Ati has probably tons of patents that doesn't want to share for cheap! If they want to share them at all. 

Not to mention that this way Ati is kind of imposing the use of the memory the way they want it to be, which may not be the better way, who knows? In any case it surely benefits Ati. It's not like other companies can't develop their own standard, or that they can't make it better, it's surely more based on relationship with JEDEC. It's like TWIMTBP in hardware development, with the exception that there's no patent involvement in TWIMTBP and in harware it's SURE there is. :shadedshu


----------



## eidairaman1 (Jun 2, 2008)

Technically, PCB has a part to play, usually the more Area that is used by a Single PCB, such as larger the Price does go up but if the boards were smaller they could get greater yields out of them, thus price would go down, Just too bad they cant reuse the Wasted PCB Platter to make a new PCB Platter.


DarkMatter said:


> 1- GDDR5 uses a *smaller* process. smaller != simpler != cheaper. For comparison 55nm process in the HD3000 series was not a lot cheaper (if ay all at launch time) than 65 nm in the time. We can say the same about firts 45 nm Intels. Check your facts.
> 
> 2- Right now slow GDDR5 is 10-20% more expensive than *higher speed* GDDR3 (0.8 ns). Slower GDDR3 such as 1ns memory is way cheaper. GTX260, the only possible "direct" competitor to Ati, uses the cheaper one and has  more or less the same bandwidth as the 770XT. GTX 280 has a significant higher bandwidth, so we are not comparing apples to apples there. We could easily say that we are comparing extreme high GDDR3 vs. slow GDDR5 and the actual result is Nvidia has the higher bandwidth, so Ati solution being cheaper makes sense, it's what we should expect from the slower part. 512 bit + GDDR3 here is giving more bandwidth, it's overkill, unnecerary IMO, but faster still.
> 
> ...


----------



## Assimilator (Jun 2, 2008)

imperialreign said:
			
		

> sad, too, ATI jumps on new technology as soon as they can, and nVidia lags behind - they remind me of Intel in many respects, as long as their current tech works and dominates the market, why change it? If they get ahead of the competition, they sandbag their current tech and milk it for as long as they can.



If there's no pressure to innovate, companies don't innovate. That's why AMD/ATI need to get off their asses and provide some decent competition.

And implementing technology for technology's sake is hardly the right way to make money. Did GDDR4 prevent the 2900 XT from being a POS? How is DirectX 10.1 useful if not ONE game in existence uses it? In this industry, performance is king - ATI can provide all the features they want, but if they don't provide the horsepower to go with them, no-one will buy their cards and hence no-one will use those features.



			
				imperialreign said:
			
		

> but, y'know, IMO, if AMD designed GPU cores as massive as nVidia has, we'd have seen a different story the last few series.



By that logic, all ATI has to do to regain the performance crown is release a GPU with 2 billion transistors.

As for the GDDR3 vs GDDR5 debate, it's been done to death already. The fact of the matter is that nVidia would be suicidal to go with GDDR5 because there's so little of it available, it's hideously expensive, and (almost certainly) because ATI has already purchased most of the stock that's available.


----------



## DarkMatter (Jun 2, 2008)

eidairaman1 said:


> Technically, PCB has a part to play, usually the more Area that is used by a Single PCB, such as larger the Price does go up but if the boards were smaller they could get greater yields out of them, thus price would go down, Just too bad they cant reuse the Wasted PCB Platter to make a new PCB Platter.



What I was saying is that we don't know if the simpler PCB offsets the other differences in price. In fact I think that 1 GB of GDDR5 must be expensive enough compared to 1 GB GDDR3 that this difference is bigger than the one existing between 256 and 448 bit boards or close to it if Ati decided to launch the cards with "only" 512 MB. GTX cards are going to be more expensive in retail (~$100 GTX260 vs, 770XT), but they will pack more memory that will account for ~$30-50 in production costs. Could that difference in production costs account for a great part of the final retail price difference? Just thinking...


----------



## eidairaman1 (Jun 2, 2008)

DarkMatter said:


> What I was saying is that we don't know if the simpler PCB offsets the other differences in price. In fact I think that 1 GB of GDDR5 must be expensive enough compared to 1 GB GDDR3 that this difference is bigger than the one existing between 256 and 448 bit boards or close to it if Ati decided to launch the cards with "only" 512 MB. GTX cards are going to be more expensive in retail (~$100 GTX260 vs, 770XT), but they will pack more memory that will account for ~$30-50 in production costs. Could that difference in production costs account for a great part of the final retail price difference? Just thinking...



IMO id say 512 is the sweet Spot.


----------



## DarkMatter (Jun 2, 2008)

eidairaman1 said:


> IMO id say 512 is the sweet Spot.



Debatable, even though I agree with that sweet spot. But often times: sweet spot = mainstream, and we are talking about high-end cards isn't it?. 1gb 8800s do have some benefits at higher settings and more with high overclocks=performance. A card 2x faster will easily need the extra memory.  
Anyway the difference in price is there, and that was my point. Maybe I remember this badly wrong, but I remember Palit_Guy saying their profit was smaller in 1 GB cards than on 512 ones. This means that production cost difference was bigger than the one existing in retail prices. I mean the difference in retail price (without the aforementioned profit loss) of the extra 400 MB in the GTX 260 would probably be around $50. A 448 MB GTX 260 could easily sell for $60-80 less just as the GTS 320 and still be faster than the competition.

EDIT: Also correct me if I'm wrong but Wizzard said 1 ns 512 Mbit chips were about $4,5 from manufacturers in 1000 (10.000?) units. $4,5 x 8 = 36. And 0,8 ns chips are a lot more expensive AFAIK.

EDIT2: On another note, I've just seen this in Fudzilla (yeah whatever, I trust them as much as many other sources, kinda):

http://www.fudzilla.com/index.php?option=com_content&task=view&id=7625&Itemid=1

I've mixed opinions about this. I do think AA is done better out of the shaders, but we'll have to wait and see now, because FSAA done in 16 ROPs could be a limiting factor for such a card. I guess they have significantly improved them, but I don't know how it will work in the end. I'm optimistic in that I never think they will fail, but I'm am worried in the sense that the decisions seems more based on developers desires than on Ati's desires. This is important in a timely basis: have the chip prepared to do it since the start or is it a "last time" decision?


----------



## pentastar111 (Jun 2, 2008)

Personally I don't give a cr^p if the memory is GDDR3 or GDDR5 or GDDR10...I want a card that  kicks butt...is stable and reliable...and doesn't require the selling of a kidney or my firstborn...


----------



## jonmcc33 (Jun 2, 2008)

DarkMatter said:


> I've been searching info about this for some time, as I don't have the Wiki in my highest reliable sources list, and it seems the info is right. I never pay as much attention to who develops what as I do to the specs and benchmarks of the thing. But I may do it in the future.
> 
> That's the sadest thing I have heard in a long time. It's extrememly unfair and bad for free competition. :shadedshu
> I had no clue Ati was involved in the development of GDDR. I thought it was the JEDEC who does this things in conjunction with memory developers in any case (when talking about memory of course). And not only ONE of the consumers. No wonder why Nvidia is not using it! Ati has probably tons of patents that doesn't want to share for cheap! If they want to share them at all.
> ...



Well, AMD developed x86-64 technology did they not? Intel was left in the dust with that too originally. 

But it's good in the long run. These companies keep pushing themselves to advance technology and it's the consumer that has benefited. 

The reason nVIDIA needs to use a more expensive 512-bit memory bus is because they don't have the capability to advance GDDR memory the way ATi can. Still using GDDR3 because they don't know how to develop anything better.


----------



## eidairaman1 (Jun 2, 2008)

pentastar111 said:


> Personally I don't give a cr^p if the memory is GDDR3 or GDDR5 or GDDR10...I want a card that  kicks butt...is stable and reliable...and doesn't require the selling of a kidney or my firstborn...



Well dont worry about it then, your cards will obviously last longer than the G80 line will.


----------



## farlex85 (Jun 2, 2008)

I thought they were going to launch a 1gb 4870 at the outset.  I mean, I'm sure these cards will be great, nice successors to the 3xxx series, but it seems as though their not even trying to compete w/ nvidia in the high end. I was hoping they would........


----------



## DarkMatter (Jun 2, 2008)

jonmcc33 said:


> Well, AMD developed x86-64 technology did they not? Intel was left in the dust with that too originally.
> 
> But it's good in the long run. These companies keep pushing themselves to advance technology and it's the consumer that has benefited.
> 
> The reason nVIDIA needs to use a more expensive 512-bit memory bus is because they don't have the capability to advance GDDR memory the way ATi can. Still using GDDR3 because they don't know how to develop anything better.



The difference is that JEDEC is suposed to be an "open" group to set standards in the industry. If they make one company's development to be an standard that's not fair. It's not a matter of "I can or I can not". It's a matter of "what's the price I have to pay?" I'm 99'9% sure *any chipmaker* can implement anything like that. Maybe not as well, but they can...

EDIT2: I have made my brother read this and he has said I didn't made my point clear about the above. Here's further explanation. It's not that I don't like a company to own the rights over something they have developed. It's the fact that suposedly Ati has set standards for the industry for something they don't manufacture, memory chips. They are the customers not the developers and IMO shouldn't be the ones setting standards, as those may not fit other customer's needs. The fact that it's always Ati who sets the GDDR standards and that I know how the world works, and that's 99% politics, my view is that JEDEC aproves Ati's designs because they are "used" to them. That's not good for the industry nor the customer IMHO.

EDIT: Just as a showing of how they can. I studied Telecommunications Engineering. I left in the second course because it was a lot about programming software than what I liked and first thought it would be. But in the meantime and without a lot of studying in that matter (I mean I left in the second course) , I learnt by myself how to do a SDRAM memory controler with the help of Altera programable FPGAs, for a project in which I needed more memory than the one that I could fit in the FPGA they gave us. Was it fast? No mama, but it worked, and it was me. Come on...


Plus Amd developed x86-64 because they have a joint development agreement with Intel. Any improvement made by one part can be "copied" without a patent infringement (more or less that's it). 

This is not the case with Ati - Nvidia AFAIK. Plus JEDEC dictates what the manufacturers are going to do. If JEDEC says GDDR5, it's GDDR5. Let's say Nvidia comes up with a new own memory design, they have first need to be aproved by JEDEC. If the new design is similar to GDDR they won't get aproved. The case is that GDDR5 is the standard and if Nvidia wants to use it, probably has to pay to Ati because they probably have the patents over it. Any try to make something similar will fail into patent infringement. 

Following the example of x86. Do you really think that nobody in the world can make their own x86 processor? Nvidia for example to follow with the same players. Of course they can, but patents prevent them from doing it. You would be surprised how patents can prevent many companies from doing lots of things in this industry. The computer industry is not that much about innovation really, it's more the sort of like for double the performance double the lines or the clocks, etc. There are some things you can innovate, but offen times they are easy to think off and obvious, but at the same time are not worth at that time, so you don't implement them ar even care about them. So you forget about them, then a month later you think again and say "Ey in the next year this could be handy". You go to the patent's office just to find out you are 3 days late and someone else has "taken the lead"... That's how this things work. The worst thing, besides this or in adition to this, is that there are plenty of "companies" out there which their only job is finding those "holes" and making patents of them even though the won't even develop them into reality and never have thought of doing it. It sucks.


----------



## jonmcc33 (Jun 2, 2008)

farlex85 said:


> I thought they were going to launch a 1gb 4870 at the outset.  I mean, I'm sure these cards will be great, nice successors to the 3xxx series, but it seems as though their not even trying to compete w/ nvidia in the high end. I was hoping they would........



Not sure what you mean by this. Did you complete miss the 3870 X2 or something? The price factor makes it an even bigger steal.


----------



## farlex85 (Jun 2, 2008)

jonmcc33 said:


> Not sure what you mean by this. Did you complete miss the 3870 X2 or something? The price factor makes it an even bigger steal.



Nah I mean I was under the impression the initial card was to be a 4870 1gb single gpu that would at least hopefully compete w/ the offerings from nvidia (1gb and 884gb). It seems though, that they are instead launching in the mid-range, giving nvidia free reign over the high-end for some period of time. I'm not saying these cards won't be an excellent value, I'm saying I was hoping for competition on all levels at the get-go for this round (for better pricing for us). Although, Nvidia doesn't seem to be launching in the middle-ground where most cards are bought, so maybe its just a marketing thing. I just wanna see some fighting......

That 3870x2 came long after nvidia was at the top of the hill, and it was only one offering, and it was a dual-gpu. I was just hoping for more competition, one card (even though it is a great card) too far down the road doesn't do it.


----------



## jonmcc33 (Jun 2, 2008)

farlex85 said:


> Nah I mean I was under the impression the initial card was to be a 4870 1gb single gpu that would at least hopefully compete w/ the offerings from nvidia (1gb and 884gb). It seems though, that they are instead launching in the mid-range, giving nvidia free reign over the high-end for some period of time. I'm not saying these cards won't be an excellent value, I'm saying I was hoping for competition on all levels at the get-go for this round (for better pricing for us). Although, Nvidia doesn't seem to be launching in the middle-ground where most cards are bought, so maybe its just a marketing thing. I just wanna see some fighting......
> 
> That 3870x2 came long after nvidia was at the top of the hill, and it was only one offering, and it was a dual-gpu. I was just hoping for more competition, one card (even though it is a great card) too far down the road doesn't do it.



It's been back and forth for years (or did you miss nVIDIA's GeForce FX debacle). They screwed up with the 2900 series but the 3800 series has done quite well in performance. The 3870 X2 was their king of the hill and not only did it take the performance crown temporarily but it was also offered at a real budget price. I cannot believe I paid $500 for my X1900XT back in the day and a 3870 X2 right now costs $300 and destroys my video card.


----------



## imperialreign (Jun 2, 2008)

Assimilator said:


> If there's no pressure to innovate, companies don't innovate. That's why AMD/ATI need to get off their asses and provide some decent competition.
> 
> And implementing technology for technology's sake is hardly the right way to make money. Did GDDR4 prevent the 2900 XT from being a POS? How is DirectX 10.1 useful if not ONE game in existence uses it? In this industry, performance is king - ATI can provide all the features they want, but if they don't provide the horsepower to go with them, no-one will buy their cards and hence no-one will use those features.



I completely agree.  At the least it shows that a company is willing to keep up with the times.  I've noticed, though, that before the merger with AMD, supporting new tech worked, because they were well on par with nVidia; but a lot of the newer tech support hasn't been that big of importance since the merger . . .

anyhow, GDDR4 was never really show to improve performance over GDDR3, as GDDR3 over GDDR2 - the only difference I think that makes sense, IMO, is it reduces loading stutter, as the memory can move texture files in and out quicker.  But, this type of performance isn't included, nor can it really be, in benchmarks.



> By that logic, all ATI has to do to regain the performance crown is release a GPU with 2 billion transistors.



it's a thought . . . but ATI has never really been about the brute force method


----------



## brian.ca (Jun 2, 2008)

farlex85 said:


> Nah I mean I was under the impression the initial card was to be a 4870 1gb single gpu that would at least hopefully compete w/ the offerings from nvidia (1gb and 884gb). It seems though, that they are instead launching in the mid-range, giving nvidia free reign over the high-end for some period of time. I'm not saying these cards won't be an excellent value, I'm saying I was hoping for competition on all levels at the get-go for this round (for better pricing for us). Although, Nvidia doesn't seem to be launching in the middle-ground where most cards are bought, so maybe its just a marketing thing. I just wanna see some fighting......
> 
> That 3870x2 came long after nvidia was at the top of the hill, and it was only one offering, and it was a dual-gpu. I was just hoping for more competition, one card (even though it is a great card) too far down the road doesn't do it.



The intial cards aren't meant to compete with Nv's highend.. that's the role of the X2 which was slated for late Augustish I think?  There was a rumor about ATI wanting to see what the Nv cards were capable of before releasing their top of the line card. 

I think I heard of the 1gb card as well but I'd imagine that'd be a no go until supply of DDR5 gets better (it would seem they had to cancel the DDR5 on the 4850s b/c of supply issues).  Ultimately these cards very much so seem to be meant for the mainstream where good price/performance ratios should be able to get ATI some market share back.  If 512mb is the sweet spot for most buyers I'd imagine ATI would want to put all their eggs into that basket from the outset and put the current supply of memory towards those so they can sell more.



> How is DirectX 10.1 useful if not ONE game in existence uses it? In this industry, performance is king - ATI can provide all the features they want, but if they don't provide the horsepower to go with them, no-one will buy their cards and hence no-one will use those features.



More of a nitpick than anything but Assasin's Creed actually did make use of 10.1 and from what I read it was supposed to make a pretty significant difference (I think I remember reading like 20% performance gain when using AA).   

Also "performance is king" is probably not correct either -- it's important but it doesn't rule all.  I'm sure Nv sold a hell of a lot more 8800 GTs than Ultra's or GX2s.  Not to mention the arguement can be misleading.  You can have games where an ATI card will out perform a similiar NV card that otherwise crushes the same ATI card in game x.  In cases like those which is really the stronger card?  At times performance can rely as much on market share & cooperative development initatives as it does technical design & specs.  Where ATI may have the latter two they are definately lacking in the former.  It becomes a chicken & egg thing.  To make their cards perform beter ATI needs games to support their cards features, to get the support they probably need stronger market share (it makes sense to put more support behind a card your consumer is just more likely to have), to get market share they need....


----------



## Wile E (Jun 2, 2008)

jonmcc33 said:


> No it is not overkill when you have DX10.1 which requires 4XAA. When you play with the AA/AF cranked then bandwidth becomes very important and is the bottleneck.
> 
> GDDR5 isn't unnecessary, it's unattainable by nVIDIA because they aren't involved in the development of it. Come to think of it, didn't ATi help develop GDDR3 as well? The same memory technology that nVIDIA is using on it's flagship cards?
> 
> ...


Ummm, even with 4xAA in DX10.1 DDR3 on a 512bit bus IS NOT going to be a bottleneck. Again, look at the 2900XT with the 512bit bus. The GDDR3 and GDDR4 card perform the same, because the bandwidth isn't anywhere near maxed out on even the GDDR3 cards.

And why do you care who developed it? All that matters is the cost and performance of the hardware, not who designed it.


----------



## jcfougere (Jun 2, 2008)

Ati needs to catch up financially..that is to say need to come out of debt.

I believe it is an extremely good move for Ati to stay in the shadow on Nvidia on the ultra high end of the scale.  Why should they manufactor an expensive mega card that only markets towards the 5% of users who actually buy the best of the best.

By sticking to the affordable almost high end cards, they are bringing VERY good cards at FAR more reasonable prices, and starting to win back the mainstream users who buy up most of the cards sold by both companies.

After 2900xt people feared Ati might be a done company, now they are on the right track again with affordable high performance cards; isn't that everyones dream come true?


----------



## johnnyfiive (Jun 2, 2008)

jbunch07 said:


> yes!
> good news!



Indeed


----------



## jonmcc33 (Jun 3, 2008)

Wile E said:


> Ummm, even with 4xAA in DX10.1 DDR3 on a 512bit bus IS NOT going to be a bottleneck. Again, look at the 2900XT with the 512bit bus. The GDDR3 and GDDR4 card perform the same, because the bandwidth isn't anywhere near maxed out on even the GDDR3 cards.
> 
> And why do you care who developed it? All that matters is the cost and performance of the hardware, not who designed it.



When it comes to AA and AF, the bottleneck is memory bandwidth. Not sure what makes you think it doesn't. The 2900XT had a 512-bit bus and then what did ATi do? They abandoned it for the 256-bit bus on the 3800 series. But again, the reason they can is because they can advance memory speed and performance where as nVIDIA cannot.


----------



## candle_86 (Jun 3, 2008)

imperialreign said:


> I completely agree.  At the least it shows that a company is willing to keep up with the times.  I've noticed, though, that before the merger with AMD, supporting new tech worked, because they were well on par with nVidia; but a lot of the newer tech support hasn't been that big of importance since the merger . . .
> 
> anyhow, GDDR4 was never really show to improve performance over GDDR3, as GDDR3 over GDDR2 - the only difference I think that makes sense, IMO, is it reduces loading stutter, as the memory can move texture files in and out quicker.  But, this type of performance isn't included, nor can it really be, in benchmarks.
> 
> ...



really?

R300 8pp to compete with NV25 with 4pp. 256bit over 128bit. Seems brute force to me. r4xx 16pp SM2b 520/560 mhz core vs NV40 16pp SM3 450mhz. Brute force to me again.
R580 16pp 48shaders vs G71 24pp 24 shaders Brute force once again from ATI. Heck the R600 320 shaders 512bit vs G80 128 shaders 384bit, brute force again

The last non brute force attempt on ATI/AMD's part was the R520.


----------



## HAL7000 (Jun 3, 2008)

Well this all started with the mention of AMD shipping out the 4850 video cards. I enjoyed the string but the bottom line is will they be worth the wait and price. I wanted to upgrade a year ago and put it off because I did not feel that the hardware was worth the investment. 

AMD or Nvidia, its all choices we live with. One leads the other follows and vise versa, one works through the front door (Intel) the other through the back door (AMD). But as for video cards, I am for the best implementation of new technology at a fare price. Memory, GPU's, CPU's and performance change quickly. 

I for one look forward to seeing the new 4800 series come to market. My personal gaming rig will own one of these cards. My Biostar TP35D3-A7 Deluxe 5.x and E6850 sit idle waiting for the 4870 X2 to be released. *AMD don't fail me now.*


----------



## Wile E (Jun 3, 2008)

jonmcc33 said:


> When it comes to AA and AF, the bottleneck is memory bandwidth. Not sure what makes you think it doesn't. The 2900XT had a 512-bit bus and then what did ATi do? They abandoned it for the 256-bit bus on the 3800 series. But again, the reason they can is because they can advance memory speed and performance where as nVIDIA cannot.



You are completely missing my point. My point is the technology in use does not matter, only the end result matters. If nVidia can get it done with a wider bus but slower ram, so be it. I am also saying that putting slower GDDR3 on a faster bus will be roughly equivalent in performance to faster GDDR5 memory on a slower bus. Either way, it will be more bandwidth than the cards will be able to use, so it doesn't matter. All that is going to matter to most people is price and performance.


----------



## eidairaman1 (Jun 3, 2008)

Dude what is the problem with you  :shadedshu



candle_86 said:


> really?
> 
> R300 8pp to compete with NV25 with 4pp. 256bit over 128bit. Seems brute force to me. r4xx 16pp SM2b 520/560 mhz core vs NV40 16pp SM3 450mhz. Brute force to me again.
> R580 16pp 48shaders vs G71 24pp 24 shaders Brute force once again from ATI. Heck the R600 320 shaders 512bit vs G80 128 shaders 384bit, brute force again
> ...


----------



## [I.R.A]_FBi (Jun 3, 2008)

lawl .. calm down guys .. get a bveer or some juice and gin. candle is an NV d00d forever.


----------



## eidairaman1 (Jun 3, 2008)

he certainly seems to try to filibuster his point, or tries to beat a dead horse, how much fun is it to beat a dead horse?


----------



## erocker (Jun 3, 2008)

Yeah, I think his opinions are well ingrained into all of us by now.  Let's move forward!


----------



## eidairaman1 (Jun 3, 2008)

erocker said:


> Yeah, I think his opinions are well ingrained into all of us by now.  Let's move forward!



banghead:


J/K


----------

