Sunday, January 11th 2009
AMD's Response to G200b Slated for March
NVIDIA snatched the performance crown from ATI with the introduction of the GeForce GTX 295 accelerator, and its launch itinerary for CES 2009 includes the GeForce 285, NVIDIA's second fastest graphics accelerator. NVIDIA's campaign to regain the performance crown was spearheaded by the G200b graphics processor, that, while not offering anything new, helped cut manufacturing costs and reduced the thermal envelope of the GPU, making conditions favourable for a dual-GPU accelerator, the GeForce GTX 295.
AMD on the other hand, has announced price-cuts to respond to the GeForce GTX 295, by lowering the prices of its Radeon HD 4870 X2 accelerator. The G200b is likely to get a competitor from AMD by March, when the company is looking to release the industry's first GPU built on the 40nm manufacturing process, the RV740. But wait, there seems to be something larger on the cards, according to the various sources Hardware-Infos got in touch with. AMD is planning the RV790 graphics processor. It will be a current-generation GPU built on the next generation 40nm manufacturing technology. There is a lot of speculation surrounding the RV790's specifications, with some of the more plausible ones hinting it has two additional SIMD clusters (960 SPs) and a total 48 texture memory units (TMUs). Both the RV740 and RV790 are slated for March, there's also a little indication of AMD using the occasion of CeBIT for its announcements and product launches.
Source:
Hardware-Infos
AMD on the other hand, has announced price-cuts to respond to the GeForce GTX 295, by lowering the prices of its Radeon HD 4870 X2 accelerator. The G200b is likely to get a competitor from AMD by March, when the company is looking to release the industry's first GPU built on the 40nm manufacturing process, the RV740. But wait, there seems to be something larger on the cards, according to the various sources Hardware-Infos got in touch with. AMD is planning the RV790 graphics processor. It will be a current-generation GPU built on the next generation 40nm manufacturing technology. There is a lot of speculation surrounding the RV790's specifications, with some of the more plausible ones hinting it has two additional SIMD clusters (960 SPs) and a total 48 texture memory units (TMUs). Both the RV740 and RV790 are slated for March, there's also a little indication of AMD using the occasion of CeBIT for its announcements and product launches.
36 Comments on AMD's Response to G200b Slated for March
I love competition. it brings prices down :)
:shadedshu
Anyway if it was true I would inmediately take the negotiators from both companies and make them lead peace negotiations around the world ASAP. World peace FOREVER!! They must be that good. :roll:
Back to topic, IMO 2 more clusters won't increase performance too much unless they also increase ROP performance by the same or almost the same amount. I'm basing this opinion just looking at how the HD4830 is very close clock for clock to it's bigger brothers and even the "faulty" 560 SP HD4830's were pretty close. IMHO 2 more SIMD clusters won't help too much and higher clocks would help much more.
Also Nvidia has been 2 years on the lead and Ati has been fighting with prices. Ati would never agree to something that made them go from a 50% of discrete cards market share to around a 25% in the time they had the lowest. That simply doesn't fit in the "conspiracy" theory. I'll never understand that mentality. New products always means progress. And your $300 card is as good today as it was a comparable $300 card in the past. (far better in fact, in the past cards couldn't keep up with the games, now it's the opposite) It also lasts the same, either if the next card comes in 3 months being 25% faster or if it comes in a year and is twice as fast. As long as your "old" card can play everything, and never in history that was more true than now the card has not lost it's value. You don't need always the best of the best, maybe you want it, but you don't need it. If you feel you do, you have a problem, and I'm being serious about that. So, because it's something about wanting and not needing, if you want always the best, even if that "the best" is just a little bit faster than what you have, then you should be prepared to pay. No one forces you into buying every card. The current market model makes prices far better and also allows you to buy a card whenever you want and you will always have the best your money can buy. WIN WIN.
these company's need to slow down the release, bring out more meaningfull cards, and STOP RENAMING OLD PRODUCTS AND RESELLING THEM AS NEW ITEMS!!!!!!
at least they could make some changes like those made between the 2900 and 3800 cards(the changed did improove avivo playback acceleration!!! )
crysis=tech demo for cryengine2 that never got patched properly, unlike the techdemo for cryengine1 (farcry!!!)
haha, i got nwn2:soz running in background because alt+tabing back in is easyer/faster then restarting the game :P
This is hurting the GPU industry, IMO. You've got both companies rushing out with tech as soon as it is developed, instead of spending to time to refine it. So we get something like G80 and R600, where both were far from perfect and both companies could have simply waited a couple of months to refine them into what they eventually turned into, G92 and RV670. We have the same thing with G200b, nVidia could have easily just not released G200 and left G92 to content with ATi's new offerings until the G200b was ready.
I'm all for progress, and there is no reason that progress can't be made at the same rate it is moving now, without high end product releases every 2 months. One high end product release a year is all that is really necessary, none of this baby step BS. And it will free up some time to work on the mid-range market, which really needs some help, especially on nVidia's side.
- We are able to get a better card for the same money now than when only a card or two a year were released, because competition brings prices down. Maybe is not going to be the best after a month, but it certainly is comparatively better. If only few cards were released competition wouldn't exist, think about G80 days. So what is what we want, better cards to be able to play better games or the best card for a long period of time (even if that's not the best that could exist) that serves nothing but to be able to say you are above others...
- Forget about one mayor release per year, a release that will make a card 2x as fast, that's something of the past. As complexity has incresed the development cycle is goig up, just as with CPUs and is probably now somewhere around 18 months or more. Companies just can't be the loser for so long, again remember G80 days. In that time, new processes can appear, yields can improve and so on, and those things make it pssible to release cards that are up to around 50% faster. It'd be stupid not to use those improvements. Specially if you are the one behind.
- Related to the above: this industry is an egg and chicken thing. Without a card that could run it, games would never improve, they just can't take the risk. On the other hand GPU manufacturers can't neither take the risk of releasing a card that would be overkill. But in order to improve someone has to take the risk. Well Crytek took the risk and we know how that ended up. Yet they knew better cards were coming out soon, imagine if they had to wait 18 months, that simply wouldn't be profitable and all developers would just make their release coincide with card releases. Even then that wouldn't be profitable, only best games would sell and developers don't know if their game will be the best one, they can hope, they can put as much energy as they can, but they never know. And that's unsustainable, no one works for 3 years just to get nothing in turn.
That also applies to the technologies behind the GPUs: fab process, ram, PCBs, everything. If they know their advancements will not be used until 18 months later they wouldn't put much effort into it. Who would want to put money into something so uncertain that would happen every 2 years without knowing you could have a second chance they use your tech in a later product? Bacause if you develop something every 18-24 months and you happen to loose to another company or if you end up better but you are late, you'd have to wait another 18 months and by then your product wouldn't be the best one anyway.
The industry advances so fast because the wheel keeps rolling for every link in the chain and the ones above in the chain use the best at their hands to make the best they can in all moments. Break one link and everything falls apart.
- Sorry for the rant, but there's one more thing to take into account: the market today is not as it was in the past, it's already saturated. In it's infancy all markets are easier. When only a 10% of the target population has your product or one of your competitors product, you fight so that you can convince buyers into buying your product. You don't care about competition. But when it's saturated, you have to convince them to upgrade over what they have, so being the looser even if it's only by a bit is unaceptable. People will upgrade to what is better. Price wars doesn't help there, the competitor with the best product can always fight you there: Intel vs AMD.