• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

January 9 Launch Date for AMD Radeon HD 7900

I'll have to wait for some tests this time if the new architecture is really any good for games. Data crunching is fine as it is as far as i'm concerned.
 
This is earlier than expected.
 
I will be picking up two of these bad boys in February :) ..My 5850 Crossfire are still a powerhouse, but its time has come :)
 
Hmm .. I just ask for A RMA on my MSI 6970 Lighthing. ... Kind of wish I would of waited a month .. might of lucked out and got A 7970.


But on the other hand .. 6970s will drop in price so 2 more would be nice...
 
Donanimhaber is a rumor shotgun, it's bound to hit something right. I've seen too many benches from them that weren't accurate to believe all they say though, I think some of those might actually be purposefully leaked by AMD.

In any case release around CES seems likely. It looks like AMD already handed out the test cards, which is a little surprising. Perhaps they're keeping the drivers until closer to release.

Just sold my second 6950. I'll make do with one until I can get my hands on one of these babys! :rockout:
 
i think my GTX580 will serve me for quite some time still... theres no games i cant max so no need for another gpu upgrade just yet :)
 
so this is how 2012 apocalypse happens. AMD releases a single gpu so powerful that it simulates the entire universe causing several wormholes to be created that swallow our galaxy super cluster. yeah that makes sense now.
 
so this is how 2012 apocalypse happens. AMD releases a single gpu so powerful that it simulates the entire universe causing several wormholes to be created that swallow our galaxy super cluster. yeah that makes sense now.

Yeah, have you looked at the leaked shot? Volterra Digital PWM, their best quality part (which nVidia lacks) is gone. They were using it since HD 2900, and now it's Analog with two inductors off. Like a non-ref 6970 rather than a reference. Doesn't seem "super duper mega awesome" to me.
 
Yeah, have you looked at the leaked shot? Volterra Digital PWM, their best quality part (which nVidia lacks) is gone. They were using it since HD 2900, and now it's Analog with two inductors off. Like a non-ref 6970 rather than a reference. Doesn't seem "super duper mega awesome" to me.

So AMD has advanced past the point of the HD2900 and you think the card is going to flop? I fail to see your logic there ;D
 
So AMD has advanced past the point of the HD2900 and you think the card is going to flop? I fail to see your logic there ;D

I'm pretty sure he's just saying that AMD cheaped out on parts which is apparent by the pictures. Then again this could be the 7950 card and we haven't seen the retail 7970 yet. I'm just waiting to see results first.
 
So AMD has advanced past the point of the HD2900 and you think the card is going to flop? I fail to see your logic there ;D

No, they went from the best Volterra 8-phase Digital PWM solution to a cheap Analog one. The one they had before gave accurate reading and adjusting of voltage on it's own along with fast power phase shifting. This one they have now is only better than the exploding VRM of nVidia.
 
Yeah, have you looked at the leaked shot? Volterra Digital PWM, their best quality part (which nVidia lacks) is gone. They were using it since HD 2900, and now it's Analog with two inductors off. Like a non-ref 6970 rather than a reference. Doesn't seem "super duper mega awesome" to me.
Good observation there and if true what does that mean for power requirements and stability? Is the new architecture able to work with a less stringent power envelope? I mean I can't see them dropping back on a sample or reference design unless they know they can maintain with analog and inductors?
 
Good observation there and if true what does that mean for power requirements and stability? Is the new architecture able to work with a less stringent power envelope? I mean I can't see them dropping back on a sample or reference design unless they know they can maintain with analog and inductors?

Well actually, Digital PWM creates more heat and is more expensive to make. You can do the same with Analog though it may not have an as high OC potential (i.e 570/590). nVidia uses Analog for that reason while AMD was going with Digital since their cards are cooler to boot. IDK what it might exactly mean about the GPU itself other than the fact that they cheaped out.
 
Well actually, Digital PWM creates more heat and is more expensive to make. You can do the same with Analog though it may not have an as high OC potential (i.e 570/590). nVidia uses Analog for that reason while AMD was going with Digital since their cards are cooler to boot. IDK what it might exactly mean about the GPU other than the fact that they "cheaped out".

Are there such things as high quality analog PWM that allow for more OC potential? Or is there physical limit to how good an analog controller can be? If AMD did use it to save money then maybe that is how they will afford to undercut all of nVidia's cards and consumers get less expensive products.

Basically, do you know if this will effect the card much, or is it just like having a top-of-the-line stereo system inside a Nascar? Not really needed but nice to have.
 
Well, this coincides with my own estimated ETA for these cards. If true, good job, AMD!!
I had hoped for a release this month, but didn't expect card until January, with the dual GPU card in March/April.
Yes, seems AMD are on the ball...at least with graphics.
A couple of points:
1.Launch is good. Retail availability is better.
2.New µarch likely means a greater emphasis on getting the drivers right. Having a Ferrari in the garage means squat if the throttle linkage is broken.
2a. With a new µarch, I suspect the driver team(s) will be concentrating their efforts on GCN based cards. Not a good sign with the amount of game related bugs still prevalent with VLIW4/5 based cards. AMD seem to have trouble optimizing for one µarch, and now they're doubling the fun. Does this mean that AMD are now consigning HD4000/5000 to legacy driver status?
New designs every year is almost too fast!
Yes, I too was an uber-enthusiast...until I took an arrow to the knee
 
Are there such things as high quality analog PWM that allow for more OC potential? Or is there physical limit to how good an analog controller can be? If AMD did use it to save money then maybe that is how they will afford to undercut all of nVidia's cards and consumers get less expensive products.

Basically, do you know if this will effect the card much, or is it just like having a top-of-the-line stereo system inside a Nascar? Not really needed but nice to have.

Yeah, there're many strong Analog solutions like the ones in MSI's non-ref cards. But this one looks no different than the phases they used in the second revision 6950/70. Cheap ones to be precise. We can't be sure how it'd affect the board before it's released. It probably works well, but might as well be limited like 570's. Volterra solutions are always more accurate at power delivery (i.e reference over non-ref).
 
Yeah, there're many strong Analog solutions like the ones in MSI's non-ref cards. But this one looks no different than the phases they used in the second revision 6950/70. Cheap ones to be precise. We can't be sure how it'd affect the board before it's released. It probably works well, but might as well be limited like 570's. Volterra solutions are always more accurate at power delivery (i.e reference over non-ref).

Why they look cheap? Ti vrms looks just fine by be better than whatever invidia is using on their gtx 570 and all their cards inb general.
For god sake 6870 and gtx despite ythe huge power consumption between them both are using 4 gpu phases ...so tell me ho is cheaping out huh? both are to make more profits but at least i didn't heard of any recent cards from amd blowing out like gtx 570 or especially 590 do.
 
Why they look cheap? Ti vrms looks just fine by be better than whatever invidia is using on their gtx 570 and all their cards inb general.
For god sake 6870 and gtx despite ythe huge power consumption between them both are using 4 gpu phases ...so tell me ho is cheaping out huh? both are to make more profits but at least i didn't heard of any recent cards from amd blowing out like gtx 570 or especially 590 do.

Not sure where the Ti or the 6870 came in discussion. lol. I was on about the high end (7900/6900).
 
Yes, seems AMD are on the ball...at least with graphics.
A couple of points:
1.Launch is good. Retail availability is better.

Yeah, time will tell how much of a paper launch this will be. It's quite normal for me to ahve to wait about 3 months before being able to buy cards locally. It seems Bioware is good at snapping up new hardware locally.:laugh:

2.New µarch likely means a greater emphasis on getting the drivers right. Having a Ferrari in the garage means squat if the throttle linkage is broken.
2a. With a new µarch, I suspect the driver team(s) will be concentrating their efforts on GCN based cards. Not a good sign with the amount of game related bugs still prevalent with VLIW4/5 based cards. AMD seem to have trouble optimizing for one µarch, and now they're doubling the fun. Does this mean that AMD are now consigning HD4000/5000 to legacy driver status?

I have no idea how to comment on this. What I hope to see if AMD commit a team of programmers to each core design, GCN and VLIW, respectively. It seems this would be the best way to do things, but then they are very reliant on having the right people in the right places, if they have them at all.

I'm not too sure what I'm gonna do. If htese cards are fast enough to do Eyefinity decently, I'm gonna buy a couple, for sure. If not, my 6950's do a decent job at single monitor, so I ahve no interest in buying new cards. I'd rather buy someone's used cards and get 2 more 6950's before buying into a new gen. $ cards are jsut so much mroe visually impressive in how htey fill a case, anyway.:laugh:
 
For comparison, here are 3 pics. This is the 7900.

amd_tahiti_nude_li01.jpg


This one's an original, reference 6970.

sapphire-6970-scan-front.jpg


And this is a second revision, cheaped out 6970.

Sapphire-Radeon-HD6970-2GB-V2.jpg
 
All the author of DomainHamber :D does is to make up crap. That site is a joke. Why would they, from Silicon Valley, CA, send their most important info to some junk site from a 3rd World country? They never get any info. They just stir it up.

You, my dear, are an idiot.

If you took few minutes to check out their rumor posts from the past you would surely notice that they have been correct on pretty much everything. The only exception to this are the inferences people make from those announcements. People like you.
 
I've used Volterra PWM-powered reference AMD boards, and cheapo Sapphire/HIS boards that use the same GPU but cheaper UPI/CHIL/OnSemi + DPAK circuits. I could never tell any quality difference. I guess the PWM-powered boards only have an edge with extreme cooling...that is if AMD doesn't skew you over with clock speed limits.
 
IDK what it might exactly mean about the GPU itself other than the fact that they cheaped out.
Given the price they have for this, including TSMC 28Nm price increase and yields... while not absolutely needing it now becuase the 28Nm and GCN silicon doesn't demand for it. They made concessions.

While now (also) they don't intend for OC'n them and will be leaving up to AIB's to do what Nvidia has been doing, letting AIB's build the OC units with the power section to achieve their halo products, and more difference from how they did it in the past.

If I understood right, I read several month back that AMD would permit AIB's more leeway right from the initial release to not release basically stickered reference designs. Heck most often the reason to grab a newly release AMD/ATI reference card was that those most often provide the best OC’n. Going forward it may be more like Nvidia AIB's some generic release cards, and a full stable of über units almost from day one.
 
Last edited:
Well, my game of the moment (Skyrim) is kicking the crap out of my CPU more than my GPU so I don't really care (which I did prior to Skyrim's launch). Wait for Kepler...
 
Back
Top