Friday, June 26th 2015

AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

This has been a roller-coaster month for high-end PC graphics. The timing of NVIDIA's GeForce GTX 980 Ti launch had us giving finishing touches to its review with our bags to Taipei still not packed. When it launched, the GTX 980 Ti set AMD a performance target and a price target. Then began a 3-week wait for AMD to launch its Radeon R9 Fury X graphics card. The dance is done, the dust has settled, and we know who has won - nobody. AMD didn't get the R9 Fury X wrong, but NVIDIA got its GTX 980 Ti right. At best, this stalemate yielded a 4K-capable single-GPU graphics option from each brand at $650. You already had those in the form of the $650-ish Radeon R9 295X2, or a pair GTX 970 cards. Those with no plans of a 4K display already had great options in the form of the GTX 970, and price-cut R9 290X.

The Radeon R9 290 series launch from Fall-2013 stirred up the high-end graphics market in a big way. The $399 R9 290 made NVIDIA look comically evil for asking $999 for the card it beat, the GTX TITAN; while the R9 290X remained the fastest single-GPU option, at $550, till NVIDIA launched the $699 GTX 780 Ti, to get people back to paying through their noses for the extra performance. Then there were two UFO sightings in the form of the GTX TITAN Black, and the GTX TITAN-Z, which made no tangible contributions to consumer choice. Sure, they gave you full double-precision floating point (DPFP) performance, but DPFP is of no use to gamers. So what could have been the calculation at AMD and NVIDIA as June 2015 approached? Here's a theory.
Image credit: Mahspoonis2big, Reddit

AMD's HBM Gamble
The "Fiji" silicon is formidable. It made performance/Watt gains over "Hawaii," despite a lack of significant shader architecture performance improvements between GCN 1.1 and GCN 1.2 (at least nowhere of the kind between NVIDIA's "Kepler" and "Maxwell.") AMD could do a 45% increase in stream processors for the Radeon R9 Fury X, at the same typical board power as its predecessor, the R9 290X. The company had to find other ways to bring down power consumption, and one way to do that, while not sacrificing performance, was implementing a more efficient memory standard, High Bandwidth Memory (HBM).

Implementing HBM, right now, is not as easy GDDR5 was, when it was new. HBM is more efficient than GDDR5, but it trades clock speed for bus-width, and a wider bus entails more pins (connections), which would have meant an insane amount of PCB wiring around the GPU, in AMD's case. The company had to co-develop the industry's first mass-producible interposer (silicon die that acts as substrate for other dies), relocate the memory to the GPU package, and still make do with the design limitation of first-generation HBM capping out at 8 Gb per stack, or 4 GB for AMD's silicon; after having laid a 4096-bit wide memory bus. This was a bold move.

Reviews show that 4 GB of HBM isn't Fiji's Achilles' heel. The card still competes in the same league as the 6 GB memory-laden GTX 980 Ti, at 4K Ultra HD (a resolution that's most taxing on the video memory). The card is just 2% slower than the GTX 980 Ti, at this resolution. Its performance/Watt is significantly higher than the R9 290X. We reckon that this outcome would have been impossible with GDDR5, if AMD never gambled with HBM, and stuck to the 512-bit wide GDDR5 interface of "Hawaii," just as it stuck to a front-end and render back-end configuration similar to it (the front-end is similar to that of "Tonga," while the ROP count is the same as "Hawaii.")

NVIDIA Accelerated GM200
NVIDIA's big "Maxwell" silicon, the GM200, wasn't expected to come out as soon as it did. The GTX 980 and the 5 billion-transistor GM204 silicon are just 9 months old in the market, NVIDIA has sold a lot of these; and given how the company milked its predecessor, the GK104, for a year in the high-end segment before bringing out the GK110 with the TITAN; something similar was expected of the GM200. Its March 2015 introduction - just six months following the GTX 980 - was unexpected. What was also unexpected, was NVIDIA launching the GTX 980 Ti, as early as it did. This card has effectively cannibalized the TITAN X, just 3 months post its launch. The GTX TITAN X is a halo product, overpriced at $999, and hence not a lot of GM200 chips were expected to be in production. We heard reports throughout Spring, that launch of a high-volume, money-making SKU based on the GM200 could be expected only after Summer. As it turns out, NVIDIA was preparing a welcoming party for the R9 Fury X, with the GTX 980 Ti.

The GTX 980 Ti was more likely designed with R9 Fury X performance, rather than a target price, as the pivot. The $650 price tag is likely something NVIDIA came up with later, after having achieved a performance lead over the R9 Fury X, by stripping down the GM200 as much as it could to get there. How NVIDIA figured out R9 Fury X performance is anybody's guess. It's more likely that the price of R9 Fury X would have been different, if the GTX 980 Ti wasn't around; than the other way around.

Who Won?
Short answer - nobody. The high-end graphics card market isn't as shaken up as it was, right after the R9 290 series launch. The "Hawaii" twins held onto their own, and continued to offer great bang for the buck, until NVIDIA stepped in with the GTX 970 and GTX 980 last September. $300 gets you not much more from what it did a month ago. At least now you have a choice between the GTX 970 and the R9 390 (which appears to have caught up), at $430, the R9 390X offers competition to the $499 GTX 980; and then there are leftovers from the previous-gen, such as the R9 290 series and the GTX 780 Ti, but these aren't really the high-end we were looking for. It was gleeful to watch the $399 R9 290 dethrone the $999 GTX TITAN in September 2013, as people upgraded their rigs for Holiday 2013. We didn't see that kind of a spectacle this month. There is a silver lining, though. There is a rather big gap between the GTX 980 and GTX 980 Ti just waiting to be filled.

Hopefully July will churn out something exciting (and bonafide high-end) around the $500 mark.
Add your own comment

223 Comments on AMD Didn't Get the R9 Fury X Wrong, but NVIDIA Got its GTX 980 Ti Right

#101
newtekie1
Semi-Retired Folder
FrickIn 12 of the 22 games of w1z's charts it is actually faster than the 980ti. In that chart I count 12 games. I haven't bothered to see if they are the same games.
That is nice, overall, it is slower so their claim is false. Period.
Frickwww.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,11.html


Obviously if some are getting 49°C and others are getting 60°C in open test benches the claim of <50°C typical load temps isn't true. Since, you know, typically people put their computers in actual computer cases...typically.
Frickaven't watched the video, but it is the fastest in cherry picked tests. Which is the point of PR, and as far as I'm concerned they haven't told lies. Picking on a company for PR speak is like hating the sky because it's blue, or the sun because it shines.
Yep, and I can say a Prius is the fastest car in the world if I cherry pick tests. I'd be lying, but I could do it.

There is cherry picking tests and showing those in press releases, then there is flat out making false claims without showing where you are getting those claims from. If you say you have the fastest GPU in the world with no further explanation, it better damn well be faster than every other GPU overall, not just in cherry picked tests, otherwise it is a lie. Now, if they said they have the fastest GPU in the world based on 3DMark Firestrike results, then that would be fine. If they say it is the most efficient GPU in the world, and there is no debating that is a lie, then it better well give the best performance per watt out of anything on the market, and the 980Ti simply crushes the Fury X in that department.
Posted on Reply
#102
Frick
Fishfaced Nincompoop
newtekie1That is nice, overall, it is slower so their claim is false. Period.
Well, it means that the 980 ti is faster than the Fury X in 10 games, whereas the Fury X is faster than the 980ti in 12 games. The claim was for 4k, and it seems to hold up pretty well.

I'm not saying the Fury X is better than the 980ti, because it's not, for all the reasons you say (if I was in the market for a high end GPU I'd get the Fury X for the hell of it). But to say AMD is lying and berating them for it is overreacting. If you take a company's PR speak literally, I consider that to be your problem. AMD is doing it too much, sure that I can agree with (because they totally do), but again, if you take every PR slide at face value that is indeed your problem.
Posted on Reply
#103
Folterknecht
64KSteam Hardware Surveys aren't scientific but I look at them to get some general idea what people are gaming on. Very very few are gaming on high end cards. The vast majority surveyed are gaming on entry level and mid range cards or integrated graphics. Neither Fury X nor 980 Ti will constitute much of AMD/Nvidia sales. As far as monitors between 1368X768 and 1920X1080 combined totals 60% of monitors. At my resolution, 1440p, it's 1.1% and despite all the talk of 4K becoming mainstream it's 0.06% (about 1 out 0f 1,600). The vast majority don't need or want a high end card.

store.steampowered.com/hwsurvey
As long as it isnt possible to play 4K with 150-250$ cards as reasonable framerate and settings nothing much will change there.

I mean look at 980ti and FuryX at 4K they usually are around 40-45 FPS avg with higer settings. No thanks I d rather play Full HD or 1440p with everything maxed while having min FPS at around 60 FPS, than what the current highend can offer at 4K.

In 2 years we might get to the point and that Freesync/G-Sync nonsense might even be sortet out by then.
Posted on Reply
#104
the54thvoid
Super Intoxicated Moderator
FrickWell, it means that the 980 ti is faster than the Fury X in 10 games, whereas the Fury X is faster than the 980ti in 12 games. The claim was for 4k, and it seems to hold up pretty well.

I'm not saying the Fury X is better than the 980ti, because it's not, for all the reasons you say (if I was in the market for a high end GPU I'd get the Fury X for the hell of it). But to say AMD is lying and berating them for it is overreacting. If you take a company's PR speak literally, I consider that to be your problem. AMD is doing it too much, sure that I can agree with (because they totally do), but again, if you take every PR slide at face value that is indeed your problem.
I can see where you are coming from with that response. Kudos for being rational.

I'm sure Nvidia and AMD had a dual gpu stand off some time back when Nvidia claimed they had the fastest dual card but it really was head to head. Both sides have done it.

Their biggest fat lie was the part where they said it was an overclockers dream. Now that WAS a lie, unless you class a nightmare as a dream. The overclock is what is driving me to a 980ti.
Posted on Reply
#105
Frick
Fishfaced Nincompoop
the54thvoidTheir biggest fat lie was the part where they said it was an overclockers dream.
Wow, they said that? That I can agree on is just plain wrong. :laugh:
Posted on Reply
#106
MrGenius
the54thvoidTheir biggest fat lie was the part where they said it was an overclockers dream. Now that WAS a lie, unless you class a nightmare as a dream.
FrickWow, they said that? That I can agree on is just plain wrong. :laugh:
Based on what? You've got one? You've applied some overvoltage and it resulted in what? What we know as of now is...W1z got 10% with no added volts. Which if you know anything about/have any experience overclocking graphics cards is not bad(AT ALL). My 280X won't do half of that with no added volts(and with an Arctic Accelero Extreme IV installed to boot).

And who knows what the ASIC quality is on the one they gave him? Well...besides him.

And the there's this:
Note: we'll have to wait for proper support in tweak utilities before we can draw a final conclusion. I'm betting that with a little extra voltage we can reach 1250 MHz. But that remains an estimated guess.
www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,37.html

Their sample did 1125MHz. So we can also guesstimate W1z's might do 1275MHz. That's what you call an overclocker's dream. I know, I am one. And I still insist it'll go higher than that actually. Someone, with the skills, will push one well over 1400MHz. Mark my words.
Posted on Reply
#107
Folterknecht
^All reviews indicate slightly above 1100 MHz without voltage tweaks for OC. And if you look up the IR images on Tomshardware I highly suggest to leave the fingers from raising the voltage without proper water cooling, if you plan to keep that card for a longer period of time.
Posted on Reply
#108
the54thvoid
Super Intoxicated Moderator
MrGeniusBased on what? You've got one? You've applied some overvoltage and it resulted in what? What we know as of now is...W1z got 10% with no added volts. Which if you know anything about/have any experience overclocking graphics cards is not bad(AT ALL). My 280X won't do half of that with no added volts(and with an Arctic Accelero Extreme IV installed to boot).

And who knows what the ASIC quality is on the one they gave him? Well...besides him.

And the there's this:

www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,37.html

Their sample did 1125MHz. So we can also guesstimate W1z's might do 1275MHz. That's what you call an overclocker's dream. I know, I am one. And I still insist it'll go higher than that actually. Someone, with the skills, will push one well over 1400MHz. Mark my words.
“You’ll be able to overclock this thing like no tomorrow,” AMD CTO Joe Macri says. “This is an overclocker’s dream.”

Direct quote, courtesy of @HumanSmoke's link. Why do you feel the illogical need to jump to attack what I said?
All the reviews had poor overclock results. Just deal with it.

Or is it the world of AMD to release products that don't work as planned at launch? The 980ti didn't have this problem at launch. It over clocked just fine.
You're clinging onto some hope by using the tired line of, "once they get it right it'll be good". But really? Really?

Sorry man, your posts are really quick to attack what is known and evidenced thus far by ALL the reviews. I've praised the card, it's great and its a good implementation but for me the shitty over clock is a deal breaker.

oh, FTR, my power color 7970 LCS hit 1300 core (the other was BIOS locked at 1225 Max). My 780ti hit shy of 1400. I look forward to a 980ti on 1500+.
Posted on Reply
#109
newtekie1
Semi-Retired Folder
FrickWell, it means that the 980 ti is faster than the Fury X in 10 games, whereas the Fury X is faster than the 980ti in 12 games. The claim was for 4k, and it seems to hold up pretty well.

I'm not saying the Fury X is better than the 980ti, because it's not, for all the reasons you say (if I was in the market for a high end GPU I'd get the Fury X for the hell of it). But to say AMD is lying and berating them for it is overreacting. If you take a company's PR speak literally, I consider that to be your problem. AMD is doing it too much, sure that I can agree with (because they totally do), but again, if you take every PR slide at face value that is indeed your problem.
I'm not berating them for it, thinking I am is over reaction, I'm just stating that is what they did and why the launch ended in disappointment.

You are also just focusing on one of the lies they told, I listed several. It is very simple, if the Fury X was faster than the 980Ti at 4K, then the Fury X would be higher than the 980Ti on this graph:


But it isn't. If their statement was true I'd expect the Fury X to have no problem topping the 980Ti in overall performance.
Posted on Reply
#110
HumanSmoke
MrGeniuswww.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,37.html
Their sample did 1125MHz. So we can also guesstimate W1z's might do 1275MHz. That's what you call an overclocker's dream. I know, I am one. And I still insist it'll go higher than that actually. Someone, with the skills, will push one well over 1400MHz. Mark my words.
How to you guesstimate that?
The overwhelming marketing impact of a new card is its launch where it is the focus of the consumer tech world. Either AMD don't know a thing about selling their own product, or they voltage locked the card for a reason. Maybe they are aware that adding voltage brings adverse results ( a steep ramp in temps affecting GPU/HBM? overwhelming the AIO resulting in GPU throttling producing lower benchmark results than the stock settings? As an engineer don't you concede that both these outcomes might be possible?)
Bearing in mind that the overclocks achieved have been done with no voltage increase, and the power envelope has been raised in accordance with clockspeed...

...maybe AMD already know how the card acts under more extreme conditions.
Posted on Reply
#111
MrGenius
e
the54thvoid“You’ll be able to overclock this thing like no tomorrow,” AMD CTO Joe Macri says. “This is an overclocker’s dream.”

Direct quote, courtesy of @HumanSmoke's link. Why do you feel the illogical need to jump to attack what I said?
All the reviews had poor overclock results. Just deal with it.

Or is it the world of AMD to release products that don't work as planned at launch? The 980ti didn't have this problem at launch. It over clocked just fine.
You're clinging onto some hope by using the tired line of, "once they get it right it'll be good". But really? Really?

Sorry man, your posts are really quick to attack what is known and evidenced thus far by ALL the reviews. I've praised the card, it's great and its a good implementation but for me the shitty over clock is a deal breaker.

oh, FTR, my power color 7970 LCS hit 1300 core (the other was BIOS locked at 1225 Max). My 780ti hit shy of 1400. I look forward to a 980ti on 1500+.
  1. You weren't the only one I quoted, don't single yourself out on my account.
  2. If asking you questions about your reply in a straight-forward and civil manner is illogical, and/or what you call being attacked...see a psychologist/psychiatrist. You've got some issues to deal with.
  3. Poor overclocking results is a conclusion that one comes to who has no real idea how this works.
  4. Yes really! Really!
  5. ALL the reviews support my statements. It's you who's misinterpreting them.
  6. 7970 liquid cooled @ 1300 core...meh...not bad. Sounds pretty typical.
  7. 7970 BIOS locked @ 1225 Max? Trying to prove you really don't know what the hell you're talking about...or what? Hint: there's no such thing as a BIOS lock on overclocking. It's a hardware thing.
  8. 780 Ti @ ~ 1400. Don't doubt it. Doesn't prove anything.
  9. 980 Ti @ 1500+? Possibly. I'd like to see it. Go for it! Then ask yourself this: How much performance have I gained relative to a similarly overclocked Fury X. And no, not a Fury X @ the same core speed. Since given equal core speeds = YOU LOSE BADLY!!!
Homework people, homework. I'm not really as big a dufus as you think I am. Or that I lead on to be. It's not wise to display one's full power to just anyone and everyone. Then you could prepare yourself for what I actually have in store. And possibly reserve your best tricks for last too. My A-game is saved for ONLY when absolutely required. Shoot...I've said too much. That is all.

@HumanSmoke I'll concede that. But we're still in the realm of suppositions and assumptions here, as opposed to hypotheses. Or better yet, theories.

Theory>Hypothesis>supposition/assumption>conjecture
Posted on Reply
#112
newtekie1
Semi-Retired Folder
MrGeniusPoor overclocking results is a conclusion that one comes to who has no real idea how this works.
So I guess W1z has no real idea how this works then, considering he even said he was disappointed with the overclocking of the Fury X...
Posted on Reply
#113
HumanSmoke
MrGenius980 Ti @ 1500+? Possibly. I'd like to see it. Go for it!
You don't read W1zz's reviews at this site?
As a comparison...
From the Gigabyte GTX 980 Ti G1 review at HardOCP
Without touching any voltage, we began increasing our clock speed. We hit a brick wall when our boost clock hit 1500MHz. The real-time frequency in game was a stable 1524MHz. We also managed to increase the memory frequency by an outstanding 1100MHz. That brought our memory frequency to 8.11GHz........We found by enabling over voltage and increasing the offset maximum to the highest level, which is .087V, that we could squeeze a few MHz more out of the video card. We increased our boost clock from 1500MHz up to 1513MHz. This also altered the actually frequency in game, raising it to 1550MHz.
...and Gigabyte's G1 isn't the fastest 980 Ti out there. Word is that Galaxy's HOF is an overclocking monster, while the HOF LN2 editionis shaping up as a serious contender for the Classified KPE
Posted on Reply
#114
HumanSmoke
MrGenius980 Ti @ 1500+? Possibly. I'd like to see it. Go for it! Then ask yourself this: How much performance have I gained relative to a similarly overclocked Fury X. And no, not a Fury X @ the same core speed. Since given equal core speeds = YOU LOSE BADLY!!!
You're an engineer and making direct comparison between clockspeed with disparate architectures? That sounds - if you don't mind me saying, pretty stupid. Are you also going to argue that the efficiency of Fiji should be compared on core count? Maybe a salvage part Fiji with ~2600 cores vs the GTX 980 Ti ? That sounds like fun!
BTW. Re: the relative overclocking merits. Here's Hexus's OC'ed Fury X ( 8.6% OC w/ no voltage adjustment ) and the Gigabyte's G1 980 Ti (8.9% OC w/ no voltage adjustment)
Posted on Reply
#115
arbiter
newtekie1AMD can't go back and change the way they hyped up the card, they can't undo the straight up lies they told about the card leading up to its launch. Yes, companies hype up products, they even cherry pick test results to make the product look good before launch. However, AMD made bold claims, with no cherry picked sources to back them up. They just flat out made claims that were simply not true. There is a difference between what AMD did before the launch of Fury X and what other companies do to promote their upcoming products.
Techreports talked a bit about Results AMD had and settings they used. The way they explained it the settings were set in a way so that they were to take full advantage of the shaders on the card not what would look the best. So would been like if games that had physX they tested with it on reguardless of if it was amd or nvidia gpu in the machine. AF being off kinda says a lot since AF is pretty low impact and makes most games look a ton better but fact it was off is clear sign if you use THOSE settings it will be faster.
AsRockWho takes notice of PR anyways ?, OMG you learned any thing over the years ?, it's not like they can say our card is meh and expect people to look still
Yea, Seems like Certain groups of people takes PR slides as complete truth even when over last few years they have been proven to be BS when product is out.
RejZoRWhat I've heard somewhere about R9 Nano is that it won't be even as powerful as R9-290X. Probably something along R9-280X-ish performance in that tiny package. I hope though that those were just empty rumors and that it'll be a lot more powerful...
AMD said it would be significantly faster then 290x, But don't see how that will be case with that tiny cooler on it without throttling.
BiggieShadyIf anyone wants to recreate conditions in which Fury X beats 980 Ti in every benchmark, do what AMD did in their benches before release: force 0x anisotropic filtering in drivers (basically they used trilinear or bilinear filtering)
Pretty typical of AMD in their PR benchmark slides nit pick conditions or software which they get massive advantage in. Kinda like GPU accelerated benchmarks to compare their APU to intel cpu and say their APU is faster.
Posted on Reply
#116
moproblems99
the54thvoidHell, Nvidia get dogs abuse (rightly so as market leader) for miscommunication and they got off relatively scott free with 970 memory issue (God knows how).
I don't want to harp on the 970 memory issue, because the card performs really well for its price. BUT, do you really believe it was a miscommunication? If it was a miscommunication then they wouldn't have had to invent this magical system that allowed them to have the extra half gigabyte like the talking head was whining about. People are screaming that 4gb is not enough for the Fury so I am sure that people would have been screaming about 3.5 and they knew it. So they hid it. Does the result matter? Not really because the card is faster than a 290X and draws significantly less power. Just please don't say it was a miscommunication.
Posted on Reply
#117
newtekie1
Semi-Retired Folder
moproblems99I don't want to harp on the 970 memory issue, because the card performs really well for its price. BUT, do you really believe it was a miscommunication? If it was a miscommunication then they wouldn't have had to invent this magical system that allowed them to have the extra half gigabyte like the talking head was whining about. People are screaming that 4gb is not enough for the Fury so I am sure that people would have been screaming about 3.5 and they knew it. So they hid it. Does the result matter? Not really because the card is faster than a 290X and draws significantly less power. Just please don't say it was a miscommunication.
I believe it is definitely possible it was a miscommunication. I do believe it is plausible the engineers failed to inform the PR team of exactly how the GPU works and is set up, I also believe that the engineers could have informed the PR Team how the GPU worked and the PR Team didn't understand it. I think both are plausible. The PR team saw 4GB on the card and used 4GB in the advertising.
Posted on Reply
#118
hertz9753
This reminds of the review of the GTX 480 several years. People didn't get what they wanted.
Posted on Reply
#119
64K
newtekie1I believe it is definitely possible it was a miscommunication. I do believe it is plausible the engineers failed to inform the PR team of exactly how the GPU works and is set up, I also believe that the engineers could have informed the PR Team how the GPU worked and the PR Team didn't understand it. I think both are plausible. The PR team saw 4GB on the card and used 4GB in the advertising.
This is just so tiresome. Going on the premise that every engineer at Nvidia are too mentally incompetent to communicate tech to the PR department if that helps you but there were two Nvidia employees that coincidentally appeared here on the forums just after the shit hit the fan with the 970 fiasco. It's a financial mindgame for the most part. Play along with them if you want.
Posted on Reply
#120
newtekie1
Semi-Retired Folder
64KThis is just so tiresome. Going on the premise that every engineer at Nvidia are too mentally incompetent to communicate tech to the PR department if that helps you but there were two Nvidia employees that coincidentally appeared here on the forums just after the shit hit the fan with the 970 fiasco. It's a financial mindgame for the most part. Play along with them if you want.
Actually, my money is on the PR department being the incompetent ones. I mean, AMD's PR department seems to think Fury X is the most efficient GPU in the world... PR departments aren't exactly the brightest bunch.
Posted on Reply
#121
64K
newtekie1Actually, my money is on the PR department being the incompetent ones. I mean, AMD's PR department seems to think Fury X is the most efficient GPU in the world... PR departments aren't exactly the brightest bunch.
Indeed.
Posted on Reply
#122
Bad Bad Bear
Uuuuuuhhhhmmmm Nvidia won. I'm on the fence with either brand, whatever performs the best in it's relevant price range. A step in the right direction for AMD. Looking forward to Pascal.
Posted on Reply
#123
the54thvoid
Super Intoxicated Moderator
MrGeniuse
  1. You weren't the only one I quoted, don't single yourself out on my account.
  2. If asking you questions about your reply in a straight-forward and civil manner is illogical, and/or what you call being attacked...see a psychologist/psychiatrist. You've got some issues to deal with.
  3. Poor overclocking results is a conclusion that one comes to who has no real idea how this works.
  4. Yes really! Really!
  5. ALL the reviews support my statements. It's you who's misinterpreting them.
  6. 7970 liquid cooled @ 1300 core...meh...not bad. Sounds pretty typical.
  7. 7970 BIOS locked @ 1225 Max? Trying to prove you really don't know what the hell you're talking about...or what? Hint: there's no such thing as a BIOS lock on overclocking. It's a hardware thing.
  8. 780 Ti @ ~ 1400. Don't doubt it. Doesn't prove anything.
  9. 980 Ti @ 1500+? Possibly. I'd like to see it. Go for it! Then ask yourself this: How much performance have I gained relative to a similarly overclocked Fury X. And no, not a Fury X @ the same core speed. Since given equal core speeds = YOU LOSE BADLY!!!
Homework people, homework. I'm not really as big a dufus as you think I am. Or that I lead on to be. It's not wise to display one's full power to just anyone and everyone. Then you could prepare yourself for what I actually have in store. And possibly reserve your best tricks for last too. My A-game is saved for ONLY when absolutely required. Shoot...I've said too much. That is all.

@HumanSmoke I'll concede that. But we're still in the realm of suppositions and assumptions here, as opposed to hypotheses. Or better yet, theories.

Theory>Hypothesis>supposition/assumption>conjecture
Yeah. Bring your weapons back to the table when you've actually checked your safety and loaded your gun.
Also, I'm not a hardware or software engineer. Hell, I can't even do basic coding, shit ANY coding. I'm a hobbyist gamer that tinkers with water cooling and like to have a wee push on over clocking now and again.
I bow to your awesome knowledge base, I prostrate myself at your epic magnitude. And we can both watch the over clocking Hall of Fames for which turns out to be the better card.
Posted on Reply
#124
RejZoR
newtekie1Actually, my money is on the PR department being the incompetent ones. I mean, AMD's PR department seems to think Fury X is the most efficient GPU in the world... PR departments aren't exactly the brightest bunch.
Then again, how hard is it to call 2 engineers for a cup of tea and a chat regarding the product you're about to market to the masses? All you have to do is to ask them for an explanation like you're a 12 years old kid.

It's the same crap I (we) are facing in company where I work. The workforce and the leadership seems to be entirely disconnected from one another. When you give them feedback based on experience you have with customers, they don't do shit and 10 minutes later, they again demand you improve things because results aren't good enough. I don't understand this world...

I bet it was the same at AMD...
Posted on Reply
#125
the54thvoid
Super Intoxicated Moderator
RejZoRThen again, how hard is it to call 2 engineers for a cup of tea and a chat regarding the product you're about to market to the masses? All you have to do is to ask them for an explanation like you're a 12 years old kid.

It's the same crap I (we) are facing in company where I work. The workforce and the leadership seems to be entirely disconnected from one another. When you give them feedback based on experience you have with customers, they don't do shit and 10 minutes later, they again demand you improve things because results aren't good enough. I don't understand this world...

I bet it was the same at AMD...
To be fair, my brother designs PCB's and works on board designs for scientific implementations (not for gaming cards, far smaller tech firm) and he was asking me a while back about cooling solutions for a specific thermal TDP. It was quite funny speaking to him because the board engineers didn't really have a great idea about cooling the card (as it was a general requirement for the spec to be single slot for the design brief. It was a dual chip design but I was showing him pics and sending reviews of gfx cards to give him an idea of what is used.
If the board designers aren't fully aware of certain parameters, it's quite conceivable PR haven't got a clue. PR isn't technical - it's pure smoke and mirrors.
Posted on Reply
Add your own comment
Nov 23rd, 2024 06:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts