Friday, March 11th 2016

NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

It looks like NVIDIA's next GPU architecture launch will play out much like its previous two generations - launching the second biggest chip first, as a well-priced "enthusiast" SKU that outperforms the previous-generation enthusiast product, and launching the biggest chip later, as the high-end enthusiast product. The second-biggest chip based on NVIDIA's upcoming "Pascal" architecture, the "GP104," which could let NVIDIA win crucial $550 and $350 price-points, will be a lean machine. NVIDIA will design the chip to keep manufacturing costs low enough to score big in price-performance, and a potential price-war with AMD.

As part of its efforts to keep GP104 as cost-effective as possible, NVIDIA could give exotic new tech such as HBM2 memory a skip, and go with GDDR5X. Implementing GDDR5X could be straightforward and cost-effective for NVIDIA, given that it's implemented the nearly-identical GDDR5 standard on three previous generations. The new standard will double densities, and one could expect NVIDIA to build its GP104-based products with 8 GB of standard memory amounts. GDDR5X breathed a new lease of life to GDDR5, which had seen its clock speeds plateau around 7 Gbps/pin. The new standard could come in speeds of up to 10 Gbps at first, and eventually 12 Gbps and 14 Gbps. NVIDIA could reserve HBM2 for its biggest "Pascal" chip, on which it could launch its next TITAN product.

The GP104 will be built on the 16 nm FinFET process, by TSMC. NVIDIA is hoping to unveil the first GP104-based products by April, at the Graphics Technology Conference (GTC) event, which it hosts annually; with possible market availability by late-May or early-June, 2016.
Source: Benchlife.info
Add your own comment

135 Comments on NVIDIA "GP104" Silicon to Feature GDDR5X Memory Interface

#101
PP Mguire
xenocideI'm sure Nvidia would love to sell you parts like that, but it's just not possible given manufacturing costs. You can't expect them to shove every new bit of tech into a new manufacturing process and keep it cost-effective. Even GDDR5X is a huge improvement over regular GDDR5, and in most instances memory isn't the bottleneck outside of maybe 4K--which most people don't even play at. The other issue is Nvidia has to go back to competing with AMD in terms of Compute capability, which is a lot of added cost. I think they planned on using HBM2 for GP104 and up but realized when DX12 was in the works they would need to account for Async. Compute and had to drop HBM2 to keep costs in control.
Not at all, HBM is meant for top tier chips, and the 2 recent news releases proves this. They won't reduce their pricing and release structure until AMD can hammer them back down with solid competition. So until then we will see midrange chips asking a 500 dollar price until the big chips come out half a year later. Rinse and repeat. Pascal was also roadmapped to be a number cruncher so you'll see increased FP16 support in all their chips and HBM2 will be on their big chip as well as more queues for Async. Nothing really to do with costs or competition, it's how they had it mapped out.
Posted on Reply
#102
Stefan Payne
I woudn't expect to see any nVidia HBM chip anytime soon.

They don't know shit about that, they have absolutely NO know how about HBM at all!
They did do shit in this regards, just the big moth jensen has, besides that, they have don nothing at all for HBM! NOTHING!

And to expect that nVidia is able to deliver an HBM chip right of the bat? no way in hell!

The HBM experiment can still be a desaster for nVidia, they have to do a whole chip in 16nm, they also have to do HBM - good chances that something will go horribly wrong...

And that we will see just another GDDR5(X) chip from them, whilst AMD rocks the boat with HBM...
Posted on Reply
#103
rtwjunkie
PC Gaming Enthusiast
Stefan PayneI woudn't expect to see any nVidia HBM chip anytime soon.

They don't know shit about that, they have absolutely NO know how about HBM at all!
They did do shit in this regards, just the big moth jensen has, besides that, they have don nothing at all for HBM! NOTHING!

And to expect that nVidia is able to deliver an HBM chip right of the bat? no way in hell!

The HBM experiment can still be a desaster for nVidia, they have to do a whole chip in 16nm, they also have to do HBM - good chances that something will go horribly wrong...

And that we will see just another GDDR5(X) chip from them, whilst AMD rocks the boat with HBM...
They aren't making the HBM modules and neither does AMD. All they have to do is implement it once priduction is underway.

If you've been paying attention, instead of blindly flying the red flag, AMD also is going to do the same thing, only bringing in HBM2 in the NEXT generation after Polaris. This means, since the size of VRAM modules needed for Polaris models means we will likely see GDDR5X from them too.
Posted on Reply
#104
anubis44
PP MguireNo, Maxwell has 32 queues compared to the 128 from AMD. Maxwell supports Async Compute but only chokes when too many commands are sent to it as has been explained in detail many times by renowned sources on the net.

Funny you link a site known for fud and rumors. Even further, the benchmarks they show don't even match. How is it the first set they showed has a 980ti (which beats the Fury X) running 76fps in DX12 @ 1080p (beats in all 3 resolutions too) but the other site shows 68? Furthermore the gap at 1440p for Fury X is higher between the two sites. Who to trust? I'll say nobody. Also, it's been mentioned extensively that DX12 in the game is buggy anyways, as the Computer Base site mentions. Finally, the game is an AMD title and I'd expect it to run better on AMD cards anyways. Of course, one set of benchmarks seems to show otherwise. As to the 390x beating the crap out of the Titan X, yea and I'd be willing to bet like every other site they aren't letting the card boost. An eVGA Titan X SC will boost easily to 1350 by itself which is close to the 980ti clockspeed they have which means Titan X and 980ti would be on top of the charts for the first set of benches shown. So no, anybody worth their salt that owns a Titan X will have performance actually higher than a 980ti, which would handily beat the 390x if the top set of charts are to be believed.

So you had an issue with 2 Nvidia setups and you want to spread BS? What happens when your precious AMD card dies? Are you going to make excuses to continue hating on Nvidia? :rolleyes: Get a grip dude. Your Toshiba laptop came out 11 years ago and you're still whining about it? And who the hell runs Surround on a 670 of all cards? Oh man, my 5770 Crossfire setup was screwed up even after 3 replacement cards because XFX kept giving me Rev B boards.
AMD drivers keep crashing when I test my 295x2 for performance figures for friends. Snap, I might as well hate on AMD and not consider their lineup this year against Pascal and cry about it a decade later. :shadedshu:
The 970 has 4GB of usable VRAM. If you actually OWNED one of these cards you would know that all 4GB is addressable and actually doesn't hamper performance like all the BS misinformation on the net suggests. Want to know how I know? I have one sitting right behind me and have actually tested it for this very reason. Maybe if you took off your green glasses and stopped the fanboyism you'd see that your post here just sounds ridiculous.

Nvidia didn't bother putting a lot of effort into Maxwell regarding DX12 and DX12 features because they are meant to be DX11 1080p/1440p beasts, and that they are. Wait until Pascal is here, you'll see that very statement is beyond true.

The only thing that's a joke here is you and your post bud.
OK, you clearly want a recap, so I'll give it to you.

So I had more than an 'issue' with the dead GPU in my Tecra. It was a known defect in the cooling that nVidia washed their hands of. I don't give a crap if it was 11 years ago or yesterday. That's ALL I NEED TO KNOW about a company to not want to give them any more money in the future. Kinda like Ford calculating that it would cost them more money to fix the Pinto than to pay for the lawsuits from exploding gas tanks. Doesn't exactly make me want to rush out and buy a Ford. When I catch wind of that kind of garbage, yeah, it's an 'issue' for me to give that company money. But you know what? I still gave nVidia another chance. Then I had essentially unworkably buggy drivers screwing around with my three monitor setup, and I gave up and sold the card. And don't give me any crap about using the 670 with 3 monitors. The specs said it supported nVidia Surround, and that's what I expected it to do. And it didn't, at least not without a 19 step procedure with every single driver update.

But what's funny is that even after I'd already decided not to give any more money to nVidia, the worst was yet to come from them. The $200 G-Sync tax, the GTX970 fiasco, the deliberate sabotaging of games with GimpWorks, hell, they're now integrating PhysX into the game engine, so you can't turn it off. Just how much more screwing over do you need to have some sense of self-respect? You yourself were screwed over by the Green Goblin, and you're defending them. The crap about how 'all 4GB is addressable' is the joke. Of course it's addressable, it just runs at 1/7th normal speed. Games actually scale back on textures when they see you're using a 970 so they won't blow over the 3.5GB. Hey, if you're happy with having been lied to, that's your problem, but don't go saying what I've said is a joke. As for 'green glasses', I think you meant to imply I'm wearing 'red glasses'. You're the one wearing green ones. The joke is you arguing to defend them when YOU YOURSELF as a GTX970 owner were sold a bill of goods. You're the one who's deluded for down-playing that so intensely. If AMD had done it, you'd be all over it, and you know it.
Posted on Reply
#105
anubis44
rtwjunkieNo you did not read. Reading comprehension is fundamental. What I said was, once you start working for retirement, have people you know and love near death because of heart valve problems, deal with raising and providing for your children, work to pay your bills and enjoy as well as deal with real life, then getting as worked up and angry as anubis44 is about a GPU company becomes laughable.

What a GPU company does or sells is nothing in the grand scheme of things that matter, and won't actually affect your life.

Nowhere did I go on about starving kids in Africa and whatnot. That was your stretch, not mine.
I am working for retirement. No kids because I went through 2 marriages with women who turned out to have mental health issues they refused to deal with. I pay my bills and enjoy my 'real' life. I'm not getting the least bit worked up, I just call a spade a spade. Screwed over by nVidia several times. Check. OK, don't buy their stuff. End of discussion. That's the calculation. No getting worked up going on here. Enjoy your nVidia card.
Posted on Reply
#106
rtwjunkie
PC Gaming Enthusiast
anubis44I am working for retirement. No kids because I went through 2 marriages with women who turned out to have mental health issues they refused to deal with. I pay my bills and enjoy my 'real' life. I'm not getting the least bit worked up, I just call a spade a spade. Screwed over by nVidia several times. Check. OK, don't buy their stuff. End of discussion. That's the calculation. No getting worked up going on here. Enjoy your nVidia card.
Hey, we came to a mutual semi-truce. You are quoting my response ro the other guy who felt he needed to insert himself into the discussion.

For the record, I've owned both camps, and also am now considering an AMD card now. I don't have a preference company to buy from. Not that what I purchase should matter.

Do look up sales numbers for the 970 though, SINCE january last year. Hint: Phenomenal. Read user reviews. Nearly all were aware of the memory configuration. It hasn't bothered them.
Posted on Reply
#107
Parn
PP MguireNot at all, HBM is meant for top tier chips, and the 2 recent news releases proves this. They won't reduce their pricing and release structure until AMD can hammer them back down with solid competition. So until then we will see midrange chips asking a 500 dollar price until the big chips come out half a year later. Rinse and repeat. Pascal was also roadmapped to be a number cruncher so you'll see increased FP16 support in all their chips and HBM2 will be on their big chip as well as more queues for Async. Nothing really to do with costs or competition, it's how they had it mapped out.
Looks like Fury X wasn't enough of a competition for NV then if the new GTX1080 would start off around $500. I guess I will just have to wait until 1080 Ti is released before getting my hand on a 1080.
Posted on Reply
#108
rtwjunkie
PC Gaming Enthusiast
ParnLooks like Fury X wasn't enough of a competition for NV then if the new GTX1080 would start off around $500. I guess I will just have to wait until 1080 Ti is released before getting my hand on a 1080.
Actually, that's the assumed price of the as yet unnamed card using the GP104 chip, which is not the "full" Pascal. It's the upper mid-tier, just like where the 980 sits today in the lineup.
Posted on Reply
#109
Stefan Payne
ParnLooks like Fury X wasn't enough of a competition for NV then if the new GTX1080 would start off around $500. I guess I will just have to wait until 1080 Ti is released before getting my hand on a 1080.
nVidia can do what they want.

The nVidia fans will buy the shit and say it's good, even if it is utter garbage like the memory of the GTX 970?
Why would you do that?
There is just one reason for that...

And that's to make it unusable when the pascal successor is around. You tweak the memory a little, the new bought game runs like shit and the nVidia consument zombie runs to the next store to buy the next nVidia graphics card....

That 'the other one' could have saved money because it could have lasted longer does not matter, it has to be an nVidia graphics card to be hip on the schoolyard - and because those nVidia Fans say it has to be this way because they belive strongly in this company...

Well, a couple of years ago, you had your god and fought about that.

Today you belive in some companys that likes nothing more than to rip you naked and YOU like being raped by them....
That has some kind of perversion in it, don't you think?
Posted on Reply
#110
PP Mguire
anubis44OK, you clearly want a recap, so I'll give it to you.

So I had more than an 'issue' with the dead GPU in my Tecra. It was a known defect in the cooling that nVidia washed their hands of. I don't give a crap if it was 11 years ago or yesterday. That's ALL I NEED TO KNOW about a company to not want to give them any more money in the future. Kinda like Ford calculating that it would cost them more money to fix the Pinto than to pay for the lawsuits from exploding gas tanks. Doesn't exactly make me want to rush out and buy a Ford. When I catch wind of that kind of garbage, yeah, it's an 'issue' for me to give that company money. But you know what? I still gave nVidia another chance. Then I had essentially unworkably buggy drivers screwing around with my three monitor setup, and I gave up and sold the card. And don't give me any crap about using the 670 with 3 monitors. The specs said it supported nVidia Surround, and that's what I expected it to do. And it didn't, at least not without a 19 step procedure with every single driver update.

But what's funny is that even after I'd already decided not to give any more money to nVidia, the worst was yet to come from them. The $200 G-Sync tax, the GTX970 fiasco, the deliberate sabotaging of games with GimpWorks, hell, they're now integrating PhysX into the game engine, so you can't turn it off. Just how much more screwing over do you need to have some sense of self-respect? You yourself were screwed over by the Green Goblin, and you're defending them. The crap about how 'all 4GB is addressable' is the joke. Of course it's addressable, it just runs at 1/7th normal speed. Games actually scale back on textures when they see you're using a 970 so they won't blow over the 3.5GB. Hey, if you're happy with having been lied to, that's your problem, but don't go saying what I've said is a joke. As for 'green glasses', I think you meant to imply I'm wearing 'red glasses'. You're the one wearing green ones. The joke is you arguing to defend them when YOU YOURSELF as a GTX970 owner were sold a bill of goods. You're the one who's deluded for down-playing that so intensely. If AMD had done it, you'd be all over it, and you know it.
All I see is bitch bitch bitch tbh. Last time I checked GPU makers don't create the cooling solution for laptops, the laptop makers do. That would be a Toshiba thing.

Nobody buys a 670 for Surround dude. It's literally not powerful to push it. My best friend has used Surround since he upped to 780ti and now has 3 ROG Swift (the 9q) and although he has minor issues with games and the update to Windows 10 really screwed with things his overall experience with it is he likes it enough to keep it. I can't say the same for Gsync because I sold my Swift within 6 months. But sure, I'm "wearing green glasses".

No buddy, I just call it like it is and all I see is a guy who's had some bad experiences and wants to whine to the world about it. I simply use whatever is the best at the time. If I was truly wearing "green glasses" I wouldn't own many products from both sides. Where fanboyism is concerned I'm pretty unbiased when it comes to it, which comes from the territory of being a previous hardware reviewer. I bought a 970 for testing to debunk exactly the kind of retarded misinformation that guys like you spew constantly. 1/7th the normal speed? Even if it did, it sure doesn't hamper games at all, and I'd love for you to provide a legit source (not bullshit like WCCF) saying they "scale back textures". Now you're just pulling shit out of your ass because if they did that then they'd do it with any other 4GB card (and they certainly don't). :rolleyes: I also like how you went from saying it's not a 4GB card to it's addressable. You guys with your one sided double standard contradicting arguments makes me laugh. What you're saying was, and still is a joke.

Actually no, I don't go posting on AMD threads like guys like you spewing blatant fanboyism. Go ahead, search. I could easily bring up how they're using yet another Cooler Master AIO system despite the allegations from Asetek. I could go on how they are or were being sued for their FX series false advertising. I could go on how they haven't had any competition in the CPU market since 2012. I could go on about how their PR stunts are outright laughable (anybody remember those terrible Youtube videos?). I could go on about how their only form of marketing is hype hype hype while delivering an underwhelming product. Do I? No. I've also had many, many, many bad experience from the AMD side of things but low and behold it doesn't keep me from buying AMD product for testing. It's called having a sense of understanding, and not getting butthurt about little things. Anyways, this shit's way off topic. :shadedshu:
Posted on Reply
#111
rtwjunkie
PC Gaming Enthusiast
Stefan PayneToday you belive in some companys that likes nothing more than to rip you naked and YOU like being raped by them....
That has some kind of perversion in it, don't you think?
Man, I'm not sure what world you live in, but being extreme fans of and "believing" in a company are fanboy acts, and most of us on here are way beyond that age in life where anything like that matters. We buy by card for the most part, not card family.
Posted on Reply
#112
PP Mguire
rtwjunkieMan, I'm not sure what world you live in, but being extreme fans of and "believing" in a company are fanboy acts, and most of us on here are way beyond that age in life where anything like that matters. We buy by card for the most part, not card family.
This. It's freakin computer stuff. So and so has best performing product, I buy that. Idc about name, I just want what performs.
Posted on Reply
#113
Parn
rtwjunkieMan, I'm not sure what world you live in, but being extreme fans of and "believing" in a company are fanboy acts, and most of us on here are way beyond that age in life where anything like that matters. We buy by card for the most part, not card family.
Well said.

The only reason I've been buying NV since Kepler was released is because of their superior energy efficiency compared to AMD cards. If Polaris manages to turn the table on Pascal, I won't hesistate to join the red camp.
Posted on Reply
#114
rruff
rtwjunkiemost of us on here are way beyond that age in life where anything like that matters. We buy by card for the most part, not card family.
I'd buy AMD if it was close, up to a coin toss. But at the moment the only AMD GPU or CPU I have is an old Athlon X2 64.
Posted on Reply
#115
Slizzo
Stefan PaynenVidia can do what they want.

The nVidia fans will buy the shit and say it's good, even if it is utter garbage like the memory of the GTX 970?
Why would you do that?
There is just one reason for that...
Jesus Christ, how many times are people going to harp on that? Here's the thing:

Benchmark and actual game performance DID NOT CHANGE in between before we knew about how the memory was carved up, and afterwards. And, the card performs in line with the bracket that it's priced in.

Basically, this is a non-issue that continues to have the dead horse beaten.
Posted on Reply
#116
PP Mguire
SlizzoJesus Christ, how many times are people going to harp on that? Here's the thing:

Benchmark and actual game performance DID NOT CHANGE in between before we knew about how the memory was carved up, and afterwards. And, the card performs in line with the bracket that it's priced in.

Basically, this is a non-issue that continues to have the dead horse beaten.
Gee, how many times have I said this. All Nvidia haters have are the 970 and Gameworks :roll:
Posted on Reply
#117
anubis44
PP MguireAll I see is bitch bitch bitch tbh. Last time I checked GPU makers don't create the cooling solution for laptops, the laptop makers do. That would be a Toshiba thing.

Nobody buys a 670 for Surround dude. It's literally not powerful to push it. My best friend has used Surround since he upped to 780ti and now has 3 ROG Swift (the 9q) and although he has minor issues with games and the update to Windows 10 really screwed with things his overall experience with it is he likes it enough to keep it. I can't say the same for Gsync because I sold my Swift within 6 months. But sure, I'm "wearing green glasses".

No buddy, I just call it like it is and all I see is a guy who's had some bad experiences and wants to whine to the world about it. I simply use whatever is the best at the time. If I was truly wearing "green glasses" I wouldn't own many products from both sides. Where fanboyism is concerned I'm pretty unbiased when it comes to it, which comes from the territory of being a previous hardware reviewer. I bought a 970 for testing to debunk exactly the kind of retarded misinformation that guys like you spew constantly. 1/7th the normal speed? Even if it did, it sure doesn't hamper games at all, and I'd love for you to provide a legit source (not bullshit like WCCF) saying they "scale back textures". Now you're just pulling shit out of your ass because if they did that then they'd do it with any other 4GB card (and they certainly don't). :rolleyes: I also like how you went from saying it's not a 4GB card to it's addressable. You guys with your one sided double standard contradicting arguments makes me laugh. What you're saying was, and still is a joke.

Actually no, I don't go posting on AMD threads like guys like you spewing blatant fanboyism. Go ahead, search. I could easily bring up how they're using yet another Cooler Master AIO system despite the allegations from Asetek. I could go on how they are or were being sued for their FX series false advertising. I could go on how they haven't had any competition in the CPU market since 2012. I could go on about how their PR stunts are outright laughable (anybody remember those terrible Youtube videos?). I could go on about how their only form of marketing is hype hype hype while delivering an underwhelming product. Do I? No. I've also had many, many, many bad experience from the AMD side of things but low and behold it doesn't keep me from buying AMD product for testing. It's called having a sense of understanding, and not getting butthurt about little things. Anyways, this shit's way off topic. :shadedshu:
I bought a 670 to play Diablo 3 using three monitors. If you don't like that, screw yourself. That's why I did it. I used to play the hell out of Diablo 2, and I decided I'd play the hell out of Diablo 3 on three monitors. Contrary to your empty-headed rant, the 670 put out plenty of performance for that game in surround (120FPS). Unfortunately, the crappy GreedForce drivers didn't play well with 3 monitors, so I sold the card in frustration after a month of having to fiddle with it.

All I see coming from you is complete ignorance. If you really think nVidia is innocent, never tried to screw over anybody, honestly just 'forgot' to mention that the 4GB GTX970 didn't really have 4GB of full-speed memory until 3 months later, isn't really ripping anybody off by charging them a $200 premium for FreeSync with a Green Goblin logo that won't work with anybody else's graphics cards, a pathetic attempt at confetti effects that are patented as PhysX, and that GameWorks is really good for the PC game industry, and not really a transparent rear-guard action to gimp games on all but the latest nVidia cards to force an upgrade, and you don't need to hold them to account for anything they've done, the lies, the deception, go right ahead enjoy taking it from behind. Keep supporting the company that's proven itself happy to screw you over. That's your prerogative. But don't tell me I'm just 'whining'. I've given you true anecdotes from my experience. There's no 'shit' being pulled from my ass, just true accounts.

I'm the one telling it like it is, and you're one justifying why all that crap is OK, so who's really the one whining, eh?
Posted on Reply
#118
PP Mguire
anubis44I bought a 670 to play Diablo 3 using three monitors. If you don't like that, screw yourself. That's why I did it. I used to play the hell out of Diablo 2, and I decided I'd play the hell out of Diablo 3 on three monitors. Contrary to your empty-headed rant, the 670 put out plenty of performance for that game in surround (120FPS). Unfortunately, the crappy GreedForce drivers didn't play well with 3 monitors, so I sold the card in frustration after a month of having to fiddle with it.

All I see coming from you is complete ignorance. If you really think nVidia is innocent, never tried to screw over anybody, honestly just 'forgot' to mention that the 4GB GTX970 didn't really have 4GB of full-speed memory until 3 months later, isn't really ripping anybody off by charging them a $200 premium for FreeSync with a Green Goblin logo that won't work with anybody else's graphics cards, a pathetic attempt at confetti effects that are patented as PhysX, and that GameWorks is really good for the PC game industry, and not really a transparent rear-guard action to gimp games on all but the latest nVidia cards to force an upgrade, and you don't need to hold them to account for anything they've done, the lies, the deception, go right ahead enjoy taking it from behind. Keep supporting the company that's proven itself happy to screw you over. That's your prerogative. But don't tell me I'm just 'whining'. I've given you true anecdotes from my experience. There's no 'shit' being pulled from my ass, just true accounts.

I'm the one telling it like it is, and you're one justifying why all that crap is OK, so who's really the one whining, eh?
The only one spewing ignorance here is you because you're butthurt as hell. Get over yourself buddy. I don't have to justify anything, I buy what's the best period. I like how before you completely ignored the "and how many devices do you own that have Samsung in them" because you're just a hypocrite who wants to take your decade old frustration out on forum posts. Samsung is by far one of the worst companies to pull shit yet I bet you give 0 fucks because all you want to do is bash on Nvidia like all the Nvidia fanboys bash on AMD.

Yea, you're pulling shit out of your ass that is your opinion and nobody is winning because it's a forum on the internet. If anybody is winning it's me because I'm not a biased shit stain whining on the internet because boohoo Nvidia is soooo bad. I've given you factual arguments and all you're doing is whining about it saying the same crap over and over again. Nvidia is a company out to make money, same as AMD, same as Intel, same as Alienware/Dell, same as NCIX, same as Samsung, same as Apple, and many others. IF you don't like their business practice for making money take your wallet somewhere else. Don't go on a board trying shit on an Nvidia thread because you had bad experiences that give you a warped opinion of a company and anything they do. That's like having an ex gf and anything they do is pissing you off even if it's something as simple as putting a harmless selfie on Facebook. You're being that guy, that ex. It's exactly why I made the sarcastic remarks about how I should hate AMD because I've had so many bad products from them but nope. Shit happens, you get over it and realize it's a freakin computer part dude. That doesn't make the company Hitler, and it definitely doesn't make your opinion fact.
Posted on Reply
#119
rtwjunkie
PC Gaming Enthusiast
Come on guys....I repeat my earlier stance. It's not worth getting angry over. It isn't going to impact anyone enough to ruin their life, nor is it going to cure world hunger.

Just agree to disagree with different opinions. :-)
Posted on Reply
#120
PP Mguire
rtwjunkieCome on guys....I repeat my earlier stance. It's not worth getting angry over. It isn't going to impact anyone enough to ruin their life, nor is it going to cure world hunger.

Just agree to disagree with different opinions. :)
Who's angry? I guess maybe him, but this is a usual day in the office for me and find it hilarious.
Posted on Reply
#121
EarthDog
I wish members could vote to send some of these trouble makers to purgatory... like OCN. You get punished and have to go there for a month. :laugh: :nutkick:
Posted on Reply
#122
PP Mguire
EarthDogI wish members could vote to send some of these trouble makers to purgatory... like OCN. You get punished and have to go there for a month. :laugh: :nutkick:
Please no massa, I'z be goodz I swearz it!
Posted on Reply
#123
Prima.Vera
reading the comments is so much beter than the "article" itself... :)))
Posted on Reply
#124
medi01
PP MguireAll I see is bitch bitch bitch tbh.
Some people think that nVidia's business practices set a record low mark AND openly state that..
You consider them to be "fanbois" and "irrational". Well. So what?

Maybe that's where the problem is? Why should you care about "bitching"? Why not just get over it?
SlizzoBenchmark and actual game performance DID NOT CHANGE...
Bingo. Most people judge GPU performance based on reviews, not real life experience.

Yep. It's damn subjective. Most things expressed in perf bars are hardly noticeable in real life, that's why we have things like "halo product", more 3xx's being sold after Fiji was released etc.

And your point was?
Posted on Reply
#125
Slizzo
medi01Some people think that nVidia's business practices set a record low mark AND openly state that..
You consider them to be "fanbois" and "irrational". Well. So what?

Maybe that's where the problem is? Why should you care about "bitching"? Why not just get over it?


Bingo. Most people judge GPU performance based on reviews, not real life experience.

Yep. It's damn subjective. Most things expressed in perf bars are hardly noticeable in real life, that's why we have things like "halo product", more 3xx's being sold after Fiji was released etc.

And your point was?
Pretty sure my point was pretty clear. He was calling the GTX970 garbage according to some information that was learned after release and the cards performance had been advertised. And the card's performance was the same before we knew this information, and afterwards. Basically nullifying his point about the card being garbage because of the way the memory subsystem is set up.
Posted on Reply
Add your own comment
Nov 22nd, 2024 20:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts