• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Vega-based Cards to Reportedly Launch in May 2017 - Leak

Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Still waiting for the 80% difference ...
polaris_vs_pascal.png

polaris_vs_pascal_1.png

These charts are not that hard to read!
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
No, they are not. But people with common sense don't cut them out and compare different tiers.
Now who's the one cherry-picking? We are comparing efficiency, not price here, and you wouldn't find two chips that are exactly the same size. If you don't like a comparison with GP104 and GP106 then you have a problem.

I think you have forgot what we were discussing here. Some claims Vega will be more efficient than Pascal, and since Vega 10 will compete with GP104 it's a fair comparison. GP106 is 55% more efficient, GP104 ~80% more efficient and GP102 ~85% more efficient, so AMD have their work cut out for them.
 
Last edited:
Joined
Dec 29, 2014
Messages
861 (0.24/day)
But with the 1060 3GB that is closest with the performance it's sub 20% difference. What's your point?

You are comparing the least efficient Pascal card to the most efficient Polaris. Average them all and you'll get Pascal being ~50% more efficient.

EDIT: Actually did the calculations and Pascal is 59% more efficient on average. 460,470, and 480 vs 1050 Ti, 1060 3, 1060 6, 1070, and 1080.
 
Last edited:

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,120 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Stop bitching over the usual crap, all the conjecture is pointless, no point getting worked up and feisty over it. There is no Vega for months.

Edited because there's no point talking performance.
 
Joined
Sep 26, 2012
Messages
871 (0.19/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling ASUS ROG Ryujin III 360, 13 x Lian Li P28
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 STRIX
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Acer X38S, Wacom Cintiq Pro 15
Case Lian Li O11 Dynamic EVO
Audio Device(s) Topping DX9, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply Seasonic PRIME TX-1600
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + Universal Blue
You are comparing the least efficient Pascal card to the most efficient Polaris. Average them all and you'll get Pascal being ~50% more efficient.

EDIT: Actually did the calculations and Pascal is 59% more efficient on average. 460,470, and 480 vs 1050 Ti, 1060 3, 1060 6, 1070, and 1080.

The other thing is, are these calculations done under new driver sets? Already we've seen AMD gain another 10% perf with Polaris under DX11. But lets face it, Nvidia does so well efficiency wise as compute features have been stripped off die since Kepler. And is half the reason why even 10xx series Nvidia gpu's are still not fully DX12 and Vulkan spec level compliant, and why they gain nothing from either api's.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,120 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
The other thing is, are these calculations done under new driver sets? Already we've seen AMD gain another 10% perf with Polaris under DX11. But lets face it, Nvidia does so well efficiency wise as compute features have been stripped off die since Kepler. And is half the reason why even 10xx series Nvidia gpu's are still not fully DX12 and Vulkan spec level compliant, and why they gain nothing from either api's.

AMD is not fully DX12 compliant either. The only reason AMD perform better in DX12 and Vulkan is better use of compute and parallelism of sorts.
Too many people wave the DX12 flag as if it's magical. It's not. It brings no better gfx settings and is so far proving hard to code.
People get pissed off Nvidia performs so well with such a lean efficient design and people get pissed off because AMD are doing better.
People ought to look at things from outside the arena and see how fantastic both things are for us consumers in the long run.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
AMD is not fully DX12 compliant either. The only reason AMD perform better in DX12 and Vulkan is better use of compute and parallelism of sorts.
Too many people wave the DX12 flag as if it's magical. It's not. It brings no better gfx settings and is so far proving hard to code.
People get pissed off Nvidia performs so well with such a lean efficient design and people get pissed off because AMD are doing better.
People ought to look at things from outside the arena and see how fantastic both things are for us consumers in the long run.

There is not one PC game so far that requires D3D12 runtime most DX12 games run with 11_1 or 11_3 which are DX12 compliant. Its going to take a few more years for things to sort out with DX12 and even see a "real DX12" game rather then a DX11 game with a DX12 stamp.

Hardware has to be available
Developers willing to program for
Enough of a customer base to make it worth it
 
Last edited:

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,120 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
There is not one PC game so far that requires D3D12 runtime most DX12 games run with 11_1 or 11_3 which are DX12 compliant. Its going to take a few more years for things to sort out with DX12 and even see a "real DX12" game rather then a DX11 game with a DX12 stamp.

Hardware has to be available
Developers willing to program for
Enough of a customer base to make it worth it

Yes, that's why folk need to stop saying NV aren't compliant, as if AMD are. It's post truth.

Nobody has it nailed down and as you rightly say, it's going to take quite a while to get it right. Still a lot of folk on W7 as well (raises hand) which doesn't help the cause.
 
Joined
Dec 6, 2016
Messages
155 (0.05/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
So a year behind the big green :)

Why are you happy about this? If AMD falls to far behind we'll be seeing an nvidia monopoly and GPU prices will skyrocket. Just look at the price of a 1080 vs the 980 at launch - the 1080 is a whopping 775$ (3100 lei - tax included) in my country, while the 980 was about 550-600$ (2000-2450 lei depending on model) soon after launch. That's a HUGE price increase of 200$ for a high-end card over a period of what, two years?

This is what happens when there's no competition. Not to mention lack of competition is BORING for hardware enthusiasts like myself.

AMD is not fully DX12 compliant either. The only reason AMD perform better in DX12 and Vulkan is better use of compute and parallelism of sorts.
Too many people wave the DX12 flag as if it's magical. It's not. It brings no better gfx settings and is so far proving hard to code.
People get pissed off Nvidia performs so well with such a lean efficient design and people get pissed off because AMD are doing better.
People ought to look at things from outside the arena and see how fantastic both things are for us consumers in the long run.

Agreed.

The thing is, there's huge improvement in performance using DX12 / vulkan. I've had the opportunity to test the 8GB RX 480 (Asus Strix) and a 4GB RX 470 (PowerColor RedDragon V2) and I have to say I was impressed.
 
Last edited:
Joined
Mar 24, 2012
Messages
533 (0.11/day)
They've always had a gaming flagship & a separate (compute) flagship ever since the days of Fermi. That they've neutered DP on subsequent Titan's is something entirely different, the original Titan had excellent DP capabilities but every card that's followed had DP cut down massively & yet many call it a workstation card.

The GP102 & GP100 are different because of HBM2, Nvidia probably felt that they didn't need "nexgen" memory for their gaming flagship that or saving a few more $ was a better idea i.e. with GDDR5x & the single card cannot support these two competing memory technologies.

not really. with fermi and kepler GF100/GF110/GK110 still being use in a gaming card. but with pascal GP100 is not used on gaming card at all. and despite cards like GK110 is more compute oriented and GK104 is more gaming oriented the SM arrangement still the same. that is not the case for GP100 and other pascal chip.

titan x maxwell might not have excellent DP capability as kepler based titan but it's spec is exactly the same as quadro M6000. so if you don't need those certified drivers that come with quadro those titan x can easily replace quadro M6000 at far cheaper price (5k vs 1k). in fact as anandtech point it out some company actually build their product using titan x maxwell instead of using quadro or tesla. titan x pascal was sold for it's unlock INT8 performance.

No one said it wasn't, but they could've gone the route of AMD & given 16/32 GB of HBM2 to 1080Ti/Titan & yet they didn't & IMO that's down a lot to costs.
The Titan is still marketed as a workstation card, I do wonder why?

titan x maxwell is a bit weird since there is no advantage to it compared to regular geforce based card. the only advantage it has really is that 12GB of VRAM making it having the same spec as quadro M6000. but i heard talks about some people sought after it's FP16 performance. it might be the reason why nvidia end up cripling FP16 in non tesla based card with pascal.
 
Last edited by a moderator:
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
So a year behind the big green :)
Feels so good, isn't it? I mean paying 700$ for 314mm^2 chips.

On the other hand, if you pull all the data from the Titan X review we see that both manufacturers see decreasing performance per flop as they go up the scale and it's worse for AMD:
View attachment 82981

So if we plot out all of these cards as Gflops vs Performance and fit a basic trend line we get that big Vega would be around 80% of Titan performance at 12 TFlops. (this puts it even with the 1080).


View attachment 82980

So for AMD to be at 1080TI levels, they'll need to have improved their card efficiency by 10 - 15 percent for this architecture.

Given the number of changes they've talked about with this architecture, I don't think that's infeasible but it is a hurdle to overcome.


It's an interesting take, but you need to explain why Vega would have worse perf/tflop than Polaris (after AMD explicitly claimed more "non tflop" stuff in it).

Also, 1060 outperforming 480 in your chart is... somewhat outdated.


480 = 232mm (15% larger than 1060 but 10% slower)

Well, intresting to mention that 6 month after release 480 took over.
https://www.techpowerup.com/forums/...ver-improving-over-time-is-not-a-myth.228443/

You are comparing the least efficient Pascal card to the most efficient Polaris. Average them all and you'll get Pascal being ~50% more efficient.

EDIT: Actually did the calculations and Pascal is 59% more efficient on average. 460,470, and 480 vs 1050 Ti, 1060 3, 1060 6, 1070, and 1080.

Bullshit:

upload_2017-1-25_20-42-26.png


https://www.computerbase.de/2017-01...si-gaming-test/3/#abschnitt_leistungsaufnahme

Thanks to TPU benchmarks (I have yet to find site that claims 480 consumes more than 1070) in practice negligible power consumption differences are hyperboled into orbit.
 
Last edited by a moderator:
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
It's an interesting take, but you need to explain why Vega would have worse perf/tflop than Polaris (after AMD explicitly claimed more "non tflop" stuff in it).

Also, 1060 outperforming 480 in your chart is... somewhat outdated.




Well, intresting to mention that 6 month after release 480 took over.
https://www.techpowerup.com/forums/...ver-improving-over-time-is-not-a-myth.228443/

Exactly.

The odd arguments these people keep making are quite puzzling, no?

-Why do people keep comparing Hawaii and Fiji TFLOPS to Vega, when Polaris is much closer architecturally?

-Why do people keep exaggerating how powerful the Titan is?! As usual people seem to be mislead by the name - it's only ~45% stronger than the Fury X depending on the game and resolution. Do people really think it's going to be hard for AMD to make a card 50% stronger than a card they made 2 years ago?!?!?!

-Overall, why do people continue to just assume AMD can't make powerful cards? The 5870 and 5970 practically had supremacy in the Enthusiast market for a FULL YEAR. The 7970 and then GHz edition essentially held the performance crown for an entire year until Nvidia released a larger card that cost $1000, and then the 290X crushed the Titan and 780 Ti HARD. AMD is no stranger to performance wins.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,120 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
-Overall, why do people continue to just assume AMD can't make powerful cards? The 5870 and 5970 practically had supremacy in the Enthusiast market for a FULL YEAR. The 7970 and then GHz edition essentially held the performance crown for an entire year until Nvidia released a larger card that cost $1000, and then the 290X crushed the Titan and 780 Ti HARD. AMD is no stranger to performance wins.

Are you Donald Trump? The misuse of historical accuracy is disturbing. For reference, I simply googled 290x review and went to the first one, a TPU review on the 290X Lightning model.

I don't see the 780ti being crushed HARD, as you put it. I appreciate you prefer AMD (I've seen you on other sites lauding AMD to the moon) but your abuse of truth is silly.

Again, I hope the Vega card is more powerful than most people are expecting. I'll be very pissed off if a card I'm stalling an entire build for doesn't match (or beat) the rumoured 1080ti. I'm in a win, win here - I want Vega to perform well - I'm not a zealot like some Nvidia owners can be. But you're always a wee bit overly red pumped to be taken seriously.

 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Are you Donald Trump? The misuse of historical accuracy is disturbing. For reference, I simply googled 290x review and went to the first one, a TPU review on the 290X Lightning model.

I don't see the 780ti being crushed HARD, as you put it. I appreciate you prefer AMD (I've seen you on other sites lauding AMD to the moon) but your abuse of truth is silly.

Again, I hope the Vega card is more powerful than most people are expecting. I'll be very pissed off if a card I'm stalling an entire build for doesn't match (or beat) the rumoured 1080ti. I'm in a win, win here - I want Vega to perform well - I'm not a zealot like some Nvidia owners can be. But you're always a wee bit overly red pumped to be taken seriously.


HAHAHAHHAAHA - A 900p benchmark. Nice "Alternative Facts" shill. Or perhaps this the ideal resolution for nvidiots.

Here's something from 2016:

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080/images/perfrel_3840_2160.png

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080/images/perfrel_2560_1440.png

780 Ti almost losing to a 7970 GHz (Which beat the pathetic 780).


To be clear I didn't cherry pick at all. I simply googled "290X 780 Ti 2016". That is the first thing that popped up.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
-Overall, why do people continue to just assume AMD can't make powerful cards? The 5870 and 5970 practically had supremacy in the Enthusiast market for a FULL YEAR. The 7970 and then GHz edition essentially held the performance crown for an entire year until Nvidia released a larger card that cost $1000, and then the 290X crushed the Titan and 780 Ti HARD. AMD is no stranger to performance wins.

You're talking release performance, right? It seems so.

a) 7970 lost to GTX 680 when 680 was released, but it had performance crown for about ~3 months. Later 7970 GHz Edition equalled performance of GTX 680, or was a bit faster, but with way higher power consumption.

b) 290X was faster than GTX 780 at release and had a good fight with GTX Titan. Nvidia then released the 780 Ti to counter the 290X/Hawaii GPUs and the 780 Ti easily won vs. the 290X. Later custom cards came closer to the performance of the 780 Ti, but the 780 Ti custom versions were again faster.

c) Only thing you had right was HD 5870 and 5970 being unmatched until GTX 480 and GTX 500 series came. GTX 480 came late and was faster, but consumed a hell lot of power. GTX 580 easily won vs HD 5000 and HD 6000 series.

This is still talking release and maximum 6 months after it. It's pretty irrelevant to compare performances now and disregarding release performance of the GPUs, as nobody buys a GPU to have more performance than the competitor GPU 3 years later.

Also @the54thvoid is right, here are the other tables:





https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/24.html

That's (one of the best) custom 290X losing to reference 780 Ti. Custom 780 Ti is even faster. 290X simply had no chance.
 
Last edited:
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
You're talking release performance, right? It seems so.


No I am talking about right now. Right now the 290X destroys the 780 Ti and even trades blows with the 980. Heck even in late 2014 the 290X was curb stomping the 780 Ti.


Sorry but I don't need to read your long fanboy rant. Continue to live in the past - All Nvidiots do by necessity.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
No I am talking about right now. Right now the 290X destroys the 780 Ti and even trades blows with the 980. Heck even in late 2014 the 290X was curb stomping the 780 Ti.


Sorry but I don't need to read your long fanboy rant. Continue to live in the past - All Nvidiots do by necessity.
Any evidence for your bold statements? Until now you're just behaving like a childish fanboy, nothing more.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Any evidence for your bold statements? Until now you're just behaving like a childish fanboy, nothing more.

lol scroll up cry baby. I already posted TechPowerup benches from 2016.


Oh.... He can't read. That's sad.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
lol scroll up cry baby. I already posted TechPowerup benches from 2016.


Oh.... He can't read. That's sad.
I already stated, nobody cares about performance in 2-3 years when he choses to buy a GPU in anno 2014, if you're unable to understand this, I'm gonna end this discussion. And I thought fanboys like this are prominent elsewhere.

Also calling someone else "nvidiot", and behaving like a fanboy is really funny. Seems you don't see your own behaviour as fanboyism, but it is.
Sorry but I don't need to read your long fanboy rant.

Oh.... He can't read. That's sad.

You're a really funny person. I'd say the only fanboy here, is you.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
I already stated, nobody cares about performance in 2-3 years when he buys a 290X in anno 2014,

Anno 2014? What are you talking about lmao! You can't read!

I posted the aggregate framerate - that includes Nvidia's broken "The Way It's Meant to not Boot" games.

Most people keep their cards for 2-3 years on average. Even at launch the 780 Ti was tied in 4K, and only won by 10% at most in 1080p (1440p depended on the game). Nobody with half a brain pays 30% more money for 0-10% more performance that will only last a year. Considering the 780 Ti came out AFTER the 290X, I would call it pretty pathetic.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.03/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Anno 2014? What are you talking about lmao! You can't read!

I posted the aggregate framerate - that includes Nvidia's broken "The Way It's Meant to not Boot" games.

Most people keep their cards for 2-3 years on average. Even at launch the 780 Ti was tied in 4K, and only won by 10% at most in 1080p (1440p depended on the game). Nobody with half a brain pays 30% more money for 0-10% more performance that will only last a year. Considering the 780 Ti came out AFTER the 290X, I would call it pretty pathetic.
Yeah of course, for a fanboy like yourself, anything Nvidia does is pathetic and you hate people that use Nvidia cards, calling them "Nvidiots", but you know nothing of them. That is pathetic behaviour.

Yes 290X is now better than years before, but that doesn't change the fact nobody can see the future. People paying over 500 bucks for a GPU, mostly chose the 780 Ti, because it was simply the all around better GPU, power consumption wise, performance wise by far (custom GPU vs. custom) and also Nvidia had simply better drivers at least until a few months ago. 4K didn't play the slightest role back then, it's not even really important now. 1440p and 1080p which I posted, even the ref 780 Ti had a easy win VS. Lightning 290X there, which is one of the best 290X, and this was many months after the first release of 290X, so drivers were already a lot better. You can say that the 290X/390X is on same level or better now, but power consumption is still a mess (idle, average gaming, multi monitor, blu ray/web). So in the end, it's still not really better in my books, as I'm using Multi Monitor and I don't want to waste 35-40W. I also don't want a GPU that consumes 250-300W of power, that's especially true for the 390X that's even more power hungry than the 290X. Nvidia GPUs are way more sophisticated power gating wise, much more flexible with core clocking, and only Vega can change that, because Polaris didn't. And yes I hope that Vega is a success.

Comparing the 290/390X with the 980 is just laughable. The 980 is a ton more efficient. Maybe efficiency is not important for you, but for millions of users it is. It has also to do with noise, 780 Ti/980 aren't as loud as 290/390X.

But I'm still laughing about you calling me a "Nvidiot fanboy". I'm a regular poster in AMD reddit, I owned several Radeon cards and I know ~everything about AMD. If at all, I'm more likely a fanboy of AMD than Nvidia, but this doesn't change certain facts that you can't change as well.
 
Joined
Dec 6, 2016
Messages
748 (0.25/day)
Yeah of course, for a fanboy like yourself, anything Nvidia does is pathetic and you hate people that use Nvidia cards, calling them "Nvidiots", but you know nothing of them. That is pathetic behaviour.

Yes 290X is now better than years before, but that doesn't change the fact nobody can see the future. People paying over 500 bucks for a GPU, mostly chose the 780 Ti, because it was simply the all around better GPU, power consumption wise, performance wise by far (custom GPU vs. custom) and also Nvidia had simply better drivers at least until a few months ago. 4K didn't play the slightest role back then, it's not even really important now. 1440p and 1080p which I posted, even the ref 780 Ti had a easy win VS. Lightning 290X there, which is one of the best 290X, and this was many months after the first release of 290X, so drivers were already a lot better. You can say that the 290X/390X is on same level or better now, but power consumption is still a mess (idle, average gaming, multi monitor, blu ray/web). So in the end, it's still not really better in my books, as I'm using Multi Monitor and I don't want to waste 35-40W. I also don't want a GPU that consumes 250-300W of power, that's especially true for the 390X that's even more power hungry than the 290X. Nvidia GPUs are way more sophisticated power gating wise, much more flexible with core clocking, and only Vega can change that, because Polaris didn't. And yes I hope that Vega is a success.

Just a moment here. Did you even read the reviews? The reference 780 Ti consumed 15 Watt less when gaming (muh better efficiency!!111), it was 8% faster and 28% more expensive vs the reference 290x. And even if the 290x was 8% slower, it still managed to push 60+ in most games tested here on TPU at 1080p. People only bought the 780 Ti because it was nVidia, not because it was that much better as you say. The only two problems with 290x were it's multimonitor power consumption and poor reference colling. Otherwise it was a great card! Stop making things up ...
 
Joined
Feb 7, 2006
Messages
739 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
It's an interesting take, but you need to explain why Vega would have worse perf/tflop than Polaris (after AMD explicitly claimed more "non tflop" stuff in it).

Also, 1060 outperforming 480 in your chart is... somewhat outdated.

It's because as AMDs dies get larger and speeds go up the performance per TFLOP decreases.

¯\_(ツ)_/¯
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
Yeah of course, for a fanboy like yourself, anything Nvidia does is pathetic and you hate people that use Nvidia cards

Comparing the 290/390X with the 980 is just laughable. The 980 is a ton more efficient. Maybe efficiency is not important for you, but for millions of users it is. It has also to do with noise, 780 Ti/980 aren't as loud as 290/390X.

But I'm still laughing about you calling me a "Nvidiot fanboy".


No I have nothing inherently against Nvidia, or any company that makes a product. I don't call people "Nvidiots" because they buy Nvidia cards, I do so when I truly believe they are a fanboy. And yeah I assume most people who defend Kepler are in fact fanboys, because kepler was a big joke if you actually know what you are talking about.


I have owned plenty of Nvidia cards (Haven't owned any for a few years now though). However I honestly don't believe you when you say you own AMD cards considering the continued noob arguments I keep hearing.


The 390X is noisy huh? Um no they were all AIB whisper quiet cards. Of course you probably don't know that because you are clearly uninformed on all of these cards from top-to-bottom. I mean the 290X was hot, but not loud if using its default fan settings,; and again - that's for the cheap launch cards. If you bought the plentifully available AIB cards you would know they were very quiet. Quite funny you bring up noise when the Titan series (And now 1070/80 FE) have been heavily criticized for their under-performing and noisy fan systems.


Also the efficiency argument is my favorite myth. Only with the release of Maxwell did Nvidia start to have any efficiency advantage at all, and that was only against the older generation AMD cards. I will leave you with this:

upload_2017-1-27_15-40-20.png



^WOW! A full 5-10% more efficient! (depending on the card). Anyone who thinks that is worth mentioning is simply looking for reasons to support "Their side."

Pascal was really the first generation Nvidia won efficiency in any meaningful way.
 
Last edited:
Top