• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Vega-based Cards to Reportedly Launch in May 2017 - Leak

Joined
Apr 10, 2012
Messages
1,400 (0.30/day)
Location
78°55' N, 11°56' E
System Name -aLiEn beaTs-
Processor Intel i7 11700kf @ 5.055Ghz
Motherboard MSI Z490 Unify
Cooling Corsair H115i Pro RGB
Memory G.skill Royal Silver 4400 cl17 @ 4403mhz
Video Card(s) Zotac GTX 980TI AMP!Omega Factory OC 1418MHz
Storage Intel SSD 330, Crucial SSD MX300 & MX500
Display(s) Samsung C24FG73 144HZ
Case CoolerMaster HAF 932 USB3.0
Audio Device(s) X-Fi Titanium HD @ 2.1 Bose acoustimass 5
Power Supply CoolerMaster 850W v2 gold atx 2.52
Mouse Razer viper 8k
Keyboard Logitech G19s
Software Windows 11 Pro 21h2 64Bit
Benchmark Scores ► ♪♫♪♩♬♫♪♭
So.. Vega.

Yeah its a nice little solar system :D



March nv, May Amd, seems reasonable, maybe its just enough to fine tune it to 1080ti..
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
performance per TFLOP decreases
That's completely nonsensical, performance is measured in TFLOPs, there is no such thing as performance per TFLOP.
You probably meant that ratio of average performance in TFLOPs versus peak theoretical performance in TFLOPs decreases.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.05/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Just a moment here. Did you even read the reviews? The reference 780 Ti consumed 15 Watt less when gaming (muh better efficiency!!111), it was 8% faster and 28% more expensive vs the reference 290x. And even if the 290x was 8% slower, it still managed to push 60+ in most games tested here on TPU at 1080p. People only bought the 780 Ti because it was nVidia, not because it was that much better as you say. The only two problems with 290x were it's multimonitor power consumption and poor reference colling. Otherwise it was a great card! Stop making things up ...
Did you even read or understand what I've written? I talked about CUSTOM 780 Ti mainly, and those were a great deal faster back then, not only 8%. And afaik the 290X even being Ref run pretty good, because it was tweaked by W1zzard and afaik was a cherry picked GPU of AMD too. But you can't say that about the ref 780 Ti which runs on pretty pretty low clocks compared to custom ones. Over 200 MHz lower than decent customs.

No I have nothing inherently against Nvidia, or any company that makes a product. I don't call people "Nvidiots" because they buy Nvidia cards, I do so when I truly believe they are a fanboy. And yeah I assume most people who defend Kepler are in fact fanboys, because kepler was a big joke if you actually know what you are talking about.
It's not and I still know what I'm talking about. And you're still behaving like a fanboy or someone who actually has no clue himself. Just a FYI: 1536 shaders GTX 680, that consumed way less power than HD 7970, was faster than the 2048 shader GPU of AMD. AMD had only one way to react to it: power the HD 7970 to retarded clocks, calling it "GHz Edition" with 1050 MHz, but also increasing it's power consumption a lot by the way. The whole HD 7970 and R9 290X GPUs were about brute power with wide buses (384 / 512 bit) and high shader counts. Basically, the mistakes Nvidia did before with GTX 280/480/580 were copied by AMD onto their HD 7000 and later lineups, and Nvidia basically tried to do what AMD pulled off with HD 5000 / 6000 which were pretty efficient compared to GTX 200/400/500 series. Only when the 290X was released it put enough stress on Nvidia to counter it with their own full force GPU, the GK110 with all of its shaders enabled (2880). It was more expensive, but also better in every single aspect.

I have owned plenty of Nvidia cards (Haven't owned any for a few years now though). However I honestly don't believe you when you say you own AMD cards considering the continued noob arguments I keep hearing.
Said the angry fanboy. Who cares. I owned HD 5850 from release to 2013 when I replaced it with HD 5970 and used it until it started to be defective, end of 2015. I own a HD 2600 XT now. Basically I owned 2 of the best AMD/ATI GPUs ever made. The reason why I chose HD 5000 over GTX 200 series back then was simple: because it was a lot better. Because I don't care about brands.

The 390X is noisy huh? Um no they were all AIB whisper quiet cards.
I said they are 'noisier', not noisy. Also I was mostly talking about R9 200 series, not 300, which are pretty irrelevant to this discussion. But if you want to talk about the R9 300 series: yes compared to GTX 900 series they were noisy. You can't compare R9 300 series to GTX 700 series because they are a different generation. Go and read some reviews.

Of course you probably don't know that because you are clearly uninformed on all of these cards from top-to-bottom.
Childish behaviour all the way.

I mean the 290X was hot, but not loud if using its default fan settings,; and again - that's for the cheap launch cards. If you bought the plentifully available AIB cards you would know they were very quiet. Quite funny you bring up noise when the Titan series (And now 1070/80 FE) have been heavily criticized for their under-performing and noisy fan systems.
We are not talking about Titan series. And 1070/1080 are doing relatively good for what they are (exhaust style coolers). AMD's exhaust coolers were just a mess after HD 5000 series. Loud and hot. And later ineffective (R9 290X fiasko).

I can bring up noise every time, since I'm absolutely right about Nvidia cards being more quiet and way less power hungry which has to do with the noise as well.

Also the efficiency argument is my favorite myth. Only with the release of Maxwell did Nvidia start to have any efficiency advantage at all, and that was only against the older generation AMD cards. I will leave you with this:

View attachment 83498


^WOW! A full 5-10% more efficient! (depending on the card). Anyone who thinks that is worth mentioning is simply looking for reasons to support "Their side."

Pascal was really the first generation Nvidia won efficiency in any meaningful way.
Then compare R9 290X custom vs. custom 780 Ti and then you'll see I'm right. They are way more efficient and a lot faster. Also Multi Monitor, idle and Bluray / Web power consumption is still a mess on those AMD cards. Efficiency is not only, when you play games.

I'm gonna drop this discussion now, cause you're a fanboy and just a waste of time. Save to say, you didn't prove anything of what I've said wrong. The opposite is true and you totally confirmed me with your cherry picking. Try that with someone else.

You're on Ignore, byebye.
 
Last edited:
Joined
Dec 6, 2016
Messages
748 (0.26/day)
Did you even read or understand what I've written? I talked about CUSTOM 780 Ti mainly, and those were a great deal faster back then, not only 8%. And afaik the 290X even being Ref run pretty good, because it was tweaked by W1zzard and afaik was a cherry picked GPU of AMD too. But you can't say that about the ref 780 Ti which runs on pretty pretty low clocks compared to custom ones. Over 200 MHz lower than decent customs.

Here are two benchmarks made at the same time with custom 780 Ti and custom 290x:
https://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/
https://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/

The custom 780 Ti was 15% faster than the custom 290x at 1080p and cost 18% more, which makes the custom 290x have better performance/price. The custom 780 Ti also consumed 6% more power on average gaming session making it only 10% more efficient at gaming than the custom 290x. Nowadays, in modern games, the reference 780TI and 290x are on par with performance at 1080p, which we can probably safely extrapolate to custom cards. You do the math.

As I said before, you either didn't read the reviews or you have a really bad memory. Pick one. I'm pulling out of this debate as it's not the topic of this thread, however I did enjoy proving you wrong ;).
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.05/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Here are two benchmarks made at the same time with custom 780 Ti and custom 290x:
https://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/
https://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/

The custom 780 Ti was 15% faster than the custom 290x at 1080p and cost 18% more, which makes the custom 290x have better performance/price. The custom 780 Ti also consumed 6% more power on average gaming session making it only 10% more efficient at gaming than the custom 290x. Nowadays, in modern games, the reference 780TI and 290x are on par with performance at 1080p, which we can probably safely extrapolate to custom cards. You do the math.

As I said before, you either didn't read the reviews or you have a really bad memory. Pick one. I'm pulling out of this debate as it's not the topic of this thread, however I did enjoy proving you wrong ;).
You didn't prove me wrong, but thanks for proving me correct. And while you're at it don't forget the bad power consumption of R9 290X on idle/multi monitor and BD/Web. And looking into the future is still impossible. Maybe more people would've bought the 290X knowing it's better in 2-3 years, maybe not, because Nvidia had better drivers back then. I'm also pretty sure most people didn't care, they wanted the best performance NOW, and not in years to come, most enthusiast users that pay over 600 bucks for a GPU don't keep it for 2-3 years anyway, they replace GPUs way faster than normal users. AMD always was a good brand for people that kept their GPU a long time, their GPUs had more memory and future oriented technology like DX11 on HD 5000 series, low level API prowess on R9 200/300 series. Doesn't change the fact, the last GPUs AMD had, that was a real "winner", was the HD 5000 series. Everything after that was just a follow-up to Nvidia, always 1 or 2 steps behind.
 
Last edited:
Joined
Dec 6, 2016
Messages
748 (0.26/day)
You didn't prove me wrong, but thanks for proving me correct. And while you're at it don't forget the bad power consumption of R9 290X on idle/multi monitor and BD/Web. And looking into the future is still impossible. Maybe more people would've bought the 290X knowing it's better in 2-3 years, maybe not, because Nvidia had better drivers back then. I'm also pretty sure most people didn't care, they wanted the best performance NOW, and not in years to come, most enthusiast users that pay over 600 bucks for a GPU don't keep it for 2-3 years anyway, they replace GPUs way faster than normal users.

Look, I know you got Choice-supportive bias because you own 780 Ti, but avoiding the numbers won't make the nVidia card better. In my first post, which you obviously didn't read properly, I said clearly that the multi-monitor was one of the two problems with the 290x.

Let's analyze the power consumption in other areas (single monitor only of course, since the number of multimonitor users is way lower):
The difference in single monitor idle is 10 Watts. This is also the difference between the cards at average gaming. If you leave the 290x idling, it can turn off to ZeroCore Power modus which consumes virtually no power (i measured it myself on my 7750 and you can find measurements online), but I'm putting 2 W there so you won't say I'm cheating ...

If you leave the computer running 24/7 and it idles 12 hours, you game 6 hours, watch a BlueRay movie for 2 hours and do other things for 4 hours the 780 Ti will consume: (12h*10W + 6h*230W + 2h*22W+4h*10W)/24h = 60-70W per hour and the 290X will consume (12h*2W + 6h*219W + 2h*74W+4h*20W)/24h = 60-70W per hour. Virtually the same! You can say whatever you want, but the numbers are on my side :D.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.05/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Look, I know you got Choice-supportive bias because you own 780 Ti, but avoiding the numbers won't make the nVidia card better. In my first post, which you obviously didn't read properly, I said clearly that the multi-monitor was one of the two problems with the 290x.

Let's analyze the power consumption in other areas (single monitor only of course, since the number of multimonitor users is way lower):
The difference in single monitor idle is 10 Watts. This is also the difference between the cards at average gaming. If you leave the 290x idling, it can turn off to ZeroCore Power modus which consumes virtually no power (i measured it myself on my 7750 and you can find measurements online), but I'm putting 2 W there so you won't say I'm cheating ...

If you leave the computer running 24/7 and it idles 12 hours, you game 6 hours, watch a BlueRay movie for 2 hours and do other things for 4 hours the 780 Ti will consume: (12h*10W + 6h*230W + 2h*22W+4h*10W)/24h = 60-70W per hour and the 290X will consume (12h*2W + 6h*219W + 2h*74W+4h*20W)/24h = 60-70W per hour. Virtually the same! You can say whatever you want, but the numbers are on my side :D.
Maybe so, because I'm kinda tired of this discussion. And yeah I have to admit some bias, though I bought this 780 Ti used after 390X were already released and I still don't regret it a single bit. Maybe I mixed power consumption numbers of 290X with 390X in my mind, because 390X consumes more than 290X (8 GB vs. 4 GB, higher clocks all around), for the same reason, because I compared 780 Ti with 390X back then. So it's okay I don't disagree with your numbers. However, I still don't like the Multi Monitor power consumption on those GPUs, I didn't like it when I had HD 5850/5970 as well, those had the same problem and Web power consumption is way too high too (I don't care about BD, I just use the wording from TPU). For me it matters, for others maybe not. I didn't had the choice between 390 and 780 Ti anyway, I bought this from a friend for a discount. Originally I had a R9 380 delivered to me, but it was defective from the start, so I asked him if he wants to sell me his 780 Ti, because he had just bought a 980 Ti, and returned the 380. That's it.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
That "people buy good products" crap is annoying in 2017. We have seen it with Prescott we have seen it with Fermi, it is clearly not the case.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
That "people buy good products" crap is annoying in 2017. We have seen it with Prescott we have seen it with Fermi, it is clearly not the case.

Fermi was the superior performing card? It took two AMD cards in crossfire to best the GTX480.
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Fermi was the superior performing card? It took two AMD cards in crossfire to best the GTX480.
Fermi was hot but great, big improvement over 200 series ... however, wasn't as good as evergreen series until 580 model, timeline went something like this:
500x1000px-LL-bc319d0b_diesize.jpeg

Fermi had huge die size compared to evergreen and succesors ... AMD already had 5870 that 480 couldn't quite dethrone without high tesselation (evergreen's Achilles heel)... thing is 580 was a proper fermi improvement (performance and power, temperature and noise improvement) and 6000 series wasn't improvement at all over evergreen.
Two 5000 or 6000 series gpus in xfire had huge issues with frame pacing back then (and it was an issue all the way to the crimson driver suite), so they would actually beat any fermi in average frame rate, but measured frame time variations made it feel almost the same. Maybe that's what you are referring to?
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.05/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Fermi was the superior performing card? It took two AMD cards in crossfire to best the GTX480.
The gtx 480 wasn't bested anyway because crossfire back then was a complete mess, frame variances all over the place, only difference being it wasn't public knowledge like it is now. But the gtx 480 bested itself by being loud, hot and power hungry, it wasn't a good gpu if you ask me. Maybe 20-30% faster than HD 5870 but not really worth the hassle. The best *card* was the HD 5970 if you ignore the crossfire problems. It was actually pretty nice with frame pacing, too bad that feature was introduced in 2013 not Nov. 2009 when the GPU was released.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
The gtx 480 wasn't bested anyway because crossfire back then was a complete mess, frame variances all over the place, only difference being it wasn't public knowledge like it is now. But the gtx 480 bested itself by being loud, hot and power hungry, it wasn't a good gpu if you ask me. Maybe 20-30% faster than HD 5870 but not really worth the hassle. The best *card* was the HD 5970 if you ignore the crossfire problems. It was actually pretty nice with frame pacing, too bad that feature was introduced in 2013 not Nov. 2009 when the GPU was released.

I agree my point was performance wise it was almost two full AMD GPU's ahead
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
I remember HD5870 vs GTX480 at launch being neck to neck ... radeon was even better in Crysis which was the uber benchmark at the time ...
I believe performance lead for fermi came later with driver maturity ... and by that time it was GTX 580 vs HD 6970 win for nvidia and very soon after GTX 580 vs 7970 win for amd until kepler and so on.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.05/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
I remember HD5870 vs GTX480 at launch being neck to neck ... radeon was even better in Crysis which was the uber benchmark at the time ...
I believe performance lead for fermi came later with driver maturity ... and by that time it was GTX 580 vs HD 6970 win for nvidia and very soon after GTX 580 vs 7970 win for amd until kepler and so on.
I only remember out of my head, the GTX 480 being mostly faster but with a few games that favoured ATI tech with the HD 5870 and the HD 5970 being always on top (always but a few games that didn't support CF).

btw.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html
 
Last edited:

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.05/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
It was faster in almost everything lol
Yeah I screwed the wording lol. Such a nice card and the cooler was designed like a grill too, so it was ready for barbecue. I think they did a great job fixing the issues with the GTX 580 though.
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Yeah I screwed the wording lol. Such a nice card and the cooler was designed like a grill too, so it was ready for barbecue. I think they did a great job fixing the issues with the GTX 580 though.

I loved my 470's to the point where I still have three of them with DD blocks, just in case I want to feel nostalgia.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
What on earth:


Wait lets look at this graph. 100% performance match to the 5970. Would you mind telling everyone how many GPU's the 5970 contained?
 
Joined
Feb 7, 2006
Messages
739 (0.11/day)
Location
Austin, TX
System Name WAZAAM!
Processor AMD Ryzen 3900x
Motherboard ASRock Fatal1ty X370 Pro Gaming
Cooling Kraken x62
Memory G.Skill 16GB 3200 MHz
Video Card(s) EVGA GeForce GTX 1070 8GB SC
Storage Micron 9200 Max
Display(s) Samsung 49" 5120x1440 120hz
Case Corsair 600D
Audio Device(s) Onboard - Bose Companion 2 Speakers
Power Supply CORSAIR Professional Series HX850
Keyboard Corsair K95 RGB
Software Windows 10 Pro
That's completely nonsensical, performance is measured in TFLOPs, there is no such thing as performance per TFLOP.
You probably meant that ratio of average performance in TFLOPs versus peak theoretical performance in TFLOPs decreases.

I'm referencing my previous posts with the graphs and tables. There I define performance as the Average Performance as listed in the summary of the Titan X Pascal review.

Post 1: https://www.techpowerup.com/forums/...-launch-in-may-2017-leak.229550/#post-3584964

Post 2: https://www.techpowerup.com/forums/...h-in-may-2017-leak.229550/page-4#post-3585723


The table for reference:


AMD's larger die have lower performance per flop.

Before you reply that this is a stupid methodology, please put together something based on a methodology that you think is good and use that supporting data.
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Can you tell us why the relationship of die size to flop matters to people? Its not like that translates to anything tangible for the user, like FPS, or compute power. Its just some math that divides TFlops by die size...

Am I missing something?
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Can you tell us why the relationship of die size to flop matters to people? Its not like that translates to anything tangible for the user, like FPS, or compute power. Its just some math that divides TFlops by die size...

Am I missing something?

No you are missing nothing. He posted graphs in one of the other threads claiming how there is performance per flop etc. Basically he used excel and is proud of it. Let him have his excel moment remember not everyone can use excel.
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
It just seems so arbitrary.. like, the size of the rims on the car compared to how many windows it has. It really has no bearing on anything. I mean, cool metric, but, what is it actually telling us? How can a consumer use that data to gauge anything??

Consumers could care less if there was something the size of a postage stamp or a 8.5"x11" die under there... really.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.14/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
It just seems so arbitrary.. like, the size of the rims on the car compared to how many windows it has. It really has no bearing on anything. I mean, cool metric, but, what is it actually telling us? How can a consumer use that data to gauge anything??

Consumers could care less if there was something the size of a postage stamp or a 8.5"x11" die under there... really.

Consumers don't care about anything. but RGB lights.
 
Top