• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon RX Vega Needs a "Damn Lot of Power:" AIB Partner Rep

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,673 (2.86/day)
Location
w
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance @ 6000Mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Let's recap this article for a moment:

Argh it's DUTCH. DUTCH. How is this complicated?

Not a big deal.
I would never go under 1KW PSU because I want to be far away from power limitation.
Now you can find cheap quality 1KW PSU and they don't need to be big.
There is a models with very small PSU case as EVGA 1000 GX.
mATX motherboard + Intel 6-10 cores.

This only could be problem for guys who thought 700W is enough for everything and keep Intel X chipset and now is time for GPU upgrade.
Even than 750W should be enough but fan will work like crazy.

This is ... wow. There is so much strange in this post I don't even know where do begin.
 
Joined
May 2, 2013
Messages
489 (0.11/day)
Location
GA
System Name RYZEN RECKER
Processor Ryzen 5 5600X
Motherboard Asus Prime B350-plus
Cooling Arctic Cooler 120mm CPU, Cougar case fans.
Memory 16GB (2x8GB) Corsair LPX 3200mhz
Video Card(s) XFX 6700XT Swift 309
Storage 6.5TB total across 4 different drives
Display(s) Acer 32" 170hz VA
Case Antec 900
Audio Device(s) Logitech G430 headset
Power Supply Corsair CX850m
Mouse Steel Series Sensei Ten
Keyboard Corsair K55
Software Windows 10 64-bit
I just upgraded to a Corsair CX850 watt. This should be enough to handle this beast, correct? Just one of them xD
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,019 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
I just upgraded to a Corsair CX850 watt. This should be enough to handle this beast, correct? Just one of them xD
Really? I know you are kidding us. :)

Yes, it will be just fine.
 
Joined
May 2, 2013
Messages
489 (0.11/day)
Location
GA
System Name RYZEN RECKER
Processor Ryzen 5 5600X
Motherboard Asus Prime B350-plus
Cooling Arctic Cooler 120mm CPU, Cougar case fans.
Memory 16GB (2x8GB) Corsair LPX 3200mhz
Video Card(s) XFX 6700XT Swift 309
Storage 6.5TB total across 4 different drives
Display(s) Acer 32" 170hz VA
Case Antec 900
Audio Device(s) Logitech G430 headset
Power Supply Corsair CX850m
Mouse Steel Series Sensei Ten
Keyboard Corsair K55
Software Windows 10 64-bit
Really? I know you are kidding us. :)

Yes, it will be just fine.
lol Yes, a tad bit of trolling. I thought about getting an awesome water loop setup and overclocking the hell out of my R9 290 and Ryzen 1600X but, I think I will wait for Vega.
 
Joined
Oct 4, 2013
Messages
58 (0.01/day)
To all the fan boys and girls of NV and AMD included INTEL

POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!

If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.

Power available is only a part of the Graphic Card design specification!

And I'm commenting on a marketing guy!? NVM!!

Cheer!

{@}
^(^
Peace!
 
Joined
Feb 16, 2017
Messages
494 (0.17/day)
It's so funny looking at peasants raving about power consumption of top of the line card. It's abotu as equally as stupid as raving about MPG of a Ferrari F12...
One more thing to claim dominance over while they OC the 1080ti and push up power usage o_O.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
To all the fan boys and girls of NV and AMD included INTEL

POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!

If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.

Power available is only a part of the Graphic Card design specification!

And I'm commenting on a marketing guy!? NVM!!

Cheer!

{@}
^(^
Peace!

Exactly. If card has 500W TDP and consumes just 200W during actual operation, it just means it's so far over engineered (highly over-exaggerated example, but you get the picture). It's also important to distinct core TDP and board TDP. Which is a bit tricky with Vega since it has HBM2 on same interposer as chip. Is this counted within core design or board design?
 
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
WOW HOW QUICK WE LOOK AWAY...

A MSI marketing director said this... he's already out the door; Or MSI want's Miners to not hold out for Vega and buy MSI stock of 1080 and 1080Ti at now their now rapidly inflating prices. The idea that you might not find currency as fast, but do it with less energy is the Nvidia mantra toward mining and it a premise that can work. So here's "some marketing shrill" at MSI that has done both MSI/Nvidia a favor to promote lining their pockets...

I'm not saying Vega isn't going to want power probably, but it *might* deliver better perf/w, so it's relative. Nvidia is working that same angle with mining, saying you don't unearth as fast, but we do it with less energy. It' just takes longer to see a return on the investment. Here's the thing you what to find as fast as you can as who know when it will again "go bust" and then need to relieve yourself of the currency and either companies GPU's. I comes down on ROI, and the guys with more coin that sold under high prices will almost always come-out ahead not on used equipment resale.
 
Last edited:
Joined
May 11, 2016
Messages
261 (0.08/day)
Exactly. If card has 500W TDP and consumes just 200W during actual operation, it just means it's so far over engineered (highly over-exaggerated example, but you get the picture). It's also important to distinct core TDP and board TDP. Which is a bit tricky with Vega since it has HBM2 on same interposer as chip. Is this counted within core design or board design?

The power requirements are usually very close to what they rate them at, within about 25W + or -. They aren't going to say it requires 300W and it only pulls 100W. We're talking wattage under load also. Pretty much any card will have a much lower power pull if in 2D surfing the net.
 
Joined
Oct 4, 2013
Messages
58 (0.01/day)
The power requirements are usually very close to what they rate them at, within about 25W + or -. They aren't going to say it requires 300W and it only pulls 100W. We're talking wattage under load also. Pretty much any card will have a much lower power pull if in 2D surfing the net.

YES, but what is the PERFORMANCE PER WATT?! That's the most critical information missing. Without it, how do you compared?

The article should has title it as: >> Vega available with 250W and 300W of power!<< THAT IT'S!
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Exactly. If card has 500W TDP and consumes just 200W during actual operation, it just means it's so far over engineered (highly over-exaggerated example, but you get the picture).
TDPs are always with a small margin, and as we can see with CPUs with TDP brackets like 65W, 91W, 140W, etc., then of course some CPUs are going to be at the lower end of each bracket, and some at the higher end.

When you see GPUs with TDPs of 120W, 150W, 180W, 250W, 300W, etc., a higher number generally means more energy consumption. TDPs of GPUs are also much closer to typically gaming consumption than TDPs of CPUs (CPUs are usually hotter during encoding), so there is no denying that a graphics card with a TDP of 300W is going to consume a lot.

It's also important to distinct core TDP and board TDP. Which is a bit tricky with Vega since it has HBM2 on same interposer as chip. Is this counted within core design or board design?
All graphics cards use board TDP.
 
Joined
Sep 25, 2007
Messages
5,966 (0.95/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
You should not care about Powerdraw on such Cards.
You only care if you have at least couple of dozens working full load in some sort of super Computer. That is where it matters.
But for us mostly single gpu users, it wont matter if ist 250 or 300 or 350 W. For waht, for disipated heat? For extra power costs ?
What is the Advantage of 250W versus 350W ?
You are not going to hold your VEGA in 100% full load 24/7, and even if you did, the difference from 250W to 350W is 2.4 KWh per day which translates in 0.72 EUR per day. Which is 21EUR per month, if you Keep it in FULL load that is, which as a gamer only you will never.
Chances are you are getting more Performance for those 21 EUR per month in a full load Scenario. So if you had the couple of hundert euros to pay for the Card, you sure as hell wont mind paying a bit much for power usage.

That beeing said, im waiting for the DUAL GPU Version of this, and im going to get me two of them for 4 GPUs in total, see then how much the powerdraw is in full load.
I personaly wont mind at all if the DUAL VEGA Card would have 4x8pin power plugs. I can easily accomodate 8x8pin power plugs from my two ANTEC High current pros 1300W.
So bring it on AMD, i'm waiting.

Its high but as long as they can cool it I agree it shouldn't be an issue. I haven't heard anything about a dual consumer level vega card. It looks like theirs a FirePro/RadeonPro dual vega but no consumer yet so far. Has anyone heard about a consumer dual vega in development?
 
Last edited:
Joined
Aug 13, 2009
Messages
3,254 (0.58/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) AMD Radeon RX 6600
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) HP Z Display Z24i G2
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
Assuming the rumor is true, how is it possible? I thought power consumption continually goes down as CPUs/graphic cards are built on smaller processes.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Not when you try to make them ridiculously fast. More transistors means more power consumption. With smaller nodes, you're just shifting the inevitable. When you hit the limit, like it was with 28nm, you have to go radical like NVIDIA has with Maxwell 2.

In a way, I'm ok with RX Vega having higher consumption. At least god damn miners won't take all of them immediately.
 
Joined
Aug 13, 2009
Messages
3,254 (0.58/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) AMD Radeon RX 6600
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) HP Z Display Z24i G2
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
What did Nvidia do?
Also, what's Maxwell 2? Do you mean Pascal, or?...
 
Joined
Dec 15, 2006
Messages
1,703 (0.26/day)
Location
Oshkosh, WI
System Name ChoreBoy
Processor 8700k Delided
Motherboard Gigabyte Z390 Master
Cooling 420mm Custom Loop
Memory CMK16GX4M2B3000C15 2x8GB @ 3000Mhz
Video Card(s) EVGA 1080 SC
Storage 1TB SX8200, 250GB 850 EVO, 250GB Barracuda
Display(s) Pixio PX329 and Dell E228WFP
Case Fractal R6
Audio Device(s) On-Board
Power Supply 1000w Corsair
Software Win 10 Pro
Benchmark Scores A million on everything....
Assuming the rumor is true, how is it possible? I thought power consumption continually goes down as CPUs/graphic cards are built on smaller processes.
Yeah, sure it does.... Until they increase the core count and clock speed.
 
Joined
Jun 10, 2014
Messages
2,995 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Not when you try to make them ridiculously fast. More transistors means more power consumption. With smaller nodes, you're just shifting the inevitable. When you hit the limit, like it was with 28nm, you have to go radical like NVIDIA has with Maxwell 2.

In a way, I'm ok with RX Vega having higher consumption. At least god damn miners won't take all of them immediately.
Heat is becoming a really serious issue when approaching 300W. Most of us wants a card that can remain stable for ~5 years. I wouldn't complain if AMD managed to get within 5-10% of the efficiency of Nvidia, but when it's even just 20-30% it really limits their competitiveness in the upper range, and now when it's like ~80% they even struggle in the mid range. Energy efficiency is now the main limiting factor for performance scaling.

The only advantage Vega has right now is cheaper fp16 performance, which wouldn't matter for gaming anytime soon.
 
Joined
Mar 10, 2014
Messages
1,793 (0.45/day)
What did Nvidia do?
Also, what's Maxwell 2? Do you mean Pascal, or?...

Maxwell 1 is gm108 and gm107 aka gtx750/ti. Maxwell 2 is gm206, gm204 and gm200(gtx900 series) all within TSMC 28nm like kepler was. Nvidia stripped off FP64 more with maxwell that they have with kepler and by that could add more fp32 cuda cores while being same 28nm node.
 
Joined
Dec 31, 2009
Messages
19,372 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
To all the fan boys and girls of NV and AMD included INTEL

POWER USAGES and POWER AVAILABLE are 2 (TWO) different thing!

If the performance/watt is good base on whatever we are reference on than 250W and 300W are awesomeee! The more the better.

Power available is only a part of the Graphic Card design specification!

And I'm commenting on a marketing guy!? NVM!!
power use and power available are two different things... true..but how is that relevant here? Power available is just done by adding up pcie slot and tbe two connectors...assuming thats what you meant (its what was said anyway...)

Its high but as long as they can cool it I agree it shouldn't be an issue. I haven't heard anything about a dual consumer level vega card. It looks like theirs a FirePro/RadeonPro dual vega but no consumer yet so far. Has anyone heard about a consumer dual vega in development?
they are cooling it with 120mm aio. Most would agree those are good for 100w... and we have up to 375w, stock, to cool with it. Hell, they used a 120mm aio with 295x2 at 500w...and it worked...but man that thing got hot! Sorry... 375w on water, 300w on air...


@RejZoR - ive never heard of a gpu using core power for power rating.... you got a link of that being used??? In fact ive never heard of the term....





For all - Board power includes everything afaik. Doesnt matter if its hbm or not. Board power is power use of the card, not a 'part' of it. And while tdp isnt power used, it is very close and most would consider it a good ballpark for actual power used.

YES, but what is the PERFORMANCE PER WATT?! That's the most critical information missing. Without it, how do you compared?

The article should has title it as: >> Vega available with 250W and 300W of power!<< THAT IT'S!
well, it needs to be 20% faster than a 1080ti to take that away from it...considering we know 1080ti board power is 250...this one is 300, 20% more. Problem is, its predicted to be as fast, not 20% faster...and thats not talking the 375w part...


Not a big deal.
I would never go under 1KW PSU because I want to be far away from power limitation.
Now you can find cheap quality 1KW PSU and they don't need to be big.
There is a models with very small PSU case as EVGA 1000 GX.
mATX motherboard + Intel 6-10 cores.

This only could be problem for guys who thought 700W is enough for everything and keep Intel X chipset and now is time for GPU upgrade.
Even than 750W should be enough but fan will work like crazy.
No. Wow...no.

So just how much damn power is this thing....
from the first para... yes its tdp, but again, its plenty close enough!
air-cooled version has its TDP rated at 300W, and the faster liquid-cooled variant 375W. This is way above the 275W TDP of the TITAN Xp, NVIDIA's fastest client-segment graphics card.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,684 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I won't say 'told you so'

But... told you so. This is just Fury X v2, pointless HBM that is way too fast for the core it is coupled with in a desperate attempt to leave more TDP budget for a crapload of GCN cores that still date back to 2012.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.79/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
But... told you so. This is just Fury X v2, pointless HBM that is way too fast for the core it is coupled with in a desperate attempt to leave more TDP budget for a crapload of GCN cores that still date back to 2012.
The HBM memory controller is a lot smaller than the GDDR5 one, even more so if you're considering the 512-bit wide bus on the 290(x) and 390(x) cards. 4096 shaders all together is a lot of compute which is going to create a lot of heat which is going to self-limit itself with respect to how fast it can go. I don't think HBM is the problem, I think the problem is that AMD has to find a better way to spread those GCN cores out so all the heat being generated isn't so concentrated because then you have to resort to more innovative ways to cool the chip because the problem isn't so much that it makes a lot of heat but rather that the circuit gets hot faster than the heat itself can move through the rest of the GPU material between that and the heat sink. Even with the same temperature gradient, it takes more time for heat to move a larger distance and as circuits get smaller, heat gets concentrated in a smaller area which is going to impact the temperature of the circuit but, not necessarily the rate at which heat can be removed from the system.

Think of it this way, if you have a square, 1 meter by 1 meter producing some amount of heat consistent across its surface, if you reduce the the size of it by half (0.5m x 0.5m,) but, heat produced stays the same, that smaller surface will be hotter, even though it's producing the same amount of heat. What changed is that the rate of heat dissipation does not remain a constant because the available area to conduct heat has been reduced but, the end result is that a higher temperature difference is required for the heat to flow faster. As a result, circuits themselves get hotter the smaller you make them and when you try to cool or reduce their temperature the same way, you end up with higher circuit temperatures which brings everything back to equilibrium but ultimately becomes the limiting factor with how fast they can operate.

tl;dr: Too much heat in too small of an area is to blame, not HBM.
 
Joined
Oct 4, 2013
Messages
58 (0.01/day)
power use and power available are two different things... true..but how is that relevant here? Power available is just done by adding up pcie slot and tbe two connectors...assuming thats what you meant (its what was said anyway...)

they are cooling it with 120mm aio. Most would agree those are good for 100w... and we have up to 375w, stock, to cool with it. Hell, they used a 120mm aio with 295x2 at 500w...and it worked...but man that thing got hot! Sorry... 375w on water, 300w on air...


@RejZoR - ive never heard of a gpu using core power for power rating.... you got a link of that being used??? In fact ive never heard of the term....





For all - Board power includes everything afaik. Doesnt matter if its hbm or not. Board power is power use of the card, not a 'part' of it. And while tdp isnt power used, it is very close and most would consider it a good ballpark for actual power used.

well, it needs to be 20% faster than a 1080ti to take that away from it...considering we know 1080ti board power is 250...this one is 300, 20% more. Problem is, its predicted to be as fast, not 20% faster...and thats not talking the 375w part...


No. Wow...no.

from the first para... yes its tdp, but again, its plenty close enough!

We have no information in regarding to Vega's performance as of yet. it all rumored...

as for me, the articles just indicated that Vega will be available up to 250W and 300W of power usaged, Nothing more.
 
Joined
Dec 31, 2009
Messages
19,372 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
We have no information in regarding to Vega's performance as of yet. it all rumored...

as for me, the articles just indicated that Vega will be available up to 250W and 300W of power usaged, Nothing more.
correct, its all rumor. My reply to you was based off a 1080ti which we know how it performs. I simply stated performamce /w would need to beat something KNOWN, that 1080ti i mentioned.

First post follows another rumor from tech report showing 300/375.

The company already put out specifications of the first consumer product based on this silicon, the Radeon Pro Vega Frontier Edition; and according to listings by online retailers, its power figures aren't looking good. The air-cooled version has its TDP rated at 300W, and the faster liquid-cooled variant 375W.

http://techreport.com/news/32112/rumor-vega-frontier-edition-board-power-ratings-surface
 
Joined
Aug 20, 2007
Messages
21,546 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
At least god damn miners won't take all of them immediately.

They will if they perform well at mining for said wattage...
 

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,817 (1.71/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
no one gives a fuck about power usage in reality. Its a bs metric we we use to see who can piss farther. What will matter is end all be all performance. If the card is affordable, and performs on par with a 1080Ti people will buy it. Power usage does not even form a blip on my radar. High end GPU expect high end power consumption. All that matters is performance per $ for a GPU.

Example AMD Vega comes out same price as a 1080Ti thats fine if performance is similar thats fine if it uses more wattage that is also fine. What I would like to see is minimum frame rate. If the VEGA card can sustain a higher frame rate with less performance Dips with its HBC setup then its a worthwhile offering even with the higher consumption. My gut tells me its too little too late.

But power usage does not compute. because most system sit idle or off the majority of the time. The higher power usage means diddly squat. For the sake of the argument.

375w vs 250w at 10 cents per KWH, = $27 vs $18 per month to run said GPUs at maximum performance 24/7. cut that in half so 12 hours a day thats $13.5 vs $9 per month. None of us play games for 12 hours a damn day. So lets say 6 hours thats still a bit much but its a more believable amount. thats $6.75 vs $4.5 per month. 3 hours per day of gaming on a 375w Vega vs 250w 1080Ti would end up being $3.37 vs $2.25.

Efficiency aside. a $1 a damn month difference. thats what 125w between 2 gpus will realistically work out to. So unless you live in a third world country where you probably cant get a Vega or a 1080Ti. Realistically speaking the wattage amount between gpus is basically just bs unless running the cards 24/7 full bore something 99% of gamers will not do.

Keep in mind I used $0.10 cents. My local rate is $0.07 cents. So even cheaper lol. I am pretty sure you can save more money driving a more efficient car. Buying cheaper shit tickets, taking a shorter shower than buying a GPU based on power usage.
 
Top