• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

Joined
Dec 12, 2020
Messages
1,755 (1.20/day)
That picture is of an Nvidia H100 CNX DPU: https://www.nvidia.com/en-gb/data-center/h100cnx/
Why does that HPC GPU have on-board 200Gb and a 400Gb ethernet ports?

Nevermind the marketing material has an answer for that:

The H100 CNX alleviates this problem. With a dedicated path from the network to the GPU, it allows GPUDirect® RDMA to operate at near line speeds. The data transfer also occurs at PCIe Gen5 speeds regardless of host PCIe backplane. Scaling up GPU power in a host can be done in a balanced manner, since the ideal GPU-to-NIC ratio is achieved. A server can also be equipped with more acceleration power, because converged accelerators require fewer PCIe lanes and device slots than discrete cards.
 
Joined
Dec 28, 2012
Messages
3,944 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
What you said makes perfect sense, but why did AMD abandon HBM for their consumer cards?
Simple: their two HBM card lines, the fury and vega, lost to their repsetive nvidia cards, the GDDR5 powreed 980ti and the GDDR5x powered 1080ti/1080 respeively. Neither card was powerful enough for HBM to help in any way.
And while the bandwidth on HBM is incredible relative to any GDDRx flavour what is the latency like?
The latency is high, but GDDR memory tends to have high lateny anyway. GPUs are more about pure bandwidth for gaming appliations typically. IIRC HBM3 latency is comparable to GDDR6x.
 
Joined
Dec 12, 2020
Messages
1,755 (1.20/day)
@TheinsanegamerN
Do you think AMD will give HBM a shot again? They have GPU's competitive with Team Green now maybe a modern GPU using HBM would push them over the top and give them the clear cut lead in 4K-8K gaming/mining perf?

Maybe you could vent the 900 Watts of heat into a clothes drier or central furnace heating system or a water heater?
 
Joined
Oct 12, 2005
Messages
710 (0.10/day)
I certainly won't buy a 900W card but i don't see how it's an issue for residential power. A single socket can deliver 3520W here, even with the whole computer and screens you will still have no issue. You may eventually need a higher subscription than the medium 6kva i have if you intend to cook and heat at the same time that you are playing but that's pretty much it. I don't think that the kind of people who would be interested in this card would mind the price difference.
In europe maybe, but in North America, house hold socket are on 120v and have a maximum rating most of the time of 1800w for a 15 amp circuit. Note that it's not every single socket that can deliver either 3500w or 1800w, it's each circuit. Depending on how your house is wired, you can have multiples thing on the same circuit. It's not common to have multiple house socket on the same circuit. It's even worst for older houses.

So let say you have a 900w GPU, a 170w+ CPU(i doubt you would run that card on a low end cpu), plus 30w for the rest of the system, you are at 1100w. PowerSupply aren't 100% efficient so you can add another 100w of loss on top of that. Then you have monitor. some can easily use more than 50w per monitor. if you have more than 1, you can start to get closer to the circuit breaker.

And you haven't plug anything else. Normally, for kitchen, they design it so that each socket have it's own circuit, if that trend continue, we might have to design home office to get the same treatment.

But before we get there, we need to consider how much heat those kitchen appliance output. I mean, that PC would have more heat output than the heating system of the room i would put it in.

I wonder if Ferrari considered that consumers have to pay for gasoline. I could not justify paying for an additional 10 gallons of gas.

I wonder if pool companies considered that consumers have to pay for water. I could not justify paying for an additional 100 gallons of service.

ece. ece. If the power draw of a GPU means your power bill will be a concern, you shouldnt be buying a likely $2500+ video card. compared to a 400w GPU like the 3090 it's literally a 5-6$ difference over an entire YEAR unless you are playing 8+ hours a day, in which case your streaming will more then make up the difference.
it all depend where you live.

let say you are not a hardcore gamer, but you still play 4 hours per day in average, a bit much on the weekend, a bit less during week, etc. Something normal for someone spending that much in equipment, if you live in an area where the electricity is around 0.20$ per KwH, it would cost you around 146$ per year more.

where i live, it still almost 60 buck. I mean it's not that much, but way more than 5-6 $ diff.

And it depend, some area have peek hour rate that are much higher, some have different tariff based on their consumption.

But still, my main concern would be that additional 500w of power that it would dump in my room. it would be the equivalent of having the other computer + this running at maximum power while I game:
1651089200486.png
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
This should be illegal.
I think using that much wattage to game a bit mental but it's worth remembering that 1KWatt air heaters are also quite popular.

Actually not that dissimilar besides it Could play Crisis.
 
Joined
Dec 12, 2020
Messages
1,755 (1.20/day)
48 GiB of VRAM wouldn't be useful in any gaming context, even 8k right?

Would there be any advantage to crypto-miners to having a 48 GiB VRAM footprint?
 
Joined
Jul 16, 2014
Messages
8,216 (2.16/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
I wonder if Ferrari considered that consumers have to pay for gasoline. I could not justify paying for an additional 10 gallons of gas.

I wonder if pool companies considered that consumers have to pay for water. I could not justify paying for an additional 100 gallons of service.

ece. ece. If the power draw of a GPU means your power bill will be a concern, you shouldnt be buying a likely $2500+ video card. compared to a 400w GPU like the 3090 it's literally a 5-6$ difference over an entire YEAR unless you are playing 8+ hours a day, in which case your streaming will more then make up the difference.
I think the math might be off here, going by wiki example (because I have to use a source to prove my point :

An electric heater consuming 1000 watts (1 kilowatt), and operating for one hour uses one kilowatt-hour of energy.
so thats 1Kw/h (per hour) x 4 hours costs 80 cents( someone else's figure), times 365, is $292 or $24 per month. Of course your current system usage is not figured in. So lets assume you use 450w currently that works out to a $12 per month increase. Not huge sum for 4 hours of gameplay. Yea I'd be ok with that, but I dont game for just 4 hours usually. Being retired means I have more time to game if i wanted to.

EDIT: I do know that after 17 Kwh per day, I incur an addition rate to the base rate. Rates also change for peak hours.


The way inflation is increasing, by the time a 900watt card is bought, electricity will likely cost much more. The Ferrari owner will be paying $8 a gallon (by the time this card comes out), but when he bought it was $3, $50 might mean something to some people.
 
Last edited:
Joined
Dec 26, 2006
Messages
3,859 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
For some reason I read that as Over 9000!!!
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,537 (6.67/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
FX5000 series and Fermi 4.0?
 
Joined
Apr 6, 2011
Messages
703 (0.14/day)
Location
Pensacola, FL, USA, Earth
Anyone else got a pocket nuclear power pack on order? :roll:
 
Joined
Dec 5, 2017
Messages
157 (0.06/day)
There are plenty of played out jokes to be made about this. GTX 480 looks like an icebox now.

I'd just like to caution that testing a 900W card doesn't mean the final product will take 900W. Especially knowing what we know about this chip, that it uses a Samdung node and die size is near the reticle limit, you're way up the (presumably terrible) v/f curve there. I would expect the final product to slot in between 500-600W. Still unacceptable for most, but a lot more reasonable and practical. I personally don't have much issue with it as long as the performance justifies it and efficiency is still improved. Back in the day people who wanted the best ran 2 or 3 GPUs at 200-300W+ in CFX/SLI. Now that's dead and you get your half-kilowatt, $1500 monsters instead, I have more of a problem with the price honestly.

I certainly won't buy a 900W card but i don't see how it's an issue for residential power. A single socket can deliver 3520W here, even with the whole computer and screens you will still have no issue. You may eventually need a higher subscription than the medium 6kva i have if you intend to cook and heat at the same time that you are playing but that's pretty much it. I don't think that the kind of people who would be interested in this card would mind the price difference.
Many people in the US would probably encounter issues. I get the impression most homes/buildings haven't had electrical work done since the 1970s. Standard outlets here deliver a maximum of 1800W, which is enough, but definitely cutting it close for a full system with such a card. But if you ran the PC and another appliance on the same fuse, it'd blow nonetheless... it's not just the individual outlet that needs to be considered.
 
Joined
Nov 15, 2020
Messages
929 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Granted this is all rumour but I do resent generally that such a product would demand so much from a system. Essentially a company making such a product is requiring further sums to be spent cooling and powering the thing. It better be worth it!
 
Joined
Aug 20, 2007
Messages
21,531 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Its the drawback of going one big monolitic chip design. AMD on the other hands opts for MGM based GPU's, multiple smaller GPU's combined as one.
Monolithic would technically be more efficient power wise than a chiplet design of equal performance. The thing chiplet design helps with is cost and yields not power consumption...

Anyways as the owner of an RTX 3090ti I seriously question how this will even work in a PC case...
 
Joined
Jun 2, 2017
Messages
9,354 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I think using that much wattage to game a bit mental but it's worth remembering that 1KWatt air heaters are also quite popular.

Actually not that dissimilar besides it Could play Crisis.
A bit?

If you need 4 slots for 450 Watts can any shroud handle that kind of size would a 900 watt card be?
 

Count von Schwalbe

Nocturnus Moderatus
Staff member
Joined
Nov 15, 2021
Messages
3,160 (2.80/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
The solution is simple... Only game in winter, and use an open air chassis in your ductwork. I read someone did that with cryptomining (Bitcoin, hold the pitchforks) and netted a profit, even though the cost of electricity was higher than the Bitcoin earned.

Seriously though, to cool the thing must take a custom loop and an outdoor chiller.
 
Joined
Dec 12, 2020
Messages
1,755 (1.20/day)
After reading replies here I'd be more interested in seeing how they're planning on cooling that monstrosity than the actual hardware itself (which will be priced wwwwwwwwwwwaaaaaaaaayyyyyyyyyyyyyyyy out of my price range anyway).

Could they use a thermosiphon to cool it? Or maybe some new NASA tech like the oscillating heatpipe?
 
Joined
Feb 23, 2008
Messages
1,064 (0.17/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Are we sure that's enough Watts? At this point, might as well round it up to a 1000.
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
Monolithic would technically be more efficient power wise than a chiplet design of equal performance. The thing chiplet design helps with is cost and yields not power consumption
Not necessarily, it's mostly about clocks & inter-chip communication ~ this is why zen3 is still more efficient than ADL in a lot of tasks. Yes if 12xxx chips are clocked lower they do match or beat AMD but that's with DDR5 & a slight node advantage, zen4 would likely change that once it releases to firmly AMD's side. This of course doesn't apply to GPU's because we have no working MCM models in the mainstream as yet.
 
Joined
Jun 5, 2021
Messages
284 (0.22/day)
Legislation must be made to ban client gpu's that go over 500watts.. nvidia is panicking they know rdna 3 will crush Lovelace
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
Legislation must be made to ban client gpu's that go over 500watts.. nvidia is panicking they know rdna 3 will crush Lovelace

Yes.

Certain prebuilt Alienware gaming PCs can no longer legally be sold in half a dozen US states due to recently passed power consumption laws.

As reported first by The Register (spotted by Vice), some of Dell's Alienware Aurora R10 and Aurora R12 gaming PCs are no longer available for sale in California, Colorado, Hawaii, Oregon, Vermont, or Washington. Heading over to Dell's website and looking to purchase certain configurations will display a warning message to buyers, indicating that it will not be shipped to those provinces due to power consumption regulations that have been adopted in those states. Dell notes that any orders that are slated to ship to those states will now be canceled.
US States Ban Certain Alienware PC Sales Because They Use Too Much Power - IGN
 
Joined
Aug 20, 2007
Messages
21,531 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Legislation must be made to ban client gpu's that go over 500watts
As much as I hate the idea of greater than 500w gpus, this will never happen, and honestly shouldn't. It's not the states place to tell end users what electronic wattages are acceptable, otherwise we'd end up banning all sorts of kitchen appliances.

this is why zen3 is still more efficient than ADL in a lot of tasks
No, that's completely down to the design choices of the cores, not a chiplet vs monolithic problem.
 
Joined
Aug 20, 2007
Messages
21,531 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
That ban is based on compute efficiency though, not wattage. And honestly, it's a really dumb piece of legislation in it's own right. It uses a lot of antiquated metrics for it's required "perwatt" efficiency requirement (or at least Washingtons does)


Won't they just drive out of state to buy it? transporting it back state to state in the car is no problem is it.
The idea is to pressure manufacturers to get more efficient so they can use one design everywhere. But as I said, that ban is not rooted in raw wattage.
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
Nobody is panicking or crushing anything just yet.
You can glean this information by the rumour that 4080 is based on AD103, as opposed to 3080 that was last minute carved out of GA102 instead to be able to compete with 6800 XT in 4K. a full GA103 would be 5-10% slower so nvidia so it got canceled. And usually 4070 would contain exactly 7680, as 3080 was supposed to be. just like 3070 and 2080, is the same 2944, and doubles to 5888.

So nvidia must be pretty confident that a narrow 192 bit 7800 XT is a AD103 counterpart. For all weknow AMD may be counting the GPU cores differently now. so a 7680 core part is actually shown as 15360, and performs like 10240.

And the beefy GPU is being developped just for the hell of it, just because they can. The sky is the limit.
 
Last edited:
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
No, that's completely down to the design choices of the cores, not a chiplet vs monolithic problem.
Design choice like?

Moving data through cores, caches, dies or chiplets is arguably the biggest power hog these days & that's where AMD excels with IF, this is especially evident in processors with over 12~16 cores.
 
Top