• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Experiences Strong Cloud AI Demand but Faces Challenges in China, with High-End AI Server Shipments Expected to Be Below 4% in 2024

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,769 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
NVIDIA's most recent FY3Q24 financial reports reveal record-high revenue coming from its data center segment, driven by escalating demand for AI servers from major North American CSPs. However, TrendForce points out that recent US government sanctions targeting China have impacted NVIDIA's business in the region. Despite strong shipments of NVIDIA's high-end GPUs—and the rapid introduction of compliant products such as the H20, L20, and L2—Chinese cloud operators are still in the testing phase, making substantial revenue contributions to NVIDIA unlikely in Q4. Gradual shipments increases are expected from the first quarter of 2024.

The US ban continues to influence China's foundry market as Chinese CSPs' high-end AI server shipments potentially drop below 4% next year
TrendForce reports that North American CSPs like Microsoft, Google, and AWS will remain key drivers of high-end AI servers (including those with NVIDIA, AMD, or other high-end ASIC chips) from 2023 to 2024. Their estimated shipments are expected to be 24%, 18.6%, and 16.3%, respectively, for 2024. Chinese CSPs such as ByteDance, Baidu, Alibaba, and Tencent (BBAT) are projected to have a combined shipment share of approximately 6.3% in 2023. However, this could decrease to less than 4% in 2024, considering the current and potential future impacts of the ban.




China to expand investment in proprietary ASICs and develop general-purpose AI chips due to limited high-end AI chip demand
Facing the risk of expanded restrictions arising from the US ban, TrendForce believes Chinese companies will continue to buy existing AI chips in the short term. NVIDIA's GPU AI accelerator chips remain a top priority—including existing A800 or H800 inventories and new models like H20, L20, and L2—designed specifically for the Chinese market following the ban. In the long term, Chinese CSPs are expected to accelerate, with Alibaba's T-Head and Baidu being particularly active in this area, relying on foundries like TSMC and Samsung for production.

At the same time, major Chinese AI firms, such as Huawei and Biren, will continue to develop general-purpose AI chips to provide AI solutions for local businesses. Beyond developing AI chips, these companies aim to establish a domestic AI server ecosystem in China. TrendForce recognizes that a key factor in achieving success will come from the support of the Chinese government through localized projects, such as those involving Chinese telecom operators, which encourage the adoption of domestic AI chips.

Edge AI servers: A potential opportunity for Chinese firms amid high-end AI chip development constraints
A notable challenge in developing high-end chips in China is the limited access to advanced manufacturing technology. This is particularly true for Huawei, which remains on the US Entity List and relies on domestic foundries like SMIC for production. Despite SMIC's advancements, it faces similar issues created by the US ban—including difficulties in obtaining key advanced manufacturing equipment and potential yield issues. TrendForce believes that in trying to overcome these limitations, China may find opportunities in the mid to low-range edge AI server market. These servers, with lower AI computational demands, cater to applications like commercial ChatBOTs, video streaming, internet platforms, and automotive assistance systems. They might not be fully covered by US restrictions, presenting a possible growth direction for Chinese firms in the AI market.

View at TechPowerUp Main Site | Source
 
Joined
Jan 14, 2023
Messages
842 (1.19/day)
System Name Asus G16
Processor i9 13980HX
Motherboard Asus motherboard
Cooling 2 fans
Memory 32gb 4800mhz
Video Card(s) 4080 laptop
Storage 16tb, x2 8tb SSD
Display(s) QHD+ 16in 16:10 (2560x1600, WQXGA) 240hz
Power Supply 330w psu
To the moon on their stock price.
Really their stock price is like rocket this year. I'm happy I got in early enough but I wont be adding to my portfolio and I'm not selling.

I wonder with a rumor going around that OpenAI made a breakthrough on artificial general intelligence might mean to Nvidia.
 
Joined
Dec 29, 2010
Messages
3,809 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
I wonder with a rumor going around that OpenAI made a breakthrough on artificial general intelligence might mean to Nvidia.
All the major players ie. Google, MSFT/OpenAI, Amazon, and Tesla are moving or have moved to custom silicon.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
All the major players ie. Google, MSFT/OpenAI, Amazon, and Tesla are moving or have moved to custom silicon.
Exactly, running AI on ASICs will significantly enhance efficiency, prompting companies to reduce service operating costs. What's noteworthy is that, unlike the CPU and GPU market, there is no nearly insurmountable IP barrier, allowing major players to freely develop their solutions.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I love how certain people in all the NVIDIA/China threads are consistently making stupid claims that the Chinese ban is going to somehow destroy the company, and that NVIDIA is breaking the law to ship GPUs to circumvent this ban. Yet the actual numbers show that China makes up barely a 15th of all shipments of NVIDIA's ML-focused chips, and this will decrease by an incredible two percent in 2024.

Clearly NVIDIA will have to massively step up their imaginary lawbreaking in order to make up for this incredible shortfall! Also, and let me be completely clear here: you conspiracy theorists are complete and utter idiots, which is why you're broke and shitposting on tech forums, and NVIDIA shareholders aren't.

All the major players ie. Google, MSFT/OpenAI, Amazon, and Tesla are moving or have moved to custom silicon.
LMAO, and where do you think they going to fab that custom silicon? Where everyone else does, i.e. TSMC.

You know, the same TSMC that is massively backlogged with orders?
The company that NVIDIA has a contract with for a large percentage of wafer allocation, and all those other companies don't?
The company that isn't going to give time of day to anyone who isn't willing to commit to massive volumes, which the aforementioned players can't?

Or maybe they're going to use Intel... oh wait, they're outsourcing to TSMC too now.

Guess it's GloFo then, I'm sure that'll work just great!
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
I love how certain people in all the NVIDIA/China threads are consistently making stupid claims that the Chinese ban is going to somehow destroy the company, and that NVIDIA is breaking the law to ship GPUs to circumvent this ban. Yet the actual numbers show that China makes up barely a 15th of all shipments of NVIDIA's ML-focused chips, and this will decrease by an incredible two percent in 2024.

Clearly NVIDIA will have to massively step up their imaginary lawbreaking in order to make up for this incredible shortfall! Also, and let me be completely clear here: you conspiracy theorists are complete and utter idiots, which is why you're broke and shitposting on tech forums, and NVIDIA shareholders aren't.


LMAO, and where do you think they going to fab that custom silicon? Where everyone else does, i.e. TSMC.

You know, the same TSMC that is massively backlogged with orders?
The company that NVIDIA has a contract with for a large percentage of wafer allocation, and all those other companies don't?
The company that isn't going to give time of day to anyone who isn't willing to commit to massive volumes, which the aforementioned players can't?

Or maybe they're going to use Intel... oh wait, they're outsourcing to TSMC too now.

Guess it's GloFo then, I'm sure that'll work just great

You get lost in your own arrogance by assuming a lot of things that are virtually impossible to know, whether it's about people's intellect or what Nvidia is doing behind the scenes. Sorry but that's a stupid attitude that only a blind fanatic would have.

ASICs are so many times more efficient that you can manufacture them in a more mature and cheaper process whose capacity is not compromised. TSMC's capacity is also increasing not decreasing.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506

Or just use GPU? Bad times I'm guessing for buyers.

There's certainly enough evidence on show to display that the AI and ML market in china does matter to someone.

Let's hope the raids on Nvidia offices recently found a non loop hole jumping company behaving as it should then.

Because there's been a few coincidences lately.

like Nvidia bios are cracked.

Oh we are eol 4090/80 soon. Ready for super's.

Oh look millions of 4090 are now getting taken apart in china.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
ASICs are so many times more efficient that you can manufacture them in a more mature and cheaper process whose capacity is not compromised.
Fair, but fabbing is only half the equation.

You have to design your custom ASICs first.
You have to design the software that they're going to use.
You have to write that software.
You have to migrate over all your current software, or write a translation layer that does so.

All of the above take time.
All of the above involve significant risk.

Tesla is the only company that has tried this and that's because they're run by a man rich enough to not give a s**t. Other companies would much rather stick with what they know and what works, and just buy more hardware. It's not the cost that's the problem, it's the supply, which leads to...

TSMC's capacity is also increasing not decreasing.
And they're going to offer that capacity to the companies that already have massive long-term contracts with them.

Companies like, I dunno, NVIDIA.

Because there's been a few coincidences lately.

like Nvidia bios are cracked.

Oh we are eol 4090/80 soon. Ready for super's.

Oh look millions of 4090 are now getting taken apart in china.
You need help.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
Fair, but fabbing is only half the equation.

You have to design your custom ASICs first.
You have to design the software that they're going to use.
You have to write that software.
You have to migrate over all your current software, or write a translation layer that does so.

All of the above take time.
All of the above involve significant risk.

Tesla is the only company that has tried this and that's because they're run by a man rich enough to not give a s**t. Other companies would much rather stick with what they know and what works, and just buy more hardware. It's not the cost that's the problem, it's the supply, which leads to...


And they're going to offer that capacity to the companies that already have massive long-term contracts with them.

Companies like, I dunno, NVIDIA.


You need help.
All large corporations (Meta, Google, Microsoft, Amazon) developing AI services are mostly or partly software companies. Is it so difficult to imagine that they who not only have the expertise to create the software ecosystem for their hardware but also the necessary money have been working on it in silence for some time?

The state of the AI market can be illustrated by considering Tesla as an example. A while back, the company invested in a market that few believed in until there was an explosion in demand. Tesla, being ahead of the curve, was able to set its fees freely. However, with increasing competition, the battle over pricing levels is exerting pressure on profit margins, compelling companies to invest in maximizing efficiency. Simply swap Tesla for NVIDIA in this historical context, and you will observe the pattern repeating. Even Huang admitted the existence of this risk:

"Nvidia CEO Jensen Huang says his AI powerhouse is ‘always in peril’ despite a $1.1 trillion market cap: ‘We don’t have to pretend…we feel it’ "

Nvidia CEO says his AI powerhouse is ‘always in peril’ | Fortune
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
Tesla is the only company that has tried this and that's because they're run by a man rich enough to not give a s**t. Other companies would much rather stick with what they know and what works, and just buy more hardware. It's not the cost that's the problem, it's the supply, which leads to...
Nope Tesla's not the only one ~ Amazon, Google & now MS are moving to custom ARM chips. Granted this is about AI but their experience in developing(R&D) & then making these chips over several generations will help them everywhere even if Nvidia is the market leader right now. Also did you forget about Google's own TPU now at v4 o_O
 
Joined
Dec 29, 2010
Messages
3,809 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
Nope Tesla's not the only one ~ Amazon, Google & now MS are moving to custom ARM chips. Granted this is about AI but their experience in developing(R&D) & then making these chips over several generations will help them everywhere even if Nvidia is the market leader right now. Also did you forget about Google's own TPU now at v4 o_O
Yea, Googly has been on their iterations for a decade lmao. That poster is goin on my ignore list, such arrogance is hilarious yet insulting.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Yea, Googly has been on their iterations for a decade lmao. That poster is goin on my ignore list, such arrogance is hilarious yet insulting.
Remind me how well Google's AI push is going.
 
Joined
Aug 20, 2007
Messages
21,541 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
ASICs are so many times more efficient
At tasks that are fixed and easily defined. AI is very much outside that category. I do not see any way an ASIC designed for AI would be much different than what a GPU ASIC honestly already is: tons of simple stream processors.
 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
At tasks that are fixed and easily defined. AI is very much outside that category. I do not see any way an ASIC designed for AI would be much different than what a GPU ASIC honestly already is: tons of simple stream processors.
One com-any has already shown it's custom AI solution using far less energy to do the same calculations as H100 and do it 7x faster or something like that. The energy requirements for AI are growing exponentially, it'll make bitcoin energy use look pathetic. Companies are aware of this. They are not just trying to cut the costs of hardware but overall operating costs. These custom ASIC's are coming about precisely because of Huang's extreme greed and the egregious pricing of their H100/200 and GH200. With little compettion these scumbags charge what they like. Well no surprise several companies have had enough.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
At tasks that are fixed and easily defined. AI is very much outside that category. I do not see any way an ASIC designed for AI would be much different than what a GPU ASIC honestly already is: tons of simple stream processors.
Except AI isn't AI at all. There are several different applications within the spectrum of so-called AI, some companies including AMD/Xilinx have already demonstrated solutions that are more efficient than GPUs in some cases.

Nvidia sells GPUs at a very high price, which also encourages companies to look for alternatives.

AI-Chips-by-country.jpg


AMD_VCK5000_Slide15 (1).png

AMD_VCK5000_Slide14.png
 
Joined
Aug 20, 2007
Messages
21,541 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Except AI isn't AI at all.
I know exactly what "AI" (as it's being marketed) is when I spoke. It's weighted tables to make decisions. Same outcome.
One com-any has already shown it's custom AI solution using far less energy to do the same calculations as H100 and do it 7x faster or something like that.
I'd really like to see that independently validated. You'll gain some from losing the display circuitry sure, but not that much. I'm calling bullshit until someone (other than the company itself) proves that stat.

some companies including AMD/Xilinx have already demonstrated solutions that are more efficient than GPUs in some cases.
More efficient I could buy, but not 7x. Like I said, losing the display circuitry and useless parts like ROPs alone is sure to gain you something, at least.
2x is feasible. Mind you nvidia could easily do the same.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
At tasks that are fixed and easily defined. AI is very much outside that category. I do not see any way an ASIC designed for AI would be much different than what a GPU ASIC honestly already is: tons of simple stream processors.
Oh thank god, finally someone who has a basic understanding of how this actually works.

Except AI isn't AI at all. There are several different applications within the spectrum of so-called AI, some companies including AMD/Xilinx have already demonstrated solutions that are more efficient than GPUs in some cases.

Nvidia sells GPUs at a very high price, which also encourages companies to look for alternatives.

View attachment 322907

View attachment 322905
View attachment 322906
Quoting performance numbers from three years ago isn't the slam-dunk you seem to think it is.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
Oh thank god, finally someone who has a basic understanding of how this actually works.


Quoting performance numbers from three years ago isn't the slam-dunk you seem to think it is.
I just mentioned an example; you know, all these companies are working on their hardware internally. As I said, AI has a huge scope, and ASICs will be molded to fit each of these scenarios.

There's a big incentive to cut operating costs like this when Nvidia sells GPUs costing tens of thousands of dollars each.

"In 2013, Google realized that unless they could create a chip that could handle machine learning inference, they would have to double the number of data centers they possessed. Google claims that the resulting TPU has “15–30X higher performance and 30–80X higher performance-per-watt” than current CPUs and GPUs."

"Although both TPUs and GPUs can do tensor operations, TPUs are better at big tensor operations, which are more common in neural network training than 3D graphics rendering. The TPU core of Google is made up of two parts a Matrix Multiply Unit and a Vector Processing Unit. When it comes to the software layer, an optimizer is used to switch between bfloat16 and bfloat32 operations (where 16 and 32 are the number of bits) so that developers don’t have to rewrite their code. As a result, the TPU systolic array architecture has a large density and power advantage, as well as a non-negligible speed advantage over a GPU."

 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Oh thank god, finally someone who has a basic understanding of how this actually works.


Quoting performance numbers from three years ago isn't the slam-dunk you seem to think it is.
It was for Elon when he ditched Nvidia to make his own Dojo.
 
Joined
Aug 20, 2007
Messages
21,541 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
It was for Elon when he ditched Nvidia to make his own Dojo.
Elon is not really the slamdunk genius many think either. I'm sure he did it mind, but that does not mean it was a good idea or "worth it."

I stand by what I said: independent verification or I'm not buying anything more than double the efficiency.
 
Top