• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen ThreadRipper is Capable of Running Crysis without a GPU

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,641 (0.99/day)
AMD has just recently launched its 3rd generation of Ryzen ThreadRipper CPUs, and it is already achieving some impressive stuff. In the world of PC gaming, there used to be a question whenever a new GPU arrives - "But can it run Crysis?". This question became meme over the years as GPU outgrew requirements for the Crysis game, and any GPU nowadays is capable of running the game. However, have you ever wondered if your CPU can run Crysis, alone without a GPU? Me neither, but Linus from LinusTechTips taught of that.

The CPU, of course, can not run any game, as it lacks the hardware for graphics output, but being that AMD's ThreadRipper 3990X, a 64 core/128 thread monster has raw compute power capable of running Crysis, it can process the game. Running in software mode, Linus got the CPU to process the game and run it without any help from a GPU. This alone is a massive achievement for AMD ThreadRipper, as it shows that CPUs reached a point where their raw computing power is on pair with some older GPU and that we can achieve a lot of interesting things. You can watch the video down below.



View at TechPowerUp Main Site
 
Joined
Nov 18, 2010
Messages
7,590 (1.48/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
Why are we endorsing Linus?
 
Joined
Feb 11, 2009
Messages
5,569 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Why are we endorsing Linus?

Silly question tbh.

On the topic, can someone explain this to me?
Like why is this so amazing? why is this so hard to do with cpu's and so...well nowadays, ez for gpu's?

Are there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?
 
Joined
Apr 24, 2012
Messages
1,606 (0.35/day)
Location
Northamptonshire, UK
System Name Main / HTPC
Processor Ryzen 7 7800X3D / Ryzen 7 2700
Motherboard Aorus B650M Elite AX/ B450i Aorus Pro Wifi
Cooling Lian-Li Galahad 360 / Wraith Spire
Memory Corsair Vengeance 2x16 6000MHz CL30 / HyperX Predator 2x8GB 3200MHz
Video Card(s) RTX 3080 FE / ARC A380
Storage WD Black SN770 1TB / Sabrent Rocket 256GB
Display(s) Aorus FO32U2P / 39" Panasonic HDTV
Case Fractal Arc XL / Cougar QBX
Audio Device(s) Denon AVR-X2800H / Realtek ALC1220
Power Supply Corsair RM850 / BeQuiet SFX Power 2 450W
Mouse Logitech G903
Keyboard Drop Sense75 with WQ Studio Morandi's
VR HMD Rift S
Software Win 11 Pro 64Bit
Joined
Feb 22, 2019
Messages
10 (0.00/day)
Silly question tbh.

On the topic, can someone explain this to me?
Like why is this so amazing? why is this so hard to do with cpu's and so...well nowadays, ez for gpu's?

Are there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?

If we expect CPUs and GPUs should perform at the same level, why do GPUs exist? Why can't we just use CPUs for everything?
Turns out that there are some problems benefit hugely from processing it in parallel, but there are also some that can't be processed parallely.
The 3990X being such a high asking price CPU, only has 64 cores. The Titan RTX however, has 4k cores iirc.
Well, why does the TR has so few cores? Because it can do all 64 cores @ ~3.3GHz, where as the Titan RTX can only do @ ~1.7GHz, if not less.

Typically, games are very demanding in rendering objects. The details of leaves, dirt etc.This is a task that can be parallely executed, so a GPU, having a huge amount of cores, benefit from it.
That's why many modern GPUs can run Crisis in ease.

There are also tasks that can't be parallely executed. In that case, the only way to achieve better performance is bumping up the speed. This is where the GHz of a CPU shines.
 
Joined
Jan 8, 2017
Messages
9,499 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
why is this so hard to do with cpu's and so...well nowadays, ez for gpu's?

A top of the line CPU can do a couple of hundreds floating points operations per clock cycle, a top end GPU a couple of thousands. There are many more reasons but that's the gist of it. Of course that extra performance isn't free, a GPU can't do sequential instructions with a lot of branching and whatnot very good.

Are there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?

Yes, well maybe not something that can be measured in frames per second but there are many tasks that have traditionally ran on CPUs which are orders of magnitude faster on GPUs.

The Titan RTX however, has 4k cores iirc.
Well, why does the TR has so few cores? Because it can do all 64 cores @ ~3.3GHz, where as the Titan RTX can only do @ ~1.7GHz, if not less.

There is somewhat of a misunderstanding of how GPU cores are actually counted. What Nvidia calls a "CUDA core" is more like a thread, the SM is the real core, that being said they both have similar amounts of "cores" 64 vs 72. There are other reasons why they are faster such as having orders of magnitude more threads and execution units. They can fit more cores on a GPU primarily because the caches are much smaller, half of a typical CPU is just the cache.
 
Last edited:
Joined
Mar 7, 2010
Messages
993 (0.18/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 3900x
Motherboard AsRock X570 Taichi
Cooling Enermax LIQMAX III 360
Memory 32 GiG Team Group B Die 3600
Video Card(s) Powercolor 5700 xt Red Devil
Storage Crucial MX 500 SSD and Intel P660 NVME 2TB for games
Display(s) Acer 144htz 27in. 2560x1440
Case Phanteks P600S
Audio Device(s) N/A
Power Supply Corsair RM 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Pro
Joined
Oct 5, 2017
Messages
595 (0.23/day)
There's really nothing wrong with Linus's content. No, it's not as in depth as an Anandtech or AdoredTV article, but that's not the point.

He's rarely outright wrong, he usually admits it when he IS wrong, and the greatest sin he commits on any sort of regular basis is just not going into enough depth. He's not lying to his viewers, he doesn't shill, he doesn't encourage fanboying, he's just creating content that is geared towards younger people getting into tech, and not crusty nerds who already have opinions on everything.

If you're tech literate enough to be annoyed at the stuff Linus oversimplifies or omits from explanations, then you're literally not the kind of person that needs Linus. And that's fine. It's not for you. Move on. It's great for the people who it's aimed at, and he's not *wrong* for deciding to explain things at a level of complexity his audience is comfortable with.

If you want more you have plenty of great options - Gamers Nexus, AdoredTV, etc.
 
Joined
Jun 18, 2015
Messages
343 (0.10/day)
Location
Perth , West Australia
System Name schweinestalle
Processor AMD Ryzen 7 3700 X
Motherboard Asus Prime - Pro X 570 + Asus PCI -E AC68 Dual Band Wi-Fi Adapter
Cooling Standard Air
Memory Kingston HyperX 2 x 16 gb DDR 4 3200mhz
Video Card(s) AMD Radeon 5700 XT 8 GB Strix
Storage Intel SSD 240 gb Speed Demon & WD 240 SSD Blue & WD 250 SSD & WD Green 500gb SSD & Seagate 1 TB Sata
Display(s) Asus XG 32 V ROG
Case Corsair AIR ATX
Audio Device(s) Realtech standard
Power Supply Corsair 850 Modular
Mouse CM Havoc
Keyboard Corsair Cherry Mechanical
Software Win 10
Benchmark Scores Unigine_Superposition 4K ultra 7582
Whats wrong with linus ? nothing , this is news worth forwarding
 
Joined
Feb 3, 2017
Messages
3,806 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Threadripper being able to do this in the first place is an amazing achievement. Although from the looks of it whatever software solution they are using for it is incredibly inefficient. Would be curious to know how it would work on a well-optimized software implementation.

However, keep in mind that this is a $4000 CPU that is 416mm^2 14nm IO Die plus 8x 74mm^2 CCD's. Not accounting for the huge IO Die, this is on 592mm^2 state of the art 7nm CPU cores. Bascially, you can have more than two 5700XT's from the same die space.

There is somewhat of a misunderstanding of how GPU cores are actually counted. What Nvidia calls a "CUDA core" is more like a thread, the SM is the real core, that being said they both have similar amounts of "cores" 64 vs 72. There are other reasons why they are faster such as having orders of magnitude more threads and execution units. They can fit more cores on a GPU primarily because the caches are much smaller, half of a typical CPU is just the cache.
Zen2 core has two 128-bit FPU-s that should be able to be divided further to four 64-bit units.
CUDA Core (SM, Streaming Multiprocessor) has 64 32-bit shader units in it.
While their capabilities are widely different and Zen2's FPU is a lot more powerful in a couple ways, SM has a lot more raw FP32 compute power which is what graphics is geared for.
 
Last edited:
Joined
Jul 16, 2014
Messages
8,215 (2.16/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
I started a thread on this last week, its neat but its old news now.

 
Joined
Aug 10, 2013
Messages
102 (0.02/day)
Location
Denmark
There's really nothing wrong with Linus's content. No, it's not as in depth as an Anandtech or AdoredTV article, but that's not the point.

He's rarely outright wrong, he usually admits it when he IS wrong, and the greatest sin he commits on any sort of regular basis is just not going into enough depth. He's not lying to his viewers, he doesn't shill, he doesn't encourage fanboying, he's just creating content that is geared towards younger people getting into tech, and not crusty nerds who already have opinions on everything.

If you're tech literate enough to be annoyed at the stuff Linus oversimplifies or omits from explanations, then you're literally not the kind of person that needs Linus. And that's fine. It's not for you. Move on. It's great for the people who it's aimed at, and he's not *wrong* for deciding to explain things at a level of complexity his audience is comfortable with.

If you want more you have plenty of great options - Gamers Nexus, AdoredTV, etc.

“AdoredTV” Lol... Cant watch him cause of a Big fan boy of amd and intel/nvidia hater that he is..

Gamernexus all The Way, How it should be.
 
Joined
Jun 28, 2016
Messages
3,595 (1.16/day)
A top of the line CPU can do a couple of hundreds floating points operations per clock cycle, a top end GPU a couple of thousands. There are many more reasons but that's the gist of it. Of course that extra performance isn't free, a GPU can't do sequential instructions with a lot of branching and whatnot very good.
That's factually correct and yet has very little importance for the issue of gaming performance.
The reason GPUs exist is - basically - because they can produce a video signal. And for a very long time they were not suitable for any kind of sensible computing - compared to CPUs available at that point. They were even slower at rendering - it's just that single-core CPUs were to precious to waste on that.

Then 3D gaming became mainstream - built around simplified techniques. GPUs evolved to do them efficiently. CPUs didn't.

Maybe if 3D gaming went the ray tracing route from the start (the path it moves towards today), we may have got to high-core CPUs earlier, while GPUs would remain the humble devices they used to be ~20 years ago (with coolers similar to what we now put on chipsets...).
 
Joined
Feb 3, 2017
Messages
3,806 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
The term for the card producing the video signal is video card and these existed long before GPU was a thing. GPU as a term was brought into use by Sony for Playstation and in PC space by Nvidia for Geforce 256. GPU implies processing, meaning compute. The reason GPUs exist is specialized hardware being much more efficient in what they are specialized for.
 
Joined
Dec 12, 2016
Messages
1,939 (0.66/day)
Let me test an analogy to help understand the difference between a CPU and a GPU.

You are a king with two groups:
Group A: A drafted army of 5000 simple farmers, artisans, bakers, teachers, etc...
Group B: 16 super highly skilled trained specialists in the art of combat, stealth, mechanics, weapons, etc.

As king you have two orders:
Order 1: Defeat an army of 5000 simple peasants from the neighboring kingdom
Order 2: Kidnap the neighboring kings daughter that is being protected by the same 5000 simple peasants

Now duplicate each order 1000's of time but keep in mind that order 1 involves the exact same opposing army each time but order 2 changes to sabotage a mill, steal plans from a general, poison a water supply, etc.

If you know which group to match to which order then you will have complete understanding of the difference between CPUs and GPUs.

In the case of this article, the King now has the resources to grow the size of Group B to 128 trained specialists (TR 3990X) and given the order to take on a much smaller army of 500 peasants (old Crysis game). Its an extreme waste of specialists/resources as you would rather send in the 5000 drafted army but hey sometimes the king does stupid things for fun!
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.16/day)
The term for the card producing the video signal is video card and these existed long before GPU was a thing. GPU as a term was brought into use by Sony for Playstation and in PC space by Nvidia for Geforce 256. GPU implies processing, meaning compute. The reason GPUs exist is specialized hardware being much more efficient in what they are specialized for.
The point I was trying to make is: the reason why GPUs are so much more efficient in games today is not that creating 3D graphics is intrinsically benefiting from hundreds of smaller cores.
They are much better because they've been optimized for a particular rendering model that we've chosen to use in games.
 
Joined
Oct 5, 2017
Messages
595 (0.23/day)
The game runs windowed at low-resolution software mode and its lagging heavily. So no, it cannot run Crysis.
By definition, running it poorly is still running it.

Also why are you clarifying that it's in software mode? Of course it's in software mode. If it were in hardware mode it would be running on the GPU instead of entirely on the CPU, which is the point of the demonstration.

Lesser CPUs would be utterly incapable of this demonstration. The fact we've gone from being unable to run Crysis on many dedicated cards, to being able to run Crysis without any sort of dedicated GPU at all is a remarkable technological achievement and shows just how far we've come - and this demo is, well, sure, it's not bar charts and graphs, but it's a fun little experiment with a surprising result.
 

Regeneration

NGOHQ.COM
Joined
Oct 26, 2005
Messages
3,131 (0.45/day)
By definition, running it poorly is still running it.

Also why are you clarifying that it's in software mode? Of course it's in software mode. If it were in hardware mode it would be running on the GPU instead of entirely on the CPU, which is the point of the demonstration.

Lesser CPUs would be utterly incapable of this demonstration. The fact we've gone from being unable to run Crysis on many dedicated cards, to being able to run Crysis without any sort of dedicated GPU at all is a remarkable technological achievement and shows just how far we've come - and this demo is, well, sure, it's not bar charts and graphs, but it's a fun little experiment with a surprising result.

Almost every CPU made in the last decade can achieve the same.
 
Joined
Oct 5, 2017
Messages
595 (0.23/day)
Almost every CPU made in the last decade can achieve the same.
Oh I'm quite sure they'd be able to open the game. I'm not convinced it would be feasible to actually play it, however poorly, on older or lower end hardware however.

In any case, Linus wasn't exactly seeking to demonstrate the obsolescence of the GPU with this. He was clearly just looking for a novel way to demonstrate that TR 3000 is capable of doing things previously reserved for dedicated hardware, through sheer brute force. Mission accomplished. What was being rendered on that screen wasn't exactly 300FPS 8k per-eye VR, but it was certainly well beyond anything previously achievable with a desktop CPU and no GPU rendering.
 

Regeneration

NGOHQ.COM
Joined
Oct 26, 2005
Messages
3,131 (0.45/day)
Everyone can run Crysis without a GPU.

Almost every CPU from this decade is capable of running games in software renderer mode.


Above is a video of my very own 10-year-old Xeon running Crysis similarly like Linus.

Despite the nonsense, you'll need a GPU to run games on the highest quality, resolution, at decent FPS.
 
Last edited:
Joined
Oct 12, 2019
Messages
128 (0.07/day)
CPUs and GPUs went separate paths long time ago.

In the 'dawn of computing', the GPU main function was 'ability to put a picture to the screen' (something Intel IGPs retain as the only merit, until today, just couldn't resist :) ) - but at *that* time HDDs has a separate controller cards, not integrated on mainboard, and mouse was an optional piece of hardware. Bunch of other stuff wasn't integrated, too. Main (at that time) competitors to IBM MDA/CGA/EGA/VGA/XGA had additional options regarding number of colour or resolution (we could mention Hercules, a monochrome alternative to MDA text-only and CGA 4 colours, low resolution card) - NEVER at any kind of performance, CPU did all the work...

Stuff like Voodoo and GeForce changed a picture and started differentiation, up to a scale we have today.

It's not true that GPUs don't compute - they compute (in non-gaming terms) very much, not just super-computers, but in all the tasks that are suitable for them - say F@H or coin-mining... They would do it MUCH MORE, if the HSA ever reached a part of its potential...

TR achievement is interesting if nothing else - and Linus-basing is just... immature? What, if somebody else came for the (basically) proof-of-concept idea and tested it, then everything would be alright?

Oh, yes - GPU hardware ray-tracing is a lie... Best scenario and most honest is given on AMD slide:


A *selective* stuff can be done in ray-tracing scene can be ray-traced by individual GPUs, but the rendering of the sequential screens is so variable in computing power that it's... jut not possible in next, one after... and we shall see how many generations after... Therefore, idea that GPUs could've started with doing ray-traced is... hmmm...
 
Joined
Nov 24, 2017
Messages
853 (0.33/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
Top