Monday, February 17th 2020

AMD Ryzen ThreadRipper is Capable of Running Crysis without a GPU

AMD has just recently launched its 3rd generation of Ryzen ThreadRipper CPUs, and it is already achieving some impressive stuff. In the world of PC gaming, there used to be a question whenever a new GPU arrives - "But can it run Crysis?". This question became meme over the years as GPU outgrew requirements for the Crysis game, and any GPU nowadays is capable of running the game. However, have you ever wondered if your CPU can run Crysis, alone without a GPU? Me neither, but Linus from LinusTechTips taught of that.

The CPU, of course, can not run any game, as it lacks the hardware for graphics output, but being that AMD's ThreadRipper 3990X, a 64 core/128 thread monster has raw compute power capable of running Crysis, it can process the game. Running in software mode, Linus got the CPU to process the game and run it without any help from a GPU. This alone is a massive achievement for AMD ThreadRipper, as it shows that CPUs reached a point where their raw computing power is on pair with some older GPU and that we can achieve a lot of interesting things. You can watch the video down below.
Add your own comment

56 Comments on AMD Ryzen ThreadRipper is Capable of Running Crysis without a GPU

#2
ZoneDymo
Ferrum MasterWhy are we endorsing Linus?
Silly question tbh.

On the topic, can someone explain this to me?
Like why is this so amazing? why is this so hard to do with cpu's and so...well nowadays, ez for gpu's?

Are there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?
Posted on Reply
#3
R00kie
Ferrum MasterWhy are we endorsing Linus?
Why not?
Posted on Reply
#4
ChingDim
ZoneDymoSilly question tbh.

On the topic, can someone explain this to me?
Like why is this so amazing? why is this so hard to do with cpu's and so...well nowadays, ez for gpu's?

Are there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?
If we expect CPUs and GPUs should perform at the same level, why do GPUs exist? Why can't we just use CPUs for everything?
Turns out that there are some problems benefit hugely from processing it in parallel, but there are also some that can't be processed parallely.
The 3990X being such a high asking price CPU, only has 64 cores. The Titan RTX however, has 4k cores iirc.
Well, why does the TR has so few cores? Because it can do all 64 cores @ ~3.3GHz, where as the Titan RTX can only do @ ~1.7GHz, if not less.

Typically, games are very demanding in rendering objects. The details of leaves, dirt etc.This is a task that can be parallely executed, so a GPU, having a huge amount of cores, benefit from it.
That's why many modern GPUs can run Crisis in ease.

There are also tasks that can't be parallely executed. In that case, the only way to achieve better performance is bumping up the speed. This is where the GHz of a CPU shines.
Posted on Reply
#5
Vya Domus
ZoneDymowhy is this so hard to do with cpu's and so...well nowadays, ez for gpu's?
A top of the line CPU can do a couple of hundreds floating points operations per clock cycle, a top end GPU a couple of thousands. There are many more reasons but that's the gist of it. Of course that extra performance isn't free, a GPU can't do sequential instructions with a lot of branching and whatnot very good.
ZoneDymoAre there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?
Yes, well maybe not something that can be measured in frames per second but there are many tasks that have traditionally ran on CPUs which are orders of magnitude faster on GPUs.
ChingDimThe Titan RTX however, has 4k cores iirc.
Well, why does the TR has so few cores? Because it can do all 64 cores @ ~3.3GHz, where as the Titan RTX can only do @ ~1.7GHz, if not less.
There is somewhat of a misunderstanding of how GPU cores are actually counted. What Nvidia calls a "CUDA core" is more like a thread, the SM is the real core, that being said they both have similar amounts of "cores" 64 vs 72. There are other reasons why they are faster such as having orders of magnitude more threads and execution units. They can fit more cores on a GPU primarily because the caches are much smaller, half of a typical CPU is just the cache.
Posted on Reply
#6
Dave65
Ferrum MasterWhy are we endorsing Linus?
We?
Posted on Reply
#7
GlacierNine
There's really nothing wrong with Linus's content. No, it's not as in depth as an Anandtech or AdoredTV article, but that's not the point.

He's rarely outright wrong, he usually admits it when he IS wrong, and the greatest sin he commits on any sort of regular basis is just not going into enough depth. He's not lying to his viewers, he doesn't shill, he doesn't encourage fanboying, he's just creating content that is geared towards younger people getting into tech, and not crusty nerds who already have opinions on everything.

If you're tech literate enough to be annoyed at the stuff Linus oversimplifies or omits from explanations, then you're literally not the kind of person that needs Linus. And that's fine. It's not for you. Move on. It's great for the people who it's aimed at, and he's not *wrong* for deciding to explain things at a level of complexity his audience is comfortable with.

If you want more you have plenty of great options - Gamers Nexus, AdoredTV, etc.
Posted on Reply
#8
r.h.p
Whats wrong with linus ? nothing , this is news worth forwarding
Posted on Reply
#9
londiste
Threadripper being able to do this in the first place is an amazing achievement. Although from the looks of it whatever software solution they are using for it is incredibly inefficient. Would be curious to know how it would work on a well-optimized software implementation.

However, keep in mind that this is a $4000 CPU that is 416mm^2 14nm IO Die plus 8x 74mm^2 CCD's. Not accounting for the huge IO Die, this is on 592mm^2 state of the art 7nm CPU cores. Bascially, you can have more than two 5700XT's from the same die space.
Vya DomusThere is somewhat of a misunderstanding of how GPU cores are actually counted. What Nvidia calls a "CUDA core" is more like a thread, the SM is the real core, that being said they both have similar amounts of "cores" 64 vs 72. There are other reasons why they are faster such as having orders of magnitude more threads and execution units. They can fit more cores on a GPU primarily because the caches are much smaller, half of a typical CPU is just the cache.
Zen2 core has two 128-bit FPU-s that should be able to be divided further to four 64-bit units.
CUDA Core (SM, Streaming Multiprocessor) has 64 32-bit shader units in it.
While their capabilities are widely different and Zen2's FPU is a lot more powerful in a couple ways, SM has a lot more raw FP32 compute power which is what graphics is geared for.
Posted on Reply
#11
NicklasAPJ
GlacierNineThere's really nothing wrong with Linus's content. No, it's not as in depth as an Anandtech or AdoredTV article, but that's not the point.

He's rarely outright wrong, he usually admits it when he IS wrong, and the greatest sin he commits on any sort of regular basis is just not going into enough depth. He's not lying to his viewers, he doesn't shill, he doesn't encourage fanboying, he's just creating content that is geared towards younger people getting into tech, and not crusty nerds who already have opinions on everything.

If you're tech literate enough to be annoyed at the stuff Linus oversimplifies or omits from explanations, then you're literally not the kind of person that needs Linus. And that's fine. It's not for you. Move on. It's great for the people who it's aimed at, and he's not *wrong* for deciding to explain things at a level of complexity his audience is comfortable with.

If you want more you have plenty of great options - Gamers Nexus, AdoredTV, etc.
“AdoredTV” Lol... Cant watch him cause of a Big fan boy of amd and intel/nvidia hater that he is..

Gamernexus all The Way, How it should be.
Posted on Reply
#12
notb
Vya DomusA top of the line CPU can do a couple of hundreds floating points operations per clock cycle, a top end GPU a couple of thousands. There are many more reasons but that's the gist of it. Of course that extra performance isn't free, a GPU can't do sequential instructions with a lot of branching and whatnot very good.
That's factually correct and yet has very little importance for the issue of gaming performance.
The reason GPUs exist is - basically - because they can produce a video signal. And for a very long time they were not suitable for any kind of sensible computing - compared to CPUs available at that point. They were even slower at rendering - it's just that single-core CPUs were to precious to waste on that.

Then 3D gaming became mainstream - built around simplified techniques. GPUs evolved to do them efficiently. CPUs didn't.

Maybe if 3D gaming went the ray tracing route from the start (the path it moves towards today), we may have got to high-core CPUs earlier, while GPUs would remain the humble devices they used to be ~20 years ago (with coolers similar to what we now put on chipsets...).
Posted on Reply
#13
londiste
The term for the card producing the video signal is video card and these existed long before GPU was a thing. GPU as a term was brought into use by Sony for Playstation and in PC space by Nvidia for Geforce 256. GPU implies processing, meaning compute. The reason GPUs exist is specialized hardware being much more efficient in what they are specialized for.
Posted on Reply
#14
lexluthermiester
Ferrum MasterWhy are we endorsing Linus?
Because he makes interesting video's that are relevant to TPU's readers.

Why do you care?
Posted on Reply
#15
Daven
Let me test an analogy to help understand the difference between a CPU and a GPU.

You are a king with two groups:
Group A: A drafted army of 5000 simple farmers, artisans, bakers, teachers, etc...
Group B: 16 super highly skilled trained specialists in the art of combat, stealth, mechanics, weapons, etc.

As king you have two orders:
Order 1: Defeat an army of 5000 simple peasants from the neighboring kingdom
Order 2: Kidnap the neighboring kings daughter that is being protected by the same 5000 simple peasants

Now duplicate each order 1000's of time but keep in mind that order 1 involves the exact same opposing army each time but order 2 changes to sabotage a mill, steal plans from a general, poison a water supply, etc.

If you know which group to match to which order then you will have complete understanding of the difference between CPUs and GPUs.

In the case of this article, the King now has the resources to grow the size of Group B to 128 trained specialists (TR 3990X) and given the order to take on a much smaller army of 500 peasants (old Crysis game). Its an extreme waste of specialists/resources as you would rather send in the 5000 drafted army but hey sometimes the king does stupid things for fun!
Posted on Reply
#16
Regeneration
NGOHQ.COM
The game runs windowed at low-resolution software mode and its lagging heavily. So no, it cannot run Crysis.
Posted on Reply
#17
notb
londisteThe term for the card producing the video signal is video card and these existed long before GPU was a thing. GPU as a term was brought into use by Sony for Playstation and in PC space by Nvidia for Geforce 256. GPU implies processing, meaning compute. The reason GPUs exist is specialized hardware being much more efficient in what they are specialized for.
The point I was trying to make is: the reason why GPUs are so much more efficient in games today is not that creating 3D graphics is intrinsically benefiting from hundreds of smaller cores.
They are much better because they've been optimized for a particular rendering model that we've chosen to use in games.
Posted on Reply
#18
GlacierNine
RegenerationThe game runs windowed at low-resolution software mode and its lagging heavily. So no, it cannot run Crysis.
By definition, running it poorly is still running it.

Also why are you clarifying that it's in software mode? Of course it's in software mode. If it were in hardware mode it would be running on the GPU instead of entirely on the CPU, which is the point of the demonstration.

Lesser CPUs would be utterly incapable of this demonstration. The fact we've gone from being unable to run Crysis on many dedicated cards, to being able to run Crysis without any sort of dedicated GPU at all is a remarkable technological achievement and shows just how far we've come - and this demo is, well, sure, it's not bar charts and graphs, but it's a fun little experiment with a surprising result.
Posted on Reply
#19
Regeneration
NGOHQ.COM
GlacierNineBy definition, running it poorly is still running it.

Also why are you clarifying that it's in software mode? Of course it's in software mode. If it were in hardware mode it would be running on the GPU instead of entirely on the CPU, which is the point of the demonstration.

Lesser CPUs would be utterly incapable of this demonstration. The fact we've gone from being unable to run Crysis on many dedicated cards, to being able to run Crysis without any sort of dedicated GPU at all is a remarkable technological achievement and shows just how far we've come - and this demo is, well, sure, it's not bar charts and graphs, but it's a fun little experiment with a surprising result.
Almost every CPU made in the last decade can achieve the same.
Posted on Reply
#20
Aldain
RegenerationAlmost every CPU made in the last decade can achieve the same.
lol

not they can not
Posted on Reply
#21
GlacierNine
RegenerationAlmost every CPU made in the last decade can achieve the same.
Oh I'm quite sure they'd be able to open the game. I'm not convinced it would be feasible to actually play it, however poorly, on older or lower end hardware however.

In any case, Linus wasn't exactly seeking to demonstrate the obsolescence of the GPU with this. He was clearly just looking for a novel way to demonstrate that TR 3000 is capable of doing things previously reserved for dedicated hardware, through sheer brute force. Mission accomplished. What was being rendered on that screen wasn't exactly 300FPS 8k per-eye VR, but it was certainly well beyond anything previously achievable with a desktop CPU and no GPU rendering.
Posted on Reply
#22
Regeneration
NGOHQ.COM
Everyone can run Crysis without a GPU.

Almost every CPU from this decade is capable of running games in software renderer mode.


Above is a video of my very own 10-year-old Xeon running Crysis similarly like Linus.

Despite the nonsense, you'll need a GPU to run games on the highest quality, resolution, at decent FPS.
Posted on Reply
#23
Mouth of Sauron
CPUs and GPUs went separate paths long time ago.

In the 'dawn of computing', the GPU main function was 'ability to put a picture to the screen' (something Intel IGPs retain as the only merit, until today, just couldn't resist :) ) - but at *that* time HDDs has a separate controller cards, not integrated on mainboard, and mouse was an optional piece of hardware. Bunch of other stuff wasn't integrated, too. Main (at that time) competitors to IBM MDA/CGA/EGA/VGA/XGA had additional options regarding number of colour or resolution (we could mention Hercules, a monochrome alternative to MDA text-only and CGA 4 colours, low resolution card) - NEVER at any kind of performance, CPU did all the work...

Stuff like Voodoo and GeForce changed a picture and started differentiation, up to a scale we have today.

It's not true that GPUs don't compute - they compute (in non-gaming terms) very much, not just super-computers, but in all the tasks that are suitable for them - say F@H or coin-mining... They would do it MUCH MORE, if the HSA ever reached a part of its potential...

TR achievement is interesting if nothing else - and Linus-basing is just... immature? What, if somebody else came for the (basically) proof-of-concept idea and tested it, then everything would be alright?

Oh, yes - GPU hardware ray-tracing is a lie... Best scenario and most honest is given on AMD slide:


A *selective* stuff can be done in ray-tracing scene can be ray-traced by individual GPUs, but the rendering of the sequential screens is so variable in computing power that it's... jut not possible in next, one after... and we shall see how many generations after... Therefore, idea that GPUs could've started with doing ray-traced is... hmmm...
Posted on Reply
#24
IceShroom
NicklasAPJGamernexus all The Way, How it should be.
The drama queen and most biased youtuber. The one we need.
Posted on Reply
#25
Fluffmeister
RegenerationEveryone can run Crysis without a GPU.

Almost every CPU from this decade is capable of running games in software renderer mode.


Above is a video of my very own 10-year-old Xeon running Crysis similarly like Linus.

Despite the non-sense, you'll need a GPU to run games on the highest quality, resolution, at decent FPS.
Was it really necessary to murder that poor turtle?
Posted on Reply
Add your own comment
Nov 21st, 2024 08:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts