Monday, February 17th 2020
AMD Ryzen ThreadRipper is Capable of Running Crysis without a GPU
AMD has just recently launched its 3rd generation of Ryzen ThreadRipper CPUs, and it is already achieving some impressive stuff. In the world of PC gaming, there used to be a question whenever a new GPU arrives - "But can it run Crysis?". This question became meme over the years as GPU outgrew requirements for the Crysis game, and any GPU nowadays is capable of running the game. However, have you ever wondered if your CPU can run Crysis, alone without a GPU? Me neither, but Linus from LinusTechTips taught of that.
The CPU, of course, can not run any game, as it lacks the hardware for graphics output, but being that AMD's ThreadRipper 3990X, a 64 core/128 thread monster has raw compute power capable of running Crysis, it can process the game. Running in software mode, Linus got the CPU to process the game and run it without any help from a GPU. This alone is a massive achievement for AMD ThreadRipper, as it shows that CPUs reached a point where their raw computing power is on pair with some older GPU and that we can achieve a lot of interesting things. You can watch the video down below.
The CPU, of course, can not run any game, as it lacks the hardware for graphics output, but being that AMD's ThreadRipper 3990X, a 64 core/128 thread monster has raw compute power capable of running Crysis, it can process the game. Running in software mode, Linus got the CPU to process the game and run it without any help from a GPU. This alone is a massive achievement for AMD ThreadRipper, as it shows that CPUs reached a point where their raw computing power is on pair with some older GPU and that we can achieve a lot of interesting things. You can watch the video down below.
56 Comments on AMD Ryzen ThreadRipper is Capable of Running Crysis without a GPU
On the topic, can someone explain this to me?
Like why is this so amazing? why is this so hard to do with cpu's and so...well nowadays, ez for gpu's?
Are there CPU tasks we can let an RTX Titan do at an equivalent of 13 fps and be amazed at that?
Turns out that there are some problems benefit hugely from processing it in parallel, but there are also some that can't be processed parallely.
The 3990X being such a high asking price CPU, only has 64 cores. The Titan RTX however, has 4k cores iirc.
Well, why does the TR has so few cores? Because it can do all 64 cores @ ~3.3GHz, where as the Titan RTX can only do @ ~1.7GHz, if not less.
Typically, games are very demanding in rendering objects. The details of leaves, dirt etc.This is a task that can be parallely executed, so a GPU, having a huge amount of cores, benefit from it.
That's why many modern GPUs can run Crisis in ease.
There are also tasks that can't be parallely executed. In that case, the only way to achieve better performance is bumping up the speed. This is where the GHz of a CPU shines.
He's rarely outright wrong, he usually admits it when he IS wrong, and the greatest sin he commits on any sort of regular basis is just not going into enough depth. He's not lying to his viewers, he doesn't shill, he doesn't encourage fanboying, he's just creating content that is geared towards younger people getting into tech, and not crusty nerds who already have opinions on everything.
If you're tech literate enough to be annoyed at the stuff Linus oversimplifies or omits from explanations, then you're literally not the kind of person that needs Linus. And that's fine. It's not for you. Move on. It's great for the people who it's aimed at, and he's not *wrong* for deciding to explain things at a level of complexity his audience is comfortable with.
If you want more you have plenty of great options - Gamers Nexus, AdoredTV, etc.
However, keep in mind that this is a $4000 CPU that is 416mm^2 14nm IO Die plus 8x 74mm^2 CCD's. Not accounting for the huge IO Die, this is on 592mm^2 state of the art 7nm CPU cores. Bascially, you can have more than two 5700XT's from the same die space. Zen2 core has two 128-bit FPU-s that should be able to be divided further to four 64-bit units.
CUDA Core (SM, Streaming Multiprocessor) has 64 32-bit shader units in it.
While their capabilities are widely different and Zen2's FPU is a lot more powerful in a couple ways, SM has a lot more raw FP32 compute power which is what graphics is geared for.
www.techpowerup.com/forums/threads/tr3990x-can-it-run-crysis.263818/post-4206967
Gamernexus all The Way, How it should be.
The reason GPUs exist is - basically - because they can produce a video signal. And for a very long time they were not suitable for any kind of sensible computing - compared to CPUs available at that point. They were even slower at rendering - it's just that single-core CPUs were to precious to waste on that.
Then 3D gaming became mainstream - built around simplified techniques. GPUs evolved to do them efficiently. CPUs didn't.
Maybe if 3D gaming went the ray tracing route from the start (the path it moves towards today), we may have got to high-core CPUs earlier, while GPUs would remain the humble devices they used to be ~20 years ago (with coolers similar to what we now put on chipsets...).
Why do you care?
You are a king with two groups:
Group A: A drafted army of 5000 simple farmers, artisans, bakers, teachers, etc...
Group B: 16 super highly skilled trained specialists in the art of combat, stealth, mechanics, weapons, etc.
As king you have two orders:
Order 1: Defeat an army of 5000 simple peasants from the neighboring kingdom
Order 2: Kidnap the neighboring kings daughter that is being protected by the same 5000 simple peasants
Now duplicate each order 1000's of time but keep in mind that order 1 involves the exact same opposing army each time but order 2 changes to sabotage a mill, steal plans from a general, poison a water supply, etc.
If you know which group to match to which order then you will have complete understanding of the difference between CPUs and GPUs.
In the case of this article, the King now has the resources to grow the size of Group B to 128 trained specialists (TR 3990X) and given the order to take on a much smaller army of 500 peasants (old Crysis game). Its an extreme waste of specialists/resources as you would rather send in the 5000 drafted army but hey sometimes the king does stupid things for fun!
They are much better because they've been optimized for a particular rendering model that we've chosen to use in games.
Also why are you clarifying that it's in software mode? Of course it's in software mode. If it were in hardware mode it would be running on the GPU instead of entirely on the CPU, which is the point of the demonstration.
Lesser CPUs would be utterly incapable of this demonstration. The fact we've gone from being unable to run Crysis on many dedicated cards, to being able to run Crysis without any sort of dedicated GPU at all is a remarkable technological achievement and shows just how far we've come - and this demo is, well, sure, it's not bar charts and graphs, but it's a fun little experiment with a surprising result.
not they can not
In any case, Linus wasn't exactly seeking to demonstrate the obsolescence of the GPU with this. He was clearly just looking for a novel way to demonstrate that TR 3000 is capable of doing things previously reserved for dedicated hardware, through sheer brute force. Mission accomplished. What was being rendered on that screen wasn't exactly 300FPS 8k per-eye VR, but it was certainly well beyond anything previously achievable with a desktop CPU and no GPU rendering.
Almost every CPU from this decade is capable of running games in software renderer mode.
Above is a video of my very own 10-year-old Xeon running Crysis similarly like Linus.
Despite the nonsense, you'll need a GPU to run games on the highest quality, resolution, at decent FPS.
In the 'dawn of computing', the GPU main function was 'ability to put a picture to the screen' (something Intel IGPs retain as the only merit, until today, just couldn't resist :) ) - but at *that* time HDDs has a separate controller cards, not integrated on mainboard, and mouse was an optional piece of hardware. Bunch of other stuff wasn't integrated, too. Main (at that time) competitors to IBM MDA/CGA/EGA/VGA/XGA had additional options regarding number of colour or resolution (we could mention Hercules, a monochrome alternative to MDA text-only and CGA 4 colours, low resolution card) - NEVER at any kind of performance, CPU did all the work...
Stuff like Voodoo and GeForce changed a picture and started differentiation, up to a scale we have today.
It's not true that GPUs don't compute - they compute (in non-gaming terms) very much, not just super-computers, but in all the tasks that are suitable for them - say F@H or coin-mining... They would do it MUCH MORE, if the HSA ever reached a part of its potential...
TR achievement is interesting if nothing else - and Linus-basing is just... immature? What, if somebody else came for the (basically) proof-of-concept idea and tested it, then everything would be alright?
Oh, yes - GPU hardware ray-tracing is a lie... Best scenario and most honest is given on AMD slide:
A *selective* stuff can be done in ray-tracing scene can be ray-traced by individual GPUs, but the rendering of the sequential screens is so variable in computing power that it's... jut not possible in next, one after... and we shall see how many generations after... Therefore, idea that GPUs could've started with doing ray-traced is... hmmm...