Wednesday, June 21st 2017

CryEngine to Support Vulkan Renderer in Upcoming 5.4 Update

CryEngine, the rendering prodigy responsible for some of the most visually impressive titles ever to grace our personal computing and gaming shores, is getting a Vulkan renderer. The news were broken down by the team at Crytek through a blog post, where they reaffirmed their commitment to proper GitHub support and updates for their game engine. The company puts it this way:

"Vulkan renderer
Following on from the renderer refactoring and DirectX 12 implementation, the team has been hard at work implementing a Vulkan renderer. The code can be seen in Code/RenderDll/XRenderD3D9/Vulkan/… although the feature is not functional, yet. We want to make these changes available to you for review whilst we are currently stabilizing the engine for our 5.4 release. So you can track our progress on GitHub until 5.4 is finally here by the end of July."
This comes as good news for everyone, I wager, since Vulkan has been showing more promise in actual performance improvements in real world gaming scenarios than Microsoft's poster child DX12. Granted that CryEngine isn't the resource hog and graphics-card humbler that it was once before, when it birthed the famous "But can it run Crysis?" adage. It has turned from being one of the more resource intensive engines to a more streamlined, arguably better performant one. Here's hoping Crytek's upcoming Hunt: Showdown already provides support for the Vulkan renderer. A game as graphically beautiful as that one clearly deserves the performance to go with it.
Source: Cryengine.com
Add your own comment

31 Comments on CryEngine to Support Vulkan Renderer in Upcoming 5.4 Update

#1
Unregistered
More widespread vulcan is good for me as a likely early vega adopter. Definitley not conplaining! ;)
Posted on Edit | Reply
#2
TheGuruStud
Does this mean they'll stop accepting those truck loads of cash from nvidia? Lol
Posted on Reply
#3
Fluffmeister
TheGuruStudDoes this mean they'll stop accepting those truck loads of cash from nvidia? Lol
Did Nvidia steal your first born child or something?
Posted on Reply
#4
Aquinus
Resident Wat-man
FluffmeisterDid Nvidia steal your first born child or something?
That or a kidney but, that's the price you pay, right? :laugh:
Posted on Reply
#5
TheGuruStud
FluffmeisterDid Nvidia steal your first born child or something?
I guess everyone forgot about the millions they gave crytek and nvidia cheating with insane tessellation on flat objects and underground river. Anytime devs are questioned on them/nvidia rigging their games it's, "Our lawyers won't let us comment."

Nope, not gonna let it go.
Posted on Reply
#6
Fluffmeister
TheGuruStudI guess everyone forgot about the millions they gave crytek and nvidia cheating with insane tessellation on flat objects and underground river. Anytime devs are questioned on them/nvidia rigging their games it's, "Our lawyers won't let us comment."

Nope, not gonna let it go.
I doubt people forgot, they probably just have bigger issues to deal with in their lives.

But don't let it go, scream and shout, fight the power! Never ever let it go.
Posted on Reply
#7
cdawall
where the hell are my stars
To be fair maybe amd should just get better at tesselation?
Posted on Reply
#8
TheGuruStud
cdawallTo be fair maybe amd should just get better at tesselation?
That's irrelevant. It hurt nvidia's perf by 17% (AMD's by 30) with no visual improvement.

And if anyone recalls, as soon as the new AMD cards came out with great tessellation perf, all of the sudden it wasn't overly crammed into games and wasn't important. Yeah, just a coincidence.

Short memories are for fools to folly again and again (then victim blame out of their own stupidity).
Posted on Reply
#9
Fluffmeister
Indeed, I for one will never forget the day AMD bought an Advertorial on TPU.
Posted on Reply
#10
cdawall
where the hell are my stars
TheGuruStudThat's irrelevant. It hurt nvidia's perf by 17% (AMD's by 30) with no visual improvement.

And if anyone recalls, as soon as the new AMD cards came out with great tessellation perf, all of the sudden it wasn't overly crammed into games and wasn't important. Yeah, just a coincidence.

Short memories are for fools to folly again and again (then victim blame out of their own stupidity).
You forget I remember, I just don't care. AMD isn't perfect nor is Nvidia, but at least one of the two companies actually released a new GPU product this year.
Posted on Reply
#11
TheGuruStud
cdawallYou forget I remember, I just don't care. AMD isn't perfect nor is Nvidia, but at least one of the two companies actually released a new GPU product this year.
AMD or the refresh times 3? /serious
Posted on Reply
#12
cdawall
where the hell are my stars
TheGuruStudAMD or the refresh times 3? /serious
The 580 certainly wasn't anything new, it didn't even offer better performance per watt. It was just as bad if not worse than the 480. As it sits right now we are seeing some massive short falls with the way their memory controllers work.
Posted on Reply
#13
3lfk1ng
This was to be expected when the Star Citizen team announced that they would be dropping support for DX11/DX12 in favor of going 100% Vulkan.
Posted on Reply
#14
jigar2speed
cdawallTo be fair maybe amd should just get better at tesselation?
So that they can tessellate the heck out even more ??? Nice logic. :shadedshu:
cdawallYou forget I remember, I just don't care. AMD isn't perfect nor is Nvidia, but at least one of the two companies actually released a new GPU product this year.
Hey atleast one of the two companies did actually release a New CPU this year. /s
Posted on Reply
#15
FordGT90Concept
"I go fast!1!11!1!"
3lfk1ngThis was to be expected when the Star Citizen team announced that they would be dropping support for DX11/DX12 in favor of going 100% Vulkan.
They dropped support for Crytek and went Amazon. I'm not even certain Star Citizen benefits from this announcement because RSI changed engines.
Posted on Reply
#16
cdawall
where the hell are my stars
jigar2speedSo that they can tessellate the heck out even more ??? Nice logic. :shadedshu:



Hey atleast one of the two companies did actually release a New CPU this year. /s
Nvidia doesn't make cpus that aren't arm based. The new amd chips whole inexpensive still disappointed me. So many issues, real eye opener to what's going on there.

If both companies offered similar performance in games this nonsense where you can nitpick which card based off of one batch of rendering wouldn't exist. Amd needs to stop acting like a second tier company. Now mind you I flip flop between these companies like it is going out if style, but last couple of amd generations have been a let down...
Posted on Reply
#17
3lfk1ng
FordGT90ConceptThey dropped support for Crytek and went Amazon. I'm not even certain Star Citizen benefits from this announcement because RSI changed engines.
Amazon's Lumberyard engine is still Crytek. In fact, it's based on the same 3.8.1 fork. They are still planning switching to Vulkan to support all Windows OS and Linux. As linked in my earlier post.
Posted on Reply
#18
Dethroy
When will the flame wars finally come to an end ... :kookoo:
Posted on Reply
#19
FordGT90Concept
"I go fast!1!11!1!"
3lfk1ngAmazon's Lumberyard engine is still Crytek. In fact, it's based on the same 3.8.1 fork. They are still planning switching to Vulkan to support all Windows OS and Linux. As linked in my earlier post.
We're talking 5.4 Cryengine though. That fork was a long time ago. Until I see Amazon/RSI confirm it's relevant, I'll believe it's not.
Posted on Reply
#20
Liviu Cojocaru
Ohhh the old AMD vs Nvidia war...I used to be an AMD fan when I was younger they had great products back then (Athlon 64 times) nowadays I am just realistic. I see AMD on one hand trying their best to come up with innovative products on a decent price but they fail on the reliability side, on the other hand I see Nvidia releasing powerful and reliable hardware but really expensive. I am grateful to both companies (with the good and the bad) for the advance in technology that we see today
Posted on Reply
#21
Unregistered
Liviu CojocaruOhhh the old AMD vs Nvidia war...I used to be an AMD fan when I was younger they had great products back then (Athlon 64 times) nowadays I am just realistic. I see AMD on one hand trying their best to come up with innovative products on a decent price but they fail on the reliability side, on the other hand I see Nvidia releasing powerful and reliable hardware but really expensive. I am grateful to both companies (with the good and the bad) for the advance in technology that we see today
Rx 580, reliable and cheap, ryzen a few months after launch, reliable and cheap, intel i9, unreliable and expensive, i9 7940x+, good luck cooling them and getting a higher oc than threadripper (with the 7960x or 7980xe).

Pascal wasn't that reliable either at launch. Some cards ran at 900mhz or less before BIOS updates!
#22
cdawall
where the hell are my stars
Hugh MungusRx 580, reliable and cheap, ryzen a few months after launch, reliable and cheap, intel i9, unreliable and expensive, i9 7940x+, good luck cooling them and getting a higher oc than threadripper (with the 7960x or 7980xe).

Pascal wasn't that reliable either at launch. Some cards ran at 900mhz or less before BIOS updates!
How is Intel unreliable on their top end stuff....? Ryzen still has issues and having to do monthly bios updates in hope your ram will work at rated speed is bs. Amd released an unfinished product.

For those betting on threadripper remember every single issues ryzen has will be amplified since both ccx unit have to speak across the infinity fabric. That means pcie root complex, memory for the quad channel anything the os decides to swap between cores etc. I cannot imagine the bottleneck we will see once you do something like load a game up with 2 cards in sli.
Posted on Reply
#23
Unregistered
cdawallHow is Intel unreliable on their top end stuff....? Ryzen still has issues and having to do monthly bios updates in hope your ram will work at rated speed is bs. Amd released an unfinished product.

For those betting on threadripper remember every single issues ryzen has will be amplified since both ccx unit have to speak across the infinity fabric. That means pcie root complex, memory for the quad channel anything the os decides to swap between cores etc. I cannot imagine the bottleneck we will see once you do something like load a game up with 2 cards in sli.
Some motherboards barely work, cpu BIOS's are crap since turbo 3.0 barely works and some other stuff as well.

Ryzen hasn't really got issues anymore either, unless you mean the occasional driver update that makes things go wrong, but intel's driver updates aren't much better and it's mostly windows updates ruining things nowadays.

Threadripper essentially is going to use matured drivers and the extra ccx's aren't going to be an issue. Threadripper is likely going to use epyc drivers with a few tweaks here and there, so they should be stable enough and memory support is going to be great from the start since all those updates have already been implemented into motherboard BIOS's.
Posted on Edit | Reply
#24
cdawall
where the hell are my stars
Hugh MungusSome motherboards barely work, cpu BIOS's are crap since turbo 3.0 barely works and some other stuff as well.

Ryzen hasn't really got issues anymore either, unless you mean the occasional driver update that makes things go wrong, but intel's driver updates aren't much better and it's mostly windows updates ruining things nowadays.

Threadripper essentially is going to use matured drivers and the extra ccx's aren't going to be an issue. Threadripper is likely going to use epyc drivers with a few tweaks here and there, so they should be stable enough and memory support is going to be great from the start since all those updates have already been implemented into motherboard BIOS's.
I take it you don't actually own a Ryzen system?
Posted on Reply
#25
Unregistered
cdawallI take it you don't actually own a Ryzen system?
Just stick with what works and check if the updates don't have bugs. Simple. Two minutes max.

No, I don't own ryzen yet. Getting threadripper 16-core, though. I have done my research.
Posted on Edit | Reply
Add your own comment
May 21st, 2024 15:01 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts