• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Rise of the Tomb Raider to Get DirectX 12 Eye Candy Soon?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Rise of Tomb Raider could be among the first AAA games that take advantage of DirectX 12, with developer Crystal Dynamics planning a massive update that adds a new renderer, and new content (VFX, geometry, textures). The latest version of the game features an ominously named "DX12.dll" library in its folder, and while it doesn't support DirectX 12 at the moment, a renderer selection has appeared in the game's launcher. DirectX 12 is currently only offered on Windows 10, with hardware support on NVIDIA "Kepler" and "Maxwell" GPUs, and on AMD Graphics CoreNext 1.1 and 1.2 GPUs.



View at TechPowerUp Main Site
 
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
thanks..
to much bs around about what supports what. ya nv has to add more to the driver to get dx12 working right and it adds overhead but whatever. it will balance out.. hopefully leaning towards the better side. like when tessellation went wild it balanced out enough so both camps had a fair chance. well that was nvidia holding a lead with tesselation performance and now its amd with compute performance.
in a way they have both held each other back at times.. for the sake of monopoly and the windows ecosystem
 
Last edited:

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,943 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) 49" Philips Evnia OLED (49M2C8900)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Gunnr
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Moondrop Luna lights
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
I cant wait until this becomes a reality. I am really curious (if this comes to pass) as too what the performance numbers will be after the change. I would also love it if TPU spear headed the change should we get one. comparing 11 vs 12 in this title.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,048 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I cant wait until this becomes a reality. I am really curious (if this comes to pass) as too what the performance numbers will be after the change. I would also love it if TPU spear headed the change should we get one. comparing 11 vs 12 in this title.

Given how good the game looks, any change in performance between DX11 and DX12, would be an interesting debate.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
DirectX 12 is currently only offered on Windows 10, with hardware support on NVIDIA "Kepler" and "Maxwell" GPUs, and on AMD Graphics CoreNext 1.1 and 1.2 GPUs.

For a couple of months now DX12 is supported also on Fermi. Also GCN 1.0 had support for DX12 from the first day of Windows 10 official announcement.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
This probably just means the console extensions are being re-activated being it was already DX12 and was downgraded to DX11 for the PC.

Rise of the Tomb Raider uses async compute to render breathtaking volumetric lighting on Xbox One

During its presentation at SIGGRAPH 2015, Eidos Montreal revealed some interesting technical details regarding its forthcoming action adventure game, Rise of the Tomb Raider. The developer gave an overview of the advanced rendering techniques that are being used in Lara Croft’s latest outing.

Of all the rendering techniques used in the game, the most fascinating is its use of asynchronous compute for the generation of advanced volumetric lights. For this purpose, the developer has employed a resolution-agnostic voxel method, which allows volumetric lights to be rendered using asynchronous compute after the rendering of shadows, with correctly handled transparency composition.

The developer has also used its own in-house SSAO technique dubbed ‘Broad Temporal Ambient Obscurance’ (BTAO). The technique is inspired by SAO (Scalable Ambient Obscurance), and is claimed to be vastly superior to the industry leading ambient occlusion technique i.e. HBAO, in terms of both quality and performance. According to the developer, this new ambient occlusion technique is temporally stable and provides both near and far occlusion.

Like The Order: 1886 on the PS4, Rise of the Tomb Raider will also make use of the Sample Distribution Shadow Maps (SDSM) algorithm in order to allow efficient shadow map rendering. By adjusting itself to scene and camera position, the technique will enhance the overall shadow quality in the game.

Lastly, the game features procedural snow deformation that leverages compute shaders and hardware tessellation for fast and detailed results. Due to the agnostic nature of this approach, the developer believes that this technique will have many other applications in the future.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,764 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Or, DX12 may be a feature that got the axe and the DLL is a leftover. We'll see.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
I wonder if DX12 will be of better performance or they'll fuck it up just like with EVERY single bloody "new and faster DX" version of a game.
DX11 also promised better performance compared to DX9 and 10 and it ended up being slower just because they crammed 500 trillion metric tons of unnecessary tessellation in games. And I think they'll do exactly the same nonsense with DX12. All the advertised boost will be gone because they'll stuff bunch of useless stuff you have to look for with magnifying glass instead of just letting it be the way it is and boost performance significantly.

I don't understand their logic. Sure, fancier graphics sell a bit better with the enthusiast gamers, but better overall performance sells well with average gamers as well as enthusiasts, because that means the game will probably run super fast even at 4K with max details. Which is also something these days. Not sure why they bet everything on enthusiasts but cry in the same breath how PC gamers don't buy in huge enough numbers. I wonder why. Latest TR games already look good, no need to overbloat them with quasi quality and derp the performance entirely.
 
Joined
Jun 4, 2004
Messages
480 (0.06/day)
System Name Blackbird
Processor AMD Threadripper 3960X 24-core
Motherboard Gigabyte TRX40 Aorus Master
Cooling Full custom-loop water cooling, mostly Aqua Computer and EKWB stuff!
Memory 4x 16GB G.Skill Trident-Z RGB @3733-CL14
Video Card(s) Nvidia RTX 3090 FE
Storage Samsung 950PRO 512GB, Crusial P5 2TB, Samsung 850PRO 1TB
Display(s) LG 38GN950-B 38" IPS TFT, Dell U3011 30" IPS TFT
Case CaseLabs TH10A
Audio Device(s) Edifier S1000DB
Power Supply ASUS ROG Thor 1200W (SeaSonic)
Mouse Logitech MX Master
Keyboard SteelSeries Apex M800
Software MS Windows 10 Pro for Workstation
Benchmark Scores A lot.
I wonder if DX12 will be of better performance or they'll fuck it up just like with EVERY single bloody "new and faster DX" version of a game.
DX11 also promised better performance compared to DX9 and 10 and it ended up being slower just because they crammed 500 trillion metric tons of unnecessary tessellation in games. And I think they'll do exactly the same nonsense with DX12. All the advertised boost will be gone because they'll stuff bunch of useless stuff you have to look for with magnifying glass instead of just letting it be the way it is and boost performance significantly.

I don't understand their logic. Sure, fancier graphics sell a bit better with the enthusiast gamers, but better overall performance sells well with average gamers as well as enthusiasts, because that means the game will probably run super fast even at 4K with max details. Which is also something these days. Not sure why they bet everything on enthusiasts but cry in the same breath how PC gamers don't buy in huge enough numbers. I wonder why. Latest TR games already look good, no need to overbloat them with quasi quality and derp the performance entirely.

I don't quite agree. If you want your games to run faster, tone down the details! What's the point in saying my game runs super fast with full details and demand that at the same time the developers should not pack too much details into these games so that performance stays up? You know that cranking up details is totally optional, right? :rolleyes:

I mean if game performance is crippled, even with absolutely low settings, then this may be a valid argument, but talking about max. settings and demanding they should be reduced makes just no sense at all.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
I don't quite agree. If you want your games to run faster, tone down the details! What's the point in saying my game runs super fast with full details and demand that at the same time the developers should not pack too much details into these games so that performance stays up? You know that cranking up details is totally optional, right? :rolleyes:

I mean if game performance is crippled, even with absolutely low settings, then this may be a valid argument, but talking about max. settings and demanding they should be reduced makes just no sense at all.

You don't understand what they've been doing with tessellation. 300 billion polygons. All wasted on some pointless shit you can't even see while railings 2 meters away are all blocky and made of 15 polygons. It's stupid and not only makes game run like shit even on top end rigs, it totally defeats the point of making things faster if you then waste all the gains in pointless stupid nonsense.

And I have a feeling they'll do exactly the same nonsense with DX12 gains. The reason why I'm ranting about it? Games still look like shite and running like they used to. Meaning you don't really improve anything. Not performance and not visuals. Priorities. Something devs on consoles actually understand, but on PC, well fuck that, gamers can just buy extra 32GB of RAM and QuadSLi cards because we're lazy and can't be bothered doing shit properly. That's why.

Also your logic doesn't float. So, I'll be decreasing my game details, making game actually look worse to achieve performance just because someone decided to spend 300 billion polygons on some retarded rock on the side of the road. I've seen way too much retardedly designed games to just forget about it. We could make leaps in performance while increasing overall level of detail of all games thanks to performance boost of DX12, but instead they'll just waste it all on pointless nonsense.
 
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
You don't understand what they've been doing with tessellation. 300 billion polygons. All wasted on some pointless shit you can't even see while railings 2 meters away are all blocky and made of 15 polygons. It's stupid and not only makes game run like shit even on top end rigs, it totally defeats the point of making things faster if you then waste all the gains in pointless stupid nonsense.

And I have a feeling they'll do exactly the same nonsense with DX12 gains. The reason why I'm ranting about it? Games still look like shite and running like they used to. Meaning you don't really improve anything. Not performance and not visuals. Priorities. Something devs on consoles actually understand, but on PC, well fuck that, gamers can just buy extra 32GB of RAM and QuadSLi cards because we're lazy and can't be bothered doing shit properly. That's why.

Also your logic doesn't float. So, I'll be decreasing my game details, making game actually look worse to achieve performance just because someone decided to spend 300 billion polygons on some retarded rock on the side of the road. I've seen way too much retardedly designed games to just forget about it. We could make leaps in performance while increasing overall level of detail of all games thanks to performance boost of DX12, but instead they'll just waste it all on pointless nonsense.
hopefully there will be a nice surprise for us
 
Joined
Apr 29, 2014
Messages
4,290 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Sounds awesome, I want to see what it changes!
 
Joined
Aug 15, 2008
Messages
5,941 (1.00/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
DX12 comes with a lot of advancements that previous iterations can't compare to. Overhead and drawcalls isn't really something a lazy dev can fuck up now.
 
Joined
Jun 4, 2004
Messages
480 (0.06/day)
System Name Blackbird
Processor AMD Threadripper 3960X 24-core
Motherboard Gigabyte TRX40 Aorus Master
Cooling Full custom-loop water cooling, mostly Aqua Computer and EKWB stuff!
Memory 4x 16GB G.Skill Trident-Z RGB @3733-CL14
Video Card(s) Nvidia RTX 3090 FE
Storage Samsung 950PRO 512GB, Crusial P5 2TB, Samsung 850PRO 1TB
Display(s) LG 38GN950-B 38" IPS TFT, Dell U3011 30" IPS TFT
Case CaseLabs TH10A
Audio Device(s) Edifier S1000DB
Power Supply ASUS ROG Thor 1200W (SeaSonic)
Mouse Logitech MX Master
Keyboard SteelSeries Apex M800
Software MS Windows 10 Pro for Workstation
Benchmark Scores A lot.
You don't understand what they've been doing with tessellation. 300 billion polygons. All wasted on some pointless shit you can't even see while railings 2 meters away are all blocky and made of 15 polygons. It's stupid and not only makes game run like shit even on top end rigs, it totally defeats the point of making things faster if you then waste all the gains in pointless stupid nonsense.

And I have a feeling they'll do exactly the same nonsense with DX12 gains. The reason why I'm ranting about it? Games still look like shite and running like they used to. Meaning you don't really improve anything. Not performance and not visuals. Priorities. Something devs on consoles actually understand, but on PC, well fuck that, gamers can just buy extra 32GB of RAM and QuadSLi cards because we're lazy and can't be bothered doing shit properly. That's why.

Also your logic doesn't float. So, I'll be decreasing my game details, making game actually look worse to achieve performance just because someone decided to spend 300 billion polygons on some retarded rock on the side of the road. I've seen way too much retardedly designed games to just forget about it. We could make leaps in performance while increasing overall level of detail of all games thanks to performance boost of DX12, but instead they'll just waste it all on pointless nonsense.

300 billion is probably a bit much. ;)
Nonetheless is tesselation in combination with displacement maps a very interesting feature wich adds a lot to the realism in a rendered scene. All I read out of your rant is that you want control over the tesselation iterations and propably a way to reduce tesselation when the surface to be tesselated is farther away (some kind of LoD). I'm pretty sure the latter is already implemented in most games if not all (although maybe not configurable).
If you don't like tesselation at all, it can be switched of in the settings dialog. If you want tesselation enabled, but need to adjust the amount of tesselation, this can be configured (at least with NVIDIA) in the application profile for said game and is called 'Maximum Tessellation Level'.

Regarding performance of Tomb Raider: On my system, the 2013 Tomb Raider game looked worse compared to the 2016 version (obviously), and also performed worse FPS-wise (I game at 1600p btw. which is about 2x the 1080p pixel count).
Back then I had AMD HD6970 hardware and today I own a pair of NVIDIA 980Ti's.
But even today with the SLI setup, both games stay above 60 FPS all the time with everything turned up and on. So no reason to complain about sub-par performance here.
 
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
300 billion is probably a bit much. ;)
Nonetheless is tesselation in combination with displacement maps a very interesting feature wich adds a lot to the realism in a rendered scene. All I read out of your rant is that you want control over the tesselation iterations and propably a way to reduce tesselation when the surface to be tesselated is farther away (some kind of LoD). I'm pretty sure the latter is already implemented in most games if not all (although maybe not configurable).
If you don't like tesselation at all, it can be switched of in the Settings dialog. If you want tesselation enabled, but need to adjust the amount of tesselation, this can be configured (at least with nvidia) in the application profile for said game and is called 'Maximum Tessellation Level'.

Regarding performance of Tomb Raider: On my system, the 2013 Tomb Raider game looked worse compared to the 2016 version (obviously), and also performed worse FPS-wise (I game at 1600p btw. which is about 2x the 1080p pixel count).
Back then I had AMD HD6970 hardware and today I own a pair of NVIDIA 980Ti's.
But even today with the SLI setup, both games stay above 60 FPS all the time with everything turned up and on. So no reason to complain about sub-par performance here.
AMD got all the tesselation settings and even AMD optimized for wild tesselation that needs to be tamed.
 
Last edited:

Brother Drake

New Member
Joined
Jul 24, 2015
Messages
10 (0.00/day)
I wish developers would stop saying their game is DX12 when they just paste on one or two minor features. If a game is truly DX12 it has to be designed that way from the ground up. To me the most important feature of DX12 is the ability to use multiple non identical GPUs. I especially like that DX12 can utilize the iGPU to do post processing while the graphics card(s) do the rest of the work It's estimated that this will provide at least a 20% performance boost without costing anything at all. Free? I like it! Also I can keep using my Maxwell GPU when I upgrade to Pascal. No more SLI/Crossfire bridges.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
If they used two controllable settings like "Quality that you have now with awesome DX12 framerate boost" and "Use all settings in unnecessarily stupendous way", I'm fine with it. But my hunch says, they'll go the stupendous way only...
 
Joined
Nov 18, 2010
Messages
7,531 (1.47/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
Steam just got an update and stated about DX12 support.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,943 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) 49" Philips Evnia OLED (49M2C8900)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Gunnr
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Moondrop Luna lights
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I wish developers would stop saying their game is DX12 when they just paste on one or two minor features. If a game is truly DX12 it has to be designed that way from the ground up.
nah not really dx12 is a evolution since back in the day with dx9...there is plenty of dx11 games that can be played in dx9 mode and dx12 is dx11 with draw calls off the chart so there is no more wasted gpu power. what you say is the most important part to you will most likely put you into a minority even tho it is really cool. being that only a small percentage even use more than one gpu. getting amd and nv to play ball with each other to make it happen is a slim chance since it would mostly be slim profit if any at all.
 
Last edited:
Joined
Feb 9, 2009
Messages
1,618 (0.28/day)
blah blah blah crysis 2
what a load of CRAP, every single one? i'm not sorry, you're spreading FUD

which games or benchmarks (that arent crysis 2 or hairworks enabled... or ubisoft unoptimized) have so much 'tesselation' (as if that's the only new feature devs use) causing worse than dx9 or even ogl performance, with no option to turn off or override in the driver?

star trek mmo's dx11 update with identical visuals brought so much performance that they had to make PR about it

even in the dx10 days, when i had 4870x2, crysis warhead was smoother in dx10 than dx9 even if the fps counter was a few numbers lower, i have never seen a regression with a new dx version unless it's brand new buggy drivers or OS... plus, nvidia has proven multithreaded gains

the main detail i see that hurts fps the most is HB/HDAO, so i switch to SSAO or off instead, given that my 570m or 660 arent powerful gpus... dont you think i would be one of the first to notice problems with dx11? how can i trust your word with your high end specs without real examples

by the way, crysis 2 runs great for how it looks, you are free to turn off tesselation or tweak the cvars rather than switching to dx9

also breit's logic is fine if you're only turning down the newly added details that are already disabled in the previous dx version of the game

I wish developers would stop saying their game is DX12 when they just paste on one or two minor features. If a game is truly DX12 it has to be designed that way from the ground up. To me the most important feature of DX12 is the ability to use multiple non identical GPUs. I especially like that DX12 can utilize the iGPU to do post processing while the graphics card(s) do the rest of the work It's estimated that this will provide at least a 20% performance boost without costing anything at all. Free? I like it! Also I can keep using my Maxwell GPU when I upgrade to Pascal. No more SLI/Crossfire bridges.
but that's an optional feature... if it's running dx12 it IS dx12, what else can they call it?? 'getting dx12' is different from 'is dx12'

your dream should be 'built from the ground up for dx12', aka exclusive
 
Joined
Aug 15, 2008
Messages
5,941 (1.00/day)
Location
Watauga, Texas
System Name Univac SLI Edition
Processor Intel Xeon 1650 V3 @ 4.2GHz
Motherboard eVGA X99 FTW K
Cooling EK Supremacy EVO, Swiftech MCP50x, Alphacool NeXXos UT60 360, Black Ice GTX 360
Memory 2x16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Nvidia Titan X Tri-SLI w/ EK Blocks
Storage HyperX Predator 240GB PCI-E, Samsung 850 Pro 512GB
Display(s) Dell UltraSharp 34" Ultra-Wide (U3415W) / (Samsung 48" Curved 4k)
Case Phanteks Enthoo Pro M Acrylic Edition
Audio Device(s) Sound Blaster Z
Power Supply Thermaltake 1350watt Toughpower Modular
Mouse Logitech G502
Keyboard CODE 10 keyless MX Clears
Software Windows 10 Pro
what a load of CRAP, every single one? i'm not sorry, you're spreading FUD

which games or benchmarks (that arent crysis 2 or hairworks enabled... or ubisoft unoptimized) have so much 'tesselation' (as if that's the only new feature devs use) causing worse than dx9 or even ogl performance, with no option to turn off or override in the driver?

star trek mmo's dx11 update with identical visuals brought so much performance that they had to make PR about it

even in the dx10 days, when i had 4870x2, crysis warhead was smoother in dx10 than dx9 even if the fps counter was a few numbers lower, i have never seen a regression with a new dx version unless it's brand new buggy drivers or OS... plus, nvidia has proven multithreaded gains

the main detail i see that hurts fps the most is HB/HDAO, so i switch to SSAO or off instead, given that my 570m or 660 arent powerful gpus... dont you think i would be one of the first to notice problems with dx11? how can i trust your word with your high end specs without real examples

by the way, crysis 2 runs great for how it looks, you are free to turn off tesselation or tweak the cvars rather than switching to dx9

also breit's logic is fine if you're only turning down the newly added details that are already disabled in the previous dx version of the game


but that's an optional feature... if it's running dx12 it IS dx12, what else can they call it?? 'getting dx12' is different from 'is dx12'

your dream should be 'built from the ground up for dx12', aka exclusive
Crysis and Warhead both ran like poop on my 4870 Crossfire setup in DX10. That was with 8GB of RAM in custom tweaked Vista with a high clocked e8400. Not rebutting the rest of your post, just found the 4870x2 comment a tad weird compared to my own experience.
 
Joined
Feb 9, 2009
Messages
1,618 (0.28/day)
Crysis and Warhead both ran like poop on my 4870 Crossfire setup in DX10. That was with 8GB of RAM in custom tweaked Vista with a high clocked e8400. Not rebutting the rest of your post, just found the 4870x2 comment a tad weird compared to my own experience.
which driver? i didnt mean it was fine around the game's launch so i'm also trying to remember, i have the fraps shots on another hard drive... but there was a definite change in the driver in summer 2009 that greatly reduced input lag during vsync so it's possibly around this time (far cry 2 was also improved, since then the dx10 minimum is equal to dx9 minimum with average+max dx10 being higher than dx9, not to mention less stuttering)

it should be in the release notes too, the same way there was a dx11/bf3 golden driver in 2012

edit: q9550 3.6ghz, 4gb ram... are you additionally saying single 4870 ran better than CF or simply that dx9 ran much more acceptable?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I'm less concerned with DX12 eye candy and more concerned with DX12 performance improvements.
 
Top