Short review of Shadow of the Tomb Raider's gameplay mechanics
Short review of Shadow of the Tomb Raider's gameplay mechanics
I own the early access version and performed various tests on both DX11 and DX12 platforms. My observations: 1. Performance is roughly 20% higher than in Rise of the Tomb Raider according to official scores. RotTR: 4213x1764, everything runs at ultra settings, SMAA averages around 112 FPS with 1080ti in SLI SOTTR. Note that SOTTR offers some extra graphics options not available in RotTR. 2. In DX12, scaling stays near 100% compared to DX11. Currently, no SLI profile exists for DX11; switching to profiles from Rise of the Tomb Raider or Tomb Raider (2013) reduces GPU load to about 70% each, but FPS remains steady around 45-50. 3. DX12 tends to be quite unstable. I haven’t tested the game yet, but any crashes during AA method changes—especially while playing—have forced restarts or indicated a possible driver issue. From past reports, this seems linked to the DX12 mode. If stability improves, I’d suggest DX12 for SLI users and DX11 for general use. By the way, VXAO doesn’t work in DX11 for some reason. I’ll update once I try the game myself but need to leave now ;-)
My game runs smoothly in DX12—just occasional issues when adjusting settings like screen mode or resolution, which usually resolves on their own. Some people think it appears better compared to DX11. I’ve noticed DX12 often delivers a more polished experience overall. It looks and performs exceptionally well, especially in multiplayer setups. Overall, I’m really satisfied with the visuals and gameplay.
Agree overall, with a few minor hiccups. The game runs smoothly and looks great, offering an immersive experience. It’s one of those rare titles that really draws you in and keeps you engaged.
DX12 MGPU supports SMAAT2X and SMAA4X through its updated drivers, enhancing performance as noted.
There you go: Settings: resolution 4213x1764 via DSR, everything on Ultra/maxed, V-Sync off. Additional notes: 1. There are 3 scenes during benchmark. GPU usage varies depending on the scene. 2. There is a discrepancy between FPS shown by MSI Afterburner and ingame FPS counter during benchmark Now the results SMAA: avg. FPS 93. GPU usage in scenes: 1. 100%, 2. 100%, 3. 85-100% (least demanding scene) SMAAT2X: avg. FPS 74 GPU usage in scenes: 1. 100%, 2. 67-84% (most between 70 and 80%) 3. 53-100% (most between 60-80%, 100% only couple of seconds) SMAA4X: avg FPS 55 GPU usage in scenes: 1. 100%, 2. 80-90%, 3. 55-100% (most between 60-75%, 100% only couple of seconds) Extra notes: the disparity between MSI Afterburner FPS counter and ingame is huge in the case of SMAAT2x and SMAA4X. We are talking 20FPS difference in extreme situations (50+ as shown by MSI vs. 70+ shown by the game). The difference in normal SMAA is relatively small (never more than 5-6 FPS). Both SMAAT2X and SMAA4X suffer from extreme flickering and stutters during scene 3. Hahaha When I used that one and saw about 50 fps I just thought that SLI simply doesn't work. I would never have assumed that the 50fps I was getting was ALREADY improved from a 37fps with single GPU. Geez.
It's encouraging to notice SMAAT2X and SMAA4X performing better on DX12 MGPU compared to SLI. Moving forward, this is definitely a positive trend. I hope more developers take notice. Thanks for checking it out—lol, that's all I needed! From what I've observed in user tests, DX11 tends to be less efficient even with a single GPU or multiple GPUs.
It's odd since I don't recall Rise performing poorly in DX11. Sure, DX12 was still decent but around 10-15 FPS at most. A quick test with the shadow setting reduced to normal (as recommended on Reddit) and disabling AA completely gave me an average of 98 FPS. So it seems safe to say standard SMAA is essentially free to use. Also, my CPU usage never exceeded 30% in that scenario.