The newest game from the Tomb Raider franchise – the 10th in the franchise, a reboot telling how Lara Croft became Tomb Raider and doesn’t carry any of the storyline from the previous games. This is the fourth Tomb Raider game developed by Crystal Dynamics, and they used a heavily modified and updated Crystal engine, the engine they used in Tomb Raider Underworld and Guardian of Light. As a result, the game now supports DX11 features like hardware tessellation, super-sample anti-aliasing, and the new TressFX developed by AMD – a hair physics effects rendering a realistic hair animation.
We’re gonna take a look at how this game will perform on low-end systems and possible ways how to improve its performance.
Test System and Requirements
|Test System||Minimum requirements|
|Processor||Intel Celeron G550 2.6 GHz Dual-core||Dual core CPU: AMD Athlon64 X2 2.1 Ghz (4050+), Intel Core2 Duo 1.86 Ghz (E6300)|
|Memory||4GB DDR3 1600MHz||2 GB|
|Video Card||AMD Radeon HD 7750 1GB DDR5 nVidia GeForce 9600GT 512MB DDR3||DirectX 9 graphics card with 512Mb Video RAM, AMD Radeon HD 2600 XT, nVidia 8600|
|Driver / Patch version||AMD Catalyst 13.2 Beta 7 nVidia Forceware 314.07||1.0.722.3 Survival Edition|
|Operating System / DirectX||Windows 8 64-bit Windows 7 SP1 64-bit / DX 11||Windows XP Service Pack 3, Windows Vista, Windows 7, Windows 8 (32bit/64bit)|
The requirements is not that high and probably older system can still manage to play the game with good performance, but we have to assume that it will only be on “Low” settings.
All tests were done with the following components and settings
- Intel Celeron G550 2.6 GHz Dual-core
- 4GB DDR3 1600MHz
- AMD Radeon HD 7750 1GB DDR5
- Windows 7 SP1 64-bit
- Normal preset, 1920×1080 (1600×900 for individual settings tests)
- Fraps 3.5.9 for recording average and minimum frames per second; 60 seconds and twice for each test.
Built-in Benchmarking Tool vs Actual Gameplay
I was curious if the built-in benchmarking tool really represents the performance of the game, so I picked an “actual” gameplay to compare with. The level is from Helicopter Hill to Windmill Campsite.
At Low, Normal, and Ultimate preset, the two benchmarking process displayed different results, most especially at Low preset with 14fps difference. Only at Ultra preset where the two process displayed almost identical results with 21 and 22 frames per second. I also noticed that at Ultra and Ultimate presets, the actual gameplay exhibited faster performance than using the Built-in benchmarking tool.
With these uneven results, I decided to use the actual gameplay for the rest of the tests. I am not saying that the built-in benchmarking tool is unreliable, it’s just I’m more comfortable using the gameplay for benchmarking. And besides, It’s the “actual gameplay”.
You can read a more detailed article here from our partner site comparing benchmarking using built-in tools against actual gameplay.
Image Comparison and Performance
Texture quality affects the sharpness and crispiness of texture on wood, stones, soil, and clothes. The differences between each settings are noticeable and I recommend texture quality to be set on ultra since there are no performance hit when changing between settings.
Texture Filtering makes the texture clearer. The higher the value, the farther the texture will become clearer. The performance is the same across each setting until you set it to 16x where the performance dropped by 1, but even that is negligible. I recommend to set this at 8x.
TressFX improves the look and the movement of hair, but took away some frames per second. For mid-range setup, I recommend hair quality to be set on normal.
Turning on any anti-aliasing make the edges smoother, the higher the value the better but will also take away some frames per second. FXAA improves the quality without losing performance. I recommend FXAA to be used on this setting.