![]() We didn't test every possible GPU by any stretch, but we did include at least one card from quite a few different generations of hardware.įor the most part, XeSS ran quite smoothly in its current state with image quality overall being on par with the DLSS 2 implementation in the game. We also tested a few other options on select cards, but those results aren't shown in the charts. We tested native, XeSS Quality, and DLSS Quality (where available). Ultra Quality aims to provide a "Nvidia DLAA" like approach, with a focus on image quality over frame rate, while the last three modes prioritize frame rate over image quality.Īnyway, let's get to the benchmark results. As with DLSS and FSR, there are multiple upscaling modes: Ultra Quality, Quality, Balanced and Performance. As this is the first game we've been able to test with XeSS, there may be some early bugs that are still being worked out. Keep in mind that XeSS is a brand-new AI upscaling algorithm that aims to compete with the likes of Nvidia DLSS 2 and AMD FSR 2. It may be that we were hitting some form of memory bottleneck, but this is a relatively old game and 6GB ought to have been sufficient. There were also some oddities with certain cards, like all the GPUs we tested that had 6GB VRAM. ![]() What about newer cards? Results were better, sort of, with most RX 6000-series benefitting - not a lot, but something is better than nothing. The AMD Navi 10 (RX 5600 XT and RX 5700 XT) GPUs also lost performance with XeSS, and if the Navi 12 information is correct, only the Radeon Pro 5600M (used in some MacBooks and laptops) fully supports DP4a from the RDNA generation. Emulation generally reduces performance quite a bit, and so it would make sense that cards like the RX Vega 64 ran slower. What's going on? It looks like DP4a (8-bit integers) is being emulated via 24-bit integers on architectures that don't natively support it. But the results of the benchmarks are telling, as XeSS didn't benefit quite a few of the GPUs and in some cases reduced performance. Guess which GPUs allowed us to enable XeSS: All of them. Here's where things really get odd, because we did some quick benchmarking of Shadow of the Tomb Raider at 1440p and the Highest preset (without ray traced shadows, so that we could run the same test on all GPUs). What about AMD? In theory, Vega 20, Navi 12 (?), and later GPUs have support for it. Intel says its Gen11 Graphics (Ice Lake) and later support DP4a, and of course Arc GPUs support Xe Matrix Extensions (XMX) - the faster alternative to DP4a. Officially, we know all Nvidia GPUs since the Pascal architecture (GTX 10-series) have supported DP4a. What the hell does that mean, and what graphics cards support the feature? That's a bit more difficult to say. However, XeSS is supposed to have a fallback mode where it runs using DP4a instructions - four element 8-bit integer vector dot product instructions. It was created by Intel for the Arc GPUs, more or less as a direct competitor to DLSS. This is where things can get a bit confusing with XeSS. If you own the game, you can try XeSS right now for yourself, and it works on many GPUs, including Nvidia's and AMD's best graphics cards. ![]() 1080p Ultra setting: 75 FPS, 1440p Medium setting: 100 FPS, 1440p High setting: 67 FPS, 1440p Ultra setting: 48 FPS.Shadow of the Tomb Raider just received a new update featuring support for Intel's new XeSS technology, making it one of the first games on the planet to publicly offer XeSS. Are you after bigger resolution with lower graphics quality or higher graphics quality and smaller res? Because here are a few figures that could help you choose your optimum Low Vs Ultra. While it can comfortably perform at 1080p we know it is best served at 1440p resolutions and could possibly go up to 4K with some settings adjustments. To summarise, Shadow of the Tomb Raider works very well with a GeForce RTX 2060 6GB. Put in to the system the required 16GB system memory and a processor thats as good as the Intel Core i7-9700K 8-Core 3.6GHz and you have a great Shadow of the Tomb Raider experience. The optimum resolution for the GeForce RTX 2060 6GB here is 1080p, where it can push out reliable frame rates all day at Ultra. Where it would get 39 FPS on High 4K and 58 FPS on Medium. While you can get a solid 48 FPS at 1440p, you might also consider pushing it up to 4K. Which is at High settings on 1080p performance. Viewing the performance found on the 3 year old GeForce RTX 2060 6GB while playing Shadow of the Tomb Raider we quickly see it can return a consistently high 104 frame rate. Low Vs Ultra GeForce RTX 2060 6GB Performance Review
0 Comments
Leave a Reply.AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |