AMD FSR 2.0 in God of War CUP
: Test |CUP | Specs |Config
As the third game after Deathloop and Farming Simulator 22, God of War was patched with AMD’s temporal upsampling FSR 2.0, which in this case replaces FSR 1.0. In the test, ComputerBase analyzes the image quality compared to Nvidia’s DLSS and the native resolution. In addition, benchmarks were created.
God of War for PC now features FSR 2.0 alongside DLSS
The initially PlayStation-exclusive game God of War (test) made it to the PC in January of this year and knew how to please the ComputerBase test despite the age it has now reached. Back then, the game featured AMD’s FSR 1.0 and Nvidia’s DLSS, the latter of which made a really good impression.
In the last few weeks it had already been indicated that God of War would soon receive AMD’s new, temporal upsampling FSR 2.0 (test) and thus the actual competitor to Nvidia’s DLSS. Because after a major patch, there was suddenly the entry “AMD FidelityFX Super Resolution 2.0” in the graphics menu – but behind it was still FSR 1.0. This was corrected again with a hotfix, but it was actually clear that developer Jetpack Interactive was working on an update.
FSR 2.0 replaces FSR 1.0 entirely
And that’s exactly what happened. In addition to Deathloop and Farming Simulator 22 (test), God of War is the third game to support FSR 2.0. Unlike the other two titles, however, FSR 2.0 does not expand the range of options. The temporal FSR 2.0 completely replaces the spatial FSR 1.0, so that FSR 2.0 and DLSS are now available as upsampling in the game.
Interesting side note: Officially, FSR 2.0 only supports DirectX 12 and Vulkan, but God of War uses DirectX 11. As has now been found, FSR 2.0 can also handle other interfaces apart from the low-level APIs, but not officially. However, in cooperation with AMD, developers can also integrate FSR 2.0 into other APIs.
God of War offers all four modes of FSR 2.0: “Quality”, “Balanced”, “Performance” and the optional “Ultra Performance”. DLSS, which is available in version 220.127.116.11, also allows the same settings. Also worth mentioning is a resharpening function that is available in both upsampling variants. FSR 2.0 uses AMD’s own CAS, which offers adjustment options from 0.0 to 1.0 in increments of 0.1 (0.3 by default), while DLSS uses Nvidia’s own sharpening with adjustment options from 0 to 100 in increments of 1 (Default 35) is offered. In the test, the editors used the standard setting.
The image quality of FSR 2.0 and DLSS 2.3
God of War has a peculiarity: the effect of depth of field is apparently always based on the render pixels, so there is significantly less blur when using FSR and DLSS than with native resolution. If you ignore that, you quickly realize that with the same number of render pixels, FSR 2.0 and DLSS produce the significantly better image. If you have an Ultra HD monitor, for example, but the computing power is only sufficient for WQHD, it makes much more sense to play with FSR or DLSS in Quality mode than with just a reduced WQHD resolution. Even the performance mode of both upsampling techniques looks even better in Ultra HD than the native WQHD resolution – that’s impressive. And that can also be rolled over to lower resolutions. For example, FSR and DLSS on “Quality” in WQHD look better than the native Full HD resolution, although the latter offers more render pixels.
For the most part, the duel between DLSS and FSR 2.0 in God of War is close: both techniques work at a high level. They differ primarily in details, which can definitely make the difference. It is interesting, for example, that the sharpness behavior of AMD and Nvidia technology apparently works differently. Both sharpen themselves properly, in Quality mode the image is slightly blurrier to sharper than the native resolution, depending on the object.
There are differences in image sharpness depending on the mode
But things look different in performance mode. Here DLSS suddenly sharpens significantly more than FSR 2.0. The result is a visibly sharper image from DLSS that only loses minimal sharpness compared to Quality mode and is therefore still comparably sharp to native resolution. In contrast, FSR 2.0 on “Performance”, which loses a good deal of image sharpness. DLSS and FSR 2.0 behave differently in this regard. The image sharpness itself is positive with DLSS, but unlike AMD’s CAS, Nvidia’s sharpness mechanism produces more artifacts, which can be annoying, especially in resolutions lower than Ultra HD. Here it is advisable to do without image sharpness and to reduce the sharpness control with DLSS. It’s different with FSR: The standard setting can definitely be used here. If you want, you can also turn up the sharpness a bit with FSR 2.0 on “Performance”: As usual, CAS is less susceptible to interference than Nvidia’s resharpening, which is why FSR 2.0 has to struggle with fewer artifacts than DLSS with the same image sharpness.
Interestingly, when it comes to reconstruction, neither DLSS nor FSR 2.0 make a very good impression, although this is actually their strength, which is probably due to the fact that the game’s own TAA manages it quite well. Although both upsampling variants can reconstruct fine details better than the game TAA, they are only so small that this does not apply to the native resolution, but only with the same number of render pixels. In Quality mode in Ultra HD (render resolution WQHD), details are reconstructed better than in native WQHD resolution with TAA, but Ultra HD with TAA reconstructs details better instead. DLSS has slight advantages over FSR when it comes to very fine details, especially in performance mode, but these are hardly noticeable in the game.
As far as image stability is concerned, DLSS and FSR are mostly on a comparable level. Especially in high modes and high resolutions there is hardly any difference – some details flicker in FSR, others a little more in DLSS. In lower resolutions and more aggressive settings, DLSS has a slight edge, but the differences are small. Compared to the native resolution, both DLSS and FSR 2.0 perform worse in terms of image stability, since the game’s own TAA works too well for this, and even with the same resolution there are no advantages for upsampling.
God of War also has problems
As is usually the case with temporal solutions, there are also some weaknesses of FSR and DLSS in God of War, which are different. Depending on the contrast, FSR 2.0 leaves behind a slight ghosting effect on snowflakes, which looks like a comet. At the same time, DLSS seems to be better at capturing the snowflakes in general. They are displayed more clearly with Nvidia’s technology, with AMD’s counterpart they lose their opacity. In addition, FSR 2.0 has minor problems with some transparent effects, where a kind of shimmering then settles around the object in the foreground. Both issues are barely noticeable when gaming, but should be addressed – this is where DLSS comes out on top.
DLSS and FSR usually have comparable image stability. But there’s also one exception to the game that’s a little odd. In some areas, when using upscaling of any kind, the reflections are presented differently. For an inexplicable reason, they then show more details or are more prominent despite the lower number of render pixels, which also creates a lot of noise. And it is with this noise that FSR 2.0 struggles and ultimately fails to greatly reduce it. DLSS does this much better in all resolutions and settings. Although DLSS does not manage to completely prevent noise either, the qualitative difference between DLSS and FSR 2.0 in a corresponding sequence is large.
Ultra HD – Native resolution
Bild 1 von 42
However, DLSS also has its difficulties. The many, fine branches in the game like to smear – the fewer render pixels, the longer the lines the branches draw behind them. And ghosting is also represented, which can occur with branches, but also with the representation of hair. Neither is a big problem, but it is there, which doesn’t happen with the native resolution.