Home » Technology » New for Nvidia GeForce RTX graphics: DLDSR is a faster Dynamic Super Resolution with AI

New for Nvidia GeForce RTX graphics: DLDSR is a faster Dynamic Super Resolution with AI

Source: Nvidia

The new use of core tensors in GeForce RTX graphics is similar to DLSS, but paradoxically it reduces the image. Nvidia has added AI scaling to Dynamic Super Resolution, which promises that your GPU will achieve higher FPS at comparable quality.

Nvidia last week quietly released GeForce RTX 3080 with 12GB of memory, which took place as a small announcement in the release of the new driver for the game God Of War. But there was one more news hidden in this same announcement: Deep Learning Dynamic Super Resolution (DLDSR). It is a new technology for GeForce RTX following DLSS, but this time it’s not about upscaling, but paradoxically downscaling, improving the Dynamic Super Resolution feature.

Deep Learning Dynamic Super Resolution

DLDSR is an evolution of Dynamic Super Resolution technology Nvidia added graphics drivers in 2014. It was actually an application of the FSAA: when using DSR, the game was rendered at (up to) twice the resolution of the monitor – that is, on a monitor with a resolution of 1920 × 1080, the game actually drew a 4K image (3840 × 2160 pixels). It was then scaled back to monitor resolution. This should increase the quality of the details and provide the equivalent of very good edge smoothing by plotting with a higher number of samples (supersampling).

The downside of DSR was, of course, that although you play in the final resolution of 1920 × 1080, the performance requirements are the same as when playing in 4K. Therefore, this image enhancer is not suitable for all games. And that’s exactly why Nvidia is now adding a neural network – like DLSS, it will reduce the power you need for DSR.

DLDSR will use less high virtual resolution for rendering, but the quality achieved should be comparable to traditional DSR. Where a resolution of 3840 × 2160 was previously used (“DSR 4X”, which is a 4 × scaling factor because there are 2 x more pixels in both dimensions), “DLDSR 2.25X”, ie scaling with a factor of 2, is sufficient, 25 × (which is 1.5 × in each dimension). The use of AI should compensate for the lower number of input pixels due to better quality.

It is not entirely clear from the description of Nvidia how the neural network is applied – it could either perform the scaling itself at a lower resolution, so it would replace the original algorithm in DSR (it should be 13-tap Gaussian scaling). Alternatively, this scaling algorithm could still be used, but DLSS would be inserted between the natively rendered image and the downscaling, which would still perform upscaling. Then the image would be scaled back to the final resolution.

Comparison of native resolution images of conventional DSR 4X and DLDSR 225X
Comparison of native resolution images, conventional DSR 4X and DLDSR 2.25X (Source: Nvidia)

We will hopefully find out in the future what Nvidia used. If it was the second variant, it would probably have to use DLSS 1.0, because second generation with temporal filtering it requires motion vectors and thus every game must have explicitly written support for it. DLDSR should perhaps not have this requirement, instead it turns on in the drivers. The simpler first option would seem more logical, namely that Nvidia uses a neural network directly for downscaling. But in the DLDSR blog, Nvidia has a comparison that the DLDSR 2.25X has performance almost like native rendering at 1920 × 1080, although it should render at 2880 × 1620 pixels natively. According to this indication, we would rather expect that the function works according to the second theory – that is, if the game is not limited by the processor in this comparison (then those 145 FPS for the native 1080p image would be a deceptive track). So let’s see what the other information shows us.

New for graphics with tensor cores

The DLDSR function can be turned on in the Nvidia Control Panel by setting the DSR scaling factor. In addition to the previous options, new settings are now added to this setting, labeled “DL”, which indicates the use of artificial intelligence. In addition to the already mentioned 2.5X DL factor (internal rendering resolution 2880 × 1620), a DLDSR setting of 1.78 × is offered, which means that the game will render at 2560 × 1440 pixels before scaling.

You will probably need to update to the current driver version (511.17) to make this new option available. This feature also requires a GPU with tensor cores running on the neural network, so you must have a GeForce RTX card, otherwise DLDSR will not work. However, if these requirements are met, DLDSR, according to Nvidia, should work with virtually all games, so developers do not have to add explicit support for each specific title as with DLSS.

DLDSR settings in the Nvidia Control Panel
DLDSR settings in Nvidia Control Panel (Source: Nvidia)

So DLDSR will again be something you can potentially use to improve the image of virtually all games. We do not yet know whether this will have any disadvantages – typically there could be some artifacts of the connection with the fact that it scales to a neural network.

Source: Nvidia

New for Nvidia GeForce RTX graphics: DLDSR is a faster Dynamic Super Resolution with AI

Rate this article! 5 (100%) 2 votes


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.