Jump to content

Digital Foundry - Control DLSS 2.0


Recommended Posts

When I played this game dlss was trash and obviously a massive downgrade in iq. This looks a lot better and will be great for next gen games .  Every game I tried with the old dlss was bad except for the newest wolfenstein .

This could add a lot of life to my 2060 

 

 

 

Link to post
Share on other sites

There is currently no better reconstruction technique than DLSS 2.0.  It's godly.  Absolutely NO reason to not use it if the game supports it and you have the hardware.

 

Rendering only a 1/4 of the pixels, generating the rest and looking as good as native, and in some ways slightly better is simply incredible... and all of the downsides of the previous implementations have been rectified.  Less ghosting, better clarity in motion.  Uses less VRAM too since you're rendering at a far lower native resolution.  There's tons of benefits to this tech.

 

RTX. IT'S ON. :smug: 

Link to post
Share on other sites
3 minutes ago, Team 2019 said:

AMD doesn't have a similar solution for next gen consoles?

AMD doesn't currently have a similar solution for this.  It's really not going to be up to AMD to have anything like this for the next gen consoles.  Basically the hardware simply has to be able to accelerate int8 and int4 workloads.  We know Series X has some capability here... we don't know if Sony has the same.  It's entirely possible they don't.

 

Microsoft already has DirectML, and they have their own image enhancement techniques, which they've displayed with Forza Horizon 3 a while ago.  It's assured they will have some form of this technology, but no idea if it will be as good quality or performance wise.

 

If Sony has it... I'm surprised Cerny didn't speak about it during their Spec deep dive.  Cerny didn't mention it at all.. which I would have expected.. so it remains to be seen if PS5 will have the capability to run these networks in real time.

Link to post
Share on other sites
8 minutes ago, Team 2019 said:

The era of resolution wars is over. Time to get into framerate and loading speed wars.

It's true.

 

IQ will be high and temporally stable enough next gen that slight resolution differences could be barely noticeable.  If it means developers can worry less about hitting arbitrary resolutions and more about actually making better looking assets, games will benefit.

 

RT quality is going to be the new thing differentiating the platforms.. as well as performance.

Link to post
Share on other sites
  • 2 weeks later...

I'm playing around with this DLSS in Control and from what I can see nothing beats native resolution despite it eating heavy on the framerate. 

 

When I enable DLSS I get 720p, 626p, and 540p Render Resolution so of course I'm going to pick the 720p render, but despite the smoothness and higher framerate there is still some noticeable setbacks which I guess is unavoidable. 

 

In 720p DLSS in fast motion there's a slight second delay on things like readable print like the print on the binders in Central Control that will render everytime if I move the view fast left and right before becoming visible, but if I run the game with DLSS off at 1080p it is not an issue. 

 

Wish there was a way to do 1080p DLSS @ 1080p, but it seem NVIDIA put these roadblocks in here to keep the RTX cards in their lane. 

 

Edited by The Mother Fucker
Link to post
Share on other sites
7 minutes ago, The Mother Fucker said:

I'm playing around with this DLSS in Control and from what I can see nothing beats native resolution despite it eating heavy on the framerate. 

 

When I enable DLSS I get 720p, 626p, and 540p Render Resolution so of course I'm going to pick the 720p render, but despite the smoothness and higher framerate there is still some noticeable setbacks which I guess is unavoidable. 

 

In 720p DLSS in fast motion there's a slight second delay on things like readable print like the print on the binders in Central Control that will render everytime if I move the view fast left and right before becoming visible, but if I run the game with DLSS off at 1080p it is not an issue. 

 

Wish there was a way to do 1080p DLSS @ 1080p, but it seem NVIDIA put these roadblocks in here to keep the RTX cards in their lane. 

 

 

Of course the fewer pixels you render the less stable it will get.  Not sure what you mean about the delay.  I mean, the way it works is that it accumulates frames over time.. so using the info from the previous frames to build the next ones.  Of course when you're moving the camera back and forth and then stop it builds the frames.  Because you're running at a low resolution, it make that more noticeable.

 

It's not a DLSS artifact.. it's essentially how Temporal reconstruction works.  The reason why you're noticing it.. is because it's a really low resolution to begin with, then it gets built up with detail as the frame data comes in.  It's not even noticeable at higher resolutions.. which is where it really makes a difference for performance.

 

It could be that because the 2060 has fewer tensor cores and running the network takes more time, you're seeing a longer accumulation time for the frame.. which makes it more noticeable to you. 

 

Nvidia have said however that the real challenge is getting those lower resolutions like 540p to 1080p, to look native.  That's what they're working on the most.  It'll get there with time.  There's gonna be some good stuff coming. :face: 

Link to post
Share on other sites
3 minutes ago, Remij_ said:

 

Of course the fewer pixels you render the less stable it will get.  Not sure what you mean about the delay.  I mean, the way it works is that it accumulates frames over time.. so using the info from the previous frames to build the next ones.  Of course when you're moving the camera back and forth and then stop it builds the frames.  Because you're running at a low resolution, it make that more noticeable.

 

It's not a DLSS artifact.. it's essentially how Temporal reconstruction works.  The reason why you're noticing it.. is because it's a really low resolution to begin with, then it gets built up with detail as the frame data comes in.  It's not even noticeable at higher resolutions.. which is where it really makes a difference for performance.

 

It could be that because the 2060 has fewer tensor cores and running the network takes more time, you're seeing a longer accumulation time for the frame.. which makes it more noticeable to you. 

 

Nvidia have said however that the real challenge is getting those lower resolutions like 540p to 1080p, to look native.  That's what they're working on the most.  It'll get there with time.  There's gonna be some good stuff coming. :face: 

I think my problem is the roadblocks that are in place on DLSS.  

 

What Resolution and Render Resolution do you play this at with DLSS?

 

 

Link to post
Share on other sites
2 minutes ago, The Mother Fucker said:

I think my problem is the roadblocks that are in place on DLSS.  

 

What Resolution and Render Resolution do you play this at with DLSS?

 

 

What roadblocks?  lmao.. dude.. they've removed the roadblocks that were there.  You can play at any resolution you want.  It uses 2x and 4x scales.  540p > 1080p

 

You're asking why they don't let you run 1080p DLSS at 1080p?  ... dude.. that's rendering the full pixels.. what are you expecting to improve?  You're running it at native resolution in that case.

Link to post
Share on other sites
Just now, Remij_ said:

What roadblocks?  lmao.. dude.. they've removed the roadblocks that were there.  You can play at any resolution you want.  It uses 2x and 4x scales.  540p > 1080p

 

You're asking why they don't let you run 1080p DLSS at 1080p?  ... dude.. that's rendering the full pixels.. what are you expecting to improve?  You're running it at native resolution in that case.

The framerate.   The tensor cores on the RTX card on only used with DLSS, if they can leverage the processing at 1080p I expect to get a bump in framerate than having to rely completely on the Turing shaders with DLSS off. 

Link to post
Share on other sites
Just now, The Mother Fucker said:

The framerate.   The tensor cores on the RTX card on only used with DLSS, if they can leverage the processing at 1080p I expect to get a bump in framerate than having to rely completely on the Turing shaders with DLSS off. 

Dude, if you're running 1080p internal resolution... with DLSS... you're not going to get any performance increase over 1080p native.  It's actually going to perform worse because the time it takes to run the neural network on the tensor cores is added to the time it takes to render the pixels.

 

1080p DLSS on a 1080p output... is full resolution..

 

You're asking why they don't let you run full resolution with DLSS.... and it's because at that point you're running native.. and there's nothing for the tensor cores to do but slow the process down lol..

Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...