Jump to content

Digital Foundry - Control DLSS 2.0


Recommended Posts

4 minutes ago, Remij_ said:

Dude, if you're running 1080p internal resolution... with DLSS... you're not going to get any performance increase over 1080p native.  It's actually going to perform worse because the time it takes to run the neural network on the tensor cores is added to the time it takes to render the pixels.

 

1080p DLSS on a 1080p output... is full resolution..

 

You're asking why they don't let you run full resolution with DLSS.... and it's because at that point you're running native.. and there's nothing for the tensor cores to do but slow the process down lol..

What resolution and render resolution do you run this game at using DLSS?

Link to post
Share on other sites
Just now, The Mother Fucker said:

With a native res of 4K?

With a monitor resolution of 4K yes..  the native res is the internal DLSS res.. so either 720p or 1080p.

Edited by Remij_
Link to post
Share on other sites
1 minute ago, Remij_ said:

With a monitor resolution of 4K yes..  the native res is the internal DLSS res.. so either 720p or 1080p.

See I wouldn't be surprised that the issue I explained earlier is not noticeable over a 4K display with1080p render resolution DLSS.    I am willing to take the chance of the performance hit you think might happen if I was given the 1080p render resolution on a 1080p display. 

 

I have a RTX 2080 btw.

Link to post
Share on other sites
1 minute ago, The Mother Fucker said:

See I wouldn't be surprised that the issue I explained earlier is not noticeable over a 4K display with1080p render resolution DLSS.    I am willing to take the chance of the performance hit you think might happen if I was given the 1080p render resolution on a 1080p display. 

 

I have a RTX 2080 btw.

lmao... you had a 2060 before... and now you have a 2080 and never mentioned it? :mjpls: 

 

Dude... you DO have the option of 1080p render resolution on a 1080p display... it's called 1080p native.... :cosby2: 

Edited by Remij_
Link to post
Share on other sites
12 minutes ago, Remij_ said:

lmao... you had a 2060 before... and now you have a 2080 and never mentioned it? :mjpls: 

 

Dude... you DO have the option of 1080p render resolution on a 1080p display... it's called 1080p native.... :cosby2: 

You getting me mixed with jonb.  

 

1080p native with DLSS off?  We know that.  DLSS off does not use Tensor cores for processing ray-tracing, only the Turing shaders which at the moment on the 2080Ti can deliver smooth 60fps at 1080p. 

 

DLSS is suppose to offset the load by having the tensor cores process the machine learned ray-tracing information.   If I had 1080p render option with DLSS available at 1080p, I should be able to get at the very least higher framerate.  


Otherwise, what's the point.  Might as well just run the game with DLSS off at 720p resolution?

Edited by The Mother Fucker
Link to post
Share on other sites
5 minutes ago, The Mother Fucker said:

You getting me mixed with jonb.  

 

1080p native with DLSS off?  We know that.  DLSS off does not use Tensor cores for processing ray-tracing, only the Turing shaders which at the moment on the 2080Ti can deliver smooth 60fps at 1080p. 

 

DLSS is suppose to offset the load by having the tensor cores process the machine learned ray-tracing information.   If I had 1080p render option with DLSS available at 1080p, I should be able to get at the very least higher framerate.  


Otherwise, what's the point.  Might as well just run the game with DLSS off at 720p resolution?

Dude, you really don't seem to understand.

 

DLSS has nothing to do with ray-tracing.. so just stop confusing what the tensor cores are actually doing in Control.

 

You want 1080p DLSS render option with your 1080p monitor... I get that... but at that point... you're literally rendering 1080p.  So there's NO performance benefits to be gained.  If you run "DLSS" at native resolution... you're not gaining anything...  The entire idea behind it is that you're running at a fraction of the resolution.  The simple act of running the neural network on the tensor cores takes a fixed amount of time.  So if you're running at 1080p DLSS on a 1080p monitor... you're literally going to get worse performance than 1080p native, because you're running that full native resolution.. PLUS the time it takes for the tensor cores to run the neural network.  That's the reason why they don't let you run "native" resolutions with DLSS.  It's literally a waste and will perform worse.  That's NOT something you want.

 

The idea is to look native while running lower resolutions.  DLSS2.0 is amazing at this and works very well.

 

If you want to render 1080p DLSS.. to notice the difference you're going to want to have a screen resolution of 1440p or higher.

 

Try enabling DSR in the nvidia control panel.  You should be able to enable 1.78x (1440p) and 4x (4K).. Then change your windows desktop resolution to either or, then run the game and then you should be able to select higher DLSS resolution in game.  Then you could compare.  

 

But remember, you're not comparing 1080p DLSS to 1080p native performance... you should be comparing it to 1440p. 

  • Upvote 1
Link to post
Share on other sites
2 minutes ago, Remij_ said:

Dude, you really don't seem to understand.

 

DLSS has nothing to do with ray-tracing.. so just stop confusing what the tensor cores are actually doing in Control.

 

You want 1080p DLSS render option with your 1080p monitor... I get that... but at that point... you're literally rendering 1080p.  So there's NO performance benefits to be gained.  If you run "DLSS" at native resolution... you're not gaining anything...  The entire idea behind it is that you're running at a fraction of the resolution.  The simple act of running the neural network on the tensor cores takes a fixed amount of time.  So if you're running at 1080p DLSS on a 1080p monitor... you're literally going to get worse performance than 1080p native, because you're running that full native resolution.. PLUS the time it takes for the tensor cores to run the neural network.  That's the reason why they don't let you run "native" resolutions with DLSS.  It's literally a waste and will perform worse.  That's NOT something you want.

 

The idea is to look native while running lower resolutions.  DLSS2.0 is amazing at this and works very well.

 

If you want to render 1080p DLSS.. to notice the difference you're going to want to have a screen resolution of 1440p or higher.

 

Try enabling DSR in the nvidia control panel.  You should be able to enable 1.78x (1440p) and 4x (4K).. Then change your windows desktop resolution to either or, then run the game and then you should be able to select higher DLSS resolution in game.  Then you could compare.  

 

But remember, you're not comparing 1080p DLSS to 1080p native performance... you should be comparing it to 1440p. 

I give this a shot.

  • Like 1
Link to post
Share on other sites
4 minutes ago, JONBpc said:

You have a 2080 but stuck w a 1080p monitor ?

I don't agree with the diminishing returns that come with high resolution displays.  A 2080 can do stable 144Hz remarkably well in Witcher 3 at 1080p , where as my RX 5700XT AE card could only do like 116-133hz in motion both configured to Max settings including Hairworks which at one point was NVIDIA exclusive, but Radeon cards support it now too.

Link to post
Share on other sites

Yea, there's lots of people who run 1080p monitors with the latest gpus for the highest framerates possible.

 

I don't personally understand it past a point.. as you can run 1080p just fine on a high refreshrate 1440p monitor as well... but to each their own.

 

You're definitely missing out on some detail with that monitor though.  

Link to post
Share on other sites

the blurred print processing on the binder is not noticeable anymore at 2560x1440 VSR  + 1706x960  render resolution. 

I changed the resolution next to what I complained earlier 1920x1080 + 1280x720 render resolution, and now I don't see the blurring anymore.  What happened here? Did it machine learn that font and now display it fast enough that I don't see it processing?

 

Also does VSync in this game not work at all.  I find the only way I can stop screen tearing is turning VSYNC on from the NVIDIA panel which says there will be a performance hit.  The Vsync option in the game seem to do absolutely nothing. 

Link to post
Share on other sites
4 minutes ago, The Mother Fucker said:

the blurred print processing on the binder is not noticeable anymore at 2560x1440 VSR  + 1706x960  render resolution. 

I changed the resolution next to what I complained earlier 1920x1080 + 1280x720 render resolution, and now I don't see the blurring anymore.  What happened here? Did it machine learn that font and now display it fast enough that I don't see it processing?

 

Also does VSync in this game not work at all.  I find the only way I can stop screen tearing is turning VSYNC on from the NVIDIA panel which says there will be a performance hit.  The Vsync option in the game seem to do absolutely nothing. 

I honestly don't know.  Probably was something to do with the 1080p monitor resolution and the game resolution caused some kind of bug or corruption.

 

As for vsync, what Hz is your monitor running at?  Did you remember to increase the Hz when you changed your desktop resolution?  Check it in the Nvidia control panel under  It may have defaulted back to 60.. which is causing the vsync issues.  Make sure it's set to 120 or 144 or whatever your monitor supports.

Link to post
Share on other sites
2 minutes ago, Remij_ said:

I honestly don't know.  Probably was something to do with the 1080p monitor resolution and the game resolution caused some kind of bug or corruption.

 

As for vsync, what Hz is your monitor running at?  Did you remember to increase the Hz when you changed your desktop resolution?  Check it in the Nvidia control panel under  It may have defaulted back to 60.. which is causing the vsync issues.  Make sure it's set to 120 or 144 or whatever your monitor supports.

I keep this on a 1080p 60hz display.  

 

I can only notice the post processing on the map that security guard is looking at and at the ground when I move the camera. It will blur for like a millisecond before focusing. 

 

I'm playing CoD:MW rn and it's doing exactly what I want.  Rendering at 1440p+ resolution on 1080p/60hz and I'm getting decent framerates 85-110. 

Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...