Jump to content

New Nvidia driver out adds DLSS to Final Fantasy 15


Recommended Posts

 

lmao.. dumb how they are comparing it to the 1080Ti when it should be the 2080Ti on vs off.. but I guess I can make that comparison on my own.  They obviously just want to make the 2080Ti look as good as possible over the 1080Ti.  But the difference is huge.

Link to post
Share on other sites
28 minutes ago, DynamiteCop! said:

Eww this is a game dependent feature, I thought it was some machine learning on the fly algorithm that worked with everything.

Proof that you haven't been paying attention to anything while acting like you know everything.

 

That said, I don't believe that you didn't already know that.  I'm pretty sure you're just looking for a different way to hate on it.  I could be wrong... but then that just leaves you with looking ignorant. :shrug: 

Link to post
Share on other sites
11 minutes ago, lynux3 said:

I also was under the impression that DLSS was on-the-fly using tensor cores seeing that it stands for "Deep Learning Super-Sampling". Doesn't matter to me either way.

It is on the fly though.  The tensor cores are actively super sampling the image based on information from the previous frame among other things.  It's not, and never was, a driver level setting.  The game needs to be set up to process motion vector data to the tensor cores.  They also need to provide Nvidia with images to train the network for that particular game.

 

How many jokes were made about these cards releasing and nothing supporting "DLSS or RTX" here?  If it was a driver level feature you wouldn't be needing games to support it.  And how many responses to smartass remarks from you guys specifically have I posted explaining what it is and what it does?  Both of you technically minded guys couldn't put 2 and 2 together?

 

Luckily, it's relatively easy for devs to implement, and it's completely free for developers to utilize Nvidia's servers to train the network to generate the algorithm.  Things will get better to the more training they do.  Nvidia said they will take feedback and update the algorithms over time. :glad: 

Link to post
Share on other sites
Just now, Remij_ said:

It is on the fly though.  The tensor cores are actively super sampling the image based on information from the previous frame among other things.  It's not, and never was, a driver level setting.  The game needs to be set up to process motion vector data to the tensor cores.  They also need to provide Nvidia with images to train the network for that particular game.

 

How many jokes were made about these cards releasing and nothing supporting "DLSS or RTX" here?  If it was a driver level feature you wouldn't be needing games to support it.  And how many responses to smartass remarks from you guys specifically have I posted explaining what it is and what it does?  Both of you technically minded guys couldn't put 2 and 2 together?

 

Luckily, it's relatively easy for devs to implement, and it's completely free for developers to utilize Nvidia's servers to train the network to generate the algorithm.  Things will get better to the more training they do.  Nvidia said they will take feedback and update the algorithms over time. :glad: 

The jokes have nothing to do with the card itself, it's more about the pricing of the card and what it offers for justification for buying said card. The card is too expensive and doesn't do enough to warrant the asking price. That and I don't think anyone really cares enough to get hyped up about another new super-sampling method just yet and an infant implementation of ray-tracing. I don't think anyone said this was some driver level feature either.

 

As I've always said this is the generation to skip because of the fact that no one is familiar with or rushing to implement these new technologies just yet because well... these cards are very very few and far between so at the end of the day it is up to the developer whether or not implementing it is worth the extra effort.

Link to post
Share on other sites
5 minutes ago, lynux3 said:

The jokes have nothing to do with the card itself, it's more about the pricing of the card and what it offers for justification for buying said card. The card is too expensive and doesn't do enough to warrant the asking price. That and I don't think anyone really cares enough to get hyped up about another new super-sampling method just yet and an infant implementation of ray-tracing. I don't think anyone said this was some driver level feature either.

 

As I've always said this is the generation to skip because of the fact that no one is familiar with or rushing to implement these new technologies just yet because well... these cards are very very few and far between so at the end of the day it is up to the developer whether or not implementing it is worth the extra effort.

That's cool.

 

People in general will never get hyped over shit like this.  People in general don't know what checkerboard rendering and temporal injection is/does either.  It's not about whether people who don't care know about the tech... it's that they see their games look pretty much the same, or very slightly better or worse, and their FPS jump up.

 

Regardless of what anyone thinks.. the more technologies like this that exist the better.  I fully 100% expect consoles to have some kind of DL neural network that devs can use to train their games.  It'll gain traction, one way or another.  The potential of DLSS to clean up visual artifacts (such as dithering) is something that traditional methods cant do, currently anyway.  I can understand disliking Nvidia.. but the technology is fucking cool.  And it's actually a working functional technology at this point.  It should only get better. 

 

But yea, it completely depends on support.  We'll have to wait and see.

 

 

Link to post
Share on other sites
1 minute ago, Remij_ said:

That's cool.

 

People in general will never get hyped over shit like this.  People in general don't know what checkerboard rendering and temporal injection is/does either.  It's not about whether people who don't care know about the tech... it's that they see their games look pretty much the same, or very slightly better or worse, and their FPS jump up.

 

Regardless of what anyone thinks.. the more technologies like this that exist the better.  I fully 100% expect consoles to have some kind of DL neural network that devs can use to train their games.  It'll gain traction, one way or another.  The potential of DLSS to clean up visual artifacts (such as dithering) is something that traditional methods cant do, currently anyway.  I can understand disliking Nvidia.. but the technology is fucking cool.  And it's actually a working functional technology at this point.  It should only get better. 

 

But yea, it completely depends on support.  We'll have to wait and see.

 

 

I don't follow GPU extensively, but DLSS seems like an interesting method mainly because I follow efforts from developers who use Google's TensorFlow and DeepMind (among other technologies) extensively. AI is the future of any technology these days and it's barely just getting off the ground.

 

I will say that I'm infinitely more interested in the CPU than I am GPU, which is why I'm a fan of Ryzen and what AMD has brought to the table... and what do you know, Intel just announced their "vision" of future CPUs and it's exactly what AMD has been doing for almost two years now. It's the same "vision" they made fun of when AMD revealed Ryzen and Epyc, oh the irony. :]

Link to post
Share on other sites
3 minutes ago, lynux3 said:

I don't follow GPU extensively, but DLSS seems like an interesting method mainly because I follow efforts from developers who use Google's TensorFlow and DeepMind (among other technologies) extensively. AI is the future of any technology these days and it's barely just getting off the ground.

 

I will say that I'm infinitely more interested in the CPU than I am GPU, which is why I'm a fan of Ryzen and what AMD has brought to the table... and what do you know, Intel just announced their "vision" of future CPUs and it's exactly what AMD has been doing for almost two years now. It's the same "vision" they made fun of when AMD revealed Ryzen and Epyc, oh the irony. :]

I saw that :tom: 

 

Fucking Intel ROFL..  bu but glued together CPUs :cry: 

 

:cruise: 

 

Ryzen 3000 :bow: 

Chiplets :bow: 

Zen 2 :bow: 

 

Link to post
Share on other sites
17 minutes ago, Remij_ said:

I saw that :tom: 

 

Fucking Intel ROFL..  bu but glued together CPUs :cry: 

 

:cruise: 

 

Ryzen 3000 :bow: 

Chiplets :bow: 

Zen 2 :bow: 

 

Glad Infail is following suit though. Should make 2019/2020 very interesting year for CPUs. B) That and with Intel working on their dedicated GPU we'll have a big player in the market again. It'll be Navi vs. Intel's GPU vs. NVIDIA's follow up to Turing.

Link to post
Share on other sites
1 minute ago, lynux3 said:

Glad Infail is following suit though. Should make 2019/2020 very interesting year for CPUs. B) That and with Intel working on their dedicated GPU we'll have a big player in the market again. It'll be Navi vs. Intel's GPU vs. NVIDIA's follow up to Turing.

:whew: 

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...