Jump to content

Remij

Hermits
  • Content Count

    38,627
  • Joined

  • Last visited

  • Days Won

    141

Everything posted by Remij

  1. Weird that it wont run. People were saying that drivers just activated it. I should say though that I did some deeper digging and you're right in that AMD themselves haven't updated their driver with their own specific fallback layer, this is a MS designed fallback layer for compatibility so developers could learn the API, not performance. This layer currently isn't meant for anything else. So it explains why performance is so terrible compared to Nvidia's non RTX cards.
  2. The fallback layer is supported by ALL DX12 devices... Before yesterday.. the demo wouldn't start... now it does.
  3. Are your drivers and Windows 10 up to date? I just told you that they released new drivers activating it yesterday..
  4. Yes, AMD does. They literally just activated it in their drivers yesterday... which is why you're seeing this now.. Here, try the demo yourself. Maybe someone faked that image? I wouldn't put it past them... http://s000.tinyupload.com/index.php?file_id=75266639847527851391
  5. Maybe this thread is about AMD vs Nvidia... Here's a 1050 getting 26fps with more primary rays per second
  6. No it's not.. lmao You're just taking this hard because you think the thread is about AMD vs Nvidia... when it's really about DXR fallback vs DXR..
  7. Nobody is begging the question... it's a simple matter of knowing how it currently performs through the compute cores vs Nvidia's RTX DXR layer. You know... it's a relevant point of interest for many people who are curious as to how large of a difference it would be.
  8. The fallback layer was created for a reason, wasn't it? This is the results... deal with it.
  9. 324fps vs ...10fps Also, now DXR is integrated into the latest UE4 build.
  10. They have money. We'll find out just how serious they are soon enough. Apparently Amazon has been building up some dev studios and they're working on some pretty high quality stuff. Not little mid-small sized studios either.. New IP could be exciting...
  11. Amazing.. that fucking medley is just edit: I just noticed the FF1 cover poster she has OMFG
  12. Yes they do prove my point. Your arguing a strawman you created dude.
  13. What... did you think the comparison ends with one game, or one set of screens? It's hilarious how much TAA quality can vary between games...and even scenes within games... and yet you're here acting like that wont be the case for DLSS rofl... And yes.. I'll keep posting pics which prove my point. Just like Nvidia will keep improving the technology. Keep raging on about something that nobody ever said would be the case... which is that it would look better than native.
  14. No, this shot represents a stress test for the technology.. the foliage and noise from it is precisely what is hard for DLSS to keep up with, while maintaining sharpness and clarity. Yes.. there's a tiny bit of difference... In many cases the foliage actually looks better and sharper on the DLSS image. Yes.. you were right, the top is native and 2nd is DLSS. To truly see any appreciable difference in fullscreen, which couldn't just be attributed to change of the shadows and lighting from the dynamic TOD... you have to zoom in. This is even less so while playing. An
  15. The imposed limitations are stupid, for sure. I don't agree with RT needing to be enabled to enable DLSS. I've never defended that... only explained why it's likely the case. When I told you that, I told you what it looked like in Port Royale... in fact.. I showed you. This was an implementation that was extremely bad just a few days ago, and is now extremely comparable. I'm not always trying to be combative with you fucks. Since I can't test 1080p DLSS vs 1440p, I'd like YOU to post a couple screenshots showing what you say you are seeing. I'm not def
  16. Eh, 1080p DLSS and 1440p DLSS were always going to be more noticeably blurry. It works best at higher resolutions.. everyone knows that. So which is which?
  17. They'll have to be careful how they present everything, that's for sure. Not simply to placate the main Xbox gamers, but to minimize confusion on both sides. They better have things clearly laid out and presented as well as limitations and expectations clearly expressed. It's a lot of shit going on all at once possibly... and is compounded if they're actually talking about their next gen system specs and everything as well. I'm not sure any company could really navigate all of this without causing some confusion... It's going to be a strange and yet intere
  18. And the REAL funny shit here... is that it's right there with the Xbox One X for input lag....
×
×
  • Create New...