lynux3 2,107 Posted September 22, 2019 Author Share Posted September 22, 2019 34 minutes ago, Remij_ said: AMD fanboys always run their mouths. They can never back that shit up. Always late to the conversation. It's like, just fuck right off. You can enter the conversation when you have hardware that supports what you're talking about The hurt is real. Who cares? Ray tracing isn't exactly taking over the world like NVIDIA was hoping. Who knows if AMD's solution is going to be as poor or even more poor than NVIDIA's solution. Like I've been saying the feature is too expensive for its own good. I doubt we'll even see "medium" level settings in next gen games, but I'd like to be wrong. Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 Just now, lynux3 said: The hurt is real. Who cares? Ray tracing isn't exactly taking over the world like NVIDIA was hoping. Who knows if AMD's solution is going to be as poor or even more poor than NVIDIA's solution. Like I've been saying the feature is too expensive for its own good. I doubt we'll even see "medium" level settings in next gen games, but I'd like to be wrong. You will be wrong. Like usual Quote Link to post Share on other sites
lynux3 2,107 Posted September 22, 2019 Author Share Posted September 22, 2019 15 minutes ago, Remij_ said: You will be wrong. Like usual I've been more right than you have in the last year. DUN TALK BOUT WAY TWACING IF U DUN HAB IT Quit crying you goddamn panzy. Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 Just now, lynux3 said: I've been more right than you have in the last year. DUN TALK BOUT WAY TWACING IF U DUN HAB IT Quit crying you goddamn panzy. LMAO no you haven't. Like usual You're like kids wanting to sit at the dinner table with the adults, go Radeoff yourself Quote Link to post Share on other sites
lynux3 2,107 Posted September 22, 2019 Author Share Posted September 22, 2019 1 minute ago, Remij_ said: LMAO no you haven't. Like usual You're like kids wanting to sit at the dinner table with the adults, go Radeoff yourself Yeah I have, like usual. RTX flopped, people are pointing it out and your butthurt rises exponentially. There's nothing better than watching your NVIDIA cock sucking ass cry about it too. NVIDIOT Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 2 minutes ago, lynux3 said: Yeah I have, like usual. RTX flopped, people are pointing it out and your butthurt rises exponentially. There's nothing better than watching your NVIDIA cock sucking ass cry about it too. NVIDIOT What dear, did you say something? Quote Link to post Share on other sites
DynamiteCop 2,087 Posted September 22, 2019 Share Posted September 22, 2019 (edited) The video makes a lot of sense, when talking about Cuda and Turing and the imbalance he goes on to talk about hardware acceleration and the like. Hardware acceleration should take your existing performance profile without acceleration and maintain it while operating the extenuating task it's set out to do. Turing and RTX does not do this, it kills that existing profile via Cuda and raster rendering while bogging down the Turing cores to handle RT beyond what they are capable. It's not adequate for the task it's supposedly intended to handle and as a result you get a sharp net-negative in performance. Edited September 22, 2019 by DynamiteCop! Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 1 minute ago, DynamiteCop! said: The video makes a lot of sense, when talking about Cuda and Turing and the imbalance he goes on to talk about hardware acceleration and the like. Hardware acceleration should take you existing performance profile without acceleration and maintain it while operating the extenuating task it's set out to do. Turing and RTX does not do this, it kills that existing profile via Cuda and raster rendering while bogging down the Turing cores to handle RT beyond what they can handle. It's not adequate for the task it's supposedly intended to handle and as a result you get a sharp net-negative in performance. The guy is an idiot and doesn't have any idea what he's talking about. Quote Link to post Share on other sites
DynamiteCop 2,087 Posted September 22, 2019 Share Posted September 22, 2019 Just now, Remij_ said: The guy is an idiot and doesn't have any idea what he's talking about. Like Turing for RTX, this is not an adequate response. Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 Just now, DynamiteCop! said: Like Turing for RTX, this is not an adequate response. It's as much of a response as that dipshit deserves Quote Link to post Share on other sites
lynux3 2,107 Posted September 22, 2019 Author Share Posted September 22, 2019 1 minute ago, Remij_ said: What dear, did you say something? NVIDIA RTX flopped and the damage your rectum sustained doesn't appear to be reversible. AMD shipping more GPUs is definitive proof RTX flopped. Quote Link to post Share on other sites
lynux3 2,107 Posted September 22, 2019 Author Share Posted September 22, 2019 4 minutes ago, Remij_ said: The guy is an idiot and doesn't have any idea what he's talking about. Quote Link to post Share on other sites
DynamiteCop 2,087 Posted September 22, 2019 Share Posted September 22, 2019 1 minute ago, Remij_ said: It's as much of a response as that dipshit deserves He's got a point, Turing was obviously built for something else but is being used to try and accelerate RT, it can kind of do it but not up to any kind of adequate standard. The only hope for RT is purpose built hardware directly created for the task at hand. Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 1 minute ago, lynux3 said: "Nvidia's ready to crash into a crater... in 2020" - Dipshit youtuber. Ok.. If you say so Quote Link to post Share on other sites
DynamiteCop 2,087 Posted September 22, 2019 Share Posted September 22, 2019 1 minute ago, Remij_ said: "Nvidia's ready to crash into a crater... in 2020" - Dipshit youtuber. Ok.. If you say so If AMD, Sony and Microsoft are developing RT hardware purpose built for it that is vastly superior in efficiency to Turing? Yes, 100% yes. Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 2 minutes ago, DynamiteCop! said: He's got a point, Turing was obviously built for something else but is being used to try and accelerate RT, it can kind of do it but not up to any kind of adequate standard. The only hope for RT is purpose built hardware directly created for the task at hand. No he doesn't. His point is fucking ridiculous. The cores in RT are SPECIFICALLY there to accelerate one of the key areas of the RT pipeline. The cores are going to evolve over time and become more general purpose... meaning they will completely excel at accelerating what they are meant to... but will ALSO be able to perform other calculations that help different aspects of the rendering pipeline. The guy is trying to blame Nvidia... for creating something that is currently specialized and only useful in certain situations... when all hardware does that, and then they build from there and become more programmable and general purpose in the future.... I mean.... for fuck sakes.. remember pixel shaders.... then vertex shaders... then x y z..... Quote Link to post Share on other sites
lynux3 2,107 Posted September 22, 2019 Author Share Posted September 22, 2019 3 minutes ago, Remij_ said: "Nvidia's ready to crash into a crater... in 2020" - Dipshit youtuber. Ok.. If you say so Don't watch this. 1 Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 2 minutes ago, DynamiteCop! said: If AMD, Sony and Microsoft are developing RT hardware purpose built for it that is vastly superior in efficiency to Turing? Yes, 100% yes. And what does that entail to you? How do you think they are going to do that? You're talking... but not explaining anything. Your argument is... If AMD, Sony, and MS develop something that's better... then it will be better!!! and thus Nvidia is screwed. When has that EVER HAPPENED? How much console hardware has AMD developed over the years? And every time it was... oh well now everyone supports AMD.. Nvidia's dead!! Then reality hits. And the simple fact is... that Nvidia's architecture is exposed THROUGH DXR... meaning they are working with MS. Nvidia's NEXT ray tracing implementation will be much more efficient... It's hilarious how you guys think RTX = turing.. Quote Link to post Share on other sites
DynamiteCop 2,087 Posted September 22, 2019 Share Posted September 22, 2019 Just now, Remij_ said: No he doesn't. His point is fucking ridiculous. The cores in RT are SPECIFICALLY there to accelerate one of the key areas of the RT pipeline. The cores are going to evolve over time and become more general purpose... meaning they will completely excel at accelerating what they are meant to... but will ALSO be able to perform other calculations that help different aspects of the rendering pipeline. The guy is trying to blame Nvidia... for creating something that is currently specialized and only useful in certain situations... when all hardware does that, and then they build from there and become more programmable and general purpose in the future.... I mean.... for fuck sakes.. remember pixel shaders.... then vertex shaders... then x y z..... You're missing the point though, Xbox and PlayStation are going to set the standard for a different process of handling RT that has nothing to do with Turing, and Turing is going to fall to the wayside as a failed expensive experiment. No one wants to deal with it unless they're be paid additionally to do it. The top comment on that video is a perfect example. I thought exactly as you did in regards to Digital Foundry. I'm a game engine developer and in my world? using turing to achieve ray-tracing is just as exotic as any compute brute force method. There's no standard API - Nvidia did release extensions for Vulkan - but by their nature? they're not going to be compatible with whatever the standard will be. Using Nvidia ray-tracing extensions in Vulkan, given the vast setup required? is just as locked-in and un-portable as using an nvidia library directly. It's not like you just flip a switch - your engine has to be designed around it. The future ray-tracing 'API' will not plug into the holes that the Nvidia ray-tracing leaves behind - that's a 'hole' lot of work to re-factor code.I'm heavily reminded of when ATI put tesselation onto their cards - long before OpenGL4 and DirectX 11 - it worked ok, if you coded for it directly - but it wasn't powerful enough to deliver the dream and ATI had to pretty much start from scratch with their tessellation pipeline after Nvidia set the standard. Tensor RT isn't powerful enough to achieve the dream - just like early ATI tessellation. They slapped wings on a car. Quote Link to post Share on other sites
Remij 4,687 Posted September 22, 2019 Share Posted September 22, 2019 4 minutes ago, lynux3 said: Don't watch this. More AMD fanboy youtubers.. these guys are all trying so hard to convince themselves Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.