Jump to content

AMD IR Day: Next gen consoles based on "Zen 2" and "next gen RDNA"


Recommended Posts

34 minutes ago, Remij_ said:

AMD fanboys always run their mouths.  They can never back that shit up.  Always late to the conversation.

 

It's like, just fuck right off.  You can enter the conversation when you have hardware that supports what you're talking about :tom: 

The hurt is real. :dame: Who cares? Ray tracing isn't exactly taking over the world like NVIDIA was hoping. Who knows if AMD's solution is going to be as poor or even more poor than NVIDIA's solution. Like I've been saying the feature is too expensive for its own good. I doubt we'll even see "medium" level settings in next gen games, but I'd like to be wrong.

Link to post
Share on other sites
  • Replies 146
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

Don't watch this.  

I don't know why he can't see this roadblock coming, it's as clear as day. RTX is not the future, it's an expensive interim experiment. 

You're way too invested and it's disgusting.  There's nothing worse than a GPU fangirl.

Posted Images

Just now, lynux3 said:

The hurt is real. :dame: Who cares? Ray tracing isn't exactly taking over the world like NVIDIA was hoping. Who knows if AMD's solution is going to be as poor or even more poor than NVIDIA's solution. Like I've been saying the feature is too expensive for its own good. I doubt we'll even see "medium" level settings in next gen games, but I'd like to be wrong.

You will be wrong.  Like usual :tom: 

Link to post
Share on other sites
Just now, lynux3 said:

I've been more right than you have in the last year. :tom: 

 

DUN TALK BOUT WAY TWACING IF U DUN HAB IT :cries:

 

Quit crying you goddamn panzy. :cruise: 

LMAO no you haven't.  Like usual :geese: 

 

You're like kids wanting to sit at the dinner table with the adults, go Radeoff yourself :pffft:

Link to post
Share on other sites
1 minute ago, Remij_ said:

LMAO no you haven't.  Like usual :geese: 

 

You're like kids wanting to sit at the dinner table with the adults, go Radeoff yourself :pffft:

Yeah I have, like usual. :banderoos:

 

RTX flopped, people are pointing it out and your butthurt rises exponentially. :rofls: There's nothing better than watching your NVIDIA cock sucking ass cry about it too. :kaz:  NVIDIOT :D 

Link to post
Share on other sites
2 minutes ago, lynux3 said:

Yeah I have, like usual. :banderoos:

 

RTX flopped, people are pointing it out and your butthurt rises exponentially. :rofls: There's nothing better than watching your NVIDIA cock sucking ass cry about it too. :kaz:  NVIDIOT :D 

What dear, did you say something?

 

kids-table-630x420.jpg?w=980&q=75

 

:tom: 

Link to post
Share on other sites

The video makes a lot of sense, when talking about Cuda and Turing and the imbalance he goes on to talk about hardware acceleration and the like. Hardware acceleration should take your existing performance profile without acceleration and maintain it while operating the extenuating task it's set out to do. Turing and RTX does not do this, it kills that existing profile via Cuda and raster rendering while bogging down the Turing cores to handle RT beyond what they are capable. It's not adequate for the task it's supposedly intended to handle and as a result you get a sharp net-negative in performance.

 

 

Edited by DynamiteCop!
Link to post
Share on other sites
1 minute ago, DynamiteCop! said:

The video makes a lot of sense, when talking about Cuda and Turing and the imbalance he goes on to talk about hardware acceleration and the like. Hardware acceleration should take you existing performance profile without acceleration and maintain it while operating the extenuating task it's set out to do. Turing and RTX does not do this, it kills that existing profile via Cuda and raster rendering while bogging down the Turing cores to handle RT beyond what they can handle. It's not adequate for the task it's supposedly intended to handle and as a result you get a sharp net-negative in performance.

 

 

The guy is an idiot and doesn't have any idea what he's talking about.

Link to post
Share on other sites
1 minute ago, Remij_ said:

What dear, did you say something?

 

:tom: 

NVIDIA :ccruise:

 

RTX flopped and the damage your rectum sustained doesn't appear to be reversible. :cruise: 

 

AMD shipping more GPUs is definitive proof RTX flopped. :rofls: 

 

Link to post
Share on other sites
1 minute ago, Remij_ said:

It's as much of a response as that dipshit deserves :reg: 

He's got a point, Turing was obviously built for something else but is being used to try and accelerate RT, it can kind of do it but not up to any kind of adequate standard. The only hope for RT is purpose built hardware directly created for the task at hand. 

Link to post
Share on other sites
2 minutes ago, DynamiteCop! said:

He's got a point, Turing was obviously built for something else but is being used to try and accelerate RT, it can kind of do it but not up to any kind of adequate standard. The only hope for RT is purpose built hardware directly created for the task at hand. 

No he doesn't.  His point is fucking ridiculous.  The cores in RT are SPECIFICALLY there to accelerate one of the key areas of the RT pipeline.

 

The cores are going to evolve over time and become more general purpose... meaning they will completely excel at accelerating what they are meant to... but will ALSO be able to perform other calculations that help different aspects of the rendering pipeline.

 

The guy is trying to blame Nvidia... for creating something that is currently specialized and only useful in certain situations... when all hardware does that, and then they build from there and become more programmable and general purpose in the future....  I mean.... for fuck sakes.. remember pixel shaders.... then vertex shaders... then x y z..... :snoop: 

Link to post
Share on other sites
2 minutes ago, DynamiteCop! said:

If AMD, Sony and Microsoft are developing RT hardware purpose built for it that is vastly superior in efficiency to Turing? Yes, 100% yes. 

And what does that entail to you?  How do you think they are going to do that?  You're talking... but not explaining anything.

 

Your argument is... If AMD, Sony, and MS develop something that's better... then it will be better!!! and thus Nvidia is screwed.

 

When has that EVER HAPPENED?  How much console hardware has AMD developed over the years?  And every time it was... oh well now everyone supports AMD.. Nvidia's dead!!

 

Then reality hits. 

 

And the simple fact is... that Nvidia's architecture is exposed THROUGH DXR... meaning they are working with MS.  Nvidia's NEXT ray tracing implementation will be much more efficient...  It's hilarious how you guys think RTX = turing..

Link to post
Share on other sites
Just now, Remij_ said:

No he doesn't.  His point is fucking ridiculous.  The cores in RT are SPECIFICALLY there to accelerate one of the key areas of the RT pipeline.

 

The cores are going to evolve over time and become more general purpose... meaning they will completely excel at accelerating what they are meant to... but will ALSO be able to perform other calculations that help different aspects of the rendering pipeline.

 

The guy is trying to blame Nvidia... for creating something that is currently specialized and only useful in certain situations... when all hardware does that, and then they build from there and become more programmable and general purpose in the future....  I mean.... for fuck sakes.. remember pixel shaders.... then vertex shaders... then x y z..... :snoop: 

You're missing the point though, Xbox and PlayStation are going to set the standard for a different process of handling RT that has nothing to do with Turing, and Turing is going to fall to the wayside as a failed expensive experiment. No one wants to deal with it unless they're be paid additionally to do it. 

 

The top comment on that video is a perfect example.

 

I thought exactly as you did in regards to Digital Foundry. I'm a game engine developer and in my world? using turing to achieve ray-tracing is just as exotic as any compute brute force method. There's no standard API - Nvidia did release extensions for Vulkan - but by their nature? they're not going to be compatible with whatever the standard will be. Using Nvidia ray-tracing extensions in Vulkan, given the vast setup required? is just as locked-in and un-portable as using an nvidia library directly. It's not like you just flip a switch - your engine has to be designed around it. The future ray-tracing 'API' will not plug into the holes that the Nvidia ray-tracing leaves behind - that's a 'hole' lot of work to re-factor code.

I'm heavily reminded of when ATI put tesselation onto their cards - long before OpenGL4 and DirectX 11 - it worked ok, if you coded for it directly - but it wasn't powerful enough to deliver the dream and ATI had to pretty much start from scratch with their tessellation pipeline after Nvidia set the standard. Tensor RT isn't powerful enough to achieve the dream - just like early ATI tessellation. They slapped wings on a car.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    No registered users viewing this page.


×
×
  • Create New...