DynamiteCop 2,090 Posted May 8, 2019 Share Posted May 8, 2019 AMD doesn't have driver support for any of this shit yet you fucking flop Quote Link to post Share on other sites
Remij 4,720 Posted May 8, 2019 Author Share Posted May 8, 2019 4 minutes ago, DynamiteCop! said: AMD doesn't have driver support for any of this shit yet you fucking flop God fucking DAMN you are dumb as fuck Quote Link to post Share on other sites
roflpwnedz 289 Posted May 8, 2019 Share Posted May 8, 2019 vega 56 is pretty good. I <3 it Quote Link to post Share on other sites
DynamiteCop 2,090 Posted May 8, 2019 Share Posted May 8, 2019 1 hour ago, roflpwnedz said: vega 56 is pretty good. I <3 it 64 master race Quote Link to post Share on other sites
roflpwnedz 289 Posted May 8, 2019 Share Posted May 8, 2019 10 minutes ago, DynamiteCop! said: 64 master race lol i got a 24ksomething graphics score on firestrike but I bet yours was on stock or turbo ;( Quote Link to post Share on other sites
DynamiteCop 2,090 Posted May 8, 2019 Share Posted May 8, 2019 (edited) 2 minutes ago, roflpwnedz said: lol i got a 24ksomething graphics score on firestrike but I bet yours was on stock or turbo ;( It was stock, I messed with undervolting and it works but it's just too noisy because I have to ramp up the fan. Edited May 8, 2019 by DynamiteCop! Quote Link to post Share on other sites
Remij 4,720 Posted May 8, 2019 Author Share Posted May 8, 2019 LMAO, I still can't get over the fact that they outright fucking lied that it was 4K/30 when in reality it's 1080p30... Also the fact that Deeno thinks that Crytek's implementation requires and would improve with driver support... which it doesn't Crytek's implementation is API and hardware agnostic... there's no "drivers to improve anything"... it runs how it runs. It's funny how you were claiming it was all superior before.. stating that it was taking advantage of AMD's hardware.. when it was "running at 4K/30". But now it's revealed that it runs at 1080p/30.. and it's all "AMD doesn't have drivers yet " Quote Link to post Share on other sites
roflpwnedz 289 Posted May 8, 2019 Share Posted May 8, 2019 25 minutes ago, DynamiteCop! said: It was stock, I messed with undervolting and it works but it's just too noisy because I have to ramp up the fan. Im just impressed how much these cards can oc. Even the 64. You could slap on a cooler later on and the drivers keep making it better and better. Quote Link to post Share on other sites
DynamiteCop 2,090 Posted May 8, 2019 Share Posted May 8, 2019 The funny thing is you think that in some way we're getting owned here, a middle of the road AMD GPU from 2017 with no special hardware is doing raytracing at 1080p and 30 FPS with absolutely no drivers or software optimization surrounding it by AMD. At the moment we don’t benefit from any additional performance that modern APIs like Vulkan or DX12 or dedicated hardware like the latest generation of graphics cards could give us. But of course, we will optimize the feature to achieve an uplift performance from these APIs and graphics cards. It's still in its infancy, and AMD hasn't even done anything to account for it on their end. The entire purpose of a driver is hardware communication with the software, and right now Crytek is achieving what they are devoid of AMD's drivers, of course once they release drivers this will help increase performance; don't be retarded. 1 minute ago, roflpwnedz said: Im just impressed how much these cards can oc. Even the 64. You could slap on a cooler later on and the drivers keep making it better and better. I'm not too concerned about it, the card seems to handle itself just fine without OC'ing. Quote Link to post Share on other sites
Remij 4,720 Posted May 8, 2019 Author Share Posted May 8, 2019 34 minutes ago, DynamiteCop! said: The funny thing is you think that in some way we're getting owned here, a middle of the road AMD GPU from 2017 with no special hardware is doing raytracing at 1080p and 30 FPS with absolutely no drivers or software optimization surrounding it by AMD. At the moment we don’t benefit from any additional performance that modern APIs like Vulkan or DX12 or dedicated hardware like the latest generation of graphics cards could give us. But of course, we will optimize the feature to achieve an uplift performance from these APIs and graphics cards. It's still in its infancy, and AMD hasn't even done anything to account for it on their end. The entire purpose of a driver is hardware communication with the software, and right now Crytek is achieving what they are devoid of AMD's drivers, of course once they release drivers this will help increase performance; don't be retarded. I'm not too concerned about it, the card seems to handle itself just fine without OC'ing. Funny how it wasn't in it's infancy when Nvidia first announced and showed their RT... Yes.. you are getting owned. You touted that Vega 56 was running this at 4K/30 as a reason why Crytek's implementation was superior and that "compute RT" was good enough and Nvidia scammed people... Again... you seem to be missing the fact that this is a severely paired back implementation of RT, in a very basic scene with nothing dynamic happening. This implementation does NOT require any drivers from AMD's end... drivers aren't and wouldn't be exposing any hardware functions to help accelerate RT on these GPUs.. This is done in software in Cryengine... it's agnostic. The "additional performance" that they are referring to is general performance improvements that those APIs can bring... not RT specific... not something that drivers are going to expose... Not something that's going to significantly speed up RT performance. Have you stopped to ask yourself where AMD's DXR drivers are? Why they haven't released them yet? Straight away it requires 6GB+ of VRAM.. that excludes a lot of their GPUs. AMD went on record saying that they wont support DXR until their entire product stack and support it. So it's likely that they'll never release driver support for their current GPUs. They know performance is too abysmal to even bother. But of course... that has nothing to do with the fact you were gloating before about how amazing Crytek's implementation was and how compute RT was good enough... and that it means nothing that there's actually been a 4x reduction in pixels LOL. Can you imagine if Sony released something, and demoed it running at 4K... and then said the final thing would run at 1080p... you'd be laughing your ass off at them. Which is exactly why I'm laughing at you Quote Link to post Share on other sites
lynux3 2,166 Posted May 8, 2019 Share Posted May 8, 2019 (edited) AMD already confirmed their DX12 GPUs will support DXR's fallback layer, but it doesn't matter. Microsoft isn't even going to maintain that code anymore. Edited May 8, 2019 by lynux3 Quote Link to post Share on other sites
Remij 4,720 Posted May 8, 2019 Author Share Posted May 8, 2019 2 minutes ago, lynux3 said: AMD already confirmed their DX12 GPUs will support DXR's fallback layer. Yea, they did... and where are they? They have support already but aren't enabling it... because they say that they wont support it until their entire product range can do it... which means 6GB+ on all GPUs.. from low to high end. Regardless, this demo doesn't use DXR. It's just amazing how they had the gall to show a 4K/30 video of a demo and then much later... after AMD fags gloated so much.. revealed that it's only actually 1080p 30 on that GPU. lmao.. such fucking liars to try to make their implementation sound impressive/superior to others. lol Quote Link to post Share on other sites
lynux3 2,166 Posted May 8, 2019 Share Posted May 8, 2019 Just now, Remij_ said: Yea, they did... and where are they? They have support already but aren't enabling it... because they say that they wont support it until their entire product range can do it... which means 6GB+ on all GPUs.. from low to high end. Not sure, but it doesn't matter. Microsoft isn't even going to maintain the code for it. https://github.com/Microsoft/DirectX-Graphics-Samples/tree/master/Libraries/D3D12RaytracingFallback Quote Link to post Share on other sites
Remij 4,720 Posted May 8, 2019 Author Share Posted May 8, 2019 1 minute ago, lynux3 said: Not sure, but it doesn't matter. Microsoft isn't even going to maintain the code for it. https://github.com/Microsoft/DirectX-Graphics-Samples/tree/master/Libraries/D3D12RaytracingFallback Well of course. Obviously there was never any real intention to continually update the fallback layer.. because the idea is that future hardware is going to support hardware based RT acceleration. They built the fallback layer for development purposes. Nvidia released their fallback layer driver. AMD already have support ready to go... but wont enable it.. quite simply because performance is abysmal, it requires 6GB ram.. and there's hardly any game support. There's no "drivers" coming from AMD that are going to change their RT situation. DynamiteCop needs to stop acting like AMD has special optimizations up their sleeves to enable their hardware to do shit that it's simply not capable of doing well. AMD are holding off because they know exactly how abysmal it is atm. The simple fact is that this demo was falsely represented... AMD doesn't perform nearly as well as was advertised... and that Deeno should have known better than to gloat about it. Quote Link to post Share on other sites
JonDnD 2,700 Posted May 8, 2019 Share Posted May 8, 2019 20 minutes ago, Remij_ said: Yea, they did... and where are they? They have support already but aren't enabling it... because they say that they wont support it until their entire product range can do it... which means 6GB+ on all GPUs.. from low to high end. Regardless, this demo doesn't use DXR. It's just amazing how they had the gall to show a 4K/30 video of a demo and then much later... after AMD fags gloated so much.. revealed that it's only actually 1080p 30 on that GPU. lmao.. such fucking liars to try to make their implementation sound impressive/superior to others. lol That card cant do 4k anyway on real games who expected it to do 4k with rt Rtx flopped thhbo 1 Quote Link to post Share on other sites
Remij 4,720 Posted May 8, 2019 Author Share Posted May 8, 2019 16 minutes ago, JONBpc said: That card cant do 4k anyway on real games who expected it to do 4k with rt Rtx flopped thhbo A tech demo isn't a game... They showed a 4K/30fps video and stated it was run on a Vega 56 at the end of it... bebecause Crytek's RT implementation was "so superior and compute RT is good enuf" I should also state that CryEngine notoriously favors AMD based hardware rofl.. 1080p30 extremely basic scene no dynamic objects partial reflections only for mirror surfaces Cheapest form of RT LMAO AMDeenoHBR LynnoRTHBR JonnRTXHBR Reading that thread I posted again.. lmfao you fucks are so smug in your ignorance. Fuck all of you Quote Link to post Share on other sites
The Mother Fucker 26 Posted May 9, 2019 Share Posted May 9, 2019 a VEGA 56 today is still more bang for your buck than a RTX 2080Ti. Quote Link to post Share on other sites
Remij 4,720 Posted May 9, 2019 Author Share Posted May 9, 2019 4 minutes ago, The Mother Fucker said: a VEGA 56 today is still more bang for your buck than a RTX 2080Ti. Not when it comes to ray tracing lol Quote Link to post Share on other sites
DynamiteCop 2,090 Posted May 9, 2019 Share Posted May 9, 2019 1 minute ago, Remij_ said: Not when it comes to ray tracing lol In that regard it appears as though it's dead even. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.