Jump to content

Devil May Cry V: GPU Benchmarks Rolling Out


Recommended Posts

1 minute ago, Jon2B said:

Well , he's right . Nobody here has a9900k , not even you 

Why would I buy one when my CPU will already get me 200fps in this fucking 60fps console game with braindead AI :drake: 

 

You idiots are dumb as FUCK.

Link to post
Share on other sites
  • Replies 58
  • Created
  • Last Reply

Top Posters In This Topic

1 minute ago, Remij_ said:

Because it's the typical bullshit he pulls.  Ah lemme just change the CPU to something I think more people will have.... Ah... lemme just saturate and add more contrast to this Gears of War 4 picture to exaggerate how AMAZING HDR looks on the Xbox One X version and how dull the PC version looks... :mj: 

 

I catch him every fucking time.

 

Sorry idiots... your xbox isn't hanging with my PC regardless of the CPU I have... or any high end PC.  It's a mid ranged PC lite.. and you idiots need to fucking RECOGNIZE that shit.

 

MCC best on PC btw get fucked :juggle: 

 

2 minutes ago, DynamiteCop! said:

Because he's emotionally unstable, 6 and 8 core CPU's make up 13% of the PC market and AMD has offered both for the last 10 years which is easily the lions share of that percentage...

 

Most people even with the best PC's out there are still using 4 core CPU's; i5's and i7's... Not a newfangled 6 core i7 or an 8 core i9. It's stupid to parse GPU results from something that's a total edge case, I should have realistically put a worse CPU in that chart.

 

Oh and with the Gears image I already explained to you exactly why I did that. HDR isn't captured when taking a screenshot, and the contrast levels look absolutely nothing like what you're given in SDR format.

 

Here's a prime example.

 

SDR capture of RDR2

 

44675357215_a399192ccc_o.png

 

Here's what it actually looks like when being viewed on an HDR TV, the bottom with adjusted contrast and lighting levels is far more indicative of its actual representation.

 

44865529264_f870cfd722_o.png

 

None of these things are "gotcha" moments, you're just being stupid.

Link to post
Share on other sites
3 minutes ago, DynamiteCop! said:

 

Oh and with the Gears image I already explained to you exactly why I did that. HDR isn't captured when taking a screenshot, and the contrast levels look absolutely nothing like what you're given in SDR format.

 

Here's a prime example.

 

SDR capture of RDR2

 

44675357215_a399192ccc_o.png

 

Here's what it actually looks like when being viewed on an HDR TV, the bottom with adjusted contrast and lighting levels is far more indicative of its actual representation.

 

44865529264_f870cfd722_o.png

 

None of these things are "gotcha" moments, you're just being stupid.

No... YOU are absolutely being stupid.  Your adjusting the constrast and saturation IS NOT MIMICKING THE EFFECTS OF HDR....

 

I mean... if that WAS the case... then VOILA! all games are now HDR :drake:   

 

You fucking guys :snoop: 

 

And RDR2 is a TERRIBLE example dude.... it's MORE washed out and LESS contrasty with HDR :D 

Link to post
Share on other sites
1 minute ago, Remij_ said:

No... YOU are absolutely being stupid.  Your adjusting the constrast and saturation IS NOT MIMICKING THE EFFECTS OF HDR....

 

I mean... if that WAS the case... then VOILA! all games are now HDR :drake:   

 

You fucking guys :snoop: 

 

And RDR2 is a TERRIBLE example dude.... it's MORE washed out and LESS contrasty with HDR :D 

Of course it's not making it HDR, I'm not stupid. It's making it appear more in line with what is actually seen when viewing this content in play... Even a fucking camera can pick up the difference in light and contrast levels...

 

32371379717_f8b364cb59_o.jpg

 

46589994694_7ac6aeb880_o.jpg

Link to post
Share on other sites
14 minutes ago, DynamiteCop! said:

How else do you try to convey what is actually seen with your eyes?

You don't.  You especially don't to someone who you know knows what HDR looks like in person and what it does.  And again, it's not conveying what you want it to convey...  what it conveys is a picture that's been adjusted to be more contrasty... something that can be done on any set.  And of course it's especially disingenuous to do it, when whatever you're comparing it to isn't having the same thing done... 

Link to post
Share on other sites
Just now, Remij_ said:

You don't.  You especially don't to someone who you know knows what HDR looks like in person and what it does.  And again, it's not conveying what you want it to convey...  what it conveys is a picture that's been adjusted to be more contrasty... something that can be done on any set.  It's especially disingenuous to do it, when whatever you're comparing it do isn't having the same thing done... 

Then how do you explain those camera shots looking completely different? Because lighting and contrast levels are substantially altered with SDR vs. HDR  which not only a camera picks up but also does not translate over to SDR screen shots.

 

It's not disingenuous, it's actually more genuine because you're trying to better replicate what it actually looks like on screen. 

 

 

Link to post
Share on other sites
9 minutes ago, DynamiteCop! said:

Then how do you explain those camera shots looking completely different? Because lighting and contrast levels are substantially altered with SDR vs. HDR  which not only a camera picks up but also does not translate over to SDR screen shots.

 

It's not disingenuous, it's actually more genuine because you're trying to better replicate what it actually looks like on screen. 

 

 

No.. lmao.  Dynamite... I'm not stupid man.  Please stop.  There's like 10 different ways I could come at you with this...

 

Imagine review sites doing what you tried to do...  Advertising spots already do this shit and it's ridiculous.

 

The absolutely STUPID thing about what you were attempting to do... is that you were attempting to show that HDR is more contrasty and vivid... except doing what you did... to get your point across... could be done... and applied to the opposing image... thus REDUCING the impact that you were TRYING to demonstrate.

 

You can't SHOW someone what HDR is like using an SDR display.. and then telling them that they need an HDR to understand.....  

 

I know you were just trying to make a dull looking game in SDR look less dull and vivid... but you were doing it with the intention of making the difference more pronounced than it actually is (PC vs Xbox)... and it STILL isn't showing the ACTUAL effect of HDR.  

Link to post
Share on other sites
15 minutes ago, Remij_ said:

No.. lmao.  Dynamite... I'm not stupid man.  Please stop.  There's like 10 different ways I could come at you with this...

 

Imagine review sites doing what you tried to do...  Advertising spots already do this shit and it's ridiculous.

 

The absolutely STUPID thing about what you were attempting to do... is that you were attempting to show that HDR is more contrasty and vivid... except doing what you did... to get your point across... could be done... and applied to the opposing image... thus REDUCING the impact that you were TRYING to demonstrate.

 

You can't SHOW someone what HDR is like using an SDR display.. and then telling them that they need an HDR to understand.....  

 

I know you were just trying to make a dull looking game in SDR look less dull and vivid... but you were doing it with the intention of making the difference more pronounced than it actually is... and it STILL isn't showing the ACTUAL effect of HDR.  

You're implying that this is some type of bullshot, it's not, this is almost exactly how it looks to the human eye bar the brighter highlights which a camera does not pick up. 

 

You can't see HDR via an SDR display, that's correct but even without the ability to see it you can still see the dramatic shift in light and contrast, and even color shift apparent which warps all of those levels. 

 

These images are one and the same, the only difference is HDR which you can clearly see alters the color space, alters the lighting and the way contrast is portrayed. You're not picking up the nits but you can see its effect regardless.

 

Manually adjusting brightness and contrast to try and replicate the effect HDR has on a perceived scene in a display is the most honest way you could treat HDR content in an SDR format. You're not trying to pull the wool over peoples eyes, you're trying to get it as close to what is actually seen as possible. 

 

It's amazing how dumb you're being about this.

 

47313772101_d4862ee042_o.jpg

 

46590461004_4a620543c2_o.jpg

Edited by DynamiteCop!
Link to post
Share on other sites
1 hour ago, Remij_ said:

Why would I buy one when my CPU will already get me 200fps in this fucking 60fps console game with braindead AI :drake: 

 

You idiots are dumb as FUCK.

Lol dumbo , you argue against your own argument :D

Link to post
Share on other sites
11 minutes ago, Jon2B said:

Lol dumbo , you argue against your own argument :D

No... it's not against my own argument.  My argument is very different.  It's about him ALTERING things to either be more noticeably in favor on one thing (his "HDR" tuned pic compared to my standard pic) and less in favor in others... like this CPU shit.

 

If were going to be showing what GPUs are capable of... then you choose the highest end CPU option available to avoid potentially bottlenecking...  These are GPU tests... you want each GPU to be given the ability to stretch as far as it can.  His manipulation is pure bullshit.  People with 2080ti's likely have higher end CPUs that are overclocked.. he knows this...

 

That of course... is on top of the fact that it was a ridiculous comparison to begin with because consoles and PCs run at different quality graphical settings.  When Alex does his "X1X equivalent settings" then we'll see how different it is...

 

And this is even more hilarious now.. since we know it uses reconstruction and is rendering 1/2 resolution... on top of those reduced graphical settings.... and it's STILL chugging at 30s and 40s fps in cutscenes and I've heard it drops quite a bit in busy gameplay sections :smilecry: 

 

Link to post
Share on other sites
35 minutes ago, DynamiteCop! said:

You're implying that this is some type of bullshot, it's not, this is almost exactly how it looks to the human eye bar the brighter highlights which a camera does not pick up. 

 

I'm not implying it's some type of bullshot... I'm saying it's bullSHIT.  Stop fucking with your pics, and stop fucking with benchmark results.  It's quite simple.

Link to post
Share on other sites
10 minutes ago, Remij_ said:

No... it's not against my own argument.  My argument is very different.  It's about him ALTERING things to either be more noticeably in favor on one thing (his "HDR" tuned pic compared to my standard pic) and less in favor in others... like this CPU shit.

 

If were going to be showing what GPUs are capable of... then you choose the highest end CPU option available to avoid potentially bottlenecking...  These are GPU tests... you want each GPU to be given the ability to stretch as far as it can.  His manipulation is pure bullshit.  People with 2080ti's likely have higher end CPUs that are overclocked.. he knows this...

 

That of course... is on top of the fact that it was a ridiculous comparison to begin with because consoles and PCs run at different quality graphical settings.  When Alex does his "X1X equivalent settings" then we'll see how different it is...

 

And this is even more hilarious now.. since we know it uses reconstruction and is rendering 1/2 resolution... on top of those reduced graphical settings.... and it's STILL chugging at 30s and 40s fps in cutscenes and I've heard it drops quite a bit in busy gameplay sections :smilecry: 

 

So you argue against what he did but are proof of what he was saying lol ok 

Link to post
Share on other sites
5 minutes ago, Remij_ said:

No... it's not against my own argument.  My argument is very different.  It's about him ALTERING things to either be more noticeably in favor on one thing (his "HDR" tuned pic compared to my standard pic) and less in favor in others... like this CPU shit.

 

If were going to be showing what GPUs are capable of... then you choose the highest end CPU option available to avoid potentially bottlenecking...  These are GPU tests... you want each GPU to be given the ability to stretch as far as it can.  His manipulation is pure bullshit.  People with 2080ti's likely have higher end CPUs that are overclocked.. he knows this...

 

That of course... is on top of the fact that it was a ridiculous comparison to begin with because consoles and PCs run at different quality graphical settings.  When Alex does his "X1X equivalent settings" then we'll see how different it is...

 

And this is even more hilarious now.. since we know it uses reconstruction and is rendering 1/2 resolution... on top of those reduced graphical settings.... and it's STILL chugging at 30s and 40s fps in cutscenes and I've heard it drops quite a bit in busy gameplay sections :smilecry: 

 

GPU results should be indicative of what most people who buy them will be seeing, not what a tiny outcrop minority will experience on the CPU front. I know what you're getting at here and it objectively shows a GPU at peak operation but the reality is that's just not the use case for most PC's even those who have higher end systems. Most people are using 4 core i5's and i7's even with the best GPU's on the market, and the newer ones turbo to basically overclocked performance.

 

We also don't know anything, we have speculations which differ. We know what NX said relating to shading reconstruction and geometric nativity, and it makes sense because the reconstruction in the RE Engine is weird to begin with, and John isn't entirely certain as to what is going on. The "interlaced" mode on PC is traditional checkerboard rendering, but even for Resident Evil 2 it was noted that this interlaced mode is not what is being seen on consoles, it's a different method of reconstruction, it's something different entirely. This gives near absolute credence to NX's hypothesis that geometry is native but elements within the shading pipeline are being reconstructed. 

 

He's going to do yet another video so it will be nice to see a deeper dive on this, the RE engine in general is strange in how an image is drawn. 

 

7 minutes ago, Remij_ said:

I'm not implying it's some type of bullshot... I'm saying it's bullSHIT.  Stop fucking with your pics, and stop fucking with benchmark results.  It's quite simple.

But it's not bullshit, it's bullshit to publish a screenshot which you know looks absolutely nothing like it does in use. Trying to represent HDR through contrast and light level adjustments for SDR is the definition of trying to be objectively accurate in representation.

Link to post
Share on other sites
3 hours ago, DynamiteCop! said:

How else do you try to convey what is actually seen with your eyes?

How do you convey something that can ONLY be detected and processed with the capability of the human eye?

 

And your answer is to try and capture that through a camera lens that has LESSER range of the human eye?

 

LOL

Link to post
Share on other sites
Just now, jehurey said:

How do you convey something that can ONLY be detected and processed with the capability of the human eye?

 

And your answer is to try and capture that through a camera lens that has LESSER range of the human eye?

 

LOL

Lesser range doesn't mean it's not picking up differences; which it clearly is.

 

Typical dumb jerry post. 

Link to post
Share on other sites
2 minutes ago, DynamiteCop! said:

Lesser range doesn't mean it's not picking up differences; which it clearly is.

 

Typical dumb jerry post. 

But it would specifically NOT pickup the HIGHEST of dynamic ranges in brightness...........which is what HDR is.

 

LOL you might as well have taken photos of two plates of food and told us to determine which smells better. :tom:

Edited by jehurey
Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.


×
×
  • Create New...