Jump to content

Devil May Cry V: GPU Benchmarks Rolling Out


Recommended Posts

  • Replies 58
  • Created
  • Last Reply

Top Posters In This Topic

God you're dumb. :hest: 

 

That's MAX graphics... It's gonna be hilarious watching Alex' PC analysis and then finding out that much like RE2, the Xbox One X is a mix of high and medium.... and is using a reconstruction technique :rofl: 

 

Guaranteed 100+ fps on PC at 4K with Xbox One X settings.  

 

Also, I'm not quite sure why you changed the CPU because the game scales higher than that at 1080p res.

 

32365985867_eca8a0470a_o.png

 

 

 

Link to post
Share on other sites
33 minutes ago, Remij_ said:

God you're dumb. :hest: 

 

That's MAX graphics... It's gonna be hilarious watching Alex' PC analysis and then finding out that much like RE2, the Xbox One X is a mix of high and medium.... and is using a reconstruction technique :rofl: 

 

Guaranteed 100+ fps on PC at 4K with Xbox One X settings.  

 

Also, I'm not quite sure why you changed the CPU because the game scales higher than that at 1080p res.

 

 

 

 

 

High/Medium settings matters not, the game is still rendering out at a native 4K which is absurdly demanding on this engine. The shader shit will get sorted out eventually and what's going on with that but look at where the RX 580 falls at all resolutions, at 1440p at max settings it's performing the same as the X at 4K, 57 average, 43 minimum. If you scale up the geometry and reduce the settings it should effectively be in performance parity at 4K, and that's coupled with a dramatically better CPU in the system.

 

Also I changed the CPU because an i9 9900k isn't remotely indicative of what most people have even with high end PC's, most people buy $200 - $300'ish CPU's even for top end builds, not $500+ CPU's. 

Edited by DynamiteCop!
Link to post
Share on other sites

Wait a minute..........did this moron genuinely believe that the Xbox One X version was producing the same thing in framerate in the range of a 1080 TI and a 2080 vanilla?

 

And then he said that graphical settings don't matter? That only resolution matter?:tom:

Link to post
Share on other sites
7 minutes ago, jehurey said:

Wait a minute..........did this moron genuinely believe that the Xbox One X version was producing the same thing in framerate in the range of a 1080 TI and a 2080 vanilla?

 

And then he said that graphical settings don't matter? That only resolution matter?:tom:

Illiterate

Link to post
Share on other sites

He thinks that resolution is the most demanding graphical settings, and that moving between "Medium" and "High" settings don't cause as much variation as the resolution setting?

 

Has he ever heard of Anti-Aliasing MAX settings?:drake: You want to find a surefire way to tank the framerate, just turn that up.

Link to post
Share on other sites
15 minutes ago, jehurey said:

He thinks that resolution is the most demanding graphical settings, and that moving between "Medium" and "High" settings don't cause as much variation as the resolution setting?

 

Has he ever heard of Anti-Aliasing MAX settings?:drake: You want to find a surefire way to tank the framerate, just turn that up.

I have Resident Evil 2, I know exactly how setting strain this engine. Shut up faggot.

Link to post
Share on other sites
17 minutes ago, DynamiteCop! said:

I have Resident Evil 2, I know exactly how setting strain this engine. Shut up faggot.

You have another Capcom game that has simpler textures, smaller environments and less characters on-screen.........so you know exactly how Devil May Cry 5 would perform?

 

:tom:Every post is just new material to work from.

Link to post
Share on other sites
Just now, jehurey said:

You have another Capcom game that has simpler textures, smaller environments and less characters on-screen.........so you know exactly how Devil May Cry 5 would perform?

 

:tom:Every post is just new material to work from.

It doesn't have simpler textures, there's environments just as large and arguably more characters on screen.

 

Just further confirmation that you don't play games and you haven't played Resident Evil 2.

 

P.S. Resident Evil 4 sucks :lawl:

Link to post
Share on other sites
10 hours ago, DynamiteCop! said:

High/Medium settings matters not, the game is still rendering out at a native 4K which is absurdly demanding on this engine. The shader shit will get sorted out eventually and what's going on with that but look at where the RX 580 falls at all resolutions, at 1440p at max settings it's performing the same as the X at 4K, 57 average, 43 minimum. If you scale up the geometry and reduce the settings it should effectively be in performance parity at 4K, and that's coupled with a dramatically better CPU in the system.

 

Also I changed the CPU because an i9 9900k isn't remotely indicative of what most people have even with high end PC's, most people buy $200 - $300'ish CPU's even for top end builds, not $500+ CPU's. 

LOL 

 

And it's not rendering native 4K.  Sorry to break that to you... again. :( 

 

And you changed the CPU because it's not remotely indicative of what people with high end PCs have?  Bullllll shit.  Most people who game actually DO overclock their CPUs and are very competitive with the 9900K.  So then I guess we should use the OG Xbox One, since that's most indicative of what Xbox most Xbox owners have :drake: 

 

Come the fuck on :tom: 

 

Link to post
Share on other sites

 

Quote

Speaking of platform comparisons, pretty much the entirety of the visual feature set is deployed across all systems with variations mostly in terms of resolution and less noticeably, in performance. Curiously, the standard PlayStation 4 seems to be the only version rendering at a native resolution - in this case, 1080p. The vanilla Xbox One achieves a similar look, but uses a reconstruction technique to get the job done - but despite this, the impact to quality up against PS4's native presentation is slight. Reconstruction techniques are also used on PS4 Pro and Xbox One X, delivering 1800p and 2160p respectively. Anti-aliasing quality seems slightly improved on the X, and the Pro can shimmer slightly in some spots, with an interlacing-style artefact.

https://www.eurogamer.net/articles/digitafoundry-2019-devil-may-cry-5-tech-analysis-all-consoles-tested

 

Stop TESTING me on this shit breh :reg: 

Link to post
Share on other sites
1 hour ago, Remij_ said:

LOL 

 

And it's not rendering native 4K.  Sorry to break that to you... again. :( 

 

And you changed the CPU because it's not remotely indicative of what people with high end PCs have?  Bullllll shit.  Most people who game actually DO overclock their CPUs and are very competitive with the 9900K.  So then I guess we should use the OG Xbox One, since that's most indicative of what Xbox most Xbox owners have :drake: 

 

Come the fuck on :tom: 

 

Do you have a 9900k ?

 

Looks like rtx flopped again LOL

Link to post
Share on other sites
7 minutes ago, Jon2B said:

Do you have a 9900k ?

 

Looks like rtx flopped again LOL

Why would I need one?  Does that change anything I said dumbo? 

 

LMAO Checkerbox One X.. 1920x2160... you fucking idiots actually thought your console was hanging with high end PCs :rofl: 

Link to post
Share on other sites
1 hour ago, Remij_ said:
Quote

I’m not sure what it’s doing exactly but I’ve seen it impact geometry edges on camera cuts and situations where it looks half-res on horizontal. It’s not usually noticeable but it’s there. 

It’s tricky these days because it can indeed be just part of the pipeline. That’s what I found with Frostbite games which exhibit CBR-like artefacts when using motion blur. 

DMC5’s artefacts mainly only appear on camera cuts, though...but not on base PS4.

:shrug:

Link to post
Share on other sites
1 minute ago, Remij_ said:

Why would I need one?  Does that change anything I said dumbo? 

 

LMAO Checkerbox One X.. 1920x2160... you fucking idiots actually thought your console was hanging with high end PCs :rofl: 

Lol then why u complaining about him changing the CPU?

 

X bangs on PCs 3x more expensive :juggle:

Link to post
Share on other sites
1 minute ago, DynamiteCop! said:

:shrug:

Owned. :hest: 

 

It's almost like you haven't been hearing me talk about reconstruction and checkerboard rendering as improving and being more logical or anything lately...  Wait did I make a thread about that?

 

 

 

Well fuck...

 

The dumbos at DF are constantly fooled... NXGamer just doesn't mention anything when he's not sure... and VG Tech... are usually right.  I mean... it was 100% obvious if you looked at any pictures anyway... 

 

Sure makes a pretty good case for reconstruction when EAGLE EYED DYNO will argue incessantly for his Fanbox One X that it runs everything at native resolutions. :shrug: 

Link to post
Share on other sites
8 minutes ago, Jon2B said:

Lol then why u complaining about him changing the CPU?

 

X bangs on PCs 3x more expensive :juggle:

Because he's emotionally unstable, 6 and 8 core CPU's make up 13% of the PC market and AMD has offered both for the last 10 years which is easily the lions share of that percentage...

 

Most people even with the best PC's out there are still using 4 core CPU's; i5's and i7's... Not a newfangled 6 core i7 or an 8 core i9. It's stupid to parse GPU results from something that's a total edge case, I should have realistically put a worse CPU in that chart.

 

2.7 Ghz to 2.99 Ghz
13.18%
+0.34%

 

trans.gif
3.0 Ghz to 3.29 Ghz
16.64%
+0.31%

 

trans.gif
3.3 Ghz to 3.69 Ghz
22.04%
-0.09%

 

trans.gif
3.7 Ghz and above
6.22%

 

________________________________________________________________________________________________________________________________________________

 

4 cpus
56.35%
+0.34%

 

trans.gif
5 cpus
0.00%
-0.01%

 

trans.gif
6 cpus
11.35%
+1.33%

 

trans.gif
8 cpus
2.02%
+0.23%
Edited by DynamiteCop!
Link to post
Share on other sites
5 minutes ago, Jon2B said:

Lol then why u complaining about him changing the CPU?

 

X bangs on PCs 3x more expensive :juggle:

Because it's the typical bullshit he pulls.  Ah lemme just change the CPU to something I think more people will have.... Ah... lemme just saturate and add more contrast to this Gears of War 4 picture to exaggerate how AMAZING HDR looks on the Xbox One X version and how dull the PC version looks... :mj: 

 

I catch him every fucking time.

 

Sorry idiots... your xbox isn't hanging with my PC regardless of the CPU I have... or any high end PC.  It's a mid ranged PC lite.. and you idiots need to fucking RECOGNIZE that shit.

 

MCC best on PC btw get fucked :juggle: 

Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.


×
×
  • Create New...