Jump to content
Sign in to follow this  
DynamiteCop!

Gears 5, the pursuit of 60 FPS and Scarlett (Ray Tracing)

Recommended Posts

https://www.gamespot.com/articles/gears-5-dev-explains-why-they-refused-to-compromis/1100-6469734/

 

"We didn’t want to have a compromise like Gears of War 4 where you had to choose 1080p/60fps or 4k/30fps. It was a large amount of work across a lot of disciplines to pull it off, but I really think it was worth it and is probably the best example of harnessing the additional power of the Xbox One X."

 

I have to say with Microsoft first party they're beginning to reach Nintendo levels of performance profile execution. Halo 5 is spot on locked, Forza Horizon 4 and Forza 7 are spot on locked, the framerate in Gears 5 is not perfect yet but the lowest it goes is like 57 FPS, it averages like 59.1 FPS John Linneman said so that's just some optimization with a few patches and that's eliminated. It's really impressive the strides they made from the last game with this hardware to now with 5. The game looks better, it's runs at double the framerate and their adaptive scaling with reconstruction and base resolution is so high perceivable resolution shifts go unnoticed.

 

Also on the subject of hardware, Penty said he is "definitely super excited" about the forthcoming release of Microsoft's next-generation console, Project Scarlett, in Holiday 2020. "We don’t have anything to announce right now in terms of Gears with the new hardware--but I’m definitely super excited about what the new hardware could do. Having dedicated ray tracing cores is huge," he said.

 

Well there's even more confirmation of the type of RT they're going after for Scarlett, it appears the system has dedicated ray tracing cores so it will not be cannibalizing compute from the GPU to render RT. All they said in the reveal was "hardware accelerated" which could have meant a multitude of things, well we know what it means now. @Remij_

Share this post


Link to post
Share on other sites
Just now, Team 2019 said:

It's going to tank performance, RT will suck next gen. Unless it's a current gen port.

I wouldn't be so cynical, there's a lot of things they can do to make it run well and I'm sure they have some tricks like console developers do to figure it all out. If you asked me last year if Gears 5 was 60 FPS I would have laughed at you, I think Remij even said something about it and I scoffed at the idea and look where we are now. 

Share this post


Link to post
Share on other sites
Just now, DynamiteCop! said:

I wouldn't be so cynical, there's a lot of things they can do to make it run well and I'm sure they have some tricks like console developers do to figure it all out. If you asked me last year if Gears 5 was 60 FPS I would have laughed at you, I think Remij even said something about it and I scoffed at the idea and look where we are now. 

RT will probably not even be the full package. It's for lightning only if I remember the AMD slide.

Share this post


Link to post
Share on other sites
2 minutes ago, Team 2019 said:

RT will probably not even be the full package. It's for lightning only if I remember the AMD slide.

They're already doing Ray Tracing in Gears 5 with no dedicated hardware, and at 60 FPS.

 

We abandoned baked shadows and go to a fully real-time shadowing system as we couldn’t afford to store kilometers of shadow map data on disk or in memory. Our shadows in the distance are real-time ray-traced.

Share this post


Link to post
Share on other sites
7 hours ago, DynamiteCop! said:

Well there's even more confirmation of the type of RT they're going after for Scarlett, it appears the system has dedicated ray tracing cores so it will not be cannibalizing compute from the GPU to render RT. All they said in the reveal was "hardware accelerated" which could have meant a multitude of things, well we know what it means now. @Remij_

Well, this pretty much means exactly what I thought it was going to mean all along.  They are going to have dedicated RT cores for accelerating BVH and ray traversal/intersect calculations, just like Nvidia does.  There will obviously be some differences and improvements... but fundamentally the goal is the same. 

 

I mean, guys, Nvidia has the R&D money and the best and brightest scientists working in the field.. GPUs are their entire focus... they know as well as anyone what will be the best way forward.  I mean, despite the launch issues with the market being flooded as well as them not having any software to take advantage of it (lol)... this was the absolute biggest architectural change in well over a decade.  They weren't lying when they said this was a fundamental shift in how graphics will be rendered in the future.  Turing GPUs with RT and Tensor cores are just the beginning.

 

RT cores, as well as Tensor cores are going to become more capable too.  AI and Deep Learning acceleration is going to change the game not only with regards to visuals and how they can manipulate them.. but also how games are made.  AI is going to write code that humans couldn't even dream of writing.. it's going to empower artists and animators the ability to do so much more in less time.  You guys think I just talk it up because I'm an Nvidia fanboy.. but nah man.. it's very easy to see where this is going. 

 

You better bet your god damn ass that AMD is all in on Ray Tracing, just like Nvidia.  The hardware is ready... developers are ready... programmers are begging for newer more accurate ways of rendering effects... which wont require nearly as many artists hand placing fake light and shadow maps to try to recreate "a look"... they will be able to easily simulate those lights and shadows... and that's HUGE.

 

There's a reason why people in the industry are so excited about it.  Now that the technology exists that they can denoise it and thus require far fewer rays per pixel... we can do these things in real time.. and we now have products shipping.  Now is the part where GPU vendors begin to push hard in that direction.  We'll see huge improvements coming in the next couple generations of GPUs.

 

Speaking of how it's improving.. it's not just the hardware but as you said here in response to Aza..

Quote

I wouldn't be so cynical, there's a lot of things they can do to make it run well and I'm sure they have some tricks like console developers do to figure it all out. If you asked me last year if Gears 5 was 60 FPS I would have laughed at you, I think Remij even said something about it and I scoffed at the idea and look where we are now. 

 

there's tons of improvements to come.  Devs are still learning best practices, as well as how to best utilize the newer architectures and engines.  You know that Unreal Engine Star Wars tech demo with the Storm troopers?  That was first running on 4 top of the line Volta GPUs at 24fps.  Then Turing came out and a single GPU could run it at 30fps.  That scene was 5M polys, with 4 light sources and the RT passes were done at half resolution.  Now the latest Unreal Engine tech demo "Troll" still runs at 24fps, but has 62M polys, 16 light sources, and RT passes were done at full resolution... all on the same hardware... a single 2080ti.

 

MS is going hard with Ray tracing.  They created the API which exposes all of this to developers.  Without them, this wouldn't be happening right now.. so you can bet that the next Xbox is going to have a good focus on ray tracing.. and I think we're going to see developers really begin to optimize and utilize RT effects to improve almost every facet of next generation games.. from visuals to sound and artificial intelligence.

Share this post


Link to post
Share on other sites
1 hour ago, JONBpc said:

Game Drivers are up, gonna see if this helps at all

Fixed everything. Locked 60fps now

  • Like 1

Share this post


Link to post
Share on other sites

did not want to compromise 60fps in a slow non-twitch shooter against some dumb enemy ai

 

is alright with compromising environment destructibility, hit reactions, weapon feedback, animations, sound fx, and more

 

:pavarotti:

Share this post


Link to post
Share on other sites
5 minutes ago, Quad Damage said:

did not want to compromise 60fps in a slow non-twitch shooter against some dumb enemy ai

 

is alright with compromising environment destructibility, hit reactions, weapon feedback, animations, sound fx, and more

 

:pavarotti:

:D

Share this post


Link to post
Share on other sites

After rempees recent release of his new rtx novel in this thread I thought it's time to tell you all the truth. I'm actually working on RTX right now.

 

He's right about some things but completely wrong about other technical specific stuff. Next gen RTX will be streamlined. After working with specialists in the field, people acknowledged it's not enough anymore to draw shadows and light sources via microsoft paint directly into computer simulations. Comparing 2 pictures of the same content led many experts to believe the best course of action would be to release a technical interference that drains so much hardware power with absolutely no visual gain to boost hardware sales because there are enough idiots and first entrepreneurs that buy anything. 

Share this post


Link to post
Share on other sites
7 minutes ago, Quad Damage said:

did not want to compromise 60fps in a slow non-twitch shooter against some dumb enemy ai

 

is alright with compromising environment destructibility, hit reactions, weapon feedback, animations, sound fx, and more

 

:pavarotti:

:deadkoolaid:

Share this post


Link to post
Share on other sites

Lmao at fanboys pretending they want 30fps over 60fps so they can shoot some glass bottles of a desk .

 

Gears is graphics king and it's clear that hurts some feelings .

 

 

Share this post


Link to post
Share on other sites

GameSpot edited the article. It's no longer dedicated "RT cores", but rather dedicated "RT hardware". :tom:

 

SJmD8bU.png

Share this post


Link to post
Share on other sites
15 minutes ago, lynux3 said:

GameSpot edited the article. It's no longer dedicated "RT cores", but rather dedicated "RT hardware". :tom:

 

SJmD8bU.png

As expected it's going to suck.

Share this post


Link to post
Share on other sites
5 hours ago, Team 2019 said:

As expected it's going to suck.

That doesn't change anything, it's still dedicated hardware and the vernacular used i.e. cores was likely wrong so they corrected it. 

 

The way AMD accomplishes RT may be different than Nvidia.

Edited by DynamiteCop!

Share this post


Link to post
Share on other sites
41 minutes ago, DynamiteCop! said:

That doesn't change anything, it's still dedicated hardware and the vernacular used i.e. cores was likely wrong so they corrected it. 

 

The way AMD accomplishes RT may be different than Nvidia.

Well after they just got bit by Faildozer and what they defined as a "core" this is probably the case. :pavarotti: AMD has a patent on how they could do ray-tracing with the DCUs.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×