Jump to content

Well, looks like Jedi Survivor is trash on PC - even worse than the TLOU port


Recommended Posts

 

Quote

In short, Jedi: Survivor not only joins the list of poorly optimized titles on PC, but it is one of the worst so far this year, much worse than The Last of Us Part I, since the latter in 4K at with the Ryzen 9 5950X and RTX 3080 Ti, it never went below 40 FPS, but the general average is 60, since in many parts it stays at 70-80 FPS. Broadly speaking, Jedi: Survivor is closer to Hogwarts Legacy optimization or The Callisto Protocol WITH ray tracing. While there is preloading of shaders, there is traversal stuttering (fixed stuttering when entering or leaving certain zones) and as I mentioned previously, the big issue with VRAM which will cause highly inconsistent frametime, remote texture popping and late loading or erratic of the same,

 

Quote

At 1080p with everything maxed out, the game can eat up to 11GB of VRAM in the most open spots, while the bottleneck in some graphics-heavy spots doesn't help either. To put it simply, if you don't want VRAM issues to play 1080p, you'll need a board with 12GB of VRAM, whereas 16GB would be ideal for 1440p and 4K (and I don't think more than 16GB will be required in the future, unless the game doesn't have DLSS 2 or FSR 2, both of which help reduce VRAM usage by 1-1.5GB). While it runs at 4K at 30 FPS and 1440p at 60 FPS on consoles, for its quality and performance mode respectively, on PC there will probably be a lot of complaints about its performance and Nvidia will once again be the focus of contention for lashing out on the game. amount of memory from mid-range motherboards like the RTX 3060 Ti and RTX 3070.

 

 

PC gaming :lul: 

  • dead 1
Link to post
Share on other sites
  • Twinblade★ changed the title to Well, looks like Jedi Survivor is trash on PC - even worse than the TLOU port
27 minutes ago, Twinblade said:

At some points the game uses TWENTY TWO gigs of VRAM :hest: Im at a loss for words at this point

Why aren't consoles affected the same by this high VRAM usage? Are you saying it's just optimization or the consoles have some kind of advantage?

Link to post
Share on other sites
43 minutes ago, Ramza said:

Why aren't consoles affected the same by this high VRAM usage? Are you saying it's just optimization or the consoles have some kind of advantage?

 

Well to start, the GPUs in both PS5 and Series X have 16gb of VRAM which is more than most Nvidia cards and those are the most common on PC (AMD cards have more VRAM but they make up a much smaller portion of the market). But optimization in general does play a part. I believe the way console architecture is set up makes it easier for devs to offload processes to the SSDs, or the CPUs, compared to PCs which support so many different configurations nowadays that it makes it harder to optimize for that platform. Im sure with enough time and effort ports like this one would turn out better but the extra resources that requires might not be worth it for devs who are on tight deadlines and budgets. 

 

I wish Remij still posted here as im sure he would be able to explain this far better than I could.

  • Upvote 1
Link to post
Share on other sites
4 hours ago, Hot Sauce said:

Can't even get stable 60 with a 4090 on 1440p ultra. Pretty disastrous version of the game.

if, at this current point in time, somebody in a video game forum were to have a graphics card more powerful than a 4090 and was able to get a stable 60 frames while playing their version of the game, you'd be seeing them saying that there's nothing wrong with PC gaming.

 

because that has been the same mentality 4090 owners have been doing for all the other horrible PC ports that 97% of PC gamers are unhappy with.

  • Like 2
Link to post
Share on other sites
35 minutes ago, -GD-X said:

After seeing the latest patch for Cyberpunk, which looks a gen better than this game (and everything), there’s no reason for the 4090 to not be able to get 4K and 100+ FPS.

Yea it's a joke. Hopefully the day one patch and subsequent patches can fix this. Looks seriously a whole gen behind Cyberpunk maxed out on a 4090 and I can get 100+ fps in that with DLSS 3 and frame generation On in 4K.

  • Upvote 1
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...