Jump to content

Low Fps on high end pc


camsterdude

Recommended Posts

1 hour ago, camsterdude said:

I seem to hover around 10% cpu utilization and 30% gpu and around 50 - 70 fps.

10% CPU utilization on a 16c/32t system may still be up to 3 cores at 100%.

 

Probably packed your game full with mods?

Perhaps it helps if you provide your logfile.

Edited by Liesel Weppen (see edit history)
Link to comment
Share on other sites

I have a 3900x with a 1060 and don't have a problem with a stable 60FPS at 1080p.   The game really doesn't do well at higher resolutions, and this isn't the type of game that's going to get a benefit from a higher framerate.

 

Also note that options with the word "Reflection" in them will absolutely tank FPS because this feature is not optimized.

Link to comment
Share on other sites

Looks like a CPU (per core) bottleneck to me... that would be why your GPU doesn't reach 100% utilization. Now either that's because you have junk running in the background like Anti-virus or firewall or something... or somehow the 3090 really is that good and is making your system unbalanced.

 

And like others have mentioned, anything above 1080p tanks the fps in a very big way no matter what hardware you throw at it.

Link to comment
Share on other sites

I have a similar spec system, out of the box performance for the 3950x in 7dtd isn't great and needs some user intervention to maximise performance. The single biggest thing you can do is lock the game to 4 physical cores, that will improve fps by up to 50%. To do this you can set up a short cut following this guide to always launch the game with 4 core affinity. 

 

Luckily for you we have the same cpu so you can just use my hexadecimal value and can skip the part about finding that. For 4 physical cores on a 3950x the value is 55. That will lock the game to cores 0,2,4,6

Edited by Naz (see edit history)
Link to comment
Share on other sites

  • 2 months later...

Could there be a possible conflict with the hardware scheduling of AMD graphics cards vs. the software scheduling of NVIDIA cards?

 

My 3900X used to run 70 - 90% CPU usage with my old R9 Fury or 7970x2 Crossfire setup and max GPU usage. That was without using any core affinity options.

 

Now that I have a 3090 I max out at 25% CPU usage and 30% GPU usage. the R9 actually got better FPS at 1080P when using the same settings, compared to the 3090, until getting to settings that exceeded the VRAM of the older cards.

 

1440P is the same FPS with the same settings between the R9 and 3090, until you get into the high and ultra settings.

Link to comment
Share on other sites

57 minutes ago, drperry said:

Could there be a possible conflict with the hardware scheduling of AMD graphics cards vs. the software scheduling of NVIDIA cards?

 

My 3900X used to run 70 - 90% CPU usage with my old R9 Fury or 7970x2 Crossfire setup and max GPU usage. That was without using any core affinity options.

 

Now that I have a 3090 I max out at 25% CPU usage and 30% GPU usage. the R9 actually got better FPS at 1080P when using the same settings, compared to the 3090, until getting to settings that exceeded the VRAM of the older cards.

 

1440P is the same FPS with the same settings between the R9 and 3090, until you get into the high and ultra settings.

25% CPU usage on a 24 core thread processor means the cpu is running 4 threads at 100% while the rest idles... this is expected (given the game engine's limitations or something along those lines) and means your per core performance is working at maximum and is the bottleneck, hence why your GPU doesn't need to work any harder. I very much doubt your CPU ever reached 70 - 90% usage with this game (or any other game for that matter).

Edited by Fox (see edit history)
Link to comment
Share on other sites

10 hours ago, Fox said:

25% CPU usage on a 24 core thread processor means the cpu is running 4 threads at 100% while the rest idles... this is expected (given the game engine's limitations or something along those lines) and means your per core performance is working at maximum and is the bottleneck, hence why your GPU doesn't need to work any harder. I very much doubt your CPU ever reached 70 - 90% usage with this game (or any other game for that matter).

 

Unless it was pickup up the slack of the weaker GPUs, I used to run 75 - 80*C on the CPU playing 7 days, now I tick over at 40 - 45*C... There's definitely a difference in CPU usage compared to before.

 

There's also no reason for the GPU to be limited to 30% usage now, unless there's a programming conflict somewhere, otherwise ANY game I play would suffer from the same issue.

 

There's also no difference in 1080p and 1440p for FPS on my system.

 

Ryzen 9 3900X

Gigabyte Aorus Master X570

32GB G.Skill Ripjaws 3200mhz memory

EVGA RTX 3090 FTW3 Ultra

Corsair AX1200

Edited by drperry
added some info (see edit history)
Link to comment
Share on other sites

I have the same vidcard but a different processor, 5950x. All settings maxed, 4k resolution and I get a minimum 83 fps during horde night of 12 zeds per player, 4 players. During a normal game day, fps is a steady 120. So, like others have said, it’s got to be your cpu.

Link to comment
Share on other sites

5 hours ago, drperry said:

 

Unless it was pickup up the slack of the weaker GPUs, I used to run 75 - 80*C on the CPU playing 7 days, now I tick over at 40 - 45*C... There's definitely a difference in CPU usage compared to before.

 

There's also no reason for the GPU to be limited to 30% usage now, unless there's a programming conflict somewhere, otherwise ANY game I play would suffer from the same issue.

 

There's also no difference in 1080p and 1440p for FPS on my system.

 

Ryzen 9 3900X

Gigabyte Aorus Master X570

32GB G.Skill Ripjaws 3200mhz memory

EVGA RTX 3090 FTW3 Ultra

Corsair AX1200

Having all those cores does not help you at all in gaming. Per core performance is what matters. Your CPU is the bottleneck. And it makes sense that there'd be little to no difference in fps from 1080p to 1440p if your GPU is only working at 30%. Your 3090 is designed for 4k gaming.

Link to comment
Share on other sites

18 hours ago, drperry said:

 

Unless it was pickup up the slack of the weaker GPUs, I used to run 75 - 80*C on the CPU playing 7 days, now I tick over at 40 - 45*C... There's definitely a difference in CPU usage compared to before.

 

There's also no reason for the GPU to be limited to 30% usage now, unless there's a programming conflict somewhere, otherwise ANY game I play would suffer from the same issue.

 

There's also no difference in 1080p and 1440p for FPS on my system.

 

Ryzen 9 3900X

Gigabyte Aorus Master X570

32GB G.Skill Ripjaws 3200mhz memory

EVGA RTX 3090 FTW3 Ultra

Corsair AX1200

Most games are bottlenecked by the GPU in typical PCs, very few (like 7D2D) are bottlenecked by the CPU. Also I suspect another bottleneck somewhere on the memory bus or in caches as 7D2D seems to shift a lot more information (the voxel data) between CPU, GPU, RAM and SSD/harddisk.

 

Bottlenecked CPU means: Every cycle the GPU is finished with its task to display a frame looong before the CPU gives it a new frame to display for the next cycle. Which is why it doesn't matter anymore how fast the GPU is, as it would just have to wait longer.

 

 

Edited by meganoth (see edit history)
Link to comment
Share on other sites

On 4/14/2021 at 9:08 AM, drperry said:

Could there be a possible conflict with the hardware scheduling of AMD graphics cards vs. the software scheduling of NVIDIA cards?

 

My 3900X used to run 70 - 90% CPU usage with my old R9 Fury or 7970x2 Crossfire setup and max GPU usage. That was without using any core affinity options.

 

Now that I have a 3090 I max out at 25% CPU usage and 30% GPU usage. the R9 actually got better FPS at 1080P when using the same settings, compared to the 3090, until getting to settings that exceeded the VRAM of the older cards.

 

1440P is the same FPS with the same settings between the R9 and 3090, until you get into the high and ultra settings.

Ryzen 3000 cpus won't ever be fast enough in normal situations to max out a 3090 in 7dtd. If you follow my previous discused "4 core affinity" advice, you can get a much more desirable 60-80% gpu usage at 1440p. 

 

Also 7dtd doesn't scale past cpu 4 cores, so a 12 core 24 thread cpu will never see 70%-90% usage in game, regardless of what gpu is installed. You must of had something running in the background like a steam game update or Windows update. 

Link to comment
Share on other sites

5 hours ago, Naz said:

Ryzen 3000 cpus won't ever be fast enough in normal situations to max out a 3090 in 7dtd. If you follow my previous discused "4 core affinity" advice, you can get a much more desirable 60-80% gpu usage at 1440p. 

 

Also 7dtd doesn't scale past cpu 4 cores, so a 12 core 24 thread cpu will never see 70%-90% usage in game, regardless of what gpu is installed. You must of had something running in the background like a steam game update or Windows update. 

 

Nope, doesn't make an appreciable difference. Maybe gets me another 5 FPS, but still.

 

There's still an issue, and it's NOT with my system, it's a compatibility issue with the 3000 series GPUs. Otherwise I'd be able to get higher FPS on the same settings than my R9 got.

 

There's zero difference between 1080p and 1440p. Both for For FPS and for CPU/GPU usage.

 

The game is NOT leveraging the 3000 series GPU. The 7DTD issue isn't bottlenecking. It's a programming/coding/optimization issue somewhere. Otherwise there would be a performance increase, even if it's not a huge one.

Edited by drperry (see edit history)
Link to comment
Share on other sites

7 hours ago, drperry said:

 

Nope, doesn't make an appreciable difference. Maybe gets me another 5 FPS, but still.

 

There's still an issue, and it's NOT with my system, it's a compatibility issue with the 3000 series GPUs. Otherwise I'd be able to get higher FPS on the same settings than my R9 got.

 

That "otherwise" doesn't make any sense. SAME CPU, different GPU => SAME FPS is the clearest case of a bottleneck you could get.

 

To prove that you still have an issue you need to show that someone else with a similar CPU has much better FPS than you at the same settings(!).

 

 

 

 

Link to comment
Share on other sites

4 hours ago, meganoth said:

 

That "otherwise" doesn't make any sense. SAME CPU, different GPU => SAME FPS is the clearest case of a bottleneck you could get.

 

To prove that you still have an issue you need to show that someone else with a similar CPU has much better FPS than you at the same settings(!).

 

 

 

 

i agree with you, as the person that made this post in the first place i have already stated that the problem was fixed ages ago with just turning off 2 in alpha settings. I also upgraded to a 5950x and yes i do get a @%$# ton more fps than before so it was a per core bottleneck which i did infact already knew but those 2 settings actually made it a whole lot worse. i could after turning them off play all other settings maxed at well over 150 fps.

Link to comment
Share on other sites

13 minutes ago, camsterdude said:

i agree with you, as the person that made this post in the first place i have already stated that the problem was fixed ages ago with just turning off 2 in alpha settings. I also upgraded to a 5950x and yes i do get a @%$# ton more fps than before so it was a per core bottleneck which i did infact already knew but those 2 settings actually made it a whole lot worse. i could after turning them off play all other settings maxed at well over 150 fps.

Are you talking about the ssao and reflection options? Or console command options like poi's?

Link to comment
Share on other sites

  • 8 months later...

The below should instruct it to use cores 5, 7, 9, 11. Won't work on anything with less than 6 real cores.

 

C:\Windows\System32\cmd.exe /c start "7D2D" /affinity AA "C:\Program Files (x86)\Steam\steamapps\common\7 Days To Die\7DaysToDie_EAC.exe"

 

(Edit, actually off by one, because Core 0, still needs a flag to be set, so it actually sets it on core 4, 6, 8, 10 - I.e. the physical cores, not the logical ones).

Edited by Pernicious (see edit history)
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...