Jump to content

drperry

Members
  • Posts

    3
  • Joined

  • Last visited

Everything posted by drperry

  1. Nope, doesn't make an appreciable difference. Maybe gets me another 5 FPS, but still. There's still an issue, and it's NOT with my system, it's a compatibility issue with the 3000 series GPUs. Otherwise I'd be able to get higher FPS on the same settings than my R9 got. There's zero difference between 1080p and 1440p. Both for For FPS and for CPU/GPU usage. The game is NOT leveraging the 3000 series GPU. The 7DTD issue isn't bottlenecking. It's a programming/coding/optimization issue somewhere. Otherwise there would be a performance increase, even if it's not a huge one.
  2. Unless it was pickup up the slack of the weaker GPUs, I used to run 75 - 80*C on the CPU playing 7 days, now I tick over at 40 - 45*C... There's definitely a difference in CPU usage compared to before. There's also no reason for the GPU to be limited to 30% usage now, unless there's a programming conflict somewhere, otherwise ANY game I play would suffer from the same issue. There's also no difference in 1080p and 1440p for FPS on my system. Ryzen 9 3900X Gigabyte Aorus Master X570 32GB G.Skill Ripjaws 3200mhz memory EVGA RTX 3090 FTW3 Ultra Corsair AX1200
  3. Could there be a possible conflict with the hardware scheduling of AMD graphics cards vs. the software scheduling of NVIDIA cards? My 3900X used to run 70 - 90% CPU usage with my old R9 Fury or 7970x2 Crossfire setup and max GPU usage. That was without using any core affinity options. Now that I have a 3090 I max out at 25% CPU usage and 30% GPU usage. the R9 actually got better FPS at 1080P when using the same settings, compared to the 3090, until getting to settings that exceeded the VRAM of the older cards. 1440P is the same FPS with the same settings between the R9 and 3090, until you get into the high and ultra settings.
×
×
  • Create New...