Jump to content

settargetfps


The Lorax

Recommended Posts

  • 5 months later...
Huh? Is that even a thing? Since when do servers control the clients FPS?

 

I would assume he means the server fps, which you can change with a console command.

 

standard rate is 20 fps, which means all entities, ticking blocks etc get a tick update 20 times a seconds.

Link to comment
Share on other sites

yeah, my understanding is the same as StompyNz's. I was just wondering if anyone changes it on their servers and if so what have they found from doing so. I change it on my server from time to time, currently running it at 120 without any issue.

Link to comment
Share on other sites

Wouldn't that be called tps (ticks per second)? I mean... frames per second requires displayed images per second.

 

Honestly, I feel the same way. Though, through my experiences with this game and some others. Servers sometimes use "FPS" as a metric instead of Tick. To this day, I still find this confusing.

Link to comment
Share on other sites

Wouldn't that be called tps (ticks per second)? I mean... frames per second requires displayed images per second.

 

It's how Alloc explained it; basically the servers heartbeat. I don't believe changing it accomplishes anything. Like, at all.

Link to comment
Share on other sites

ticks is a fixed value, i.e. 20 ticks per second

 

frames can vary based on server load, so it includes a deltaTime variable that gives the time elapsed since the last 'frame'.

 

This is used for animation and gravity for example to determine how far to move. if anim was just based on ticks then it would get weird lag effects if the server got loaded etc.

 

Some things, such as AI could potentially benefit from increased target fps. I'd have to do some tests to confirm that though.

 

 

You could think of it as 'logic frames' rather than the more common 'render frame' for fps

Link to comment
Share on other sites

If you do test that out, i'd be very interested in what you find. I've been keeping my server sitting now at 100 target FPS, I have found no "major" downside to doing this (It does require more CPU usage). People on my server have said that they found fighting zombies a bit more accurate (locations and attack syncing). This is a bit anecdotal though as I haven't done any actual testing myself.

Link to comment
Share on other sites

On stuttering:

Pre A16, stutters were caused by a couple known things and possibly unknowns.

 

The first known was that the UFPS camera was updating in the wrong order sometimes causing what would look like FPS lag while your FPS would still say 60fps. That one I fixed early on in A17 development so that will no longer be an issue. This would happen when the CPU's physics update would get out of sync with the game update so it was hard to track at first. I tried some random things until I stumbled onto something that kept those calculations in order.

 

The other known is Garbage Collection. Prior to A17 we made a LOT of garbage and created a lot of new structs and method level vars during run time. We have now internally all started pushing vars to class members instead of making them in the methods. While it adds a tiny bit more to ram when creating the class initially, it doesn't need to destroy it every time the method is ran. The new effect system that runs buffs, progression, items, and events uses this setup so that processing a lot in one frame won't make a crapton of garbage from all those tiny bits adding up.

 

There are probably more situations and such that have stuttering but those are the main ones I know about and that we've found and started working on reducing.

 

For anyone curious, below are my system stats and I never have issues with any Unity based games, or any games for that matter.

 

OS: Windows 7 Ultimate 64-bit

Motherboard: Gigabyte GA-78LMT-USB3

Processor: AMD FX-6300 Six Core Processor (6 CPU) ~3.5GHz

Memory: 20480MB (yeah, I had an extra 4GB ram stick laying around lol)

Video Card: GeForce GTX 1060 3GB

Hard Drives: 512G SSD (2 of them), 2TB HDD

 

I have windows installed on one of my SSDs and I keep all my games installed on the other. :)

 

....

Link to comment
Share on other sites

Fox. You don't read the dev diary anymore, huh... Search for kinyajuus last post or three.

I never read the dev diary, I avoid that section of the forums like a plague.

 

 

 

Thanks StompyNZ for saving me from having to go there and find the post.

 

It took them 3 years to acknowledge and hopefully fix the worst of them, but at least it got done which is all that matters to me. But now I understand why women nag so much... it works. :p

Link to comment
Share on other sites

I would think that if you put the argument into the startup script, that it would be used when the server is restarted. How is your setup configured?

 

Also, how exactly do you apply the argument? I'd love to do some testing with this.

 

Currently I manually apply it via shell access to the server when I restart the server, but it can be applied in console if you are admin on the server (F1 > settargetfps 100)

 

Changing this setting does increase Server CPU usage, so keep that in mind before going up to a billion FPS. For my server that at times sees about 8-10 concurrent people I found that 100 works rather well without putting much strain on the server while under max load.

Link to comment
Share on other sites

Anyone have any suggestions for getting "settargetfps 100" to apply everytime I restart the server without having to manually do it? Once the server restarts, It reverts the the default 20.

 

Server Tools has an option for that.

https://7daystodie.com/forums/showthread.php?77267-Servertools-updated-to-5-5

 

--> Fps: allows you to set the target server fps on server load up

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...