moReal Game Servers CSS Game Servers Server Forums Contact Us Server News
Sitemap Game Server Index Game Servers Server Locations Order Server Performance Game Server Support More


Understanding Server Side FPS.

Disclaimer: This article was originaly written by 'm.freaks|BD|Lefty' in the German language. It was translated into english and posted Here. Please note that we take absolutely no credit for this article and thank the original author for this solid writeup.

For a couple of months, a lot of game server providers offer special high performance servers that run at 1000 FPS. These 1000 FPS make the server run a lot better and more precisely compared to standard settings. Players hit better and the calculation of the player’s positions is also a lot more precise and if you are a "competitive player" in a “professional gaming league”, you of course need one of these 1000 FPS servers.

If you take a closer look at the details concerning these 1000 FPS servers, you’ll find that what most providers tell you is very unclear and confusing. Sometimes they try to give the impression that their servers are “FPS certified” in some way, without telling you what this certification really is. We can assume that this certification has been invented by the game server providers themselves or that it only expresses some subjective opinion. Do game servers that run at 1000 FPS really offer a smoother gameplay with better hit registration? No, because they simply can’t! Here’s why …

In theory

All events on a CS:S or CS1.6 server are calculated by frame. All positions, directions and speeds of one point in time are summarized in one frame. A frame is similar to a snapshot or still image of a movie. The more frames a server calculates per second, the more precise are its data. At 1000 frames per second (FPS) the server calculates its “world” at a rate of one frame per millisecond. At 100 FPS it calculates the same “world” at a rate of one frame per 10 milliseconds. So far the argument of a higher precision is still valid … but only as long as you look at the isolated server. In practice all this remains without any use to the client.

In practice

Here the players (clients) have to be taken into consideration, too. The server may calculate its “world” at 1000 FPS. That is only a millisecond for every calculated frame. On the other end the clients do not get their updates at such a high rate but a lot slower. How often they can be updated per second is determined by the server’s tickrate. The tickrate is usually set to 66 or 100 for high quality servers. The server freezes its “world” per tick and then decides which clients it sends its data to. But the server doesn’t send all its information. It only sends the changes compared to the last update. The tickrate defines how often the server takes snapshots of its frames that can then be sent to the clients. This way a client gets only 100 updates per second from a tickrate 100 server. On the other hand the clients also send the server commands. Here the tickrate also determines how many commands per second a server accepts from a client. This would be also only one command per 10 milliseconds for a tickrate 100 server.

This is the point where the house of cards finally collapses. What is the server supposed to calculate its “world” with every millisecond if it only gets one command from all clients every 10 milliseconds? All the server can do is calculating its “world” with old data 90% of its time. If you now also take into consideration that there are different latencies that influence the rates at which the clients send their commands to the server and that the server buffers these commands in queues, 500, 600 or even 1000 FPS make no sense at all. The server usually has to work data that is about 50 milliseconds old or even older. During this time a lot of things may have already changed and some inputs (mouse movement) may have been given on the client side. So the server has to predict events - it has to guess what the clients will do next. It maybe calculates something completely different movement from what the player really does … with a precision of one millisecond at 1000 FPS. It does not matter if the server runs at 333 FPS or 1000 FPS. A wrong guess stays a wrong guess. At 1000 FPS it is just “more precisely wrong”.

Now you might ask “But what if the server guesses right?” True, if the server predicts correctly the position of a player is indeed more precise - on the server! But not on the client. Both engines for CS:S and CS1.6 act on the assumption that the server’s and the client’s time is in sync. The servers time is used for all clients. So the server saves a so called frame time for every calculated frame. It does this every millisecond on good Linux servers. The client uses the frame time that it receives with every update as its own time. From a 1000 FPS server with a tickrate of 100 the client should receive updates in which the frame time has advanced 10 milliseconds. Even if the server has predicted the player’s actions correctly, 1 or 2 milliseconds latency in a packet’s run time from the server to the client are enough to make all data imprecise again. So again, it does not matter if the server runs at 333 or 1000 FPS. Not to mention, latencies are usually a lot higher.

Some might say “But the server includes latencies into its calculations!” True again. The server does indeed take the client’s latencies into account. In order to calculate latencies correctly, the server needs a command packet from the client. This value is determined across a series of command packets and then included in the server’s calculations. But even finding the average makes all this imprecise again. Even more importantly the server uses old data to calculate that average latency. Again no difference between a 333 and a 1000 FPS server.

Some players think that with settings of cl_cmdrate 100, cl_updaterate 100 and rate 30000, you get exactly that from the server. But that is also wrong. In both CS:S and CS1.6 the server is authoritative. That means that under any circumstances it is the server that decides how many updates it sends and how many commands it accepts per second. The client maybe claims to receive 100 updates per second but the server sometimes just delivers 90. Here 1000 FPS are also counterproductive. If the server is under heavy load and can’t calculate its 1000 FPS anymore, it decides to send the clients fewer updates and to process fewer commands in order to reach its maximum rate of FPS because the engine prioritizes the FPS rate achievement over processing updates and commands. This is important because the Source Engine but also the old Half-Life Engine do all their calculations per frame. No frame, no calculation, no update. The engine will always try to reach its pre-set FPS under all circumstances and will simply drop all commands and commands from and to the clients if necessary. This is why in this case a server that runs at a constant 333 FPS is a lot more precise than a server that has to switch between 500 and 1000 FPS all the time, which is exactly what most tuned 1000 FPS servers currently do.

Conclusion:

It is not important if a server runs at 333, 500, 600 or even 1000 FPS. Any of these frame rates make a server fast enough. It is far more important that the server has a high quality internet connection and always reaches its pre-set FPS. Only Valve Software can make hit registration better by improving the algorithms responsible for prediction, extrapolation and interpolation and thus making these more precise.

Don’t let yourself be fooled by fake “1000 FPS certificates”. Don’t let anyone force you to play on a 1000 FPS server in wars or matches. It is all just a marketing gag – nothing more or less.


Tuned 1000 FPS Game Servers



© 2006 - 2007. moReal.org | Tutorials | Sitemap | Games Hosted: COD4, CSS, TF2, CS | Use FireFox