PDA

View Full Version : Interp Ratio and Rates: The Key to Improved Hitboxes.



Dusty!
11-25-2010, 08:47 PM
If you play Counterstrike Source enough, you may sometimes notice that you cant seem to hit enemies. You might be shooting right at them, but for some reason you can't seem to hit them. Sometimes this can be caused from server settings, but more often there are several tweaks you can do to greatly improve your game play.

First let's look at an explanation of how the multi-player process works with games based on the source engine:

Multi-player games based on the Source Engine use a Client-Server networking architecture. Usually a server is a dedicated host that runs the game and is authoritative about world simulation, game rules, and player input processing. A client is a player's computer connected to a game server. The client and server communicate with each other by sending small data packets at a high frequency (usually 20 to 30 packets per second). Clients only communicate with the game server and not between each other. In contrast with a single player game, a multi-player game has to deal with a variety of new problems caused by packet-based communication.

To cope with all these issues introduced by network communication, the Source engine uses multiple techniques to solve these problems, or at least make them less visible to the player. These techniques include data compression, interpolation, prediction, and lag compensation. These techniques are tightly coupled, and changes made within one system may affect other systems.

Essentially what this is telling us, is that through fine tuning of rates and interpolation, we can influence how the game reacts to our character. With careful adjustment it is possible to improve hit-boxes and land more shots on the enemy.

A valuable asset in tweaking your settings is NetGraph. This gives you insight to what exactly is happening with your settings. This includes your precise ping, your LERP (the amount of time being spend on interpolation calculations), packets sent and received, loss, choke, and other statistics.

Let's take a look at how to use and interpret Netgraph:

The following are commands which will affect how and where Netgraph displays.

Enabling Netgraph:

net_graph
Base command to configure how net_graph is displayed

* 0 = No graph (default)
* 1 = Draw basic netgraph (text only) [areas 3, 4, 6 and 7]
* 2 = Draw data on payload as well as latency and interpolation graphs [areas 8 and 9]
* 3 = Draw payload legend [area 1] and packet loss percentage and choked packet percentage [area 4]
* 4 = Draws the server perf statistics area [area 5]

In other words, for no graph you type net_graph 0 in console. For basic net_graph 1, etc.

Positioning Netgraph:

net_graphpos
Where to position the graph. It is always at the bottom of the screen.

* 0 = left edge
* 1 = right edge
* 2 = centered
* 3 or higher specifies the X co-ordinate of the graph's left edge

To put the Netgraph at the left edge type net_graphos 0 in console, at the right net_graphos 1 etc.


Interpreting Netgraph

http://developer.valvesoftware.com/w/images/thumb/e/e8/Net_graph_annotated2.jpg/640px-Net_graph_annotated2.jpg


Area 1

This area is the legend for the colors used in the payload section of the graph. If a part of the payload arrives but doesn't fit into one of the predetermined buckets, then it is represented in the clear area between the last color and the little white dot the represents the full packet size [see indicator "a" in image].

Area 2

For packets greater than 300 bytes which are in the 95th percentile, the size of the packet is rendered in text at the top of the payload area [see marker 2].

Area 3

The local connection's frames per second and round trip ping to the server are shown in area 3.

Area 4

This area shows the current bandwidth usage. The in/out show the size in bytes of the last incoming and outgoing packet. The k/s shows the kilobytes per second (rolling average) recently seen in each direction.

Area 5

This area shows the performance of the server the client is connected to. The "sv" tag shows the fps of the server as of the latest networking update delivered to the client. The "var" shows the standard deviation of the server's frametime (where server fps = 1.0 / frametime) over the last 50 frames recorded by the server. If the server's framerate is below 20 fps, then this line will draw in yellow. If the server's framerate is below 10 fps, then this line will draw in red.

Area 6

The "lerp" indicator shows the number of msecs of interpolation being used by the client. Some notes on the value of lerp follow below.

Area 7

This area shows the user's current cl_updaterate setting, the actual number of updates per second actually received from the server, the actual number of packets per second sent to the server and the user's cl_cmdrate setting (which is the user's desired packets per second to send to the server).

Area 8

When net_graphshowlatency is 1, this area shows a historical view of the latency of the connection. The height (indicated by marker "d") corresponds to net_graphmsecs time (actually there is a bit of headroom after net_graphmsecs at the top for the text fields to fit into). Red vertical lines indicate dropped packets from the server down to the client. If the graph shows a yellow marker (such as at marker "c"), this indicates that the server had to choke back one or more packets before sending the client an update.

Area 9

When net_graphshowinterp is 1, this area shows for each client frame how much interpolation was needed. If there is a large gap between packets (packet loss, server framerate too low, etc.), then the client will have insufficient data for interpolation and will start to extrapolate. The extrapolation is shown as orange bars rising up above the while line (a run of extrapolation can be seen just to the left of the 9 marker). In addition, the very bottom pixel indicates whether a CUserCmd ("usercmd") packet was sent on that rendering frame, or held back by the client and aggregated due to the user's cl_cmdrate setting.

To summarize this section, understanding what Netgraph is displaying and how it affects you is an important part of applying the tweaks and optimizations in the following section.

Optimizing your rates and interp:

Configuring cl_updaterate and cl_cmdrate

According to valve, all servers are now at a maximum of 66 tic. However, the term 66 tic is a bit misleading the source engine runs internal simulation at a tick interval of 15 msecs. This translates to 66.67 ticks per second.

Therefore if we apply logic, the ideal cl_updaterate and cl_cmdrate to start at is 67, to include that .67 tick.

Configuring rate

The most commonly accepted "best" rate is 25000, however depending on your location to the server and your bandwidth this can go higher.

As a starting place to optimizing your rates, I suggest leaving your rate as 25000.

Configuring cl_interp_ratio and cl_interp

According to a valve network engineer, " The default for cl_interp_ratio is two to allow for an occasional dropped packet w/o a visual hitch." This considered there is a simple formula to follow to establish the best setting for cl_interp. This formula is as follows:

cl_interp = cl_interp_ratio / cl_updaterate

Therefore if we plug in our previously established cl_interp_ratio and cl_updaterate, it looks like this:

cl_interp = 2/67 =.0298507

Final optimizations

Provided your rates are set up at: cl_cmdrate 67, cl_updaterate 67, rate 25000, cl_interp .0298507, and cl_inter_ratio 2, it is time to connect to your favorite trP pub and test them out. Under ideal conditions you are looking for the following:

loss - 0 (loss is a bit out of your control. If you are cabled in and experiancing a few lost packets, there is nothing you can do. If you are on wireless you can change channel, or you can cable in to lower your loss)

choke - as small as possible (expect choke spikes at beginning and end of round)

lerp - The lowest amount possible, but the lerp text must remain white. As soon as the color of lerp in netgraph changes to yellow, red, or orange, revert to the previous settings to make it white again.

To establish the lowest of these three, fiddle with your rates by using very small increments. You can change your cl_interp_ratio and cl_interp, but STICK TO THE FORMULA. Your cl_interp must equal your cl_interp_ratio / cl_updaterate. So remember, as you change your cl_updaterate, you MUST change your cl_interp.

Once you have a low (but still white) lerp, you can fiddle with your rate. Increase it in small increments from 25000, and it may lower your lerp even more. If you notice going any higher is making your lerp increase, then revert back to the previous setting.

So remember, these settings coordinate with each other. Changes in one necessitate changes in the others.

I recently went through this whole process of optimizing my rates, and it really has has improved my game play experience. Fix your rates up and get ready for epic head shots :)