Skip to content

Instantly share code, notes, and snippets.

@vladh
Last active May 4, 2021 17:21
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save vladh/13a5c6a42cd6d72b25d38ddaf14ff3c2 to your computer and use it in GitHub Desktop.
Save vladh/13a5c6a42cd6d72b25d38ddaf14ff3c2 to your computer and use it in GitHub Desktop.

Feedback on Pestily's EFT benchmark video

Hi Pestily! I wanted to address certain information in your recent 2080ti vs 3090 benchmark testing video, going over what I found to be correct information, what I found to be incomplete, and what I found to be wrong information. I hope this is helpful to you and other members of the community. I've added references with some extra information and links at the end, marked with [n].

The video, for quick reference: https://www.youtube.com/watch?v=_GaRYhG3eJA

What do Update, GameUpdate, Render and Frame mean?

These values are actually very important when it comes to understanding why the game is running at a higher or lower frame rate. These are all in milliseconds, which means that lower is better.

Update: Non-physics things the game does on the CPU every frame, such as updating the text on the screen, calculating which loot you have and other such things. Again, this is the number of milliseconds this step takes each frame.

GameUpdate [1]: Physics things the game does on the CPU every time it updates the physics, so things like bullet calculations, figuring out where you can walk and if you've hit a wall, SCAV calculations and other similar things.

Render: Drawing the picture to the screen using the GPU. This means calculating how everything looks, calculating all the fog, reflections, weather, materials, textures, probably character animations too [6].

Frame: The total time a frame took. If you divide 1000 by “Frame”, you should get your FPS (the value might be slightly different [4]). This is the best way to measure performance. The lower this number, the better the game is running. The higher the number, the more time the game took to draw a frame, and the slower it will be. This is much better than measuring FPS, for reasons I'll get to in a bit.

For reference, here's an example from your video:

Update: 1.92
GameUpdate: 4.05
Render: 8.36
Frame: 12.41

We can actually confirm this too. When you're shooting your gun in the video (9:00), you can see that GameUpdate and Render increase! This means that the game has to calculate where your bullets are going (GameUpdate, on the CPU) and draw the muzzle flash and bullet trail (Render, on the GPU). Similarly, you'll probably notice that the exact same route through a map on the same GPU might have a different Render time due to weather effects.

How do we know a test is valid?

If we're testing the difference between two GPUs, we care about the difference in Render, not so much the others. At 10:15, you're running around and seeing a difference in performance. Update and GameUpdate are virtually the same between the two raids, but Render is lower on the 3090 side. This pretty much conclusively shows that the GPU is responsible for the speedup, so this is a good test! In fact, so are the majority of the tests in the Ultra section of the video, where you can clearly see a difference in Render time.

How do we know a test might not be valid?

Measuring FPS differences when at max FPS

At 9:18, you compare two Factory runs, and say there is no difference in FPS. However, because your FPS is capped, it may well be that there is an actual difference, but you simply can't see it because the FPS can't go higher up. Indeed, you can see that the Render time is smaller for the 3090. Therefore, we can't say much about this test.

Large differences in Update and GameUpdate

At the very same time in the video, you can see that GameUpdate is also much higher on the 3090 raid. This means the game was doing some different work on the CPU in between the two raids as well, so I wouldn't say the test is really helping us measure the difference between the GPUs. If you did have an FPS difference, it might have been caused by something else.

How do we account for differences between offline mode and online mode?

Easy, just look at how much Render is taking. SCAV calculations happen in GameUpdate, so if you just ignore that component, and only look at Render, you should get an indication as to what kind of difference you'll see in online mode.

Myth: Tarkov is more CPU-intensive than GPU-intensive

In the video, you mentioned something many people claim, namely that EFT is more CPU-intensive than GPU-intensive. Based on the numbers above, we can see that this is clearly not true. The CPU work (Update and GameUpdate) adds up to 5.97ms [4], while the GPU work (Render) is 8.36ms. This means the CPU is using around 41% of our frame time, while the GPU is using the rest of 59%.

Additionally, in the benchmark video you can clearly see that the better GPU does make a difference on Ultra settings, because that's where it's needed most. The CPU calculations are going to be pretty much the same regardless of your settings, because the game always has to do physics calculations. But the GPU load is heavily affected by your settings, and you can see this on Ultra.

So why do people make this claim? Probably because if you're on quite low settings (for example, if you're running the game on your pestilyMicrowave), you're more likely to be bottlenecked by your CPU. However, on higher settings, your GPU needs to do a lot more work than the CPU (when it comes to time taken).

Myth busted!

Why is FPS by itself not a good measure?

FPS is the easiest way to measure performance, but it is not the most accurate. This is because FPS is logarithmic, so a “frame per second” means much less as your FPS goes up.

For example, the difference between 56.25 FPS and 60 FPS is 1.11ms. However, the difference between 450 FPS and 900 FPS is…also 1.11ms! [5] This means the difference between the two when it comes to how the game feels is identical, so as your FPS goes up, you will notice a much smaller difference with each FPS.

Therefore, saying “we had a 10 FPS increase on this map between the two graphics cards” doesn't really mean much. A 10 FPS increase for someone starting out at 40 FPS would be much, much more noticeable. However, I understand it makes the video a bit harder to understand if you bust out the milliseconds. :)

I would recommend also comparing the Render and Frame times between two raids.

Bonus: Mip streaming

I noticed you mentioned on stream that the new “Mip streaming” options offloads more work to the GPU. This is actually incorrect, and I wanted to take this opportunity to share some information here.

What “Mip streaming” does, is to use a little bit of CPU power to calculate which textures need to be loaded in for a particular player camera position. Then, because the CPU has figured this out, it doesn't have to load textures that are not being used into GPU memory. This means that we're using a little bit of CPU power to give the GPU an easier time. This is probably advantageous on higher settings, where the GPU is already doing a lot more work than the CPU. You can read more about this in the Unity documentation [7]. To quote:

It trades a small amount of CPU resources to save a potentially large amount of GPU memory.

tl;dr what is this wall of text

Update and GameUpdate is how long CPU stuff takes every frame.

Render is how long GPU stuff takes every frame.

The best way to measure performance is to compare the Frame time, which is in milliseconds, so lower is better.

The best way to compare two GPUs is to look at the differences in Render.

Conclusion

I hope this helped! Lots of love and many thanks for the content.

References

1: I am basing the definition of GameUpdate on the fact that Unity (which EFT uses) names these functions Update and FixedUpdate. I assume that GameUpdate refers to FixedUpdate, and this seems to be supported by the fact that SCAV calculations are done in GameUpdate [2][3].

2: Performance with SCAVs (see GameUpdate), not my video: https://www.youtube.com/watch?v=3bIVgtjQP10

3: Performance with no SCAVs (see GameUpdate), not my video: https://www.youtube.com/watch?v=AzgvkmLY1c0

4: You might notice that if you add up Update, GameUpdate and Render, you get a number slightly different from Frame. This is probably because GameUpdate runs on every physics update, not every frame, so it can run in parallel to other tasks.

5: http://www.mvps.org/directx/articles/fps_versus_frame_time.htm

6: Some of these things depend on the Unity settings BFG have chosen. For example, whether character animations are calculated on the CPU or GPU is affected by the “GPU skinning” setting in Unity, and only BFG know whether they ticked that, so we have limited information.

7: https://docs.unity3d.com/Manual/TextureStreaming.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment