As the next-generation of consoles are released, one of the biggest demands from excited fans has been better performance. And in the context of video gaming, performance is synonymous with FPS. On previous generation consoles, most AAA titles were locked in at 30 FPS. That is simply because they couldn’t consistently handle more without compromising the graphics.
Given how powerful new-gen console hardware are, this is no longer going to be the case. 60 FPS AAA gaming is going to become the norm. And even though many of the games featured in the PS5 reveal ran at 30 FPS in 4K, there were some that ran at 60 FPS but on a lower resolution. The PS5 now supports frame rates of up to 120 FPS.
Now, the terms FPS and refresh rates are sometimes used interchangeably when talking about in-game performance. These two terms are closely tied together in the context of gaming but they are still two very different things. In this article, we will clear up some misconceptions about FPS and refresh rates.
Also Read: PC vs Console Gaming: Pros and Cons For Each
What is FPS?
In the gaming world, FPS can mean one of two things; First Person Shooter or Frames Per Second. We’ll be referring to the latter in this post. In essence, FPS indicates how many frames are being rendered by your GPU and output to the display each second. Each frame is a static image. But when they’re flipped quickly enough, they create the illusion of motion.
The higher the frame rate, the smoother and more responsive the experience will be. The disparity in user experience between 60FPS and 30FPS is quite huge, as it is between 120FPS and 60FPS. This is why so many gamers value performance over visuals these days. Especially so in the competitive online multiplayer scene.
Additionally, games running at high frame rates provide your brain with more data when gaming. That effectively improves your reaction time. Again, in a competitive environment, the slightest improvements can often mean the difference between victory and defeat.
All of this is predicated on the assumption that you can see all of these frames being rendered. That now falls outside the jurisdiction of FPS and is where refresh rates come in.
Also Read: All You Need To Know Before Your First PC Build
What Is Refresh Rate?
As we’ve emphasized, FPS has to do with the horsepower of your GPU. You could have a powerful GPU that’s capable of rendering well over 200 FPS, but that wouldn’t mean anything unless you’ve got a monitor that can keep up with the GPU. Refresh rates indicate how many times the display can refresh the image each second, which is expressed in Hertz (Hz).
The majority of the displays in the market have a refresh rate of 60Hz. This means they refresh the image 60 times each second. However, there are monitors out there that offer much higher refresh rates. 144Hz and 240Hz are two of the most common monitor refresh rates above 60Hz.
But you can still get 75Hz, 120Hz, and 200Hz monitors which are becoming more and more available. As you can tell by now, both FPS and refresh rates have to do with the same thing but are very much different. The GPU renders as many frames as it’s capable of, the more the better.
Also Read: How To Choose PC Monitors: Features To Look Out For
To determine how smooth your gaming experience will be, you have to refer to both the FPS from the GPU and the refresh rate of the monitor. For example, you could have a standard 60Hz monitor, but if your GPU can only produce 40FPS, then that’s all you’ll see.
On the other hand, if you’ve got a 60Hz monitor and your mammoth of a GPU is dishing out 100 FPS, well, tough luck. You’ll only get to see the 60 frames that your monitor can handle.
Also Read: Smart TV refresh rates (60 or 120Hz): Which is the best?
What If FPS and Refresh Rate Are Out of Sync?
In basic gaming terminology, the rate at which your monitor refreshes will place a cap on your effective FPS stat. You need to upgrade your refresh rate before you can gain the benefits of higher FPS. If only this was all there is to it. Unfortunately, if your monitor’s refresh rate and GPU’s FPS are not in sync, you will experience some discomforting results.
More so if the refresh rate of the monitor is lower than the FPS from the graphics card. This is when we are treated to a problem known as screen tearing. As much as the monitor is trying its very best, it cannot keep up with the GPU. Hence, the top and the bottom halves of the screen end up displaying two or more different frames.
Fortunately, there are ways to get around this issue which is done through Vertical Synchronization (V-SYNC), or one of the Variable Refresh Rate (VRR) technologies. The most common VRR technologies are Nvidia’s G-Sync and AMD’s FreeSync. Most of the latest high-end monitors come with either of those two.
Did you learn something new here today? If you did, please share this post with your friends and leave a comment if you’ve got questions.