There’s one big compromise that gamers have had to make for a long time. They have had to choose whether their games will look better or run faster. Typically, this means turning down the graphics setting to get a higher refresh rate. Especially if you don’t have a high-end graphics card. Today we’re covering the input lag that gets introduced when your monitor tries to upscale the signal from the GPU. This is what is called upscaling lag.
To be clear, we are not referring to your GPU rendering frames from scratch. What we are talking about instead is what happens after your GPU finishes rendering a frame, and then the display resizes the image to make it fit a certain resolution. It’s also important to note that sometimes the GPU may do this resizing before sending the signal to the display.
Scenarios You May Encounter Resolution Upscaling Lag
You can notice upscaling lag if you run a PC game below your monitor’s native resolution to improve performance. Upscaling lag will also most certainly happen if you connect an old console to a modern television. There are different forms of upscaling and some look nicer than others.
However, they all require a certain amount of post-processing time which can introduce noticeable input lag. Input lag is the delay you notice between pressing a button or moving the mouse, and the corresponding action appearing on the screen.
That can seriously hinder gameplay for obvious reasons, especially in older titles. Upscaling lag has a huge effect on classic platforms where responsiveness is a huge part of making the game feel as you remember it.
Also Read: What Is CPU Cache and How Does It Work
How Upscaling Lag Occurs
But why does upscaling introduce so much lag? Well, to get the image looking as nice as possible, some algorithms look at the frames that are rendered before and after the frame to be upscaled. This way they can better understand what a higher resolution version of the same image is supposed to look like.
The algorithms will then apply changes that they think are correct to the frame. This method definitely yields visual improvements to the user. The method also relies on what is known as a frame buffer. The frame buffer is where multiple frames are temporarily held before they are analyzed and shown to the user.
This upscaling process is computationally time-consuming. Not only does it cause the upscaling lag, but it may also result in worse image quality. The low-quality images mostly happen if the frames it’s examining are highly compressed. For example, that can happen if you’re watching a movie encoded in a compressed file format.
Also Read: How Does Wireless Power-Sharing Work
Alternatives to Combat Upscaling Lag
1. Algorithms Focusing On Specific Elements
An alternative approach to reduce upscaling lag is by having the algorithm look at certain elements of a single frame instead of relying on multiple frames at one time. The algorithm looks at the elements that human brains are typically sensitive to. For example, Marseille’s mClassic smart HDMI cable has a built-in library of such elements.
They are objects like edges and textures that we naturally key in on. Think about how jaggies caused by bad anti-aliasing of edges are often really noticeable to us. Interestingly, a game character’s eyes are also a focus for us. This is because as humans, we’re psychologically programmed to be very sensitive to what someone else’s eyes are doing.
This kind of strategy of focusing mostly on key visual elements can greatly reduce the upscaling lag while improving visual quality. It works due to its reliance on predetermined visual cues for the algorithm to focus on, as well as the fact it only examines one frame. But like other upsampling methods, it is not perfect. Can we do better? It turns out the answer is Yes.
2. Artificial Intelligence (AI)
Though we might still be some years away from seeing it becoming widely available. Rather than programming a scalar to spot a few specific elements, computer scientists have been training artificial intelligence. The training involves how to recognize what more complex objects are supposed to look like.
Accurately scaling an HD image to 4K or even 8k is a very computationally intensive problem. Large amounts of AI training will reduce reliance on predefined features. It will also allow a scalar to recognize anything from whether or not an object is a dog to how it handles scenes with complicated lighting.
We’re already seeing this to some extent with Nvidia’s Deep Learning Super Sampling (DLSS). Here, a supercomputer is fed with lots of frames from different games. It then figures out an algorithm to produce something close to an ideally anti-aliased image.
These algorithms are then pushed out to individual users through software updates. This allows gamers to improve how their games look without lowering frame rates. Also, the more efficient post-processing algorithms optimized through AI should hopefully make games feel more responsive as well.
Remember that if you aren’t good at games like CS-GO because of your terrible reflexes, AI probably won’t help you. So you might want to just give turn-based games a shot. Got any question about resolution upscaling lag? Let us know in the comments.