Tearing your hair out over V-Sync? Here's how it works and how it influences Cloud Gaming
V-Sync can prevent screen tearing, but often at the expense of performance. Find out how it works, and why it’s useful.— 8 July 2019 ago
Tinkering with a video game’s settings to create the perfect experience to suit your PC is vitally important (and also strangely cathartic), but sometimes it can be overwhelming if you don’t know exactly what each option does. One of the most common settings you’ll find is whether to enable V-Sync or not. Should you leave it off or turn it on? Well, let’s find out.
What is V-Sync?
Vertical Synchronisation, or V-Sync as it’s commonly known, is a technology that has one sole purpose: to eradicate screen tearing. Screen tearing occurs when your graphics card and display monitor go out of sync with each other, resulting in one or two new frames overlapping which produces a distracting visual artefact.
Most televisions and monitors are capped at 60Hz refresh rates (although that’s slowly beginning to change), so whenever a game’s fps exceeds the fixed refresh rate of a display (e.g. 90 fps on a 60Hz screen), the graphics card produces more frames in the frame buffer than your monitor can actually display at one time. The end result is an image that appears out of alignment or visibly torn in two, hence the term “screen tearing”. And yes, it looks as bad as it sounds.
How does V-Sync work?
Enabling V-Sync imposes a framerate cap on your GPU to limit the maximum number of frames equivalent to a monitor’s refresh rate. This eliminates screen tearing as it prevents your GPU from rendering more frames than what your monitor can display, removing the possibility of any frames overlapping. And that’s great news, because most people find screen tearing to be extremely noticeable and distracting, particularly in fast-motion games.
V-Sync can be a double-edged sword, however, as it can severely impact your overall fps if your framerate isn’t stable. If your graphics card is comfortably exceeding your monitor’s refresh rate, then enabling V-Sync won’t be a problem as there will be enough overhead to keep things locked at the desired 60 fps cap. But as many titles fail to achieve a constant 60 fps, varying dramatically depending on the scene, this can lead to V-Sync capping the framerate to a much lower 30 fps when necessary, which isn’t ideal. Triple buffering can help negate the impact of V-Sync, but it isn’t always supported and it can often create problems of its own.
Why is V-Sync useful?
Apart from banishing screen tearing for good (hurrah!), V-Sync can also provide a more stable overall experience. Fluctuations in framerate can negatively affect input latency, making certain games feel janky, inconsistent, and cause nasty stutter while in motion. The last thing you want is for your inputs to become less responsive in twitch-based shooters or platformers that require perfect timing, so a locked framerate can help avoid any serious spikes and dips.
How does V-Sync impact cloud gaming?
V-Sync is a tricky situation in terms of cloud gaming, then, as unlike with dedicated local hardware, frames arrive in unpredictable intervals. When using V-Sync, the result would either cause an accumulation of new frames, frames to be skipped entirely, or if V-Sync is disabled, good old screen tearing.
Many cloud gaming companies have come up with innovative solutions to overcome this problem. For instance, cloud gaming company Shadow does not rely on either V-Sync or the display technologies of G-Sync or Free-Sync. According to their Head of Streaming Development, Grégory Gelly, Shadow’s technology employs an anti-tearing solution directly into its architecture, and because you’re essentially streaming a video capture, enabling V-Sync or triple buffering would only result in increased latency and poorer performance. That’s one less graphics option to worry about, then!