-->

Friday, May 3, 2019

Nvidia G-Sync vs AMD FreeSync

Attention all gamers, the format wars are back: the battle of G-Sync and FreeSync is in full swing, especially now that Nvidia has opened up its G-Sync spec. Now, while this does mean that both Team Red and Team Green have solved the problem of screen tearing and frame stuttering in the best PC games, it also means that the market for the best gaming monitors is obfuscated even more. 

What's the problem?

The source of displaying high-performance PC games is that monitors have a constant refresh rate, e.g. 75Hz means that the screen is updated 75 times per second. Meanwhile, graphics cards (GPUs) render new images at a variable rate, depending on the computational load they're bearing. 

Source: Nvidia

When the refresh rate and frame rate don’t match, the current frame being rendered on the GPU can become unsynchronized. So, partway through the process of sending a new frame to the display, the graphics card will move on to the next frame. 

Source: Nvidia

This switch appears as a discontinuity in what you see on-screen. Usually, the discontinuity travels across the screen on concurrent frames as the phase difference between the GPU and monitor reduces. This discontinuity is what we call tearing and in extreme cases there can be several tears at once. 

Source: Nvidia

Most PC games have a setting called VSync, which can reduce the tearing effect. VSync effectively limits the frame rate of the GPU, such that if one particular frame has taken too long to be rendered on the GPU and it misses its slot on the monitor, the GPU will delay sending any graphics data to the monitor until the next screen refresh. 

Source: AMD

Brilliant! Problem solved then? Well, not quite. VSync isn’t perfect, and the delay in sending a frame to the monitor causes stuttering and input lag during the times that the GPU is under the most processing load, which is also the time a gamer needs the most response. Hence, many gamers choose to disable VSync in order to get the most responsive system, despite the ugly tearing effect. So, while VSync was the only remedy to tearing, many gamers choose to disable it. 

Nvidia to the rescue, kind of

Since 2014, Nvidia has been promoting its solution to the VSync problem, that it has dubbed G-Sync. The basic concept with G-Sync is that the GPU actually controls the refresh rate of the monitor. By doing this, the monitor and GPU are always in sync and therefore there is never any tearing or stuttering. Prior to this, Nvidia had already been working on Adaptive VSync.

As PC Perspective notes, there are three regimes in which any variable refresh rate GPU/monitor system needs to operate within: A) when the GPU's frames per second is below the minimum refresh rate of the monitor. B) When the GPU's frames per second is between the minimum and maximum refresh rate of the monitor. C) When the GPU's frames per second is greater than the maximum refresh rate of the monitor. 

Source: Nvidia

Case B mentioned above is straightforward – the GPU simply sets refresh rate of the monitor to equal its frames per second.

When a G-Sync compatible GPU and monitor are operating in case C, Nvidia has decided that the GPU should default to VSync mode. However, in case A, G-Sync sets the monitor's refresh rate to be an integer multiple of the current frames per second coming from the GPU. This is similar to the delaying frames strategy of VSync, but has the advantage of keeping in step with the monitor because of the whole number multiplier.

The (somewhat literal) price of this solution is that Nvidia needs to have a proprietary chip in every G-Sync compatible monitor. This has the undesirable result of G-Sync monitors incurring increased costs due to requiring the extra electronics and paying the associated license fees to Nvidia. Finally, it is not supported by AMD GPUs either.

While Nvidia’s G-Sync was the first to come up with the idea of the GPU controlling the monitor’s refresh rate, AMD has an answer: FreeSync. This tech is based on an open standard – DisplayPort 1.2a. Team Red worked with the VESA group to modify the DisplayPort standard to incorporate Adaptive Sync, which allows compatible graphics cards and monitors to automatically negotiate the optimal refresh rate for the monitor. It thus requires no proprietary elements, which keeps costs down. 

This might lead you to wonder why Nvidia took so long to use FreeSync, then, since it’s an open technology. Well, your guess is as good as ours, but Nvidia has opened up G-Sync compatibility with some FreeSync monitors. This is good, because, at least according to AMD, G-Sync might reduce your frame rate, rather than improving your gaming experience. 

Source: AMD

The key technical difference between G-Sync and FreeSync, apart from the licensing requirements (or lack thereof!), is the way in which they handle GPU output that lies outside of the refresh rate range of the monitor. FreeSync is limited to matching refresh rates to frame rates via AdaptiveSync.

It cannot perform any other refresh rate tricks in the same way that G-Sync can, when the GPU's frame rate is outside of the monitor's refresh rate. Therefore, a FreeSync GPU has a frame rate outside of its monitor's refresh rate range, it defaults back to working with or without VSync as per the user's preference, which means there will be tearing or stuttering again.

Oh great, another format war

It is early days in the world of variable refresh rate graphics and so your options are limited if you want to try either technology. According to AMD, there are eight FreeSync monitors on the market, and Nvidia reports six G-Sync monitors are now available.

It is almost impossible to compare any of these monitors given the vastly differing specifications; but generally the G-Sync monitors are more expensive than the FreeSync models. As for GPUs, Nvidia's website says that any GPU from the 600 series onwards will support G-Sync. Meanwhile, compatible AMD Radeon GPUs are limited to the "R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs".

An important consideration for gamers is that we tend to upgrade monitors far less regularly than GPUs. So whatever type of variable refresh rate technology you choose now is going to determine your choice of GPU brand for a long time to come.

As ever, when big brands get into a format war, consumers become collateral damage.



from TechRadar - All the latest technology news http://bit.ly/1CZnl9L
 

Sports

Delivered by FeedBurner