Nvidia G-Sync or AMD FreeSync? Pick a side and stick with it

If you've ever experienced screen tearing in a PC game, you know how annoying it can be. An otherwise perfectly-rendered title totally ruined by gross horizontal lines and stuttering. You can turn on V-Sync, but if you don't have a high-end system, it can put a huge dent in your performance. Both Nvidia and AMD have stepped up to try and solve the issue while preserving framerates, and both manufacturers have turned to adaptive refresh technology for the solution.


G-Sync and FreeSync are both designed to smooth out gameplay, reduce input lag, and prevent screen tearing. They have different methods for accomplishing these goals, but what really sets them apart is that one side closely guards them, and the other shares them openly. While Nvidia's G-Sync is enabled by including a chip in the construction of the monitor, FreeSync uses the video card's functionality to manage the refresh rate of the monitor using the Adaptive Sync standard built into the DisplayPort standard.

Screen tearing and stuttering are the effect of a misalignment between the number of frames your graphics card is rendering per second, and the number of times your monitor is refreshing. If your screen is in the middle of drawing one image, and is handed another, it stops what it's doing and draws the new one. That causes stuttering as the image catches up, or a horizontal tear where portions of two frames are on screen at the same time.


If you've ever noticed the problem before, you might have turned on V-Sync to alleviate the issue and noticed a drop in performance. This is because V-Sync locks frame output to the expected refresh rate of the monitor, which is almost always 60Hz (60 refreshes per second). That's great if your computer can evenly output 60 frames each second, but if it can't, V-Sync has to drop down to a division of 60 FPS -- usually 30 FPS, and if your PC can't hold that, 20 FPS, or lower.

V-Sync works as it does because a normal monitor cannot talk to the video card. It refreshes at the same rate, all the time, no matter the frames sent to it. Adaptive sync technology makes it possible for your monitor's refresh rate to vary to match the output of your video card. That prevents screen tearing without locking the framerate to 60 FPS or lower.

Unfortunately, not all technologies are created equal. If you want to get in on some buttery smooth gaming action, you'll have to decide. Nvidia, or AMD?

Pick a side

One of the first differences you'll hear people talk about when it comes to adaptive refresh technology, besides AMD vs Nvidia, is the difference between a closed and an open standard. While G-Sync is proprietary Nvidia technology, and requires the company's permission and cooperation to use, FreeSync is free to use, and implementing it is a goal of the program, not a way to make money.

That should mean that FreeSync is more widely adopted, but in actuality, they're about even -- for now. G-Sync has been around longer, and it's also managed by Nvidia, the current leader in GPU manufacturing. That may prevent AMD's lead in compatible monitors from getting out of hand, or might not. Only time will tell.

Either way, you can't mix and match between the two technologies. You have to choose whether you want to go with Nvidia or AMD, and then purchase a monitor and GPU accordingly.


It's tough to talk about cost when it comes to incorporating a refresh management technology into your rig. The reason is that there's a cost on both sides -- at the GPU, and in the monitor.

If you go the Nvidia route, the module in the monitor is going to handle a lot of the heavy lifting involved in adjusting the refresh rate. That's going to be reflected in the price you pay for the monitor, since each manufacturer has to pay Nvidia for the hardware. The upside is that the technology has been readily available since early 2014, so it's available in monitors as cheap as $500, like the BenQ XL2420G.

The G-Sync module also does most of the heavy lifting, so as long as your monitor is compatible, you can use lower end cards. Nvidia lists the compatible options, which range from the Titan X and 980 Ti all the way down the 650 Ti Boost, which retails for less than $200.


You won't end up paying much extra for a monitor with FreeSync. There's no premium to the manufacturer to include it, unlike G-Sync. As such, these monitors start around $500 as well, but you'll get a 1440p display and a 144Hz refresh rate at that price point.

You'll also need a card that supports FreeSync, and unless it's an R9 or high-end R7 card, it doesn't. At less than $100, you can install an R7 260, the most basic card in the Radeon lineup that supports FreeSync.


There's also a performance difference between the two standards in a number of different areas.

Users have noted that although tearing and stuttering are reduced with FreeSync enabled, another problem arises -- ghosting. As objects move on the screen, they leave behind a bit of the image of their last position like a shadow. It's an artifact that some people don't notice at all, and really annoys others.

There are a lot of fingers being pointed at what might be causing it, but the physical reason for it is power management. If you don't apply enough power to the pixels, your image will have gaps in it, too much power, and you'll see ghosting. It's hard to balance the adaptive refresh technology with proper power distribution.

Both systems also start to suffer when the framerate isn't consistently within the monitor's refresh range. G-Sync can show problems with flickering at very low framerates, and while the technology usually compensates to fix it, there are exceptions. FreeSync, meanwhile, has stuttering problems if framerate drops below a monitor's stated minimum refresh rate.

Most reviewers who've compared the two side-by-side seem to prefer the quality of G-Sync, which does not show stutter issues at low framerates, and thus smoother in real-world situations. Some FreeSync monitors have an extremely narrow adaptive refresh range, and if your video card can't deliver frames within that range, problems arise.

Compatible monitors

In the past year the number of monitors with G-Sync built in has exploded, but if you're planning on improving your gaming rig, you also want a monitor with the right specs. We've reviewed some monitors that are equipped with G-Sync, and here are a few of our favorites.

Acer XB280HK ($799) -- G-Sync

The XB280HK was the first 4K monitor to support G-Sync, and still a great choice for gaming, with or without a compatible card. It has a 1ms response time, and a variable refresh rate for G-Sync. We lauded the XB280HK for its sturdy build quality and solid gaming performance, and it's a great choice if you're heading the G-Sync route.

ASUS PG278Q ROG Swift ($799) -- G-Sync

From the ASUS Republic of Gamers line, the PG278Q Swift is an exceptional gaming monitor with 1440p resolution, and a 60-144Hz refresh rate to take full advantage of the G-Sync module built in. The price is a bit high, starting around $700 when it's on sale, but it's a great option if you don't want to make the move to 4K yet.

Acer XG270HU ($799) -- FreeSync

The Acer XG270HU is almost identical to the XB280HK, with a few key differences. The first is that this version sports FreeSync instead of Nvidia's G-Sync protocol. It also sports a thinner side bezel for multi-monitor setups, as well as cutting the screen size by an inch. It's an excellent value if you're choosing AMD.

Samsung LU24E590DS ($499) -- FreeSync

This Samsung display sacrifices a perfect response time of 1ms for a slightly slower 4ms time, but makes up for it with a very appealing price point, especially for a 4K display. It's only a 23 inch display, which actually give it a really high pixel density, and should make your games look especially clean and smooth.


Without any other components, you should expect to spend almost a thousand dollars on a G-Sync compatible monitor and graphics card, more if you want to step up to a graphics card that can actually handle 4K gaming. For around $600, you can get into the base level of FreeSync's compatibility, and that includes a 4K display and a baseline Radeon card -- the R7 260, which is not capable of high performance 4K gaming. The upside is that it won't take much more than $200 more to get you a lot farther up the AMD scaled into an R9 290, which is capable of some 4K gaming.

Given the price gap, you might wonder why anyone would prefer G-Sync. The answer is simple -- it's superior. If Nvidia's adaptive refresh technology doesn't suffer ghosting issues and delivers more consistent overall performance. It's also worth noting that Nvidia video cards are currently the performance king. Going with FreeSync, and thus buying an AMD Radeon card, might mean purchasing hardware that delivers less bang for your buck.

Ultimately, both of these technologies largely accomplish their goals and deliver an experience that's superior to V-Sync. Your choice will depend on whether you prefer value or a top-notch gaming experience.