Nvidia’s G-Sync monitors cost a steep premium over AMD’s rival FreeSync Displays. But are they worth it, especially now that Nvidia lets you use FreeSync panels with GeForce graphics cards? It’s a complicated issue, but PCWorld graphics guru Brad Chacos unravels what you need to know in the video below.
Both G-Sync and FreeSync panels revolve around the same core variable refresh rate technology, also known as adaptive sync. Standard monitors refresh their image at a constant speed, such as 60Hz or 144Hz. Adaptive sync monitors synchronize your monitor's refresh rate to your graphics card’s output—hence their name. Doing so prevents ugly screen tearing and stuttering, giving you a delicious, buttery-smooth gaming experience.
But the devil lies in the details. G-Sync and FreeSync take very different approaches to the same goal. Nvidia rules over the quality of its proprietary G-Sync displays with a keen eye, while AMD’s FreeSync builds upon a VESA open standard. There are also technical implementation differences, ranging from the ports that support synchronization to how each technology handles hurdles like ultra-low framerates and ghosting. Brad explains the high-level differences in the video below, and if you want to go even deeper, be sure to check out our detailed G-Sync vs. FreeSync primer.
We’re trying out a new format here, responding to common questions we receive on our Full Nerd podcast. If you like it let us know!
Have a PC- or gaming-related question? Email [email protected] and we’ll try to answer it in the future. You can also join the PC-related discussions and ask us questions on The Full Nerd’s Discord server. And be sure to follow PCWorld on Facebook, YouTube, and Twitch to watch future episodes live and pick our brains in real time!