Is Vertical Sync Good? What VSync Actually Does and When It Helps
Vertical Sync — almost always called VSync — is one of those settings that shows up in nearly every PC game's graphics menu, yet confuses a surprising number of players. Some swear by it. Others turn it off the moment they install a game. The answer to whether it's "good" depends entirely on what your hardware is doing and what kind of experience you're after.
What VSync Actually Does
Your monitor refreshes its image at a fixed rate — typically 60Hz, 144Hz, or 240Hz — meaning it redraws the screen that many times per second. Your GPU, however, renders frames at its own pace, which can be faster or slower than your display's refresh rate.
When your GPU produces frames faster than your monitor can display them, you get screen tearing — a visual artifact where two or more frames appear sliced together on screen at the same time. It looks like a horizontal split running across the image, most visible during fast camera movements.
VSync solves this by capping the GPU's frame output to match the monitor's refresh rate and synchronizing frame delivery so each new frame only gets sent when the display is ready for it. One frame per refresh cycle — no overlap, no tearing.
The Tradeoff: What VSync Costs You
VSync isn't a free fix. Locking frame delivery to the monitor's refresh cycle introduces input lag — a delay between your physical input (mouse click, button press) and what appears on screen. The GPU has to wait for the right moment in the refresh cycle before delivering a frame, which adds latency.
In fast-paced games — shooters, fighting games, competitive multiplayer — that added lag is noticeable and can affect performance. Milliseconds matter more than most people expect once you've played without them.
There's a second problem: stuttering. When your GPU can't consistently hit the frame rate VSync is targeting (say, 60fps on a 60Hz monitor), VSync drops to the next available multiple — typically 30fps. The result is a sudden, jarring stutter that can feel worse than tearing.
VSync vs. Adaptive Sync Technologies 🖥️
Modern alternatives have largely addressed VSync's weaknesses:
| Technology | How It Works | Key Benefit | Limitation |
|---|---|---|---|
| Traditional VSync | Locks GPU to monitor refresh rate | Eliminates tearing | Adds input lag; stutters when fps drops |
| G-Sync (NVIDIA) | Monitor refresh rate adapts to GPU output | Eliminates tearing and stutter | Requires compatible NVIDIA GPU + monitor |
| FreeSync (AMD) | Same principle as G-Sync, open standard | Eliminates tearing and stutter | Requires compatible AMD GPU + monitor |
| Fast Sync / Enhanced Sync | Allows GPU to render freely, selects latest complete frame | Reduces lag vs. traditional VSync | Requires capable GPU; not universal |
Adaptive sync (G-Sync and FreeSync) flips the equation — instead of making the GPU wait for the monitor, the monitor waits for the GPU. This eliminates tearing without the input lag penalty. If your hardware supports it, these technologies generally outperform traditional VSync in most use cases.
When VSync Makes Sense
VSync still has legitimate uses, even today:
- Single-player games where immersion matters more than reaction time — exploring open worlds, narrative-driven games, or slower-paced strategy titles. Tearing ruins visual polish; input lag is less critical.
- When your GPU consistently exceeds your monitor's refresh rate — if you're hitting 200fps on a 60Hz monitor, VSync caps the unnecessary workload and can actually reduce GPU temperature and power consumption.
- Older hardware without adaptive sync support — traditional VSync remains the only built-in tearing fix if G-Sync or FreeSync aren't options.
When VSync Hurts More Than It Helps
- Competitive or fast-paced multiplayer games — the input lag penalty is a real disadvantage when reaction time matters.
- Systems where fps regularly dips below the monitor's refresh rate — you'll experience frequent stutter, which is visually worse than tearing for most people.
- High-refresh-rate monitors (144Hz+) — at higher frame rates, tearing is less perceptible anyway, and the performance cost of VSync becomes harder to justify.
The Variables That Actually Determine Your Outcome 🎮
Whether VSync improves or degrades your experience comes down to several factors that vary by setup:
- Monitor refresh rate — 60Hz users feel the benefits of VSync differently than 144Hz or 240Hz users
- GPU performance headroom — can your card consistently hit the target frame rate, or does it fluctuate?
- Adaptive sync support — does your GPU and monitor combination support G-Sync or FreeSync?
- Game type — competitive shooter versus narrative RPG involves completely different priorities
- Input lag sensitivity — some players notice it acutely; others don't register it at all
- Whether tearing bothers you visually — this varies significantly between individuals
A high-end system running a single-player game at locked 60fps on an older monitor gets a genuinely different result from a mid-range PC trying to hit 144fps in a competitive shooter.
One More Layer: In-Game vs. Driver-Level VSync
VSync can be enabled inside the game or forced through your GPU's control panel (NVIDIA Control Panel or AMD Radeon Software). Driver-level VSync applies globally across all applications. Some games also offer their own frame rate limiters, which can achieve similar tearing reduction with less input lag than traditional VSync when set just below the monitor's refresh rate — a frequently overlooked option worth knowing about.
The right answer for your setup depends on your monitor's capabilities, your GPU's output, the games you play, and how much input lag you're willing to tolerate versus how much screen tearing you can live with. Those specifics are entirely yours to weigh.