Yes there is a difference - vsync adds input lag, ‘gsync’ shouldn’t.
in the end gsync also needs (at least) double buffering. so if the fps rate is beneath the maximum monitor frequency, gsync adjust the monitor rate accordingly. but what happens if the fps is above? then the gpu has to wait for the monitor and the rendered frame gets old. which means there is input lag again, at most 1 frame like vsync. maybe that’s the reason for why nvidia is only talking about 120 or 144hz devices in combination with gsync, this way at least this critical maxium is higher.
anyways, true triple buffering would be a solution for this problem. funny enough it also kills input lag with vsync. the only problem: directx doesnt use it. their triple buffering method is equivalent to a render queue, so it just adds another frame of input lag instead of eliminating it.
Nah, there are better (faster) and cheaper monitors then the Evo monitors
Thanks for the tip!

in the end gsync also needs (at least) double buffering. so if the fps rate is beneath the maximum monitor frequency, gsync adjust the monitor rate accordingly.
This article sounds like there is no need for double buffering: http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness.
Excerpt: "In traditional setups a display will refresh the screen at a fixed interval, but in a G-Sync enabled setup the display won’t refresh the screen until it’s given a new frame from the GPU."
The display just keeps showing the same frame until it’s given a new one. How is there a need for double buffering anymore?

but what happens if the fps is above? then the gpu has to wait for the monitor and the rendered frame gets old. which means there is input lag again, at most 1 frame like vsync. maybe that’s the reason for why nvidia is only talking about 120 or 144hz devices in combination with gsync, this way at least this critical maxium is higher.
True. But 144Hz monitors will probably prevent this from becoming a problem.

anyways, true triple buffering would be a solution for this problem. funny enough it also kills input lag with vsync. the only problem: directx doesnt use it. their triple buffering method is equivalent to a render queue, so it just adds another frame of input lag instead of eliminating it.
This I don’t understand. With triple buffering you store 2 more frames in advance before they are displayed which means at least 2 more frames of input lag, doesn’t it?

The display just keeps showing the same frame until it’s given a new one. How is there a need for double buffering anymore?
unless the monitor can grab a finished frame instantaneously from the gpu you have to, to prevent the gpu from overriding a finished still non displayed frame. also as a fail-safe for 30fps cuz unless a gsync monitor is working way different, the monitor itself has no memory. if there is a new sync (and there will be at the latest after 30Hz = 33,3ms) you have to give the display a frame, otherwise it’ll display black.

This I don’t understand. With triple buffering you store 2 more frames in advance before they are displayed which means at least 2 more frames of input lag, doesn’t it?
no it doesn’t. well it does mean it for the standard implementation of DX which is more like a render queue. the correct idea of triple buffering is, that the gpu writes in the third buffer and after its frame is finished the third and the secondary buffer switch places and the gpu does the whole process again indepentent from what the monitor is doing. the gpu is only writing in the third buffer while the primary buffer (from which the display gets its picture) is only switching positions with the second one. this way the gpu never has to wait and the frame which gets displayed is always the newest rendered one in this moment.
too bad though that it depends on the specific application if TP is implemented this way and to my knowledge there is no tool to force it this way.
I see, thanks for the explanation.
I know I’m resurrecting an old thread but there is tons of misinformation here. First off the best setting for pc would be with v-sync enabled (either in-game or via control panel will not make a difference unless you have a bugged vsync implementation with in-game) and setting maximum pre-rendered frames to 1 (NOT 0 which is equivalent to default which is 3, newer control panels removed the 0 option to avoid confusion), if anyone noticed this is the ONLY actual setting in the nvidia control panel that mentions affecting input lag and is indeed the only one besides turning v-sync off. No other setting in a properly powered and running pc will effect input lag for v-sync. Triple-buffering does nothing for sf4 as it is a directX game and cannot use it and might actually make things worse.
I don’t advise turning v-sync off as the game will play FASTER than arcade/console standard (input lag wise) and you will be thrown badly off if you play on console or arcade. Also due to the screen tear at high level play you will be at a disadvantage due to seeing different frames across the screen at the same time and this could make visual timings impossible to master. The visual timing of the game would always be slightly random.
Finally as for Nvidias new G-Sync technology, indeed it completely eliminates any frame buffer as long as the fps remain under refresh rate (sf4 is locked at 60) which means that the input lag is equivalent to no v-sync while at the same time giving us a uniform picture. It is literally the best of both worlds. Ideally we would all play with g-sync but since this is a rare pc monitor only feature you would still be thrown off by the slight input lag when you play on console with the vast majority of sf4 players. What I recommend for the possibly best experience would be to play pc online matches with G-Sync because you already have input lag from playing online and G-Sync will get you almost to the equivalent of offline with v-sync timing which is the closest you’ll ever get to playing online with zero lag. I would play training mode and offline in general with standard v-sync (pre-rendered frames set to 1) for being consistent with console/arcade offline timing.
Good bump.
Well, but SF4PC’s in game vsync implementation definitely IS bugged. With it enabled you get considerably worse input lag than on consoles (tested on 3 different PCs).
edit: Umm…have to admit I didn’t pay attention to the part about setting maximum prerendered frames to 1. Which I haven’t done. So maybe the vsync implementation isn’t bugged after all. I’ll try this.
Okay, you were right - maximum prerendered frames = 1 does the trick. Very responsive input and no more screen tearing.
Less input lag than the PS3 version and about the same as the Xbox version.
Thanks a lot!
Here’s a screenshot if someone wonders where to find the option:

thats the same settings am actually using, its really effective, and i found that am missing way less my combos !
Just to confirm, on this topic, earlier there was small talk in regards to monitor that have less lag and are cheaper than the Evo monitors. Sorry to go somewhat off topic, but could someone lead me to where I can find more info about these monitors? I’m interested in purchasing one.
I had the same problem. It’s related to v-sync, but the in-game v-sync option is balls. Lots of input lag. You need RadeonPro. Disable v-sync in-game and enable it in RP. It is possible to get it to feel just like the Xbox version using RP.

Just to confirm, on this topic, earlier there was small talk in regards to monitor that have less lag and are cheaper than the Evo monitors. Sorry to go somewhat off topic, but could someone lead me to where I can find more info about these monitors? I’m interested in purchasing one.