Super Turbo Offline Setup Guide & Lag Rating

DGV (Dark Gaiden & Vintage) along with Papasi have been working on a Offline Setup Guide to assist players who are looking to find an offline setup that best suits them. A lot of research, testing and other work went into making this chart so a special thanks to DGV and Papasi along with Zaspacer for creating the chart.

http://www.strevival.com/strategy/offlinesetupguide

Cheers fellas! Out of curiosity, how did you measure the input lag on the CRT / LCDs? Was it assumed based on the manufactrurer listed response time and / or refresh rate or was it physically tested somehow?

There’s also this thing called “micro stutter” which affects Vista/7/8 but does not affect XP which definitely has an impact on how it “feels”.

Big thanks to Rufus as well. And also thanks to all the other people who have contributed to testing and discussing this issue over the years.

Those numbers are higher than the results I got:

Thanks Papasi!

(and DGV)

Thanks for mentioning the default USB polling rate as a source of input lag.

It’s relatively easy to manually set the polling rate for a USB mouse. Let see if the TE S+ can handle 250Hz polling.

I’m not sure how to check if it’s working correctly at that rate.Any ideas?

@Rufus it’s impossible that the game would have less than 4 frames of lag (66.6 ms) since it runs at a 60 Hz refresh rate.

What are you taking about? A frame can be displayed per refresh making the theoretical minimum input lag the next refresh, which happens 60 times per second.

It means the fastest possible response is 4 frames from the time you hit a button to the time you see it on screen on a game that is 60 hz. That’s the bare minimum, you can’t go lower than that it’s physically impossible. You can have an input come out on any frame of a game, but that all that means is that you’ve pressed the button 4 frames earlier. An input can be done at any point during the refresh however, it still takes 4 frames for it to happen.

On a game that runs at 120 Hz it takes 2 frames. On a game that runs at 30 Hz it takes 8 frames.

In theory, the next ‘pixel’ could be changed when a button is pushed, though Street Fighter won’t do that.

“Those words. I do not think they mean what you think they mean.”
– Fezzig, The Princess Bride (paraphrased)

There’s some confusion here. The CPS-2 A board runs on a Motorolla 68000 chip with a 16Mhz clock. (That’s 16000000Hz.) When people talk about the game running at 60hz, they’re really only referring to the vsync of the display. (It’s a little sloppy for other reasons too – SF2T is interlaced, so what people refer to as a frame is really a field, and it’s not actually 60hz, but 59.94 nominal, and so on.)

I don’t want to get too far off track, but I’m a little curious to know what the explanation for requiring four frames of lag is.

Regarding the OP:
I understand that there are differences due to how people count, but I don’t know how something like this could be 6-7 frames:
http://www.pedantic.org/~nate/HDR/misc/delay/turbodelay1.html

The total input lag is the time required after input for 1) game to process the input and generate the next frame 2) monitor lag 3) vsync 4) cache ahead buffering

I have a feeling vsync and cache ahead buffering is what you are talking about as that’s causing the 4 frame lag on a 60Hz screen.

So other than reposting a chart, how do we know any of this data is valid?
What process or technique was used to gather this data?
What is the significance of these numbers? Are there arbitrarily chosen or do they represent frame rate?

Also the Poll rate for USB devices when it comes to game pads /joysticks can be misleading, specially to those who don’t understand USB polling.

I pour a couple hours looking into that issue. The conclusion is that USB poll rate doesn’t matter on arcade sticks and gamepads. USB polling really only matter for the mouse. Also, the keyboard has the worst input delay by far, on the order of 20-30 ms before an input is even registered by the machine.

In this thread:
Comparison of HDR Versions (PS3, 360, DC, CPS2)

My testing method is too hook a wire from the button to the video signal and then to record the mixed signal with an Canopus ADVC 100 or 110 which produces full frame rate DV video. There are opto-isolaters involved, but the change in the video signal should be effectively instant on the time scales we’re talking about

For the supergun testing, the video signal was modified on the JAMMA side of the video converter to control for converter lag. Xbox 360, PS3 and Dreamcast testing was done using composite out. Dreamcast was with a toodles CD, Xbox 360 and PS3 with HDR remixed in training mode.

I can’t recall the number of samples offhand, I think I got on the order of 100 for the Xbox 360 and 20-50 for the others.

Here’s some stuff about how I counted frames:
http://www.pedantic.org/~nate/HDR/misc/delay/turbodelay1.html
http://www.pedantic.org/~nate/HDR/misc/delay/righttest.html

If you have other questions, feel free to ask.

As for the numbers themselves, I can’t independantly verify my own results.

It’s a good chart. I wonder what is causing that controller lag for your setup.

I looked into increasing polling rate for my Xbox Madcatz TE S+ but further research into the issue reveals that for a game controller, the controller should be polling as fast as the game is asking for the state of the controller.

I’ve posted this elsewhere, but SRK search leaves something to be desired so:

Let’s assume, for the sake of discussion, that on the PS3 and the XBox 360 the sampling rates for the game and the controller aren’t synchronized.

Let’s assume that the time between an input change and when the controller controller gets sampled is randomly distributed over the length of the sampling cycle. So, for example, if the controller is sampled every 8 ms, then the time varies from 0-8ms, average 4ms, and if the controller is sampled every ms, then the time varies from 0-1ms, average 0.5 ms. That means that the controller which gets sampled every 8ms has - on average - 3.5ms more lag than the one that gets sampled every 1ms.

Now, typical USB 1 sampling rates include 125Hz (every 8ms) and 1000Hz (every 1ms) so if some sticks are sampled at 125 and others are sampled at 1000, then we would expect to see an average difference of 3.5 ms in lag between the two.

Now, I wired a button on my 1st generation Xbox360 MadCatz SE and a button on a first generation PS360 together so that they are activated simulataneously, and counted how often I got double hits, how often the SE stick interrupted the PS360 stick, and how often the PS360 interrupted the SE. I don’t recall the exact results, but they were something like:
PS360 first: 9
Double hit: 41
SE stick: 0
(The actual figures are buried in tech talk, and this experiment has been reported on by others there.)

If the SE samples every 8 ms, and the PS360 samples every 1 ms, we would expect the SE to win 3.5/16.68=.209 of the time which is extremely consistent with the experimental results.

I also got approximately the same 3.5 ms delay differences when I did video-recorder based lag testing of the PS360 and SE stick on my Xbox360.

I don’t have the sophistication to check the actual sampling rates, and I’ve given away my Xbox, but USB sampling rate seems like a very plausible explanation for that difference in the input lag.

Edit: I do suspect that the chart overstates the effect of input lag.

-MaxGrit:

I can confirm that the USB poll rate does in fact have an influence on an arcade joysticks (PS3 based sticks only). The difference between playing with 125ms (standard USB setting) and 1000mz is pretty noticeable. 360 joysticks are unfortunately not supported by the poll rate modifying programs as of yet.

Unfortunately game programmers have confirmed what I wrote and testing has proven that the speed the game updates indeed affects the input lag as I specified. It’s why games running without vsync at at higher refresh rates can process inputs faster than games running at lower refresh rates. Games refreshing at 30 fps (for 3D games) have 7 to 8 (most of the time 8 minimum, and often higher because of post) frames of lag native to the frame rate. Additional post or additional CPU/GPU overhead will cause more frames of lag. Killzone 2 is a great example of lots of post adding a bunch of extra lag.

If you really want to read the article, you can. The quotes are direct from a game programmer, in case you don’t want to take my word for it.

Source: http://www.eurogamer.net/articles/digitalfoundry-lag-factor-article

I’m more inclined to believe someone who codes games for a living which are response time intensive.

I’m also quite aware of the fact the game’s processor isn’t 60 Hz and well aware that’s the refresh, there isn’t any confusion there. The game is also progressive from what I recall, as you can clearly see scanlines which you won’t see on an interlaced image since each field is alternating the scanlines to make a solid picture. R-Cade Gaming and I had a huge discussion about this and I learned a lot from him. So, it’s actually updating a progressive signal at 60 Hz, not interlaced fields.

There is also a reason action games like DMC (not DmC) and Street Fighter target 60 fps, because of the natively lower input delay.

No game software and hardware will ever have instant input delay. However, the faster you render the game the faster the inputs can be, which is why many twitch shooter gamers will lower the graphics, turn off vsync, and run at 120 Hz because of the native faster input delay.

Heyo, can you tell me your testing method? I’d like to see it.