3S:OE Input Lag Testing

Robo, can you post up the numbers from the 50 tests to get an idea of distribution?

How do you figure? Unless one controller board consistently lags in comparison to the other, (and they dont), I dont see how it could be any more solid.

Each res for each console got 50 inputs recorded. I didn’t step through every single one of them frame by frame, but I probably saw half of them for each res. There was surprisingly little variation. Since the camera can’t be perfectly in sync with the monitor, there is the potential for difference, but with the shutter set to 1/60 you can catch a lot of it if you know what to look for. The samples I posted were more ideal sequences, but some will look like this:

3SOE Out of Sync Example

This is only true of SF4 frame data, nothing else. Apparently adding 1 to a number was too hard?

Great test man, we really appreciate you posting this. :slight_smile:

You guys are right, the HDFury would only show the difference between the two consoles. I still am a bit wary of the claim that there’s an inherent 4 frames of lag observes (when running through the HDFury) but my head wasn’t on straight and I missed the main point. Thanks for your responses, Toodles and RoboKrikit.

I don’t have time to run through them again now, but I put up the source files (added to Notes). The only sequences I noticed much variation with were the ones that AviSynth/VirtualDub (or me) had screwed up with frames out of order or missing frames. I’m no expert with vdub or video processing, so I put up the source files for someone better-equipped. Note that to view them frame by frame (really field by field since it’s 60i), you’ll need something capable of splitting out the fields. If you play those in a normal media player they’ll play back at 30fps. In my older SF4 tests I used HDVxDV on a Mac to capture the footage, which split the frames automatically, so I didn’t have any processing issues.

Robo - are you able to test both the ps3 and 360 when connected via hdmi to an asus vh236h?

This game is a disaster.

I’m so used to the Xbox version now. I’m adjusted. It definitely felt “different” at first, but that could very well be that I’m playing on a TV through a console for the first time in ages…

The only time I noticed really bad input delay was online. But that can be rectified by setting GGPO to 0 frames instead of the default 2. While things may get jumpy this way (on bad connections), I still prefer not having that input delay.

Overall I’m happy with the port. Feels better than the PS2 port, and about as good as the old Xbox 1 port, which I thought was great.

Keep that shit out of this thread.

wow… sweet test, thanks for taking the time to do it.

btw i don’t think u need complementparity since your source is all from the same camera.

i think this might be a directshowsource issue. there might be a better way to open m2t files but i’m not sure.

edit: note: my comments don’t affect the results of the test

double edit, found the source, thanks!

The only way this testing methodology can be completely accurate is if the lag amount happens to be in exact 1 frame increments AND the filming starts exactly at the beginning of a given frame. Otherwise, the actual amount of lag can be somewhere up to a frame different from what is detectable, and the camera can be “misaligned” to the frame by up to a frame as well. A single pass is not dependable for results. Repeatedly pressing the button in a single pass doesn’t make a difference regarding this either.

I had to add it in because the fields were always swapped (2:1 instead of 1:2) without it.

I added a link to the m2t files in the Notes section, please check them out if you get a chance. If you come up with better avisynth stuff, let me know; everything I know about avisynth/vdub (almost nothing) I had to figure out on the fly yesterday just to get at the data.

It’s a best effort thing, I’m sure you know. The camera can never be completely accurate, nor can our inputs themselves ever be completely accurate while playing, as we are not synced at 60Hz with the console either.

Edit: If it is the synchronization between the camera capturing the images (and the screen refreshing, and the console) that you are concerned with, you can rest assured that the entire thing was performed in utter reckless chaos and they were always coming in and out of sync. Between each test the camera was stopped, the game stopped, monitor changed resolutions, etc. At some point there is not much else you can do without specialized tools.

Also the camera’s idea of “60Hz” and the monitor and console’s 60Hz are separate. When you take these kinds of videos you can see them drifting into and out of sync as the movement on screen is sometimes well-exposed in each frame, then gradually drifts to partially exposed and back. It doesn’t make the tests useless, you just need to know what to look for to help you zone in on an accurate estimate.

Can you perform the same combos on both consoles to test for slowdown?

Yes, that’s my point. That’s why it’s better to do multiple runs and average them rather than just do a single pass if you’re going to use that methodology. 2f is a pretty big practical margin for error, after all.

Check out my edit above. Didn’t see this reply beforehand.

Regardless, over many runs, it was nearly always 4f. Each test was done separately and I stopped and started a lot of them that I didn’t include here. I never had a run where I thought, Huh, maybe it’s 3 frames, or 5. But the way I think about these kinds of things, I assume I’m wrong from the outset and hope other people run similar tests to see what they come up with. One person’s tests are just that; one person’s tests that are vulnerable to the same person’s mistakes.

Just so we all understand each other, how are you coming up with the 2f margin of error? I don’t think everyone understands.

I think what he means is if you get 4f as a result, it can be either 3 or 5. Thus the two frame margin of error.

Someone really needs to test the arcade version to see if it has 4f.

Also, remember the PS2 version. Why did it feel “slower”. Was it due to input lag, or something else?

I don’t know if we can test for slowdown, since we don’t have a baseline for comparison (arcade). Is there a specific scenario that would satisfy anyone?

But that does not explain why and how, just what.

Because a frame is 16.66 milliseconds, so:

  • When you start the camera, you’re not guaranteed to start on millisecond 0. You could start anywhere from millisecond 0 to 16 and you may or may not be able to tell the difference, depending how the game renders frames.
  • If the lag is an even frame, then there’s no variable there. But what if the lag is 9 ms? What if the lag is 25 ms? How could you tell the difference between those values with this camera test? In many cases, you couldn’t, and this is exacerbated by the fact that you can’t perfectly time to 1 millisecond when to start the camera. This creates another frame of potential fluctuation.