That is why you start recording the 60FPS video before you start the game, thus you know very well that you did not start recording the footage of the game when it was mid frame.
So:
Start recording > go into training mode run test
Though I trust RoboKrikit results so far - he even posted gifs of when things would be out of sync.
Edit: to double confirm the results one should use the fast capture option of a camera in photo mode so they can manually advance frame by frame at a much faster capture rate than compare the two platforms between each other.
If the lag was really 4.5frames/75ms, then out of the 50 runs about half would show as 4 and half 5. Like you said, depending on the time that the camera is recording a frame and the monitor displays a frame. That is why you do multiple runs and take an average. To take into account that your unit of measure is 16.66ms and the lag is not necessarily a even multiple of 16.66.
This has nothing to do with it. If you’re capturing at 60fps, it’s not going to make a difference. a millisecond is 1/1000 of a second. A frame is 1/60 of a second. And no one has the precision to hit the record button at the very beginning of the frame on command.
The 1/60 exposure covers approximately 1 full frame of time. So when you are out of sync (and you will be in and out of sync a lot if you record for even a minute), you see a fainter image of both the LED and the animated frame on screen. This is what I am showing in the sample ‘out of sync’ image. So the 1 frame window you are talking about here is mostly eliminated, and should be exposed over the course of a long enough session (or multiple sessions). The camera is never actually a perfect 60fps, so even when started in sync they drift in and out of sync. If you could start the camera at the perfect time, it would only matter if you had a camera that was in perfect sync with the other components to begin with.
As for the lag itself, it is unlikely the input will fall on a ‘perfect’ frame. If you manage to get your input through your PCB and into the waiting buffer for the game itself right at the end of the 1/60 window, when the buffer is collected, it will appear to come out sooner than if you had gotten it there at the beginning of the 1/60 window, right after the previous buffer had been collected. (I’m just re-explaining what you already said.)
This matters the most if you are measuring milliseconds instead of frames. If you measure milliseconds of lag, like people attempt to do with monitors, your millisecond measurements will vary drastically and will mostly be wrong, and need to be averaged over many measurements to get anything reliable.
Imagine for the sake of the conversation that your stick’s PCB is magic, and deposits the input directly into the game’s input buffer the moment you press it. When the LED lit, the camera captured that moment on camera frame 0. If it happened at the end of the game window, the camera has to wait until the next frame for any potential result to show. If it happened at the beginning of the window, it still has to wait until the next frame for any potential result to show. Since we are measuring in 1 frame units, shown by 1/60s of light captured by a camera, the exact moment of input within that frame is not as crucial as it might sound. If the LED light is strong, it probably happened in the first half of the frame, if it is weak, it probably happened in the last half, and same goes for the on-screen images, but whether it is weak or strong it is counted on that frame. There is some variable amount of exposure time that could cause the image in corner cases to be too faint to be visible in the image, but I think we are dealing in times significantly less than 1 full frame.
Of course no stick’s PCB is magic, and it takes time to get there, and we don’t know which PCBs have which amount of lag, or if that lag is variable, or whatever, so we are always dealing with approximates, but I think with all of the above factored we can get pretty close to 1 frame, especially with numerous measurements. If I would’ve seen much variation I would’ve had to guess at 4-5 or 3-4 or whatever frames, but it was pretty consistent. SF4 was less consistent.
I’m thinking out loud here so I hope that is all coherent. I don’t promise it is all accurate.
That doesn’t really make sense, since there’s no guarantee you wouldn’t be halfway through a camera exposure when the game started. There is no practical way to sync them up with these kinds of tools.
If I had a 120fps camera I would use that, but I don’t. I don’t even have a 60p camera so I have to deal with interlaced fields. You can set a camera to 1/120 exposure easily, but that doesn’t make it record 120 frames per second, it just decreases the exposure of the 1 frame.
That’s funny and relevant, thanks for posting that. That shows that arcade ST is at “4” frames, but NKI is counting from 1 and I am counting from 0. I’m counting frames of lag, and NKI is counting total frames including the button press. That means that 3SOE input delay (by my results) is 1 more frame than arcade ST. Busted! Or is it? We aren’t sure yet about arcade 3S lag.
So what makes the most sense for describing input lag? Lately in Tech Talk threads we’ve been using 0-based lag, where frame 0 is the input, and frames of delay are counted after. I didn’t remember at all that the NKI vid of arcade ST was different.
Re: HDR, I want to see other games tested, but I want to let this one simmer a while before starting something else. Say we have bad data here, I don’t want to spread it around to other stuff until it is sorted out.
And that might matter if the game ran on a continuous time line. It checks the inputs every frame, so its on a discrete time line.
But, hey, let’s educate. Let’s say that the latency on the PS3 was 4.1 frames and the latency on the Xbox360 was 4.9 frames worth of time; you’re free to convert that to the appropriate milliseconds if you like, just know its a unit of time. Since the game checks the input once per frame, the button is pressed somewhere between 0 and 1 frame of time before the input registers on the game. If that button press time (truly a random number, 0-1 frame in value) plus the inherent continuous time latency you’re expecting is between 3-4 frames , the action would display on frame 4 after the LED lit up, since it registers after frame 3 was read and processed. If the button press time plus the inherent continuous time latency is between 4-5 frames, then it would display on frame 5 after the LED lit up.
If that were the case, then his recordings would show 1/10th of the PS3 activations on the fifth recorded frame after the button LED lit up; his fifty tests on PS3 would show roughly 5 tests at 5 frames delay, and 45 tests at 4 frame delay, and the Xbox360 would show 45 frames at 5 frames delay, and 5 frames at 4 frames delay. Would you disagree with this? Sure, pretty small sample size, so you could even realistically see number like 48-2 or 43-7, but it’d be damn close to the 45-5 ratio. Would you agree?
Now, I dont have the exact numbers pulled from the recording video. I did ask for the distribution, but that may be a bit before anyone crunches them. Based on Robokrikits setup, if the same results are seen, say a 45 hits on frame 4 and 5 hits on frame 5 (just pulling that outta my ass), on both the PS3 and Xbox360 versions, regardless of what the actual continuous latency inherent to the system is, wouldn’t you agree that the no one console has more or less latency that the other? I mean, maybe the inherent latency is 70ms, or 40ms. Nobody cares. The point is, that one systems isn’t better than the other for playing. The test was strictly to compare latency between the two machines, not to determine a specific continuous timeline latency of either system.
We’re all for critique, but please back it up better than this. You’re downplaying the work he’s done without any suggestion of how it could be better, and brownfingering stuff and trying to pass it off as facts. There is simply no way the Xbox360 setup tested could be even 1 frame more latent that the PS3, or visa versa, much less the 2 frames you’ve stated.
However, the better question is how it compares to arcade. The SFAC guide says Ryu’s s.jab has a 4 frame startup. Animation starts at frame 4 in his tests, and is fully extended for what I have to assume is the first frame with a hit box at frame 8. If we assume the CPS-3 does internal double buffering, then that would mean 3SO would have 3 frames of latency over the arcade version. Possibly two (if the CPS3 does triple buffering for some unknown reason), possibly 4 (no buffering, possible but unlikely) but likely 3. I could def understand why arcade enthusiasts would prefer the real deal.
It is normal for arcade boards or any game in general to have a delay between inputs and the game taking action on the input. ST Ryu’s crouching roundhouse does not have 3 invisible startup frames before it animates. Where did you get that information?
There’s the question of where does it lag? Since there are reports of folks with the actual arcade board also saying that it doesn’t lag on their setup. This means that there’s a possibility that its due to the setup some people are using (and arcade cab with an LCD is still on an LCD which is usually slower than a CRT).
Here, I dug up the old thread. NKI tested other games and reported their input delays when some people were skeptical about arcade ST having input delay. Keep in mind he is still using the 1-base frame numbering at this point.
Shortly after that, Laugh agrees with me in the past about the input frame being the 0-frame, which NKI acknowledges. Laugh also says there that lagless emulator tests show ST natively responding in 2 frames, and speculates that the arcade hardware itself or the supergun is causing 1 frame more delay. It doesn’t look like any further tests were done on arcade ST, at least in that thread.
Derek has posted here saying that they are unable to reproduce the mysterious +1-3f PS3 input lag on their stations, most likely the same stations they had the pros playtest. Yet when people play 3SOE at home on their PS3s, there are complaints of lag.
That and the information posted here just points to the problem being the setups people are using. Some combination of controllers or TVs/monitors or whatever causing problems specifically with PS3 3SOE.
While I find lag discussions interesting (it was integral to me picking out my current gaming tv) I have to woder how many members of the community are actually sensitive enough to lag for this to matter. I mean people were convinced the PS3 version lagged more but these tests showed both versions are identical. I don’t think everyone developed Daigo/Fuudo reactions over night. Maybe I’m underestimating the impact.
It has been proven time and time again that even the very proest of the pro cannot discern accurately determine input lag of less than 3 frames just by how it looks/plays/feels.
That hurts alot of people’s feelings to say but its true
Something I’ve been curious about for a while is whether or not different model PS3s have different inherent lag or something. It sounds crazy, but when I was at PowerUp, I could’ve sworn that my controllers felt less responsive on consoles with 4 USB ports than they did on consoles with 2 USB ports. I could very well have just been imagining things and it’s been so long that I can’t remember how the setups differed between stations, but I’ve heard from other sources that the USB ports on the console add some lag so I’m just wondering whether that’s ever been verified or not.
Even if that would be true, it still makes a difference. For example a 2 frame link that you do based on visuals and not muscle memory is not possible with the usual timing. I might not be able to tell if i’m having an off day or if there is input lag… but i will still drop the link more often.
So, hold up, where did the entire ‘input lag on 3SO’ come from anyways?
Let’s be real for a second here.
It seemed like some of this mess started to riot up after fubar reported that there was input lag, and that other top players agreed when playing. However, you have to understand the setup it’s on. Basically, it’s a console setup from PS3 going through converters, and into an arcade’s control panel, so it’s going through a lot of bullshit in order for something to register. I don’t have the exact details on the setup, so might want to take that with a grain of salt.
The thing is, that setup also gives us a hard time on other games as well. Normally, we would play Marvel 3 on that setup, and everyone that takes the game seriously has issues with it. Our top players there always complain of input drops or certain things just not working at the right moment. I can’t explain it, however, I have to agree there is differently something wrong with the setup. We even used that same exact setup to play Super SF4 before AE was out, and sadly, we had the same exact issues. We could never figure out what exactly was wrong with the setup, but if it was input lag on our end, then the situation becomes more clear.
In the end, it could just be the way we have it setup and the PS3 version could be just fine.
That, or maybe the PS3 version may actually have lag, however the tests and feed back from other people are starting to get a bit overwhelming.
NKI tests is legit, but I wouldn’t mind seeing this guy do a 3SO input test, if possible.
Maybe, maybe not, because you’re ignoring the important variable of the camera, which is why I said it’s important to run the tests on multiple passes using this methodology–and he apparently agrees, since he did that.
See above.
And in order to compare the latency between two machines you have to know that your data is accurate to a point.
“Brownfingering?” Cute. Pretty much everything in this paragraph is wrong. I’m not downplaying the work he’s done, I just asked if he ran it all in one pass. I did make a suggestion of how it could be better: run the test in multiple passes and average the results. He clarified that he did that, so it’s all good. And furthermore, I never said that there was a 2f difference between the two tests in this case, only that there was a potential up to 2f difference between the recorded results and the “actual” lag. (I put “actual” in quotes because, while 1.8 frames of lag isn’t 2 frames of lag, it would feel a hell of a lot closer to 2 frames than to 1 or 0). I was talking about the inherent flaw in the testing methodology, not that there was that much of a difference in the separate system tests.
The 2 frame potential inaccuracy comes from up to just under frame of potential additional lag beyond a frame that may not be detectable through a single 60fps camera test, combined with a potential offset of the camera with the display of up to just under 1f that may not be reflected in the capture. As RoboKrikit mentioned above, camera/monitor sync is pretty random, which is just more reason any kind of lag test using a camera should be done in multiple passes for accuracy.