Just gonna drop in and say that the reaction times and Phi phenomenon claims aren’t quite true; the “skipping” of frames is but one way that the human brain fills in the gaps of our perception; it’s not entirely relevant to the latency discussion. It mostly has to do with the concept of a meta-perception that prevents our brains from treating certain perceived things (like individual frames) on their own, outside the context of things in their temporal proximity. Which, yes, is kinda why we don’t spot the aforementioned Aladdin frames.
That’s actually not even entirely accurate of a statement, either, but it’s too late to get it 100% correct (and honestly, I’m not an expert; I have, however, talked at length on the topic with a grad student friend of mine who is doing research about the brain, image perception, motion tracking, and gender).
I’m not trying to make a major definitive claim here, only that I’d submit that “the human brain can’t detect the difference between 0 and 2 frames” is a gross and inaccurate oversimplification. It absolutely can – screen capture a mouse with software that only lags by 1-2 frames and you’ll certainly perceive the lag in a preview window, next to the mouse that’s moving “in real time” (actually at the base delay of your graphics pipeline+display). Drawing tablets with sub-two-frames of latency? Not at all good enough – Microsoft had that crazy-low-latency theoretical drawing surface in their labs a couple years ago for a reason, the latest Wacoms are only a couple frames latent and every traditional artist I’ve talked to have complained about them.
“Perceiving” latency is a matter of our brain; it’s a relative thing that’s nearly impossible to make absolute statements about. If you had a 1 frame and a 2 frame monitor, mirrored and next to each other, and you watched closely (say, moved a mouse quickly), your periphery would be able to catch the different timing of when the mouse stopped or changed direction (oh man the brain is freaking AWESOME at motion stuff you guys like that gets into some crazy genetics shit from when we were hunting, and it looks like that perception has a statistical delta gap giving males the advantage; ladies have other perception advantages, it’s all real fascinating). Take it to the limit and just do a black-to-white-to-black single-frame-flash, and it’s even more likely the delta will be noticed. If you then just had the slower monitor hooked up, would you “notice the lag”? Yeah, maybe. Because you just saw it slower, so your brain’s thinking it. Would you have noticed it had we not done that earlier comparison? Almost certainly no. When our brains want to pair up and associate multiple inputs (proprioceptive input of a button press registering with something on screen, like frame of an animation), there’s some leniency. Sometimes it’s a little, sometimes it’s a lot.
But yeah, TL;DR the brain’s cool, making sweeping statements about how it works is hubris at best, but people aren’t out there reliably noticing a 2-frame monitor because that’s just silly. And even if we don’t “notice” it, it’s still there, which is why we measure. Latency is a measurement, and a valuable one; numbers are things that, when properly and consistently acquired, tell everyone the same thing; statements like “it feels fine and I’ve been playing games since forever” don’t do others much good, because different people are different. This is supposed to be a discussion of the numbers, because we’re all crazy enough to care about those; go to SlickDeals if you wanna just recommend people decent deals on monitors that feel good to you. They’ll probably complain about your lack of numbers, too, but at least it won’t be on a thread whose sole purpose is said numbers.
</walloftext>