Seriously, can’t be arsed to read a a few articles by experts? What does the dates have to do with it? If I write an article for basic snap action microswitches and 10 years later microswitches are still the same (actually, the ones we use in arcade sticks have been around since the 80s at the very least) is it not relevant because of the date?
You don’t have time to read, but you have time to sit on a forum and argue about things you don’t understand?
Oh, but hey, it’s not like that article wasn’t written by an expert in his field, nah no way he could be a veteran programmer who knows his shit. I mean, he didn’t working on twitch based arcade racing games or anything like that… -_- How much programming experience do you have? How many games have you programmed to run at bare minimum input lag?
Hilariously you’re wrong. HDTV standard (you know, what people use to play PS3 games on…) has a native refresh rate of 60 Hz and the camera records at 60 fps. He counts twice because each refresh rate of the monitor syncs with 1 frame of the game at 30 fps. You can then calculate in lag time on a 1/60th of a second time duration of 16.7 ms, or each refresh of the screen being drawn regardless of the game’s frame rate being lower.
This is pretty much a standard. 12 frames of lag at 60 fps = 200.4 ms, and 6 frames of lag at 30 fps still equals 200.4 ms… 1 frame at 30 fps does not equal 1 frame at 60 fps. Learn to do some math please. Learn about HDTV standards please. Read and listen to experts in their fields, please.
Where did I ever write wonky hardware? Please quote me.
Again, what are the controls for those games? Are they vsynced to the refresh rate? Are they unsynced? If unsynced, then as I’ve already written previously you can render many more frames per second. Many FPS tournaments are set up that way, using high refresh monitors, unsynced and uncapped FPS, and extremely low resolutions to avoid taxing the GPU for the fastest twitch input times possible.
Whether you like it or not, FPS is tied to input latency. Why do you think input latency jumps to 133 ms when 60 fps games dip down to 30 fps? Why has there been things like Triple Buffering created to avoid that, at the cost of a little extra latency? Why did companies like nVidia come up with G-Sync so that they can maintain input latency with no screen tearing even when a game dips below a target framerate as an improvement over vsync+triple buffering? Why is there things in my nVidia driver control panels for shit like adaptive vsync which maintains vsync until the GPU is being stressed then drops it and allows tearing to avoid input latency spikes? Oh no, you’re not telling me all this stuff exists just for shits and giggles and it has nothing to do with input latency??? I mean, why would any company spend $$$ doing R&D for such things, we have experts like you!
Look at nearly every single 30 fps input latency test. They’re usually up around 133 ms unless they have massive amounts of post processing creating further latency. Look at nearly every 60 fps input latency test. Usually down to 66 ms. There is literally years of data and tests to back these up. Since things haven’t changed and we’re still using tried and true methods like vsync or no vsync uncapped FPS, these tests and data are still valid. Old school arcade games updated at 60 Hz and we have some good people in the ST community and TT community who have tested actual, real, not emulated by MAME arcade hardware at 4 frames of input lag, or, by the 60 Hz standard which those games were made to update at and which the CRT monitors they were attached to updated at, of 66 ms of latency.
Nah, but you know better. You’re the expert. You have tests and data and experts in the field backing you up. Oh wait