I’m working on a new joystick controller. It’s already working on USB as a normal HID device, with the minimum 8ms of lag that allows. I have found a way to reduce the lag to 1ms, with assistance from the author of vJoy. Basically I use interrupt endpoints, which have a 1000Hz update rate compared to 120Hz for HID, and a special software driver for Windows. A similar driver could be created for Linux/MacOS, but games consoles with USB will never be supported.
I’m not a hardcore player myself. I mostly play retro stuff and find that a 1000Hz update rate helps in some real machines and emulators… Or maybe it’s just in my mind. What I want to know is if it’s worth developing this hack into a more user friendly app for Windows, or if people are happy with 120Hz updates.
FPS players seem to like “overclocking” their mice to get faster updates, but again it isn’t clear if it really makes much difference.
I hate to be that guy, but I going to be that guy.
Then my first advice to you as a player would be “do not worry about it”. We can all debate the finer points of lag till were both angry with each other.
I think the majority of “lag” in controllers is psychosomatic. Most of this timing is faster than human timing can pick up.
Yes there are people who can observe see and identify objects that not just see objects for a single frame at 60fps, but their are fighter jet pilots who an identify images of air craft that appears on a single frame of 240 fps video. But the actual response time to seeing these quick blips is another thing entirely.
Controllers that have terrible latency scores do not decide the outcome of a match. There been top level players at Evo who won despite having some of the most “Laggy” of controllers.
Far as I am concern high number polling rates only effect Mouses as it effects the accuracy of the optical sensors of the mouse and not the response time. (even ball Mouses use optical encoders with rotary wheels).
Keyboards is another thing, but unless your a big RTS or Moba competitive players (but not fighters) or a professional typist you would not see the effects.
As far as most games go, detections and calculations happen on a per frame basis, as long as your inputs are registered in that 16.66 ms window you are fine. Performance is not truly hampered till we get into a frame or more of Lag. Like for example, I find that certain jumps in Mega Man games become impossible with 2 frames or more of input lag, sub-frame lag is.
Yes the faster the controller/ lower the input latency the faster your inputs becomes. But does it really effect game play? Depends. More than 16.66ms Yes, less than 8ms no.
If you want to do this for science sure, do it for science. If you just doing this for a better game, quit worrying and play some video games already.
I used to think this, until I realized one detail: time is continuous. There is no guarantee that your 8-ms controller input lag will arrive in time for the next window (e.g., you could be 4 milliseconds away from the threshold) and if it doesn’t your input will arrive one frame later than it was physically possible. The controller PCB and the game are not synchronized (when you press a button you can be at any point along the frame window)… You can argue one-frame windows are generally not that important, I suppose…
Thanks, you guys have basically summed up my thoughts on the issue.
The engineer part of me thinks that human reactions are at best in the 150-200ms range. Then I sit down and play some retro games on real hardware and it seems to make a difference. Older machines had real zero lag, switches connected to the CPU bus, and games like PacMan seem to benefit from a faster update rate.
One issue I think could be exacerbating this is what ShinMagus alludes to. On HID you get to send one update every 8ms. You can do it one of two ways:
Update -> read inputs -> wait 7.9ms for next update
Update -> wait 7.9ms -> read inputs -> update
In the first example it’s possible that you enter a command after the “read inputs” stage and it takes anywhere from 7.9 to 16ms to arrive. 16ms is one frame…
Then again, modern games like SF5 deliberately introduce input lag by design. So maybe it only really matters with older games. I feel like MAME benefits too but it’s hard to say.
I partially agree. Time, real actual time is continuous. Computer logic on the other hand operates in cycles.
Depending on how the game engine is built and encoded is how any of this works. Is the Game checking per CPU Cycle for inputs? (unlikely as it create some major slow down)
It most likely the game engine is checking every Frame, sometimes once every 5 frames like with enemy AI (or some other metric such as an arbitrary amount of time).
Example a NES game pad uses a 4021 parallel to serial shift register as it’s encoder’s latency is in the nano second range, its actual inputs are only going to be as fast as the CPU clock cycles allows it to be.
The NES CPU, the 6502 runs at 1.79 MHz (NTSC Version) and completes operations every 2 clock cycles. I doubt the game pad checks in per cycle or per operation.
And the hit detection and controller input check of most fighting games happens on a per Frame basis. It’s near impossible to get the timing of the frame rate right, as anyone who tried to video record a CRT screen will tell you. So that means a 1000hz (thousand times a second) polling rate means nothing to a game engine that at most is going to check once per frame (sixty times per second).
I know a terribly laggy input will effect game play, just try to play Mega Man 2 with a Display that has 2 or more frames of input lag. It becomes impossible to time the jumps for those platforms.
But as we go in between the range of 0 to 10ms for a controller, the differences become less and less apparent to the player unless you look for and use certain visual clues. Such as the hit method to test arcade controller lag. Or the Less accurate Manual Lag Test on the 240p test suite.
The issue I have with that test suite is that it is great for measuring display lag, but not so much for measuring input lag because the human pressing the button will anticipate the movement of the sprite.
I think it’s at least worth trying to do the timing like I said, because then you will always get a maximum of 1/2 frame input lag, where as doing it the simple way that I see a lot of controller firmware does it can create up to 1 frame of lag.
My usual explanation for lag is, if your stick lags 8ms, and you’re versing someone whose stick lags 1ms, ~50% of the time if you push the same button, you will lose to the faster stick.
This looks more like polling rate though, I threw together a table to try to account for this. If the polling rate is 8ms, ~25% of the time the stick with the polling rate of 8ms will lose to a stick with a polling rate of 1ms if you push the button at the same time.
https://twitter.com/noodalls/status/828342073170419712 - you can see here the range of inputs occurring from millisecond 1 to millsecond 16 of the frame, with the inputs being read on frame 8-16 (i.e. polling twice per frame). Next indicates the input has come too late, and will therefore be picked up on the next frame.
I think consistency is far more important when you’re talking about miniscule amounts of time (as in less than a frame at 60fps). So if you’re used to 8 ms and something is at 14ms, it may throw you off a bit when doing very tight links. If you’re using the same stick all the time, variances in monitors/tv’s that you’re playing on should be much more of a concern.
I’ve been experimenting with MAME a bit more. Some games definitely feel different with the 1ms mode. It’s more like playing on a Supergun. Depends on the game… I couldn’t tell with SF but I don’t play it much. Final Fight and PacMan seemed to be affected.
My monitor is pretty good for lag (1 frame) so I don’t think it’s unreasonable to say that 1/2 frame of additional lag might make a difference.
noodalls: You are correct there, but as I say it’s actually possible to get up to 16ms of delay for an input if the controller isn’t well designed. If it reads inputs immediately after a poll, and you hit the button immediately after that, the next poll will get the reading without button press and only then read your input, which has to wait another 8ms before being reported.
Emulation will never compare to the real thing, Mame is also far, far from the best in emulation. Mame is just one of the older Emus, and it emulates games from a huge range of arcade hardware.
Mame is just good for doing a bunch of different hardware configs. Mame rather than actually synthesizing the arcade hardware is just doing a high level conversion to allow the software of the game to just happen run in windows. A Low level emulation is to simulate all the hardware with in code, it gives a better compatibility with all roms for that particular hardware but it takes up much more system resources (example BSNES is more demanding for system resources than Crysis).
Problem with this is the software is molded to respond per game and earlier version were made to run bad rom dumps (and not actual good roms).
As Rom dumps got better, the good non-corrupt roms didn’t work so the Emu had to be rewritten to run non corrupted romsets. Issues with Roms are tackles per rom, and the programmers try to tackle the more popular roms first rather than make sure everything worked as it should. Fortunate for Mame, the dev team is from a wide and assorted group who values diversity. Often BIOS files from various hardware had to be included to run games from a particular hardware set.
For CPS 2, CPS 3, Neo Geo ect FinalBurn Alpha is a much better Emulator.
You also have to factor some lag can come from the Emu not the controller or display.
I believe Toodles’ cthulhu is the only controller where you can intentionally change polling rates via firmware updates. This allowed to either have 1ms or 10ms polling rates. This has a significant impact on the input lag of the pcb.
@kuro68k I say go for it. Sounds like you have a working proof-of-concept and some anecdotal evidence that it improves the feel in certain situations. Will there be interest in competitive fighting gamers? Probably not since PCs running custom drivers are not really tournament standard.
Would it help? I think what DarkSakul says it true: 7ms is nearly instantaneous to human perception. But it just isn’t relevant in an absolute sense. Latency is additive, and reducing latency anywhere is going to marginally improve things. Relatively, is 7ms a large part of the latency? Between display latency, human latency and maybe network lag? Heck no. But guess what? You have the tech to improve part of that for everyone who uses your software.
As an aside: by far the biggest block of latency is between the monitor and controller (it’s you!). Raw reactions are hard to improve, but you can improve your ability to react situationally by anticipating outcomes. For example, to learn how to confirm Ken’s CA off of cr.mk I learned (1) to space cr.mk to hit at the tip (2) get in the habit of buffering the CA motion (3) learn where on the screen to actively look for the hit spark (4) use negative edge to reduce finger delays.
Anyway, I know some game developers who are generally interested in low-latency gaming. I’ll link them your work and maybe something good will come of it!
Different sources of lag add up. The fact that network lag is significant doesn’t take away the importance of other sources of lag. USF4 on the PC without V-Sync online can be more responsive than on the PS3 offline (!).
Reactions are secondary to game responsiveness on this subject… I always link people to a certain article on Gamasutra when they start talking about reaction time as if input lag isn’t much of an issue just because of it.
Magus: Yeah: responsiveness is good. You said that lag adds up – I said latency adds up. You are right that reactions and responsiveness are different beasts, and I may have muddied up my point by including that paragraph. Just to be clear: I think that shaving millis off of input latency would be a good thing.
Yes, latency adds up, that’s key. But it’s actually slightly worse than that I think, in that overall constant latency like you get from a monitor/TV you can deal with. You get a feel for it, it never changes. But variable lag is a real pain. I think the two main sources of variable lag are the controller/USB and emulators.
Even if I don’t bother building the 1ms Windows update rate system into something more usable than a CLI app, I think it’s a worthwhile experiment and will definitely help with the polling rate when on USB.