You can’t, and that’s the reason you can only be so accurate with the camera method. The 25ms I used as an example was a given. I gave you the answer to illustrate the dissonance presented by trying to solve for it.
That doesn’t make any sense. The question you’re trying to solve in any testing is how much latency is there between when something happens and when it’s shown on my screen? The reason you do this is so you stay outside of whatever your personal lag tolerance is. How would it be ideal to be less accurate?
For example, I can’t say for fighters because I don’t have a good standard for comparison, but in rhythm games like IIDX I have timing down to around 8-10 ms (around the Just Great timing window). With 10 ms lag I could adjust while still staying relatively on beat…that’s a little over half a frame. If it was twice that it would be approaching unplayable at the level I play. If it was 3 times that it would be unplayable. So the fact that any of the above scenarios could be reported simplay as “1 frame” is a problem.
Alright, alright, is there a definitive list of displays that are safe for either a ps3 or xbox with use of street fighter or tekken? Because so far I’ve had a real fuck of a time trying to get any solid information. If there is no list, can we start one?
I say we make a new thread with a list of lagless HDTVs and monitors as well as a list with close to lagless ones.
We also need to identify official testers, because its hard to believe someone who just posted 3 pages back with 5 consecutive posts of testing random HDTVs and they are all apparently lagless :rolleyes:
And none of that has to do with identifying degrees of lag. We’re not talking about frame advantage or strategies in this thread. We’re talking about video input lag and how to avoid it.
the thing is you keep on insisting to use millisecond for measuring lag.
There is NO test, tool and etc… that will allow you to accurately measure lag in milliseconds.
like i said…how your getting your ##s?
At least when using the 60fps method I can confirm visually and count the frame of delay.
even if you argue that input lag doesn’t always happen strictly in even frame ratios. You can always count the actual frame before actual hit frames occurs and match it with the sf4 frame data.
seriously, try doing the test yourself and you would understand.
There are so few tested TVs out there and so few people testing them, that we need all the Info we can get. IF they only have the Rock Band test, I wanna know. If someone has the equipment to do the stopwatch test, I wanna know the results. If someone has the equipment to do the 60fps test I wanna know, for all the games and consoles they can test.
Do we really have so great a body of evidence out there that we can afford to turn up our nose to tests that aren’t exactly to our liking? Just outline the strengths and of each testing method post the results.
Because microseconds are too small to reasonably measure. :lol: You say this like it’s a problem. Just because you don’t know how to do it doesn’t meant it wouldn’t be better if you did.
Okay…and?
I don’t know what numbers you’re talking about. If you’re referring to the 25 ms lag I used in my above example, I chose that because it was halfway between one frame and another. If you’re referring to me saying I’m sensitive to 8-10 ms lag, I just explained that it comes from years of playing timing sensitive rhythm games like Beatmania IIDX, where the tighest accuracy window is between 10 and 15 ms., as well as a few other things.
And as I’ve already demonstrated, you could be as much as a frame off even by doing that. If you’re a frame off during the animation, what makes you think you’d be any less off before the animation?
I’m not saying it’s a bad method…it’s about as good as we have currently, but it’s by no means perfect.
I’m not the one not understanding here.
Exactly. The goal should be to eliminate variables as much as possible to achieve more accurate, reproducible results.
buying this: ASUS VH236H, but does it have option for speakers or head phones to be connected. If so how do I go about that with a PS3 and HDMI?
Or should I go with following for my parents and me.
LG- 42LH40 1080p 120hz, 2.7 ms response time
Samsung LN40B630 1080p 120hz, 4 ms response time with a game mode.
or
Panasonic TC-P42S1 1080p 600hz and its plasma so no need to worry about lag right?
I am going to connect it with my PS3…
Thanks in advance guys.
Or what’s the best 32 inch HDTV to get for the PS3 that is 1080p? Just wondering because I am going to use it as a Computer Monitor as well.
Yet you did. Are you saying I can’t feel 10 ms of lag when a 15 ms window effectively becomes a 5 ms window? Because you would be very wrong.
You can’t be serious.
For God’s sake, stop acting like a baby. This is pathetic. Nobody said you don’t know anything or that you’re always wrong, so quit with your juvenile victim act.
Please read the first post. It’s not up to date, but most of your questions are answered there, most notably:
response time has nothing to do with input lag
plasma, DLP, CRT, LCD all lag. The type doesn’t matter; the implementation does.
We can’t really tell you anything about those specific models (outside of the “evo monitor”)because we don’t have them, and there aren’t archived tests by model or anything that I know of.
Basically Warpticons arguement is that the camera test can be within a frame wrong. Wouldn’t the answer to this be the same as clone stopwatch tests? Do the test multiple times and average it out across frames? Pretty sure thats how clone testing websites and the RB2 auto test get their specific ms values… just by averaging. Unfortunately in this case you can’t just check the same rolling footage as the camera should have the same sync with the display the whole time.
Anyone know how to split a VGA connection? XD
The best way would be to clone a 360 running the game from LCD to CRT monitor and run the film on both of them. While the camera might be within a frame out of sync, it would be equally for both of them. Doing this type of thing would be severly limiting though, as it only works with 360 VGA and is much harder and specific to test… which returns us to Shinshos main point here, establishing a testing method which is acceptable and practical.
Camera tests could be wrong, but when you use the same camera to record an SD television and then record an HD one and compare them frame by frame you should get a very accurate result in how many frames of lag there is.
So if the camera is lagging, it will lag on the SD tv as well. It doesn’t really matter you’re effectively going to get the frame delay on the HD display.
Yes the hdfury2 did not disable the postprocesing on my Sony kdl40v4100 the rockband 2 automatic test gave me roughly the same numbers. I know rockband 2 is far from perfect for definite answers but I don’t have a 60fps camera
1.) Hook a computer up to a CRT TV via Composite or S-Video, and then to the HDTV you would like to test via HDMI, DVI, VGA or Component.
2.) Enable both displays in nVidia Control Panel or ATi Catalyst Control Center, and make sure it’s set to Dualview/Mirror Display.
3.) Run this executable (no virii or spyware, I promise) and make sure you can see both displays at the same time. It’ll run a small 640x480 Direct3D window that will feature a stopwatch of [hh:mm:ss.msec] for you to precisely measure lag.
4.) Get a digital camera (SLR is preferable here) and take a picture of both screens at once at the fastest shutter speed that you possibly can given the lighting conditions, so that you can see the numbers on both screens in the photo clearly.
5.) If both [.msec] numbers are very close (or even the same!), then congratulations, your TV has practically zero lag. If the difference between the two numbers is over say, .25 or so, then you’re going to start to have problems.
Hope this helps some people out. I did not make the application, but I know who did and I’m sure that he would have no qualms about freely distributing it.
It’s not about the camera lagging, per se. It’s going to record at a steady 60 FPS. Again, the two issues are:
There is a 6% chance of starting the action in sync with the camera. The other 94% of the time there’s going to be a desync, which can distort the data.
Not all lag (and probably most) comes in even 1-frame, 16.66ms increments, meaning that some degree of lag won’t be clearly recorded via the camera method.
The two issues above have the possibility of skewing the results of a single camera test in a significant way.
Exactly. The camera test is probably the best test we have right now in terms of consistency, but it’s not strong enough to hold up with a sample size of one test.
Part of the reason the stopwatch test isn’t considered completely dependable is because signals being broadcast on different systems can sometimes cause erratic results on one or both ends. However, most of the time this has been done by running the video cloned through a computer sound card. I don’t know if that would change things or if the split cable could introduce latency. Why specifically 360 VGA, though?