newer machines have more delay due to post image processing.
Older Analog TVs have zero delay. Now that said I don’t recommend gaming on a rear projection TV, you can accidentally get burn in way to easily.
Like any modern tech it depends on the RPTV you are using. If it has any sort of processing, even analog hardware, it could have some latency. If it’s a old TV, then you will have a hard time finding information on it.
However, you CAN test it by using a video splitter to a CRT monitor and to your TV. Then use a tool like multirefreshrate by shurcool and use a camara that captures at 60 fps and take a picture. multirefreshrate shows you a grid that represents each frame rendered (and renders per frame) so you can count the frames and see how far behind one display is next to the other. It’s not as accurate as a Leo Bodnar device but it’s probably the next most accurate method to use when both displays are synced to 60 hz.
Oops, I forgot to add. Modern TV’s actually have slower processing which increase the frame delay because they are using software to do the processing instead of dedicated hardware. For example, my older Sony WEGA XBR910 had dedicated hardware to doing minor image processing. This stuff was fast and pretty much as lagless as it can get, because it’s dedicated hardware boards to do 1 thing and 1 thing only, and multiple image processing required separate hardware boards and chips. CRT had a bunch of extra space for that kind of thing so they could do that.
Modern TV’s all have much less space to work with. Basically anything they can flat mount against the back of the chassis will do, but most of that stuff is for local LED dimming, voltage, power board, and the likes. So now what they do is put ARM based processors on the back to run a minimal Linux kernal with software that does all the video processing and other shit like running Android apps, Skype, Netflix, and the like. My plasma has a quad core ARM in it, with 1 or 2 gb of SSD storage for apps, a built in skype camera, motion control and voice recognition. All of this is mostly driven in software which is why they’re packing these things with quad fucking core CPU, lol. They’re basically Linux PC’s built into your TV, which is soooooo dumb, but I digress, I’m the extreme minority who doesn’t give a fuck about bells and whistles on my TV. I’d be much happier to have that quad core do minor image processing (if any) instead of running a Skype app in the background. This is one reason why modern TV’s are so slow. No room for dedicated hardware video processors and most processing is done via software. Even if there isn’t a lot of processing going on, some controls like contrast and brightness may be controlled via software as well.
Sure, with 2 displays mirroring each other yes it is easier to tell when one has lag vs one that doesn’t. It’s especially noticeable during mouse movement and moving windows up and down. 1 frame is not so noticeable, and 2 frames not so bad. 3 frames and greater is visually noticeable (less so at 3 and obviously increases as the frame count gets higher) and that’s when you can easily see it in action when you have 2 displays side by side. It’s a good point, but kind of irrelevant to the discussion of can you notice 1 or 2 frames of lag during normal play. Why? Because we aren’t playing on cloned display setups. If you have a single display then most people will not notice the delay up to 2 frames from the time they press a button to what they see on a screen.
Touch screens is easier to see display lag because a pointer will trail your finger or touch device like a pen. It’s because you have a visual indicator showing you the difference, vs a controller in your hand not showing you the difference, plus the touch device is right on top of the display. Motion control tracking is another indication, but the visual difference won’t really pop up until about 3 frames (33 ms or greater) IF the motion device is lagless.
But for normal gaming in normal situations where you are not moving a device according to your hand movement on a touch screen or a cloned display, then 2 frames of lag is no big deal. Even trying to hold a controller up to a screen and pressing a button and watching the input come out is difficult to see until you hit the 3 frame threshold.
We talked dropped frames and such because it shows a good point that extremely short time spans goes mostly unnoticed by humans because of our brains and their noise and temporal filters, such as mentally removing minor debris from our site, filtering out constant droning noise after a while from our hearing, blending motion together, and all that awesome and interesting stuff. The point was made because without any sort of example like a moving finger or cloned display to show you exactly what is lagging our brain will adapt to how we play and we won’t notice minor input lag. If it didn’t then many things in our lives that are not constant would always aggravate our brains and cause us issues, lol.
I liked your post a lot btw. Great stuff, and thank you for sharing.
Man, don’t read me wrong, I’m agreeing with the established people on the thread, not you. “Believe in yourself” is some crap, people are on this thread because they want numbers that are verifiable and repeatable. If you can measure using reproducible, non-human-being methods that get consistent, precise results, that’s the science we’re wanting. “I know it when I feel it” is not that. Just because I disagreed with your opposition on a point does not mean I’m agreeing with you on… anything. In a vacuum, it’s really easy to subconsciously adjust to minimal latency deltas, and that’s what we’re trying to avoid.
I’m glad you’ve found a display you dig. Please stop trying to get this thread to accept an opinion as an objective fact.
Thanks! Yeah, cognition is awesome stuff, kinda sad I didn’t study more of it in college. Def. didn’t mean to come off as condescending or anything, I hope nobody took offense. It’s just really easy to oversimplify this stuff, since it’s complex and sometimes counter-intuitive, so I wanted to make sure that a generalization didn’t stay unchallenged. I hope that it’s clear that I’m actually agreeing with the “Yeah, it’s easy to miss a couple frames of lag” point of view, I just wanted to bring in the fact that perception of delay is all about context and stuff.
Fully agreed. We don’t really notice the lag until it’s put into context. I can game fine on a display with 4 and 5 frames of lag and don’t notice too much of a difference (though I am for sure at the point that I am noticing, however it isn’t as drastic as say, 8 frames, when it becomes extremely noticeable) but the second I clone display it becomes beyond obvious. Or, even going back and forth from one to another. 2 frames though, it’s for sure harder to notice unless it’s side by side with a lagless display. Even then it’s a bit harder to see, but within the context you can. 1 frame is pushing it and most people won’t be able to see even side by side. Comparing 1 frame display to 2 frame display is the same as lagless to 1 frame, no one will notice even side by side. This is one reason I say just play on a 2 frame display and enjoy it, because most tournaments are using ASUS which is 1 frame or slightly less. To go from 1 to 2 is nothing, but going from 0 to 3 is, but not drastic.
“I can game fine on a display with 4 and 5 frames of lag and don’t notice too much of a difference…”
I should remember to not discuss with ego-full morons and science whores. People with 2 fingers of brains will go to a store and check for themselves like I suggested and those don’t comment here.
Unsuscribed from this thread after reading the above… lol, ah well.
I have no idea what “science whores” or “2 fingers of brains” are supposed to mean, but I do know that in a thread that is supposed to be an archive of verified 1-frame displays, saying that your TV is good because you said so is completely useless, and acting like a baby because you are told so just makes you look stupid–almost as much as you do for saying stuff like “science whores” or “2 fingers of brains.” So, have fun unsubscribing from the thread. Can’t say you’ll be missed.
Is there some unwritten rule that every couple of months some clueless poster has to stumble into this thread and tout their own perception as hard fact?
Man I am such a science whore, that I watch Ted Talk documentaries instead of porn.
I think Hedy Lamarr inventor of Frequency-hopping spread spectrum communication is one of the hottest women ever.
Damn, that’s pretty good man. Too bad it’s LCD. I really want us to move on from LCD already, and OLED is looking like garbage as well. Too bad SED never took off, would have been nice to have a true successor to CRT.
BTW did you measure at the top or bottom? Because of the nature of LCD and pixel response it has to be measured at the bottom of the display.
At the top. Because the bottom is a waste of time since it takes 15-16ms to draw the whole screen anyway so any measurements taken down there would be inflated by 15-16ms. Even lagless CRTs measure at 15-16ms at the bottom.
Not quite, CRT and plasma tends to pulse the image which is why people read the center vs the bottom. LCD is a little difference because of the pixel response in addition to any input lag, it’s a little slower per frame (still no more than 16.7) but the tech is a tad diff which is why the tradition is to read at the bottom.
Thanks for your help guys. I mean it is really the only TV I have and it feels great to play on compared to other TVs I have had / played on. I suppose that is all that matters in the end.
it says it is 5ms response time, does that not necessarily mean there is sub 1 frame reponse time? I am still thinking the rear projection is going to be a better bet, this best buy TV was quite cheap.
Response time is not the measurement that you are looking for. You’re looking for input lag (aka input latency), which the vast majority of manufacturers do not advertise.