The New Definitive HDTV Lag FAQ

Do you have a source or are you trying to start a rumor?

I posted because I can’t find a source.

But you just “heard,” right. Do you know what a rumor is? Good job, all it takes is one other idiot posting a similar question for you to have done the damage. Lets see how long it takes.

Damn dude, did I accidentally hit you at a soft spot? One simple question in a relevant forum, and you go nuts. And all you’re doing is going off-topic about the issue.

So I just bought an ASUS VH236H and my Rock Band clocked it in as having over 40ms of lag. Isn’t that considerably more than one frame of lag? I know that the Rock Band calibration has it’s own built in lag but I was expecting a much better reading. Is it because I’m using HDMI? I’m thinking that shouldn’t be an issue.

Rock Band calibration isn’t always accurate. No where nearly as accurate as testing against a CRT. The VH236H is already a proven monitor.

For anyone that went to the SFxT launch in LA last week, would you happen to know what monitors they were running? I saw a mix of JVC and Samsung monitors, I believe 37"ers, lag felt very minimal on those particular setups I played on.

Also interested to see if the current gen Panny X and LG LD/LK are still any good. My bedroom TV is biting the dust and I don’t really care for using my Asus VE278Q that is right next to it as it’s too small to play from a few feet away. (Although it’s a great TN panel for handling input lag, if you want something bigger than the EVO monitor)

Calm down. WTF?

Evo has in fact used the Alienware monitor as well as the ASUS so it wouldn’t surprise me if they were looking for a new monitor to use or supplement their current stock.

I’ve just posted my Viewsonic CD4220 in the Trading Outlet if anyone is interested. This is the monitor referenced in the first post as the benchmark for lag-free gaming:

If anyone is interested send me a PM or email jleverant@gmail.com

Just a question, for terms of input lag (not response time) is a native 720p screen (mostly) always going to be safe?

I don’t believe so. o_o
Input lag doesn’t only come from scaling…

Nope. If you buy blind, expect lag, because most TVs lag.

Well if I buy blind, I won’t have to worry about it, because I won’t see the lag! /rimshot

In all seriousness, I thought a good portion of the lag introduced by 1080p screens had to do with downscaling and post-processing. If I’m hooking up through DSUB and outputting through native 720p resolution, doesn’t that cut off most of the issues right there?

A lot of (most?) 720p TVs are really 1366x768 panels, so they are resampling almost any signal they receive.

Using VGA helps on some displays, since it is sometimes assumed that the VGA input will be used for computers, so some of the post-processing is bypassed and a different scaler might be used. But really the only way to know is to try it.

I always thought 720p was a misnomer that meant native 1366x768? In other words, 1366x768 is actually native 720p (not 1280x720). In other other words, the output of a PS3/360, etc. at 720p is 1366x768… not 1280x720. Am I wrong on this?

Quoth the Wiki:
HD High Definition (720p) This display aspect ratio is among the most common in recent notebook computers and desktop monitors. 1366x768 (1049k)

Clarification: I believe on the PS3, the games are written in 1280x720, but the PS3 outputs them at 1366x768 since that is what displays are created at. Therefore the PS3 is handling the scaling and not the TV. Again, please correct me if I’m wrong.

Sorry, you are incorrect.

720p = 720 lines @ 60Hz, HDTV standard resolution 1280x720
1080p = 1080 lines @ 60Hz, HDTV standard resolution 1920x1080

Right, I get the original intended definition, but, much like “4G” the actual accepted use of 720p has become more “interpretive” than literal, and has come to mean 1366x768, as there are no commercially manufactured 1280x720 displays. That being said, devices would likely follow suit, so while the content may be written in 1280x720, the devices themselves (i.e. Blu-Ray Players, Consoles, etc) would handle the scaling and output in 1366x768 or 1920x1080 to match the native resolution of the displays themselves. Make sense?

Er… what? I have a native 720p display sitting at home (Gateway FPD1775W). They aren’t that common for whatever reason though. It is nice to be able to play console games in their native resolution without extra scaling lag that occurs with standard 768p and 1080p displays.

I get what you are saying totally, but this is not the case. 720p still means 720p, the signal has not changed. Displays that are 1366x768 have to upscale. If your device can output 768p, it should say 768p or 1366x768. I think the Xbox360 can output 1366x768 or 1360x768 over VGA, but don’t quote me on that. :slight_smile:

There are a few 1280x720 displays out there, mostly computer monitors (like yours) and not TV’s. However, the common association/definition of 720p is 1366x768. What consoles are you playing and how are they hooked up?

No that isn’t the common definition at all - the only people I have heard who refer to 1366x768 as 720p are you and low-end employees who try to sell TVs to ignorant customers at stores like The Source and Radioshack.

I play on 360 hooked up via HDMi. 360 games are output at 1280x720 and are either rescaled by your display (lag) or the 360’s internal scaler (lag) to fit your display’s resolution.