Strictly speaking V.G.A. is a 680x480 progressive scan resolution, but in a looser sense a single well constructed V.G.A. cable can carry progressive scan video signals of up to at least 2304x1440 at a refresh rate of 80hz. It depends on the capabilities of the source device and the display. The 2304x1440 resolution assumes a P.C. outputting to a Sony GDM-FW900 or equivalent, which was arguably the best C.R.T. ever produced in consideration of the fact that it used the same tube as a Sony BVM-D24E1WU professional color grading monitor if I recall correctly, yet could accept higher resolution signals. An absolutely stunning piece of technology by 2005 standards, especially since C.R.Ts. lack fixed pixels, meaning it can display resolutions all of the way down to 480p just as effectively before the signal falls out of its synchronization range.
Something else that needs to be noted is that component y/pb/pr does not necessitate interlacing. The standards support 480i, 480p, 720p and 1080i. Moreover I think if the stars align you can even get 1080p since an xbox 360 or Wii U is capable of outputting a 1080p signal through its component video. The caveat to that is that because it is a nonstandard resolution for the format is that many televisions lack support for 1080p over component. Unfortunately, to my knowledge, most of the C.R.T. televisions which directly support progressive scan over component seem to use digital processing which introduces lag for some reason. The effects of laggy processing had for playing games on H.D.T.Vs. was largely a non-concern until about 2006, and Sony halted their C.R.T. production in Japan by 2004, and the best C.R.T. they produced was arguably the 32XBR960 which was already being produced in 2004.
Anyway, you seem to be under the impression that a C.R.T. does not have drawtime, but my understanding of the situation is different. My understanding is that a refreshrate of 60 F.P.S. that a C.R.T. draws at a rate of about one frame every 16 milliseconds, which would mean that at a half screen measure of the sort usually used in lag test reporting that the amount of time it takes for the electron gun to go from the top-left corner of the screen to the middle is approximately 8 milliseconds. Shockdude of Resetera argues that this means a C.R.T. has 8.3 milliseconds of lag.
Now ordinarily I would not link Shockdude’s claim because I am actually skeptical of it. It could be argued that this 8 miliseconds should not be counted in display lag figures, depending on the details of how an analogue video signal is fed into an analogue C.R.T., because in accordance to Shockdude’s sources, the definition of lag (which I also see as a more contextually sensitive word than he does, with at least three separate applicable types: Overall input lag, H.D.T.V. lag and Display Lag) is the difference in time between video signal input into the display and visual output.
However, if we abstract the process of what is happening beyond the plastic shells of the devices, the way I see it is that using an outboard digital video processor to reprocess the signal into something an analogue C.R.T. can actually use should be no different from using the onboard digital video processor of a digital flat panel display. Either way the signal would be held in a digital frame buffer before actually being decoded into a electrical pulses that can actually be used by actual cathode ray tube or liquid crystal portion of the display to produce visual output you can actually see.
When you use a Time Slueth to measure lag on a P.V.M. Trinitron, you do not get less than 7 milliseconds worth of lag when measuring from the middle bar, as per the standard recommendation as seen in the following two videos:
That means if we assume the best L.C.D. lags by 8 milliseconds as you suggest Tripletopper, the digital to analog conversion process you are suggesting only saves you one millisecond after all factors are considered. Now the fact that we are seeing less than 8 milliseconds suggests to me that Shockdude’s reasoning is seriously flawed, since we should not see such a result unless the C.R.T. finishes refreshing a screen from top to bottom in less than 16 milliseconds, and thus refreshes at a faster rate than the actual F.P.S.
However, the time slueth can output video signals that the P.V.M. Trinitron can natively display without downscaling, and you are correct in asserting that simple signal transcoding is a very fast process. However, in the Lag Testing Retro Scalers video, Bob of RetroRGB states that GBS Control is the fastest downscaler to his knowledge, measuring a relatively simple 480p to 240p conversion to introduce 8 miliseconds of lag. When Bob makes that statement, it implies that he knows of other Downscalers with even worse results.
Although Bob seems impressed with the result, it is a bad one for this given application. It means means the net gross total of subtracting the lag difference between a C.R.T. and a T.F.T. and downscaling the image with the originally suggested process actually adds 7 miliseconds of lag, rather than reducing it by any amount. Downscaling high definition consoles to 480i as originally suggested is not the way to go here.
As for transcoding a signal to a computer monitor it could be a slight improvement. It might not be one worth pursing in consideration of the downsides, but only you can decide which factors you value more. However, I would like to note that if you wish to do so it it is not going to be so simple as outputting everything to 720 though. Video game consoles are weird, in that instead of directly rendering graphics at a given output resolution like P.C. games do, they often render in one fixed resolution then rescale the image to output another. Watchdogs on xbox one renders in 960p for example, even though the console only outputs 720p, 1080p and 4k and making matters worse is that resolutions vary on not only a console by console basis, but a game by game basis. Most games on the Playstation 3 only really support 720p resolution, but a select few are true 1080p games. Mismatch resolutions and you will yet again probably gain lag instead of reduce it.
That does not require you to use more than one C.R.T. for any applicable resolution within its synchronization range, because they are not fixed pixel displays and most C.R.T. computer monitors can accept what N.E.C. called multisync. You’d want a pretty high end model for this usage case though so it could fully resolve letterboxed 1080p, and it’s not as if you can just run down to the nearest computer store and grab one off of the shelf any-… Well actually I have difficulty casually grabbing a C.R.T. off of the shelf in any case, but I trust you understand what I mean.