Scanout and the Display
Alright. So depending on the game, we are up to somewhere between 13ms and 58ms after our mouse was moved. The GPU just finished rendering and swapped the finished frame to the front buffer. What happens next is called scanout: the frame is sent out the DVI-I port over the cable and to the monitor.
If our monitor's refresh rate is 60Hz (as is typical these days), it will actually take something like 16ms to send the full frame to the monitor (plus there's about half a millisecond of "blanking" between frames being sent) giving us 16.67ms of transmission delay. In this case we are limited by the bandwidth capabilities of DVI, HDMI and DisplayPort and the timing standards put forth by VESA. So to send a full frame of anything to the display we will have 16.67ms of input lag added. Some monitors will display this data as it is received, but others will latch input meaning the full frame must be sent before it can be displayed (but let's not get too far ahead of ourselves). Either way, we will consider the latency of this step to be at least one frame (as the monitor will still take 16ms to draw the image either way).
So now we need to talk about vsync. Let's pretend we aren't using it. Let's pretend our game runs at a rock solid exact 60 FPS and our refresh rate is 60Hz, but the buffer swap happens half way between each vertical sync. This means every frame being scanned out would be split down the middle. The top half of the frame will be an additional 16.67ms behind (for a total of 33.3ms of lag). Of course, the bottom half, while 16.67ms newer than the top, won't have it's own top half sent until the next frame 16.67ms later.
In this particular case, the way the math works out if we average the latency of all the pixels on a split frame we would get the same average latency as if we enabled vsync. Unfortunately, when framerate is either higher or lower than refresh rate, vsync has the potential to cause tons of problems and this equivalence doesn't carry in the least.
If our frametime is just longer than 16.67ms with vsync enabled, we will add a full additional frame of latency (with no work being done on the GPU) before we are able to swap the finished buffer to the front for scanout. The wasted work can cause our next frame not to come in before the next vsync, giving us up to two frames of latency (one because we wait to swap and one because of the delay in starting the next frame). If our framerate is higher than 60 FPS, our GPU will have to stop working after rendering until the next vsync. This is a waste of resources and decreases overall performance, but definitely not by as much as if we use vsync at less than the monitor refresh. The upper limit of additional delay is 16.67ms minus frametime (less than one frame) rather than two full frames.
When framerate is lower than refresh rate, using either a 1 frame flip queue with vsync or triple buffering will allow the graphics hardware to continue doing rendering work while adding between 0 and 16.67ms of additional latency (the average will be between the two extremes). So you get the potential benefits of vsync (no tearing and synchronization) without the additional decrease in performance that occurs when no work gets done on the GPU. At framerates higher than refresh rate, when using a render queue, we do end up adding an additional frame of latency per number of frames we render ahead, so this solution isn't a very good one for mitigating input latency (especially in twitch shooters) in high framerate games.
Once the data is sent to the monitor, we've got more delay in store.
We've already mentioned that some LCDs latch the entire frame before display. Beyond this delay, some displays will perform image processing on the input (including scaling if this is not done on the graphics hardware). In some cases, monitors will save two frames to overdrive LCD cells to get them to respond faster. While this can improve the speed at which the picture on the monitor changes, it can add another 16.67ms to 33.3ms of latency to the input (depending on whether one frame is processed or two). Monitors with a game mode or true 120Hz monitors should definitely add less input lag than monitors that require this sort of processing.
Add, on top of all this, the fact that it will take between 2ms and 16ms for the pixels on the LCD to actually switch (response time varies between panels and depending on what levels the transition is between) and we are done: the image is now on the screen.
So what do we have total after the image is flipped to the front buffer?
One frame of lag for transmission (to display a full frame), up to 1 frame of lag if we enable triple buffering (or 1 frame render ahead and we run at less than refresh rate), up to two frames of lag if we just turn on vsync, at framerates higher than the refresh rate we we'll add an additional frame of lag for every frame we render ahead with vsync on, and zero to 2 frames of lag for the monitor to display the image (if it does extensive image processing).
So after crazy speed from the mouse to the front buffer, here we are waiting ridiculous amounts of time to get the image to appear on the screen. We add at the very very least 16.67ms of lag in this stage. At worst we're taking on between 66.67ms and 83.3ms which is totally unacceptable. And that's after the computer is completely done working on the image.
This brings our totals up to about 33ms to 80ms input lag for typical cases. Our worst case for what we've outlined, however, is about 135ms of latency between mouse movement and final display which could be discernible and might start to feel mushy. Sometimes game developers stray a bit and incur a little more input lag than is reasonable. Oblivion and Fallout 3 come to mind.
But don't worry, we'll take a look at some specific cases next.
85 Comments
View All Comments
RubberJohnny - Friday, July 17, 2009 - link
OT - I used to be a diehard CRT ONLY user then i realised there is NO ghosting on modern LCD monitors...you may have seen smearing on LCD tvs but thats caused but the scaler resizing SD material to fit the panels native res.On monitors there is no scaling = no ghosting.
Got a 24inch samsung 6 months ago and wish i'd done so earlier, crisper, larger image and widescreen being the main reasons i'll NEVER use a crt again.
DerekWilson - Friday, July 17, 2009 - link
I agree that with modern LCD panels ghosting is not as large an issue and color (depending on backlight) and contrast (depending on panel) are much better these days as well.Refresh rate is the only real outstanding issue these days (imo). And the only 120Hz display I saw was a bit over saturated / over bright and not high enough contrast.
jkostans - Saturday, July 18, 2009 - link
I have yet to see an LCD without ghosting, it may be minor but it's still annoying. And even the 120Hz LCDs supposedly still have measurable input lag regardless of the non-existent ghosting. LCDs are still a downgrade as far as I'm concerned.DerekWilson - Sunday, July 19, 2009 - link
Well you can actually see how much (or little) ghosting there would be in our high speed footage... even the advertised latencies on the 3007WFP are pretty bad compared to most panels these days (especially TN panels). Despite that there were only a few cases where we could see ghosting take a whole frame (and it never seemed to take more than a whole frame).We should test more panels with high speed cameras and see what happens...
Freeseus - Thursday, July 16, 2009 - link
I feel like there is a HUGE section of delay missing from this article. Perhaps it was chosen specifically not to be included because rigorous testing/comparison would have to be performed in order to show any sort of suggested "average" numbers. Either way, I feel it should have been addressed... even if it were just at the end of the article.I'm referring to the added delay of wireless mice.
Aside from the added delay of transmitting all the mouse action to the receiver, the biggest issue is the inconsistencies of mouse performance and precision. I'm sure there's a direct correlation between this and the battery. I'm referring specifically to the amount of battery used up in order for the mouse to broadcast continuously at extremely high intervals to insure precise movement. But obviously this includes any issue where the battery needs to be recharged as well. And on top of that, the mouse seems to be non-responsive occasionally during use. Completely unacceptable in a work or 'twitch gaming' environment.
Anyhow, it would have been nice to see this addressed because many people make the argument that wireless mice are better. And when it comes to FPS gaming or even work, I can't think of a reason not to have a wired mouse. Do I really need to have that accuracy all the time. Yes.
DerekWilson - Thursday, July 16, 2009 - link
I agree that the current state of wireless mice is generally pretty poor ... though I've never used a wireless mouse targeted at gaming (gaming mice were the first optical mice I was able to use without mousing too quickly for the sensor even during desktop use).Testing wireless mice is definitely something worth looking into.
Vidmar - Friday, July 17, 2009 - link
What about PS/2 mice? Are they better or worse than USB?DerekWilson - Friday, July 17, 2009 - link
PS/2 mice are slower ... iirc they come in at about 100Hz (10ms).Vidmar - Monday, July 20, 2009 - link
Really?? My old PS/2 MS Wheel Optical Mouse v1.1 is currently running at 200Mhz. IE: 200 reports/second. I've never felt like it doesn't keep up in any game.lopri - Thursday, July 16, 2009 - link
I'm loving this article and chewing threw every page right now. Just wanted to say thank you for such in-depth analysis as is rarely found elsewhere.