Wrapping It Up
So there you have it. Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync. We get smooth full frames with no tearing. These frames are swapped to the front buffer only on refresh, but they have just as little input lag as double buffering with no vsync at the start of output to the monitor. Even though "performance" doesn't always get reported right with triple buffering, the graphics hardware is working just as hard as it does with double buffering and no vsync and the end user gets all the benefit with out the potential downside. Triple buffering does take up a handful of extra memory on the graphics hardware, but on modern hardware this is not a significant issue.
Just to recap, from our previous example, here are what the three frames we looked at rendering stack up side by side.
Triple Buffering
Double Buffering
Double Buffering with vsync
We've presented the qualitative argument and the quantitative argument in support of triple buffering. So, now the question is: does this data change things? Are people going to start looking for that triple buffering option more now than without this information? Let's find out.
{poll 135:300}
The major difference in the technique we've described here is the ability to drop frames when they are outdated. Render ahead forces older frames to be displayed. Queues can help smoothness and stuttering as a few really quick frames followed by a slow frame end up being evened out and spread over more frames. But the price you pay is in lag (the more frames in the queue, the longer it takes to empty the queue and the older the frames are that are displayed).
In order to maintain smoothness and reduce lag, it is possible to hold on to a limited number of frames in case they are needed but to drop them if they are not (if they get too old). This requires a little more intelligent management of already rendered frames and goes a bit beyond the scope of this article.
Some game developers implement a short render ahead queue and call it triple buffering (because it uses three total buffers). They certainly cannot be faulted for this, as there has been a lot of confusion on the subject and under certain circumstances this setup will perform the same as triple buffering as we have described it (but definitely not when framerate is higher than refresh rate).
Both techniques allow the graphics card to continue doing work while waiting for a vertical refresh when one frame is already completed. When using double buffering (and no render queue), while vertical sync is enabled, after one frame is completed nothing else can be rendered out which can cause stalling and degrade actual performance.
When vsync is not enabled, nothing more than double buffering is needed for performance, but a render queue can still be used to smooth framerate if it requires a few old frames to be kept around. This can keep instantaneous framerate from dipping in some cases, but will (even with double buffering and vsync disabled) add lag and input latency. Even without vsync, render ahead is required for multiGPU systems to work efficiently.
So, this article is as much for gamers as it is for developers. If you are implementing render ahead (aka a flip queue), please don't call it "triple buffering," as that should be reserved for the technique we've described here in order to cut down on the confusion. There are games out there that list triple buffering as an option when the technique used is actually a short render queue. We do realize that this can cause confusion, and we very much hope that this article and discussion help to alleviate this problem.
184 Comments
View All Comments
DerekWilson - Friday, June 26, 2009 - link
unfortunately, you really can't build a practical implementation that starts rendering a frame at the point where it will finish just before the next viable refresh. typically, with anything changing at all on screen, you aren't going to have previous frames be good predictors down to the accuracy level you would need.I didn't include sub 60 fps or sub 30 fps examples to keep it simple ... but in each case, the frame that starts being drawn at each refresh is equivalent between double buffering with no vsync and triple buffering.
the "odd frame" here or there really add up when you look at an entire second by the way.
velanapontinha - Friday, June 26, 2009 - link
I always try to play with double buffering + V-Sync. I've known about Tripple Buffering for quite some time, but I still prefer DB+Vsync. It's just that I never felt the theoretical input lag, while I can feel the benefits of having my CPU and GPU rest, instead of beeing always striving to get those useless 100fps.60fps (heck, even 30fps), if constant, provide a flawless gaming experience, and if you can have a wonderful gaming experience without your hardware being pointlessly pushed to its limits, why make it render frames you will never miss?
Less workload, less heat, less noise, less energy, and still an impecable gaming experience.
DerekWilson - Friday, June 26, 2009 - link
there is still benefit at 30 FPS as well and not only when the framerate skyrockets.as frametime gets longer, input lag starts to become more and more of an issue. minimizing additional lag (as triple buffering can do) can help more at lower framerates when compared to double buffering and vsync.
KikassAssassin - Friday, June 26, 2009 - link
I just ran a test in WoW (I picked it since it has a Triple Buffering option built-in), where I ran down a path and back again, running the same path three times, once with double buffering and vsync disabled, one with double buffering and vsync enabled, and one with triple buffering. I had RivaTuner open in the background monitoring my CPU and GPU usage.In all three tests, the CPU and GPU usage graphs look exactly the same. There's almost no difference between them whatsoever.
velanapontinha - Friday, June 26, 2009 - link
Well, if you can't see any difference, I guess (i'm just guessing) that you're running WoW close to your setup limits, then.I'm a beta tester for a software company, and I can assure you that vsync can and will keep your CPU and GPU usage much lower.
Try running a 3D software that lets your hardware at ease (and thus runs and over 100fps, double buffer without v-sync).
Then run the same software with v-sync enabled, and you'll see that your hardware has a lot less to struggle for.
Try this one:
http://www.theprodukkt.com/downloads/fr-041_debris...">http://www.theprodukkt.com/downloads/fr-041_debris...
A very small app (177kb) that looks impressive. Run it at a low resolution (say 1024x768, for example), and then check it out. You have v-syn option in the app.
velanapontinha - Friday, June 26, 2009 - link
At least I'm sure you'll notice that CPU usage will be lower. As to GPU, it depends, as GPU load indicators usually are not reliable (always varying at either 0% or 99%)randomname - Friday, June 26, 2009 - link
I usually start by switching most of the options on in games. After I realize it isn't running fast enough, I start switching some of those options off. Therefore even triple buffering is a "nice to have" property, that I would select (or not) based on an experiment. Unfortunately, that little tryout probably isn't representative of the rest of the game. So often just when it gets really interesting (a lot of stuff and cool effects start happening), the performance plummets. Then you switch off everything that doesn't have an immediate visual impact (maybe triple buffering as well) and try again.Absolutely the best part of console gaming is that someone else has made the (artistic) choice of enabling something, and they are in effect saying that your experience is best with these options. The game has been reviewed with those options and the same hardware, and if it sucks, it's the developers fault. The argument doesn't go towards "you really need a fast machine to appreciate the graphics", which leaves questions about how fast is fast enough (to play through the heaviest scenes) and is there any sense in making a several hundred dollar investment to play a fifty buck game, and exactly what options and hardware did the reviewer use? All that tends to take a lot away from the enjoyment and immersion.
One example is the motion blur in Crysis. It looks really nice and smooths out that FPS-style jerkiness of being able to move your head (optical axis) so fast. But it was also quite a heavy option, and although I really, really didn't want to switch it off, I had to.
SleepyGreg - Friday, June 26, 2009 - link
Having a poll of which buffering method you use under the heading "Triple buffering: Why we love it" is rather flawed. People often answer what they think is the right answer, not what they actually do.DerekWilson - Friday, June 26, 2009 - link
You know, I agree with you ... I apologize for poisoning the sample. I don't think I'm that great at article titles anyway, but the poll was just something I thought would be a cool idea. I didn't think about how they would impact each other.I'll try to be more careful with stuff like this if I do it in the future.
Mills - Friday, June 26, 2009 - link
Seems like nobody here really agrees when it is better.Some people say it's better only when your FPS is greater than refresh, some say it's better only when FPS is less than refresh.
Article seems to make the claim it's always better.
I remain confused.