Triple buffered V-Sync never took off because it did nothing for input lag, if not made it worse, and increased VRAM usage. Adaptive V-Sync made it irrelevant and VRR was the final nail on its coffin.
You could always turn up the graphics to stay within the optimal VRR range, you know. And VR will turn the GPU requirements up two notches with high-res high-refresh HMDs, so good luck trying to get 120Hz or even 90Hz then. You're exaggerating a problem that doesn't really exist and poo-pooing VRR tech when you've never actually seen it in-person.
G-Sync/FreeSync with V-Sync on is what you want if you're after tear-fee gaming even when FPS output exceeds the monitor's refresh rate, but ofc your FPS will be capped.
-
G-Sync is a failure because it is Free-Sync with a nVidia-Cookie, to prevent other cards to work with it. FreeSync on the other hand is not supported by nV, although everyone is allowed to support it for free.
GSync is just a method to print money and should not be supported. -
G-Sync is basically NVidia-made scaler chip with some features required to make adaptive syncing to work well (e.g. don't blink if going below minimum LCD refresh, overdrive with different frame times, etc). And some mods to data transmission protocol to vendor lock to this particular chip only.
FreeSync is just a data transmission protocol. Its completely open, but scaler with adaptive framerate support will have to be designed by each monitor manufacturer from scratch. -
Triple buffered v-sync never took off because Microsoft hasn't given programmers an API in DirectX and they're too lazy (not all but enough) to roll their own, and because the game studios are all "partners" with GPU makers which have vested interests in you buying more of their hardware.
G-Sync is snake oil. Simple as that. -
How much RAM do you need for a 4K frame buffer? 32MB. This costs less than a buck a chip in bulk. How much does a DIY G-Sync module cost? $200. The premium on a factory equipped G-Sync display is upwards of $500. When you buy G-Sync you're paying a 200-500% or more markup on a tiny chunk of RAM and a DRM "key" to enable said tiny chunk of RAM.
But hey! It's your money and it's your prerogative if you want to spend it on a scam. -
Last edited by a moderator: Jun 17, 2015
-
"In computer graphics, triple buffering is similar to double buffering but provides a speed improvement. In double buffering the program must wait until the finished drawing is copied or swapped before starting the next drawing. This waiting period could be several milliseconds during which neither buffer can be touched."
TL;DR: input lag.
"In triple buffering the program has two back buffers and can immediately start drawing in the one that is not involved in such copying. The third buffer, the front buffer, is read by the graphics card to display the image on the monitor. Once the monitor has been drawn, the front buffer is flipped with (or copied from) the back buffer holding the last complete screen. Since one of the back buffers is always complete, the graphics card never has to wait for the software to complete. Consequently, the software and the graphics card are completely independent, and can run at their own pace. Finally, the displayed image was started without waiting for synchronization and thus with minimum lag."
TL;DR: Triple buffering eliminates input lag.
See also:
http://www.anandtech.com/show/2794
" In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled." -
As to why Battlefield players don't use triple buffering? It's probably because FRAPS says that their frame rates drop with triple buffering enabled and they mistakenly think that this is due to a performance hit incurred by triple buffering.
-
Last edited: Jun 17, 2015 -
Meaker@Sager Company Representative
There is still an input lag via triple buffering (making it unsuitable for VR for instace) and it wont help you so much at low frame rates.
-
Still, there is always latency between the time the player pushes the button, the time the game registers the button press, the time the game does something with that input, and the time it takes to get that something to the screen. This last is exacerbated by digital displays that add their own decoding and processing latency.
Never mind bugs and games like the Battlefield series that use render ahead queues. I find it telling that my adversary above neglected to mention this fact because render ahead causes worse input lag than double buffered vsync. I also find it telling that my adversary neglected to mention what game developers call "cargo-cult configuring" (after cargo-cult programming), where gamers apply all sorts of configuration tweaks because some guide or forum post on the Internet said to do so. They have no idea what they're doing, they convince themselves that every change is an improvement in order to avoid cognitive dissonance, then rave about how much better their games are after making those changes. Why do Battlefield players disable vsync? Because "every other" Battlefield player disables vsync.
Here's a clue: if a tweak were a universal improvement then it would be the default setting.
Low frame rates shouldn't be a problem. The stock ForceWare drivers are capable of compensating for frame rates below 1/2 the display's refresh rate using the same mechanism that G-Sync adopted: doubling, tripling and quadrupling frames. It's half of what the PS_FRAMERATE_LIMITER_FPS_* frame rate limiters in the drivers do. I'm aware of no technical reason why ForceWare couldn't show the same frame twice when the render rate drops below 1/2 the refresh rate of the display, or three times when below 1/3, or four times when below 1/4. Of course, Nvidia has no incentive to give customers something for free when they can sell it for 200-500 dollars.
But I repeat myself. -
RenderDevice.TripleBufferingEnable 1
RenderDevice.ForceRenderAheadLimit 0
Does not eliminate input lag. Anybody, if you have BF3 or BF4, go try it out. I guarantee you it will not feel as responsive as V-Sync off, simple as that.
I'm not your adversary, I'm just calling you out on your BS. -
Does anyone sell this laptop with g-sync monitor?
-
-
Support.3@XOTIC PC Company Representative
-
-
-
-
Support.3@XOTIC PC Company Representative
-
When done right, triple buffering eliminates input lag caused by vertical sync delays. When done wrong then all bets are off.
Anyway, I've ranted on the subject far too long. The information and the math that backs it up is out there for everyone to see. -
Can you name even one game where triple buffering eliminates V-Sync input lag? Just one. I'm not asking for the world.Last edited: Jun 18, 2015 -
-
You're just one of those people who doesn't feel V-Sync input lag. Triple buffering prevents FPS dropping by as much as 50%, which ofc makes the game feel more responsive if it means the difference between 30 and 60 FPS. But the input lag inherent to V-Sync is still there even at 60 FPS. 60 FPS w/o V-Sync is much more responsive than 60 FPS with double/triple buffered V-Sync.Last edited: Jun 18, 2015 -
I have never noticed any input lag ever whilst running VSYNC. Which I always do because I loathe screen tearing. I never play those twitch shooters that require instantaneous reactions.
Sent from my Nexus 5 using Tapatalk -
-
You're missing one important thing here with triple buffering. It doesn't eliminate the fact that the screen will only ever update when one buffer is fully completed. Thus, regardless of triple or double buffering, if a screen refreshes and the new buffer is NOT ready for any reason, the screen shows the older frame. For example, if you're averaging steady 40fps on a 60hz monitor 1 third of your frames will inevitably be a frame double. The only difference is is triple buffering allows the engine to continue rendering, instead of waiting, thus reducing the change that the other 2 thirds ALSO have to frame-double if they don't sync properly.
Since it's extremely unlikely that your frame-times will ever actually sync with the screen refresh clock, you'll always be behind. Hence, Input lag.
Fact is, the only way to fix the problem is with dynamic update times where the screen refresh is triggered by the GPU. Double and Triple buffering are products of a time where the screen refresh clock was king and nobody thought differently. G-Sync or FreeSync or whatever DP officially implement are the way forward and I'm surprised it hasn't actually been thought of before.
Unfortunately, both G-Sync and FreeSync also point out which game engines generally have frame-timing issues. I have an AOC G-Sync monitor on desktop btw and it's the greatest thing I don't appreciate until I game on my laptop. Going from 144hz/G-Sync -> 60hz and it becomes glaringly obvious, especially at lower frame-rates because it eliminates weird frame-syncing issues.E.D.U. likes this. -
I really don't see any point in continuing this. If you want to waste your money, well, it's your money to waste and I guess I have no business trying to dissuade you. -
Last edited: Jun 19, 2015
-
Meaker@Sager Company Representative
People do tend to be one or the other I think. -
-
I can never play a game without VSYNC. Screen tearing is the most obnoxious thing in video gaming.
Sent from my Nexus 5 using Tapatalkalaskajoel likes this. -
-
Sent from my Nexus 5 using Tapatalkalaskajoel likes this. -
-
Mr Najsman and alaskajoel like this.
-
Meaker@Sager Company Representative
Lol I remember the lan parties with low grav insta gib on UT 2004 had my x600m from 400mhz core up at 600 mhz core and vsync was certainly disabled.
alaskajoel likes this. -
Quake III instagib railgun on The Longest Yard was the best
alaskajoel likes this. -
Actually played some UT99 the other week with a friend just for the memories..... low grav, InstaGib, DM-Morpheus. Le sigh...
Still have potato aim though I was never really good at those games. Super fun to play, but I was never any good. I was the guy getting his butt kicked by Koreans in BroodWar instead!Last edited: Jun 21, 2015 -
*double post*
-
Counter-Strike ftw started in the beta's
-
I have a Eurocom P7 Pro (P770ZM). Where can I find the latest bios for this model and the instructions to upgrade bios? Also can anyone please tell me how to flash my logo with the bios?
-
Read this thread http://forum.notebookreview.com/threads/mythlogic-dia-1614.775340/
And follow Prema's link (post #6) and read it too.Last edited: Jun 21, 2015 -
Sent from my iPhone using Tapatalk -
Meaker@Sager Company Representative
I would not advise unpacking the bios to work in your own image as you can land yourself up with a brick. Prema may be willing to do it for some beer money if he has the free time at some point but you would have to ask.
-
Anyone tested the newly released, un-optimized Batman: Arkham Knight with their Batman?
-
superkyle1721 Notebook Evangelist
If by not optimized you are meaning a driver has yet to be released I thought the driver was released yesterday for the game?
-
I'm waiting for some good feedback before giving it a try. Was very disappointed by the second Batman, which was boring as hell with its blocking technique... The first one was good though.
-
I played a bit tonight. 1080p, all normal settings max, all Nvidia Gameworks features of. It ran really well, +60 fps. I suspect the dynamic fog and lightshafts in Nvidia Gameworks are the real killers. -
Meaker@Sager Company Representative
Capped at 30FPS? Holy console batman!
ajc9988 and Mr Najsman like this. -
I did quite an achievement with my 4790k (which was prone to overheat quite easily even with being delidded) 2 days ago, i applied the screw mod for my CPU and marked my screws with a line so that i know how much turns i did, cause i thought maybe my heatsink is slightly warped and not really seeable under normal conditions. But low and behold i reran Prime95 1344K AVX Stress Test and maxing out now at 86 °C with fans maxed and propped up, so yeah that was stunning, before i had most of the time around 97 °C with FN+1 and propping up and of course some throttling.
Last 2 Days now i played quite a bit of GW2 again (pre-hype for coming addon) and this game normally brought my CPU to 95 °C levels and GPU to 78°C. Now i'm mostly sitting around 69 °C CPU and 61 °C GPU (stock of course, GW2 isn't that demanding even with Supersampling enabled ingame). I'm very very happy now with those temps, i absolutely recommend the screw mod
Last edited: Jun 24, 2015
*** Official Clevo P770ZM / Sager NP9772 and P770ZM-G / Sager NP9773 Owner's Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by HTWingNut, Jan 6, 2015.