The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo P770ZM / Sager NP9772 and P770ZM-G / Sager NP9773 Owner's Lounge ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by HTWingNut, Jan 6, 2015.

  1. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Triple buffered V-Sync never took off because it did nothing for input lag, if not made it worse, and increased VRAM usage. Adaptive V-Sync made it irrelevant and VRR was the final nail on its coffin.

    You could always turn up the graphics to stay within the optimal VRR range, you know. And VR will turn the GPU requirements up two notches with high-res high-refresh HMDs, so good luck trying to get 120Hz or even 90Hz then. You're exaggerating a problem that doesn't really exist and poo-pooing VRR tech when you've never actually seen it in-person. :rolleyes:

    G-Sync/FreeSync with V-Sync on is what you want if you're after tear-fee gaming even when FPS output exceeds the monitor's refresh rate, but ofc your FPS will be capped.
     
    Last edited: Jun 17, 2015
  2. Kommando

    Kommando Notebook Evangelist

    Reputations:
    46
    Messages:
    376
    Likes Received:
    271
    Trophy Points:
    76
    G-Sync is a failure because it is Free-Sync with a nVidia-Cookie, to prevent other cards to work with it. FreeSync on the other hand is not supported by nV, although everyone is allowed to support it for free.
    GSync is just a method to print money and should not be supported.
     
  3. Ingvarr

    Ingvarr Notebook Deity

    Reputations:
    292
    Messages:
    1,090
    Likes Received:
    115
    Trophy Points:
    81
    G-Sync is basically NVidia-made scaler chip with some features required to make adaptive syncing to work well (e.g. don't blink if going below minimum LCD refresh, overdrive with different frame times, etc). And some mods to data transmission protocol to vendor lock to this particular chip only.

    FreeSync is just a data transmission protocol. Its completely open, but scaler with adaptive framerate support will have to be designed by each monitor manufacturer from scratch.
     
  4. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    There is no input lag with triple buffering. Triple buffered v-sync exists to specifically address the cause of input lag with double buffered v-sync. D3D's render ahead queue is not triple buffering, and it does make input lag worse because that's how queues work.

    Triple buffered v-sync never took off because Microsoft hasn't given programmers an API in DirectX and they're too lazy (not all but enough) to roll their own, and because the game studios are all "partners" with GPU makers which have vested interests in you buying more of their hardware.

    G-Sync is snake oil. Simple as that.
     
  5. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    The problem with adaptive sync, a problem that triple buffering does not have, is that there is a lower bound for variable refresh rate. What happens when the rendering rate drops below the lower bound? AMD's solution is to engage traditional v-sync. Problem solved. Nvidia's solution is to display each frame twice or three or four times in order to keep the refresh rate above the VRR lower bound. This requires an additional frame buffer; this is what the hardware is for. Problem... solved?

    How much RAM do you need for a 4K frame buffer? 32MB. This costs less than a buck a chip in bulk. How much does a DIY G-Sync module cost? $200. The premium on a factory equipped G-Sync display is upwards of $500. When you buy G-Sync you're paying a 200-500% or more markup on a tiny chunk of RAM and a DRM "key" to enable said tiny chunk of RAM.

    But hey! It's your money and it's your prerogative if you want to spend it on a scam.
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No it does not. Triple buffering addresses the progressive FPS drop that occurs when FPS falls below refresh rate while V-Synced. It does not nothing for input lag or judder, both problems which are inherent to V-Sync. If it did, everybody would be using V-Sync in Battlefield and other games which have triple buffering built-in because who likes tearing. But the fact of the matter is they don't because input lag is there regardless of double buffered or triple buffered V-Sync. I've never heard anyone say triple buffering fixes V-Sync input lag, so it makes me wonder if you've ever actually used triple buffering or if you're just spouting nonsense you read somewhere on the Internet.
     
    Last edited by a moderator: Jun 17, 2015
  7. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    https://en.wikipedia.org/wiki/Multiple_buffering

    "In computer graphics, triple buffering is similar to double buffering but provides a speed improvement. In double buffering the program must wait until the finished drawing is copied or swapped before starting the next drawing. This waiting period could be several milliseconds during which neither buffer can be touched."

    TL;DR: input lag.

    "In triple buffering the program has two back buffers and can immediately start drawing in the one that is not involved in such copying. The third buffer, the front buffer, is read by the graphics card to display the image on the monitor. Once the monitor has been drawn, the front buffer is flipped with (or copied from) the back buffer holding the last complete screen. Since one of the back buffers is always complete, the graphics card never has to wait for the software to complete. Consequently, the software and the graphics card are completely independent, and can run at their own pace. Finally, the displayed image was started without waiting for synchronization and thus with minimum lag."

    TL;DR: Triple buffering eliminates input lag.

    See also:
    http://www.anandtech.com/show/2794

    " In other words, with triple buffering we get the same high actual performance and similar decreased input lag of a vsync disabled setup while achieving the visual quality and smoothness of leaving vsync enabled."
     
  8. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    As to why Battlefield players don't use triple buffering? It's probably because FRAPS says that their frame rates drop with triple buffering enabled and they mistakenly think that this is due to a performance hit incurred by triple buffering.
     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nowhere does it say TB eliminates input lag, that's just your interpretation of what you read. And you're basically proving my point, relying on stuff you read on the Internet instead of going out in the real world and actually trying it, whether it be triple buffering (super easy) or VRR tech (not so easy) or whatever. For over 10 years I've experimented with different combinations of double/triple buffered V-Sync, FPS limiting, and more recently SLI in vain to try to find a way to get the tear-free experience of V-Sync on with the responsiveness of V-Sync off (VRR is the only way to achieve this, that's why it's a game changer). I'm absolutely certain, beyond a shadow of a doubt, that triple buffering does not eliminate V-Sync input lag. You're believing in a placebo if you think TB makes V-Sync on feel as responsive as V-Sync off, and I'd wager that you wouldn't be able to tell the difference between double and triple buffered V-Sync at 60 FPS unless someone told you beforehand. You just aren't affected by V-Sync input lag like most people are, which boggles my mind but it happens.

    That's not what I asked. I asked why Battlefield players don't use V-Sync. The answer is input lag. Triple buffering on is the game's default setting and you can't disable it without a console command, which precludes most players turning it off. Now if triple buffering was actually V-Sync's anti-lag magic pill, more players would enable V-Sync, but they don't because they can obviously feel that the input lag is still there, triple buffering or otherwise.
     
    Last edited: Jun 17, 2015
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    There is still an input lag via triple buffering (making it unsuitable for VR for instace) and it wont help you so much at low frame rates.
     
  11. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    Triple buffering does not incur any lag at all. Vsync can incur lag when the game blocks on locked buffers. This lag is eliminated by triple buffering because the game never blocks. This isn't an interpretation (or misinterpretation) of something I read on the Internet. It's math, and not especially difficult math at that.

    Still, there is always latency between the time the player pushes the button, the time the game registers the button press, the time the game does something with that input, and the time it takes to get that something to the screen. This last is exacerbated by digital displays that add their own decoding and processing latency.

    Never mind bugs and games like the Battlefield series that use render ahead queues. I find it telling that my adversary above neglected to mention this fact because render ahead causes worse input lag than double buffered vsync. I also find it telling that my adversary neglected to mention what game developers call "cargo-cult configuring" (after cargo-cult programming), where gamers apply all sorts of configuration tweaks because some guide or forum post on the Internet said to do so. They have no idea what they're doing, they convince themselves that every change is an improvement in order to avoid cognitive dissonance, then rave about how much better their games are after making those changes. Why do Battlefield players disable vsync? Because "every other" Battlefield player disables vsync.

    Here's a clue: if a tweak were a universal improvement then it would be the default setting.

    Low frame rates shouldn't be a problem. The stock ForceWare drivers are capable of compensating for frame rates below 1/2 the display's refresh rate using the same mechanism that G-Sync adopted: doubling, tripling and quadrupling frames. It's half of what the PS_FRAMERATE_LIMITER_FPS_* frame rate limiters in the drivers do. I'm aware of no technical reason why ForceWare couldn't show the same frame twice when the render rate drops below 1/2 the refresh rate of the display, or three times when below 1/3, or four times when below 1/4. Of course, Nvidia has no incentive to give customers something for free when they can sell it for 200-500 dollars.

    But I repeat myself.
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    V-Sync on
    RenderDevice.TripleBufferingEnable 1
    RenderDevice.ForceRenderAheadLimit 0

    Does not eliminate input lag. Anybody, if you have BF3 or BF4, go try it out. I guarantee you it will not feel as responsive as V-Sync off, simple as that.

    I'm not your adversary, I'm just calling you out on your BS.
     
  13. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    Does anyone sell this laptop with g-sync monitor?

     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah Craig does
     
  15. Support.3@XOTIC PC

    Support.3@XOTIC PC Company Representative

    Reputations:
    1,268
    Messages:
    7,186
    Likes Received:
    1,002
    Trophy Points:
    331
    The Sager NP9773 will always include G-Sync regardless of where you get it.
     
  16. E.D.U.

    E.D.U. Notebook Deity

    Reputations:
    275
    Messages:
    764
    Likes Received:
    117
    Trophy Points:
    56
    Been reading you two's discussion, learned a lot. Ratinox, although I agree with you on some of your points on Nvidia, as I too am finding them a greedier company with AMD stumbling, I disagree on the input lag point. Like octiceps said, I've tried triple buffering on an old game like Left 4 Dead 2 (which offers that Vsync option) and I get input lag. Once I switch it to double-buffering (or completely off) that perceptible lag vanishes. I've even tried to force triple-buffering in the Nvidia Control panel with the option on, to see if it helps, and same thing. Maybe I'm missing something or there's some other factor, but I do perceive an input lag with triple-buffering unfortunately. Don't mean to stray off thread topic.
     
  17. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Remember though, triple buffering in AMD and Nvidia drivers (this includes both Nvidia Control Panel and Inspector) is for OpenGL only. The only way to force it in Direct3D games without native triple buffering support is using external tools like D3DOverrider (part of the original RivaTuner package) and RadeonPro. These two tools work on both AMD and Nvidia GPUs.
     
  18. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    Regardless of whatever panel is used?
     
  19. Support.3@XOTIC PC

    Support.3@XOTIC PC Company Representative

    Reputations:
    1,268
    Messages:
    7,186
    Likes Received:
    1,002
    Trophy Points:
    331
    If you keep the stock panel since there are no other options. If you swap it out after you get it to something else, then that screen would have to support it.
     
  20. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    If you enable vsync with triple buffering in a game and input lag is not better than vsync without triple buffering then something about that game is broken. There's an assumption that game developers implement triple buffering correctly. They often don't. Many games use D3D's render ahead and call it triple buffering because they set it to 3 frames. That's not triple buffering; it's a render ahead queue. Some games implement what they call triple buffering but don't drop unused frames or don't drop them fast enough. That's bugs or bad designs. It wouldn't surprise me to find games that have triple buffering check boxes that do nothing but toggle the check boxes.

    When done right, triple buffering eliminates input lag caused by vertical sync delays. When done wrong then all bets are off.

    Anyway, I've ranted on the subject far too long. The information and the math that backs it up is out there for everyone to see.
     
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    And what about games that allow you to set pre-rendered frames to 0? Triple buffering doesn't eliminate input lag there either.

    So every single game does triple buffering wrong? Seems legit. :rolleyes:

    Yeah, math...totally. Instead of admitting that you're wrong, that's your excuse?

    Can you name even one game where triple buffering eliminates V-Sync input lag? Just one. I'm not asking for the world.
     
    Last edited: Jun 18, 2015
  22. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    Doom 3 plus any games that use Doom 3's engine. Didn't even need to think about this one.
     
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nope, input lag is still there. I tested it in The Dark Mod using triple buffering in the graphics driver since id Tech 4 is OpenGL.

    You're just one of those people who doesn't feel V-Sync input lag. Triple buffering prevents FPS dropping by as much as 50%, which ofc makes the game feel more responsive if it means the difference between 30 and 60 FPS. But the input lag inherent to V-Sync is still there even at 60 FPS. 60 FPS w/o V-Sync is much more responsive than 60 FPS with double/triple buffered V-Sync.
     
    Last edited: Jun 18, 2015
  24. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I have never noticed any input lag ever whilst running VSYNC. Which I always do because I loathe screen tearing. I never play those twitch shooters that require instantaneous reactions.

    Sent from my Nexus 5 using Tapatalk
     
  25. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah you and @ratinox are special
     
  26. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Been following this double/triple buffering back and forth for a while now.

    You're missing one important thing here with triple buffering. It doesn't eliminate the fact that the screen will only ever update when one buffer is fully completed. Thus, regardless of triple or double buffering, if a screen refreshes and the new buffer is NOT ready for any reason, the screen shows the older frame. For example, if you're averaging steady 40fps on a 60hz monitor 1 third of your frames will inevitably be a frame double. The only difference is is triple buffering allows the engine to continue rendering, instead of waiting, thus reducing the change that the other 2 thirds ALSO have to frame-double if they don't sync properly.
    Since it's extremely unlikely that your frame-times will ever actually sync with the screen refresh clock, you'll always be behind. Hence, Input lag.

    Fact is, the only way to fix the problem is with dynamic update times where the screen refresh is triggered by the GPU. Double and Triple buffering are products of a time where the screen refresh clock was king and nobody thought differently. G-Sync or FreeSync or whatever DP officially implement are the way forward and I'm surprised it hasn't actually been thought of before.

    Unfortunately, both G-Sync and FreeSync also point out which game engines generally have frame-timing issues. I have an AOC G-Sync monitor on desktop btw and it's the greatest thing I don't appreciate until I game on my laptop. Going from 144hz/G-Sync -> 60hz and it becomes glaringly obvious, especially at lower frame-rates because it eliminates weird frame-syncing issues.
     
    E.D.U. likes this.
  27. ratinox

    ratinox Notebook Deity

    Reputations:
    119
    Messages:
    1,047
    Likes Received:
    516
    Trophy Points:
    131
    Yeah, see, this is just another "I sense it so I must be right" argument. Well, I don't sense it so you must be wrong.

    I really don't see any point in continuing this. If you want to waste your money, well, it's your money to waste and I guess I have no business trying to dissuade you.
     
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Waste my money on what exactly? The fact of the matter is the input lag exists, but whether you can feel it or not depends on the individual. Although the majority of people can feel it, that's why they play with V-Sync off despite tearing, which is more tolerable than input lag. #ThatIsAll
     
    Last edited: Jun 19, 2015
  29. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    I'm the other way round, I don't tend to notice tearing but am very sensitive to lag.

    People do tend to be one or the other I think.
     
  30. Ishatix

    Ishatix Notebook Consultant

    Reputations:
    2
    Messages:
    107
    Likes Received:
    54
    Trophy Points:
    41
    lol, I'm the opposite. Never notice screen tearing so it's an easy choice for me to always disable v-sync. :)

    Finally, thank you! Did nobody else actually read the AnandTech article ratinox linked?
    I rather got the feeling reading that this article that it was written by someone like ratinox or Cakefish who is more sensitive to tearing. You could easily argue this the other way round and say "how useful is a frame without tear if it is outdated?"

    Definitely for arena FPS. Did anyone else find instagib in UT3 unplayable having become pretty competent at the UT2004 version?
     
  31. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I can never play a game without VSYNC. Screen tearing is the most obnoxious thing in video gaming.

    Sent from my Nexus 5 using Tapatalk
     
    alaskajoel likes this.
  32. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah well you don't play FPS so there's that
     
  33. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Oh I play FPS, just not online competitive FPS. Only singleplayer campaigns.

    Sent from my Nexus 5 using Tapatalk
     
    alaskajoel likes this.
  34. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    I love playing FPS online, but still play with vsync because I absolutely hate screen tearing. The latency is noticeable to me, but I have such potato aim, it's the least of my problems.
     
    ajc9988 and Bullrun like this.
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Maybe you have such potato aim because you use V-Sync. XD
     
    Mr Najsman and alaskajoel like this.
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Lol I remember the lan parties with low grav insta gib on UT 2004 :D had my x600m from 400mhz core up at 600 mhz core and vsync was certainly disabled.
     
    alaskajoel likes this.
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Quake III instagib railgun on The Longest Yard was the best
     
    alaskajoel likes this.
  38. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    Oh, the feels. <3

    Actually played some UT99 the other week with a friend just for the memories..... low grav, InstaGib, DM-Morpheus. Le sigh...

    Still have potato aim though :) I was never really good at those games. Super fun to play, but I was never any good. I was the guy getting his butt kicked by Koreans in BroodWar instead!
     
    Last edited: Jun 21, 2015
  39. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    *double post*
     
  40. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    Counter-Strike ftw started in the beta's
     
  41. akm

    akm Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    2
    Trophy Points:
    16
    I have a Eurocom P7 Pro (P770ZM). Where can I find the latest bios for this model and the instructions to upgrade bios? Also can anyone please tell me how to flash my logo with the bios?
     
  42. Bullrun

    Bullrun Notebook Deity

    Reputations:
    545
    Messages:
    1,171
    Likes Received:
    494
    Trophy Points:
    101
    You may want to check with Eurocom for it. I believe they are partnered with @Prema so it's already a modded BIOS.
    Read this thread http://forum.notebookreview.com/threads/mythlogic-dia-1614.775340/
    And follow Prema's link (post #6) and read it too.
     
    Last edited: Jun 21, 2015
  43. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    May I ask why u want to update your bios? Are u having problems? Other wise the only reason to update bios I've been told is to fix something unless u are wanting to oc?


    Sent from my iPhone using Tapatalk
     
  44. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    I would not advise unpacking the bios to work in your own image as you can land yourself up with a brick. Prema may be willing to do it for some beer money if he has the free time at some point but you would have to ask.
     
  45. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    Anyone tested the newly released, un-optimized Batman: Arkham Knight with their Batman?
     
  46. superkyle1721

    superkyle1721 Notebook Evangelist

    Reputations:
    12
    Messages:
    474
    Likes Received:
    266
    Trophy Points:
    76
    If by not optimized you are meaning a driver has yet to be released I thought the driver was released yesterday for the game?
     
  47. Kommando

    Kommando Notebook Evangelist

    Reputations:
    46
    Messages:
    376
    Likes Received:
    271
    Trophy Points:
    76
    I'm waiting for some good feedback before giving it a try. Was very disappointed by the second Batman, which was boring as hell with its blocking technique... The first one was good though.
     
  48. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    No. Many have severe performance issues regardless of Nvidia or AMD. For starters it´s capped to 30 fps which have to be manually edited in an .ini-file. Cap or not many users report horrible performance, stutters, fps-dips and crashes.

    I played a bit tonight. 1080p, all normal settings max, all Nvidia Gameworks features of. It ran really well, +60 fps. I suspect the dynamic fog and lightshafts in Nvidia Gameworks are the real killers.
     
  49. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Capped at 30FPS? Holy console batman!
     
    ajc9988 and Mr Najsman like this.
  50. Scerate

    Scerate Notebook Evangelist

    Reputations:
    224
    Messages:
    687
    Likes Received:
    650
    Trophy Points:
    106
    I did quite an achievement with my 4790k (which was prone to overheat quite easily even with being delidded) 2 days ago, i applied the screw mod for my CPU and marked my screws with a line so that i know how much turns i did, cause i thought maybe my heatsink is slightly warped and not really seeable under normal conditions. But low and behold i reran Prime95 1344K AVX Stress Test and maxing out now at 86 °C with fans maxed and propped up, so yeah that was stunning, before i had most of the time around 97 °C with FN+1 and propping up and of course some throttling.

    Last 2 Days now i played quite a bit of GW2 again (pre-hype for coming addon) and this game normally brought my CPU to 95 °C levels and GPU to 78°C. Now i'm mostly sitting around 69 °C CPU and 61 °C GPU (stock of course, GW2 isn't that demanding even with Supersampling enabled ingame). I'm very very happy now with those temps, i absolutely recommend the screw mod ;) :vbthumbsup:

    Looks like Nvidia Shatworks is at it again, even a 980TI can't handle that effects right.
     
    Last edited: Jun 24, 2015
← Previous pageNext page →