The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Thin clevo G-sync laptops

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Legion343, Aug 20, 2015.

  1. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
  2. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    too bad theyre not giving out any specs yet on the 6700HQ ;)
     
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    I'd treat any pre-release information as unreliable at the moment.
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Da eff? So either that site is lying, or G-Sync is somehow working with Optimus now, or new P67xSx/P65xSx is MXM.
     
  5. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    They are quite reliable and gamenab said that G-sync should work on optimus too (more work needed)...
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nvidia ordered a hit on Gamenab. He ded, bro.
     
  7. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    i know this but at least we know that there is way to make G-crap work on optimus.
     
  8. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    nice, so then well have double crap machines with craptimus and crap-sync? :D

    Sent from my Nexus 5 using Tapatalk
     
    Bullrun likes this.
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Just because Gamenab said so? He jumped the gun and said a lot of things in the beginning that were not true.

    What's so bad about G-Sync? VRR tech is inherently a good thing, Nvidia is just being Nvidia and fleecing everyone for it. On another note, Intel will be supporting Adaptive-Sync in the future, which leaves Nvidia the odd man out. But I doubt Nvidia cares when it has 80% share of the dGPU market.
     
    Prema likes this.
  10. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    i was just referring to legion343's post :)

    i myself have no definitive opinion on gsync yet, but it wouldnt surprise me if this tech is really overhyped...

    Sent from my Nexus 5 using Tapatalk
     
  11. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    In reality, it's the Intel GPU that would need to support adaptive sync which should be possible and it would surprise me if they didn't try given their focus on the IGP in the last few generations. The entire point of the PSR feature was to save power which Intel has been pushing really hard for the past few generations.

    It's a highly mis-understood tech. Fact is, it's incredible. But one of those things you don't really notice until it's gone.
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    On an iGPU-only notebook, sure, but Intel support alone is not enough for Optimus G-Sync seeing as Nvidia is intent on making money on G-Sync licenses whether or not it's contributing anything beyond a sticker and extra branding. All the extra DRM mobile G-Sync is shackled with is enough proof of that.
     
  13. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    I'm working on the fact that Adaptive Sync and PSR (which is all mobile G-Sync is) is part of the eDP implementation which happens entirely within the realm of the IGP in an Optimus laptop. I can't see how Nvidia could possibly get around this if the NV chip is only writing out to the Intel framebuffer. It has no control over refresh-rates or the display.

    Not that there isn't also some work required by NV though. Assuming you could get the Intel bit to do adaptive sync, the NV chip now has to write to the IGP framebuffer and somehow trigger refreshes through the IGP. Can't even begin to see how that's possible. If the IGP in the 6700K is anything to go by, then the Skylake HQ processors already support eDP1.3 which is pretty much the barrier for entry to adaptive sync, so it's entirely possible.
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    If the IGP supports it then it should not matter too much in optimus mode as the dGPU sends complete frames to the IGP, the display timing is totally up to the IGP.
     
  15. Legion343

    Legion343 Notebook Consultant

    Reputations:
    21
    Messages:
    208
    Likes Received:
    109
    Trophy Points:
    56
    I'm not saying how they did that. Maybe they managed to do it through optimus or they permanently disabled iGPU or used manual switch like in last gigabyte and msi GT72...
     
  16. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't think it needs MXM... just no Optimus XD.

    Don't the ASUS machines have soldered cards but a Gsync model?
     
  17. Thorne

    Thorne Notebook Evangelist

    Reputations:
    15
    Messages:
    415
    Likes Received:
    27
    Trophy Points:
    41
    Without Optimus the battery time will be quite bad, maybe this has something to do with Intel's just announced support for adaptive sync?

    I think all Asus laptops for sale ATM have soldered cards, but i could be mistaken.
     
  18. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well battery life could be bad sure, but you don't have to buy the Gsync model. It's not like it changes the specs of the laptop.

    Gsync model = ~2 hours BL
    Regular model = ~4 hours BL
     
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I know, I just assumed 6GB 970M and 8GB 980M were MXM since the BGA versions have half the VRAM
     
  20. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,755
    Likes Received:
    2,199
    Trophy Points:
    181
    @octiceps - there is a 6GB 970M and 8GB 980M integrated, the Gigabyte chassis among others have this option, it's just that the current Clevo integrated versions are 3GB/4GB.
     
    Last edited: Aug 21, 2015
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Typos? But yeah I meant Clevo.
     
  22. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,755
    Likes Received:
    2,199
    Trophy Points:
    181
    Yup sorry, I realised just after I clicked Post Reply but my interweb connection dropped out so couldn't imediately edit!
     
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    As XMG said there's higher vRAM integrated versions, but even weirder, there's low vRAM MXM versions. MSI's GT72 has been selling with 3GB 970Ms in pre-built configs on places like amazon etc.
     
  24. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    There are 4gb 980m mxm models around too.
     
  25. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Given that Intel confirmed Adaptive Sync is coming "after" Skylake, I'm wondering if this is just some poor marketing or mistake (they don't specify desktop or mobile though, so it's still possible mobile may get it simply due to the eDP benefits).

    Alternatively, it may be something like 1/2 of the mini-DP ports being connected directly to the NV GPU allowing you to use an external G-Sync panel.
     
  26. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Not overhyped, overpriced... it's really a treat to see, just the artificial requirements imposed by Ngreedia is a bit troublesome.
     
    D2 Ultima likes this.
  27. belegdol

    belegdol Notebook Geek

    Reputations:
    4
    Messages:
    87
    Likes Received:
    5
    Trophy Points:
    16
    @XMG: Will there be a non-gsync version available, given that Gsync seems to limit the display to 1080p? Am I correct in assuming that p506/p706 will be the gsync versions with Full HD screens, and h506/h706 the non-gsync ones with higher resolution options?
     
    Last edited: Sep 23, 2015
  28. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    No, G-sync doesn't limit it to 1080p, it's the screen selected for this model. It can be any resolution, just needs to be "nVidia approved". Although I don't know if there are any 4k 17" LCD's yet.
     
  29. Samot

    Samot Notebook Evangelist

    Reputations:
    224
    Messages:
    610
    Likes Received:
    334
    Trophy Points:
    76
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Yes some very new panels.
     
  31. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,755
    Likes Received:
    2,199
    Trophy Points:
    181
    That's what we have done with the Skylake "XMG U" and "Schenker W" series, so I guess you could assume that the same logic applies to our new P and H series - I couldn't possibly confirm anything until they launch though!

    Customer feedback strongly favours the opinion that gamers want G-Sync but mainly prefer FHD, which fits our gaming orientated XMG sub brand, whereas owners with a more professional usage leaning are more inclined to want 4K but have no need for G-Sync - so the [W]ork series has these options.
     
    Samot and jaybee83 like this.
  32. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    eDP 1.2a will have adaptative Sync directly applied to the screen...all of your screens (LP156WFXX, B156HANXXX) are 1.2 eDP panels...when adaptative sync will come, G-sync will be an overpriced useless thing...

    but yes we'll have to wait a while for that...standards take a lot of time to become..standards as manufacturers need to deplete their previous standard stocks first...
     
  33. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Except the 15" Clevo gets a 4k G-sync panel. I'd so much rather have a 3K or 2560x1440 LCD. But all you get is either 1080p or 4k. Nothing in between.
     
    deepfreeze12 likes this.
  34. deepfreeze12

    deepfreeze12 Notebook Guru

    Reputations:
    0
    Messages:
    57
    Likes Received:
    61
    Trophy Points:
    26
    Yes, I agree, a 3K or 2.5K with G-Sync would be the perfect balance between high-res gaming and playable frame rates in games. FHD isn't asking too much of todays hardware, and 4K pushes it way too hard, if you don't have a SLI setup.
     
    HTWingNut likes this.
  35. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    *points at sig* :D :D :D

    Sent from my Nexus 5 using Tapatalk
     
  36. deepfreeze12

    deepfreeze12 Notebook Guru

    Reputations:
    0
    Messages:
    57
    Likes Received:
    61
    Trophy Points:
    26
    *points at missing g-sync* :p
     
  37. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    *laughs it off and keeps pointing happily at sig* :D

    Sent from my Nexus 5 using Tapatalk
     
    TomJGX likes this.
  38. TomJGX

    TomJGX I HATE BGA!

    Reputations:
    1,456
    Messages:
    8,707
    Likes Received:
    3,315
    Trophy Points:
    431
    I still don't see the hype about GSync... Never seen tearing in game and actually, its perfectly fine for me without that...
     
  39. deepfreeze12

    deepfreeze12 Notebook Guru

    Reputations:
    0
    Messages:
    57
    Likes Received:
    61
    Trophy Points:
    26
    Well yeah, especially if you have high framerates, but if your framerates are close to 30, it's gonna be better with the G-Sync. So if it's a poorly optimized game or/and a very heavy one, I guess it's a nice added little bonus to have. :) Never hurts to have more extras. :D
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Because you have to see it first *ba-dum tsss*
     
    James D, TomJGX and deepfreeze12 like this.
  41. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    Yeah, it helps out most between 30 and 60 fps. So it might be all that important on newer systems now that can run games well, and don't have much tearing, but it might be a nice thing to have as a system ages.
     
  42. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    I had it for a little while, and I personally didn't see a whole lot of difference, but I see it as a way to future-proof your machine. With G-SYNC, not only will you get 6-12 months extra out of your machine, it will make even current games look more fluid. It essentially negates the drawbacks we currently face (no V-SYNC and screen tearing, or V-SYNC on and lag/latency).

    Most people who've had it now swear by it.

    One way to think of it is like so: Imagine you have enough money for either a GTX 980M, or a GTX 970M + G-SYNC. While the 980M will pump out more FPS, the 970M+G-SYNC combo will look better/smoother, even with less FPS.

    Either way, currently looks like Clevo machines will cost about $100 more to include G-SYNC, and at that price, I'm thinking there's no reason NOT to get it. You can turn it off if you want.
     
  43. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    It really works well if you overclock your monitor too since it allows you take advantage of scenes where you go over 60fps but does not punish you if you can't.
     
  44. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    Even if the screen has more than 60hz (overclocked or stock refresh), it would still look better and more fluid than just 60hz. So there would be some benefit if it is a 75hz by default, whether the gysnc is used or not.
     
  45. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Yes but if you are sensitive to tearing and your GPU is performing at 65fps. At 60hz V-Sync you get 60fps, but at say 80hz you would only get 40fps (as half of 80hz). With G-sync you would get your full 65fps (and hz).
     
  46. Ramzay

    Ramzay Notebook Connoisseur

    Reputations:
    476
    Messages:
    3,185
    Likes Received:
    1,065
    Trophy Points:
    231
    I'm pretty sure all the "75Hz" panels are just 60Hz panels that are overclocked by the OEMs to 75Hz anyway.
     
    D2 Ultima and jaybee83 like this.
  47. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    they indeed are, since there arent any panels that run 75hz stock in their specs :D

    Sent from my Nexus 5 using Tapatalk
     
  48. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,189
    Likes Received:
    17,900
    Trophy Points:
    931
    Define overclocked? If a part is binned and quality tested at higher frequencies then that's not overclocking. That's shipping it at a higher frequency.
     
  49. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    semantics ;)

    Sent from my Nexus 5 using Tapatalk